您的当前位置:首页正文

Computer Networking

2023-01-22 来源:易榕旅网
(12) STANDARD PATENT APPLICATION (19) AUSTRALIAN PATENT OFFICE (54) Title Computer Networking International Patent Classification(s) G06Q 50/20 (2012.01) Application No: Priority Data Number 2011900674 Publication Date: Publication Journal Date: 2012201737 (11) Application No. AU 2012201737 Al (51) (21) (30) (31) (22) Date of Filing: 2012.03.21 (32) Date 2011.02.25 2012.09.13 2012.09.13 (33) Country AU (43) (43) (71) Applicant(s) Antennahead Pty Ltd;Deane Nathan Inventor(s) Not Given Agent / Attorney Antennahead Pty Ltd, 17 Cassinia Court, WATTLE GROVE, NSW, 2173(72) (74) ABSTRACT: A system comprising plural networked computing devices with one supervisory device and plural supervised devices, wherein the supervisory device is used to construct: user groups; and interactive media content playlists from a library of interactive media; then, arrange the playlists according to a timetable and selectively provide the playlists to the supervised devices according to each user's group. Thereafter, while the supervised devices are being used by the users to interactively work through their respective playlists, the supervisory device automatically receives real time progress information from each of the supervised devices and selectively displays that information as: the current content being displayed by one, or more, of the supervised devices, a graphical progress indicator for one, or more, of the supervised devices, or both. And, the supervisory device displays an interaction icon by which the user is able to selectively interact with the supervised devices, by: sending a message to the one or more supervised devices, initiating a two way audio-visual session with the one or more supervised devices, interacting with the interactive content on the one or more supervised devices, or any combination of these options.~CD CL CD CD ~CD chr CD 0 0) CL 00 IZ v ofIAUSTRALIA Patents Act 1990 Antennahead Pty Limited 'R VIG SPECIFICATION Invention TNtle: Computer Networking The invention is described in the following statement:2 Title Computer Networking Technical Field the display and This invention concerns computer networking, and in particular control 'of content among plural networked computing devices. In a first aspect the invention is plural networked computing devices with one supervisory device and plural supervised devices. In another aspect the invention is a method for operating plural networked computing devices with one supervisory device and plural supervised devices. Disclosure of the Invention The invention is plural networked computing devices with one supervisory device and plural supervised devices, wherein the supervisory device is used to construct: user groups; and interactive media content playlists from a library of interactive media, Then, arrange the playlists according to a timetable and selectively provide the playlists to the supervised devices according to each users group. Thereafter, while the supervised devices are being used by the users to interactively work through their respective playlists, the supervisory device automatically receives real time progress information from each of the supervised devices and selectively displays that information as: the current content being displayed by one, or more, of the supervised devices, a graphical progress indicator for one, or more, of the supervised devices, or both. And, the supervisory device displays an interaction icon by which the user is able to selectively interact with the supervised devices, by: sending a message to the one or more supervised devices, initiating a two way audio-visual session with the one or more supervised devices,3 interacting with the interactive content on the one or more supervised devices, or any combination of these options. Real time progress information from each of the supervised devices may be displayed in an array of 'pods' where there is one pod for each student. Each pod may selectively display the current content being displayed by one, or more, of the supervised devices, or a graphical progress indicator for one, or more, of the supervised devices, or both. The supervisory device will typically display an array of pods, where the pods are automatically resized to fit within the display area of the supervisory device. The supervisory device may be operated, for instance using two fingers that are drawn apart, to enlarge one, or more, pods. The other pods may automatically downsize to fit the reduced area available to them. In the event the pods become too small, the operator may choose to move some of them temporarily off-screen. These pods will be moved back into the screen once the enlarged pods are resized down. The pods may have several different modes of display that can be selected depending on the area available to them. For instance, when adequate area is available each pod may present a smaller version of the display as it appears on the corresponding device. However, when space is limited the pods may present, either, only part of the display or information relating to the display, or both. A single pod may, of course, be selectively enlarged to fit the whole screen of the supervisory device. Interacting with the interactive content on the one or more supervised devices may involve selectively switching control of the supervised device from its user to the user of the supervisory device. This may involve toggling control between the two devices. The interaction may also be selectively presented on one or more of the other supervised devices for demonstration purposes.4 The supervisory device may be used to arrange the supervised devices into groups depending on the user's performance, and to provide different playlists for each group, or to monitor the groups separately The supervisory device may automatically collect performance information from the supervised devices and arrange that information for display of group or individual performance, or both. Performance issues may be automatically highlighted for each user or group of users. The computing devices may be tablet computers or pads with touch-screens. An avatar may be displayed on the computing devices for interacting with the devices using voice recognition, motion recognition, or object recognition. In a second aspect the invention is a method for operating plural networked computing devices with one supervisory device and plural supervised devices, comprising the steps of: 'constructing user groups and interactive media content playlists from a library of interactive media, Then, arranging the playlists according to a timetable and selectively providing the playlists to the supervised devices according to each user's group. Thereafter, while the supervised devices are being used by the users to interactively work through their respective playlists, the supervisory device automatically receives real time progress information from each of the supervised devices and selectively displays that information as: the current content being displayed by one, or more, of the supervised devices, a graphical progress indicator for one, or more, of the supervised devices, or both. And, the supervisory device displays an interaction icon by which the user is able to selectively interact with the supervised devices, by: sending a message to the one or more supervised devices, initiating a two way audio-visual session with the one or more supervised5 devices, interacting with the interactive content on the one or more supervised devices, or any combination of these options. Brief Description of the Drawings An example of the invention, in the context of a classroom, will now be described with reference to the accompanying drawings, in which: Fig. 1(a) shows a diagram of a teacher at the front of a classroom using a tablet type computer Fig. I(b) is a zoomed out view of the teacher at front of a classroom showing the tablet computer available to the students in the classroom, and the reverse of a large tablet style blackboard. Fig. 2 is a screenshot showing a number of student icons being collected together to form a user group. Fig. 3 is a screenshot showing five different groups of children studying respective lessons in English. Fig. 4 is a screenshot of a daily calendar for the teacher. Fig. 5 is a screenshot showing how each pod selectively displays the current content being displayed by the supervised devices. Fig. 6 is a screen full of pods and a popup menu. Fig. 7 illustrates how a user expands and moves pods in one row. Fig. 8 illustrates a screen fall of collapsible pods. Fig. 9 illustrates a screen full of collapsed pods. Fig. 10 is a screenshot showing how different media items can be collected to create a lesson play list. Fig. 11 is a screenshot showing a class of students and their respective progress through a particular lesson. Fig. 12 is a screenshot showing a messaging session between the teacher and a student.6 Fig. 13 is a screenshot showing the participants of an audio visual conversation together with the subject matter under discussion. Fig. 14 is a screenshot showing the teacher activating an intervention screen for a student in the lesson. Fig. 15 is a screenshot showing the teacher controlling the students' screen during an intervention. Fig. 16 is a screenshot showing the student operating their own screen during an intervention. Fig. 17 is a screenshot showing an intervention between the teacher and a group of students. Fig. 18 is a report of student progress in a class. Fig. 19 is a report for a particular student's performance in text reading and comprehension. Fig. 20 is a screenshot showing graphical reporting of a class of students. Fig. 21 is a schematic drawing showing how a touch-screen computer can be used to interact with objects. Fig. 22 is a schematic diagram showing how a touch-screen computer may interact with three dimensional letters. Fig. 23 illustrates an interactive avatar displayed on a large surface display. Fig. 24 illustrates two interactive avatars displayed on a large surface display. Best Modes of the Invention Referring first to Fig. 1(a) the teacher 10 operates the supervisory device 12, which in this case is a large 'iPad'. Fig. 1(b) shows the teacher 10 at the front of a classroom where each student has a smaller iPad 14 for their own use. A large surface display 16 is also provided as an electronic blackboard for the class. The supervisory iPad 12 is used by the teacher to supervise the students iPads. A first Way the teacher uses the supervisory Wad is to create user groups, by dragging and dropping student icons into a stack, as shown in Fig. 2. The students in this stack become a group.7 created for five different levels of been have students of groups different In Fig. 3 five simultaneously receive different interactive will students of group Each aptitude. their English lesson. The playlists for media content playlists to work through during a day's classes which are provide to arranged and designed are all the lessons Fig. 4. In Fig. 4 we see that Kylie Young in shown as diary, a to according organised an art class from Sam to 9am, a arranged has 12A class of teacher grade second the from 12 to 1pm. By each subject class science a and 11am to loam from class maths is due for the art class and a assignment an instance for notes, she has also entered worksheet is required for the maths class. class from 9am to 10:30, the To create a particular class, for instance and English it with interactive media populates and workspace time-bound a teacher creates be worked on interactively by content making up a playlist. The playlist will play and the students during that lesson. teacher During the lesson, selection of that lesson by the will lead to the display of an from each of the supervised information progress time real displaying array of pods displays the current content being devices. As shown in Fig. 5, each pod selectively of the respective student. photograph a and devices, displayed by the supervised on-screen and they are arranged rectangular appear pods the how illustrates Fig. 6 a frame around the outside with a contains pod Each layout. grid a in side-by-side inside the frame. The frames define the display screen's remote the of view visual those of the rest of the pods being from contents its separates and pod the edge of each pod. When a selectable on regions selectable of number a displayed. There are 10 tapping it, a popup menu 602 teacher the by instance for region is activated, actions available for the pod, used commonly to shortcuts displays appears. The menu For example, a library is represented selected. option mode view the to and relevant library, upon tapping the button, a of picture iconic small a with by a graphical button view. The popup menu may window library the display to open morphs the frame the popup menu to full screen.also comprise a graphical button for maximising 1 8 For instance one hierarchy hierarchy. specific a in screen the on arranged are Pods arranges students based on their skill level. of pods by touching the screen group a up scale can user the Fig. 7 illustrates how the fingers apart. As a result other dragging and time same the at fingers two with be moved outside the viewing may area, viewing the inside fit pods, that no longer user has scaled up a row of the example, this in sight. of out area, therefore there sides on the screen to both at shown are row same the pods. The shaded pods of and are therefore not displayed. indicate that they do not fit on the screen anymore on the bottom side of the screen were moved shown pods of row shaded the Similarly, pods in the centre. Alternatively, downwards to accommodate the scaled up row of out of the viewing area, or back into the pods of group selected a move can the user one finger and drag the finger viewing area. The user may touch the screen with and, at the same time, pull pods onto the screen the of out pods push to sidewards the user may drag the finger Similarly, displayed. not previously were screen that screen over the top and bottom side. vertically to move pods into and out of the of collapsible pods number a and pods collapsible of Fig. 8 illustrates a screen collapsible pod comprises a permanent Each screen. the onto fit not do (shaded) that in Fig. 9, the user may choose to only shown As B. region optional an region A and a result, each pod occupies less area As pod. each of B area display the permanent those pods which did not Therefore, compactly. more and thespods can be displayed fit onto the screen previously, can now be displayed. have been scaled up and the pod of interest groups, or pods, multiple when In cases locate that pod has become to ability the and area, viewing the has shifted outside of upon which all pods are command display\" to \"Fit a select can user difficult, the the displays viewing area. The within back fit all they until automatically downsized pod of interest can now be located. scaled Some users may prefer to keep a specific pod up and reduce all others pods to9 fit within the viewing area, For instance, it may be desirable to keep a single pod enlarged to enable increased visibility of events playing on the screen. To do this the user can touch and hold the pod of interest and select the \"Fit to display\" command, upon which all other pods will reduce in size until they all fit back within the viewing area. In order to create the playlist for a class the teacher can select interactive media content items from a library, as shown in Fig. 10. Each item of content can be dragged into the lesson in the desired order, and then rearranged as required in order to create the lesson playlist, as shown in Fig. 10, During the lesson the teacher is able to monitor the performance of all the students in the class as the lesson proceeds, as shown in Fig. 11. In this figure a student pod is presented for each student, and in the pod there is a progress bar which indicates the students progress through the play list in real time. It can be seen in this particular screen that all the students have progressed 20% through the playlist. Also presented in each student's pod is their photograph and their accuracy rate. The first student can be seen to have a low accuracy rate of '9', whereas other students have a variety of accuracy rates with the highest being '19'. The teacher cannot only monitor the accuracy rate, but by touching a student's pod is also able to see exactly what the student is looking at and interacting with at that time. This ability to monitor all members of the class in real time enables the teacher to make timely interventions as required. A number of different types of intervention are possible, including sending the student a message, or even engaging in a messaging conversation, as shown in Fig. 12. In this case the teacher has complimented the student and then entered into a short conversation about the student's progress and future work. An alternative is for the teacher to engage in an audiovisual session with the student, as shown in Fig. 13. In this case the student and teacher can see a full screen live image of each other's face in order to assist communication. Also in sub screens they can see themselves, and the interactive media item on which they are working.10 Fig. 14 illustrates how the teacher is also able to interact with the student using an intervention window which allows the teacher to operate the student's device. In fact the teacher is able to toggle control of the device so that they can demonstrate to the student, and then allow the student to resume control and try for themselves, as shown in Fig. 15 and Fig. 16. As well as interacting with a single student the teacher is also able to interact with groups of students, as shown in Fig. 17. In this case the teacher is interacting with a number of students who are attempting the same task, and is able to control all their screens in order to demonstrate to each of them simultaneously, before allowing them each to try individually. In addition the ability to track progress and other performance criteria such as accuracy, instantaneous reporting takes place, as shown in Fig. 18. In this case the tasks in a particular lesson are shown together with whether the particular student was able to correctly complete the task or not. The incorrectly completed tasks are colour coded relative to the correctly completed tasks to enable the teacher to design a future lesson in which more time is spent teaching the student to deal with the incorrectly completed tasks. More sophisticated reporting can also be constructed for each student as shown in Fig. 19 where a text reading and comprehension report is provided together with ratings for accuracy, error rates, self correction and miscues. These criteria are also useful in designing future classes. Overall reporting can also be made for the class as a whole or individual students over a period of time which is shown in Fig. 20. Fig. 21 illustrates another functionality that may be provided, which is for the interactive devices to be able to respond to objects placed on them or near them. For instance a figurine placed upon a device's touch-screen may activate particular programming or lessons, in which the figurine may be moved or reoriented in order to complete the tasks of that lesson. Fig. 22 illustrates how three dimensional letters can be placed on top of the touch-screen, which will automatically respond according to11 its lesson programming, for instance to check that the student letters to correctly spell a word. is able to arrange the surface display 16. In this large the controlling for example Figure 23 shows an such as realtime video camera devices input additional comprises example, the system display is connected to a surface The 48. mouse a and 47, with integrated microphone to an operating system 43. The connected turn in is which 42, layer user interpreter applications 45. The camera 47 and 44 controllers data operating system comprises layer 42 via rules management and mouse 48 are connected to the user interpreter display 16 and faces the surface the on displayed is 49 system 46. An avatar teacher 10 and together with the rules the monitors 47 camera The 10. teacher using voice, motion and object management system 46, detects commands the avatar 49 via commands detected by with interacts 10 teacher The recognition. The commands are interpreted by camera 47 or via commands using the mouse 48. the input with information requested from combines that 46 system management rules a true interactive exchange of the teacher 10 by the on-screen avatar 49 to produce accurately interpreted with more are intentions 10 teacher's request and feedback. The instructions are passed on to the data resulting the and avatar on-screen the of the aid the operating system 43. Of course, controllers 44 and applications 45 that make up interact with the avatar 49. may student other any also but 10 not only the teacher 16 by the teacher 10 and a display surface of control Fig. 24 illustrates the interactive another teacher or another student. be may person second This 10'. second person 49' for teacher 10 and the second and 49 avatars two displays now 16 The display camera 47' for detecting input from second a example, this In person 10' respectively. example, the camera 47 may detect different a In provided. is 10' person the second In another example, more than display. the with interacting input from all persons with the display 16 via voice, interact may class, whole the as two persons, such may interact with one student each case, this In motion and object recognition. one single avatar. In yet another with interact may students all or individual avatar, on the students' pads 14 or the displayed be may 49 avatar example, the interactive with the avatar oninteract teacher the and students the case, teachers pad 12. In this 12 their respective screens via voice, motion and object recognition. The cameras for detecting the commands may be integrated into the pad devices or externally connected to the pads. It will be appreciated by persons skilled in the art that numerous variations or modifications may be made to the invention as shown in the specific embodiments without departing from the scope of the invention as broadly described. For instance, items or content may be shared between remote screens, using drag and drop fimctionality. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive.13 Claims devices with one supervisory 1. A system comprising plural networked computing supervisory device is used to the wherein devices, supervised plural device and construct: user groups; and interactive media content playlists from a library of interactive media, Then, arrange the playlists according to a timetable and selectively provide the playlists to the supervised devices according to each user's group. Thereafter, while the supervised devices are being used by the users to interactively work through their respective playlists, the supervisory device automatically receives real time progress information from each of the supervised devices and selectively displays that information as: ihe current content being displayed by one, or more, of the supervised devices, a graphical progress indicator for one, or more, of the supervised devices, or both. And, the supervisory device displays an interaction icon by which the user is able to selectively interact with the supervised devices, by: sending a message to the one or more supervised devices, initiating a two way audio-visual session with the one or more supervised devices, interacting with the interactive content on the one or more supervised devices, or any combination of these options. 2. The system claimed in claim 1, wherein real time progress information from each of the sipervised devices is displayed in an array of 'pods' where there is one pod for each student, 3. The system claimed in claim 2, wherein each pod selectively displays the current content being displayed by one, or more, of the supervised devices, or a graphical progress indicator for one, or more, of the supervised devices, or both.14 4. The system claimed in claim 2, wherein supervisory device displays an array of pods, where the pods are automatically resized to fit within the display area of the supervisory device. 5. The system claimed in claim 2, wherein the supervisory device is operated, using two fingers that are drawn apart, to enlarge one, or more, pods. The other pods automatically downsize to fit the reduced area available to them. 6. The system claimed in claim 2, wherein the pods have several different modes of display that can be selected depending on the area available to them. 7. The system claimed in claim 2, wherein a single pod can be selectively enlarged to fit the whole screen of the supervisory device. 8. The system claimed in claim 1, wherein interacting with the interactive content on the one or more supervised devices may involve selectively switching control of the supervised device from its user to the user of the supervisory device. 9. The system claimed in claim 1, wherein the interaction is selectively presented on one or more of the other supervised devices for demonstration purposes. 10. The system claimed in claim 1, wherein the supervisory device is used to arrange the supervised devices into groups depending on the user's performance, and to provide different playlists for each group, or to monitor the groups separately 11. The system claimed in claim 1, wherein the supervisory device automatically collects performance information from the supervised devices and arranges that information for display of group or individual performance, or both. 12. The system claimed in claim 10, wherein performance issues axe automatically highlighted for each user or group of users.15 13. The system claimed in claim 1, wherein the computing devices are tablet computers or pads with touch-screens. 14. The system claimed in claim i, wherein an avatar is displayed on the computing devices for interacting with the devices using voice recognition, motion recognition, or object recognitionl. 15 A method for operating plural networked computing devices with one supervisory device and plural supervised devices, comprising the steps of: constructing user groups and interactive media content playlists from a library of interactive media, Then, arranging the playlists according to a timetable and selectively providing the playlists to the supervised devices according to each user's group. Thereafter, while the supervised devices are being used by the users to interactively work through their respective playlists, the supervisory device automatically receives real time progress information from each of the supervised devices and selectively displays that information as: the current content being displayed by one, or more, of the supervised devices, a graphical progress indicator for one, or more, of the supervised devices, or both. interaction icon by which user is able to And, the supervisory device displays an the selectively interact with the supervised devices, by: sending a message to the one or more supervised devices, initiating a two way audio-visual session with the one or more supervised devices, interacting with the interactive content on the one or more supervised devices, or any combination of these options.2102 ayM 32 77310221022102 ayM 32 77310221022102 ayM 32 77310221022102 ayM 32 77310221022102 ayM 32 77310221022102 ayM 32 77310221027 23 May 20127310221027 23 May 20127310221022102 ayM 32 77310221022102 ayM 32 77310221022102 ayM 32 77310221022102 ayM 32 77310221022102 ayM 32 77310221022102 ayM 32 77310221022102 ayM 32 77310221022102 ayM 32 77310221022102 ayM 32 77310221027 23 May 20127310221022102 ayM 32 77310221022102 ayM 32 77310221022102 ayM 32 77310221022102 ayM 32 77310221022102 ayM 32 7731022102

因篇幅问题不能全部显示,请点此查看更多更全内容