WO1998026346A1 - Computersteuerung - Google Patents
Computersteuerung Download PDFInfo
- Publication number
- WO1998026346A1 WO1998026346A1 PCT/DE1997/002970 DE9702970W WO9826346A1 WO 1998026346 A1 WO1998026346 A1 WO 1998026346A1 DE 9702970 W DE9702970 W DE 9702970W WO 9826346 A1 WO9826346 A1 WO 9826346A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- computer
- cursor
- function
- control according
- sensor area
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
Definitions
- the invention relates to a controller for a computer.
- Controls for multifunctional systems e.g. Computers, in simple form, have been known for some time.
- computers are known in which a user functions the computer through a trigger, e.g. triggers or influences a mouse-based cursor on a screen or a data glove for manipulating spatial objects.
- a trigger e.g. triggers or influences a mouse-based cursor on a screen or a data glove for manipulating spatial objects.
- Multimedia systems are understood to mean systems in which the senses of a person are conveyed through different media, e.g. be influenced by texts, images, videos, sounds, noises or music.
- multimedia systems also include touch (tactile) or smell (olfactory) stimuli that affect a person.
- the present invention has for its object to provide a controller for a computer and a method for computer control with which the functions of the computer can be controlled in a particularly differentiated manner and the functionality of the computer is increased.
- the control of a computer according to the invention has sensor means with which the position of at least one cursor on a display of the computer is detected . If the position of the cursor lies within a certain partial area of the display, the sensor area, the dwell time and the position of the cursor in the sensor area are measured. The information about the dwell time and the position is used to influence at least one function of the computer. The dwell time of the cursor is not measured outside the sensor range.
- a computer is understood to mean any data processing device which is equipped, inter alia, with a screen and a device for controlling a cursor (eg mouse, digitizer).
- the control of the computer according to the invention can be designed, for example, in the form of a processor or a program. In general, the functional units described below can be implemented either as software or hardware.
- An advantageous embodiment of the control according to the invention detects the speed and / or acceleration of at least one cursor and uses this information to influence at least one function of the computer.
- the functionality of the computer is increased by these additional kinematic parameters of the trigger.
- a controller according to the invention for a computer can e.g. react differently to a fast or slow movement of a cursor.
- the trajectory is recorded, which is described by at least one cursor on the computer screen.
- at least one function of the computer is influenced.
- the path curves of the cursor on the computer display are characteristic of certain situations when operating the computer or for certain users. This information can be used to improve the adaptation of the computer to a user.
- the control according to the invention has means with which the kinematic (dynamic) behavior of the cursor is quantified.
- the kinematic behavior of the cursor here is generally understood to mean the space-time behavior of the cursor on the computer screen, which in particular includes the dwell time, the position, the speed and the acceleration of the cursor.
- Quantification means that the kinematic behavior of the cursor is determined by parameters or functions which e.g. describe the dwell time or the shape of the trajectory. These parameters and functions form the input values for functional relationships that directly link the kinematic behavior of the cursor with a function of the computer. These functional relationships can be permanently stored in a database or can be changed over time.
- the quantization is used with particular advantage in combination with a random generator, so that novel effects can be achieved again and again, in particular in computers with multimedia applications or games.
- the control according to the invention advantageously has means of changing the position, shape and / or the function of at least one sensor area in the computer in a predeterminable or randomly controlled manner. This allows the sensor areas to be adapted to changing situations, which increases the flexibility of the control and the computer.
- a database is used to store the kinematic behavior of at least one cursor. It is also advantageous to record and store the spatial, temporal and / or functional changes in at least one sensor area in a database. In this way, for example, certain movements or movement patterns of the cursor can be stored and used in a particularly advantageous manner for influencing functions of the computer and / or at least one sensor area.
- control according to the invention there is a continuous transition (fading) between at least two different functions of the computer.
- fading a continuous transition between at least two different functions of the computer.
- a particularly advantageous embodiment of the control system according to the invention has a database in which objects for influencing at least one function of the computer are stored. At least one of these objects has an attribute that describes a property of the object. This attribute can e.g. describe the type of object (e.g. text) or the content of the object (e.g. poem). By using attributes, the controller can easily establish relationships between different objects.
- At least one object stored in the database and / or an attribute of the object has a modifier.
- This modifier is a measure by which the control can compare different objects or attributes with each other.
- a modifier can be be tunable stored in a database or changed by the controller over time.
- the control according to the invention advantageously has means with which at least one function of the computer can be controlled by the kinematic behavior of the cursor in connection with attributes and / or modifiers of at least one object. This makes it possible for the kinematic behavior of the cursor and the properties of the objects to influence the function of the computer, which enables very flexible control of the computer.
- the controller according to the invention also advantageously has means with which objects, in particular media, can be automatically stored in the database, sorted according to their type. This considerably speeds up the acquisition of objects (e.g. texts or images) that are to be used as functions of a computer.
- the controller can e.g. automatically assign certain attributes to the objects.
- At least one of the objects advantageously has information about a sensor area, an image, a text, a sound, a piece of music, control data for external devices, data for a three-dimensional representation, a modifier, an attribute or a group of objects.
- this information can be used in a uniform manner to influence the computer.
- At least one sensor area for a cursor is advantageously stored invisibly on the display of the computer.
- the display appears to the user in the usual form.
- the cursor on the display of the computer can be controlled by eye movements of a user of the computer.
- the eye movements can be recorded, for example, by video monitoring of the pupils or by deriving action potentials from facial muscles. Operating the cursor using eye movements is particularly useful for people who cannot use their hands when working on the computer.
- the position of at least one cursor on a display of the computer is first detected by sensor means.
- Computer functions such as e.g. audio-visual signals, can be activated.
- the information recorded by the sensor means is then transmitted to the control.
- the control determines whether the at least one cursor is located in a 1- or 2-dimensional partial area (sensor area) of the computer display. If the position lies within the sensor area, the control then determines the dwell time and the position of the at least one cursor within the sensor area. Depending on the dwell time and the position of the at least one cursor in the sensor area, the control finally influences the at least one function of the computer.
- Fig. 1 - a schematic representation of a display of a computer, wherein functions of the computer can be influenced by a cursor and sensor areas for the cursor; 2 shows a schematic representation of a sensor area on a display of a computer;
- FIG. 3 shows a schematic representation of a functional relationship between the position of a cursor on the display of the computer and a function of the computer (interaction graph);
- FIG. 4 shows a schematic representation of a functional relationship between the temporal behavior of a cursor and a function of the computer
- FIG. 6 shows a schematic illustration of the temporal sequence of positioning a cursor
- FIG. 1 shows, as an example, a schematic view of a display of a computer 1 which is equipped with a control according to the invention.
- a cursor 3 serves as a cursor for functions 10 of the computer 1.
- Functions 10 are, for example, the volume of a sound clip or the provision of program menus.
- the cursor 3 is moved with the aid of a mouse or another handling device over the display of the computer 1.
- the control of the computer 1 defines certain sub-areas at some points on the display, in which it is registered when a position 8 of the cursor 3 lies inside this sub-area. These subareas are called sensor areas 2 below.
- the control detects and stores not only the position 8 of the cursor 3 but also the dwell time of the cursor 3 in a sensor area 2.
- the shape of a sensor area 2 is not rigid, but can be adjusted as required in terms of position, shape and / or function on the displays of the computer 1. It is also possible for the entire display of the computer 1 to be covered with sensor areas 2, so that the dwell time of the cursor 3 is measured at every point on the screen, two different functions 10 of the computer 1 being triggered depending on the sensor area 2.
- An overlap of sensor areas 2 is also possible, with the control of the computer 1 then determining how the dwell times are processed (e.g. weighting, addition of the dwell times).
- the sensor areas 2 are invisible on the display, i.e. they are e.g. the usual display of a multimedia application or a word processing system.
- the sensor areas 2 can, however, be made visible when a program is being created or when a program is being debugged, in order to check the function 10.
- a multimedia lexicon according to the invention shows e.g. texts, images and videos on a screen, sensor areas 2 being stored at certain points on the screen.
- a user of the multimedia lexicon guides the cursor 3 into the area of the display that is of particular interest to him siert. If the cursor 3 is guided into a sensor area 2, the control system detects the position 8 of the cursor 3 and its dwell time in this sensor area 2. For this purpose, the control system has timer functions.
- the control interprets the dwell time of the cursor 3 in a sensor area 2 as the interest of the viewer and quantifies this interest as the so-called energy value. In this way, the perception of a user can be described by a measure.
- the energy value is stored in a database and thus serves as a memory for the interest of a viewer.
- the controller ensures that the energy value is changed after some time, so that forgetting or a waning interest is simulated.
- the control system maintains an "energy budget" with which it can always be determined in which sensor areas 2 which energy was consumed.
- the state of a cursor 3 is detected at all times by the current position, the current speed and the dwell time at its current position 8. In a two-dimensional display of a computer 1, the state can therefore be described by five values.
- the control determines the further behavior of the computer 1 (see also FIGS. 2 to 4). After a certain time (when a threshold value for energy is reached), cross-references to related topics are displayed, for example, or a piece of music that matches the context is played. It is possible that the newly displayed pictures or recorded music overlap each other, thus creating a continuous transition between the scenes (fading).
- the controller can control the behavior of the computer 1 not only in a deterministic dependence on the kinematic behavior of the cursor 3. Rather, multimedia content can also be selected and presented via a random generator. In the case of an electronic lexicon, for example, this gives the possibility of "browsing". Through a combination of deterministic and random selection of content, certain associations of the user can be taken into account.
- random control e.g. create images and atmospheres in an artistic multimedia program that are not repeatable and that challenge the creativity of a user.
- randomly controlled images and texts can be used in games, which always unfold new aspects.
- a sensor area 2 can e.g. also be a menu item of an operating system of the computer 1. If the cursor 3 remains on this sensitive menu item for a longer period of time, this is interpreted as increased interest by the control and an auxiliary text for this menu item is displayed. Additional functions can then be addressed via the position of the cursor 3 in the sensor area 2.
- the control of the computer 1 can detect and use the kinematic or dynamic behavior of the cursor 3 in another way.
- the control of the computer 1 not only registers the position 8 of the cursor 3, but also measures the speed, the acceleration inclination and the trajectory of the cursor 3 on the display of the computer 1. Furthermore, the regions which a cursor 3 frames by opening a window are also detected.
- the controller By detecting the trajectory of the cursor 3, the controller recognizes the order in which the cursor 3 was in certain sensor areas 2. The controller triggers different functions of the computer 1 depending on the sequence that has been run.
- the control of the computer 1 can also carry out numerical differentiations at certain points on the trajectory, by means of which the speeds and the accelerations at the points of the trajectory are calculated.
- the kinematic behavior of the cursor 3 is thus completely captured. These measurements of the kinematic behavior of the cursor are also quantified as energy values.
- the control system evaluates this as a small release of energy, i.e. The interest of the user is rated as low and only text is displayed. If, however, a cursor 3 moves slowly over a text, more energy is consumed. The interest is rated higher, which leads to a different behavior of the computer 1, e.g. playing a video.
- the kinematic behavior of the cursor 3 depends crucially on the person of the user of the computer.
- the kinematic behavior of a user is stored in a database.
- the controller can thus assign functions 10 of the computer 1 to a specific Customize the user (e.g. using an expert system or a neural network). It is also possible that it recognizes from the kinematic behavior of the cursor 3 that a particular behavior of a user is not efficient, and it adjusts a function 10 of the computer 1 accordingly or indicates the inefficiency to the user. This can lead to a considerable improvement in learning progress, especially with learning software.
- the cursor 3 of the computer 1 can be controlled by the control according to the invention via the eye movements of a user.
- the eye movements can e.g. via video surveillance of the pupils.
- By controlling the kinematic behavior of the cursor 3 through the eye movements and the sensor areas 2, in particular people who cannot fully use their hands (e.g. physically disabled) can operate computers in an efficient and flexible manner.
- FIG. 2 shows a circular sensor area 2 on the display of a computer 1 with a radius 7. If a cursor 3 is located within the sensor area 2, as shown in FIG. 2, the kinematic behavior of the cursor 3 and its dwell time in the sensor area 2 are detected by the control of the computer 1 according to the invention.
- the position 8 of the cursor 3 is represented in a polar coordinate system with the center as the reference point 6 of the sensor area 2.
- the position 8 of the cursor 3 is determined from the distance of the cursor 3 from the reference point 6 and an angle (not shown here) to a reference line.
- a corner of the sensor area 2 or the center of gravity of the sensor area 2 serves as a reference point 6.
- the position 8 of the cursor 3 is shown in an absolute coordinate system of the display of the computer 1, ie the coordinates are from the Corner of the display counted out.
- the control according to the invention additionally evaluates the angular coordinate and the dwell time at different points in the sensor area 2 and determines at least one function 10 of the computer 1 therefrom.
- FIGS. 3 and 4 The relationship between the kinematic behavior of the cursor 3 and a function 10 of the computer 1 is shown in FIGS. 3 and 4.
- the functional relationship 9 is part of the control according to the invention.
- the functional relationships 9 between a function 10 of the computer 1 and the position 8 of a cursor 3 can be both linear and non-linear.
- the controller typically uses the following input variables: key presses, mouse movements, trackball movements, data glove actions, sensor information, camera information.
- the input variables are linked by the controller via interaction graphs to the functions 10 of the computer 1.
- the output variables are typically: visual 2D and SD representations, video information, slide projections, sound, tactile information about active sensors in data gloves.
- function 10 ' is the opacity of an image in a multimedia application.
- the start time 11 is defined by a specific action (e.g. pressing a key, exceeding a specific dwell time of the cursor 3 in a sensor area 2). From this point in time, the opacity of an image is determined by the functional relationship 9 ', i.e. the opacity increases and decreases again after a while. If the cursor 3 is removed from the sensor area 2 at any point in time 13, the opacity 10 ′ of the image assigned at this point in time 13 remains.
- Both the spatial (see FIG. 3) and the temporal evaluation of interaction graphs (see FIG. 4) can be used in combination.
- Several functions 10 can be influenced as a function of them or independently of one another.
- FIG. 5 shows an example of how control of the computer 1 according to the invention via interaction graphs 9 ′′, 9 ′′ ′′ functions 10 ′′, 10 ′′ ′′ of a multimedia system influences.
- a database is essential for the function of the control according to the invention, in which all signals measured by the control and output by the control are stored.
- the database contains objects 14, such as images, texts, music, sounds, videos, programs, control commands for external devices, which the user of the computer puters 1 are made accessible.
- objects 14 such as images, texts, music, sounds, videos, programs, control commands for external devices, which the user of the computer puters 1 are made accessible.
- information about sensor areas 2 is also treated as objects 14.
- Media are stored in the database as objects 14 of various types.
- the objects 14 are combined in a container 15 in terms of program technology, the contents 14 of the objects 14 stored in the container 15 belonging together (i.e. images, texts, music on a subject).
- a container 15 is also an object 14 from a program point of view.
- An object 14 can be a member of different containers 15.
- the selection of an object 14 or a specific number of objects 14 takes place as a function of the position 8 ′′, 8 ′′ ′′ of the cursor 3 via the interaction graphs 9 ′′, 9 ′′ ′′.
- a measure is determined from the positions 8 ′′, 8 ′′ ′′ and / or another kinematic parameter of the cursor 3 via the interaction graphs 9 ′′, 9 ′′ ′′ that are valid at the respective points and / or at the respective time.
- the control according to the invention determines which object 14 or which group of objects 14 from the suitable container 15 is displayed or played.
- Each object 14 has attributes 16 that describe the properties of the object 14. Using these attributes 16, the controller determines which objects 14 are displayed, among other things.
- the image of a Greek temple is stored, which has the attributes 16 "building", “Greece”, “religion” and "antiquity”.
- the control system displays the image of the temple. If the control has determined, for example, that a user requests information about Greece, it determines depending on speed of the kinematic behavior of the cursor 3 in sensor areas 2, for example whether the image of the temple is displayed in addition to travel information about Greece. If a user informs himself about the ancient world on the computer 1, the image of the temple can again be displayed depending on the kinematic behavior of the cursor 3.
- the attributes 16 thus establish cross-connections between different objects 14 stored in a database. Since all information is stored in the database as objects 14 in terms of program technology, diverse interactions between the information and the kinematic behavior of the cursor 3 can be established.
- the control according to the invention does not specify a rigid information hierarchy, where e.g. under the generic term Greece only the sub-terms "travel information” and "pictures” can be called up. Rather, the range of information on the display is determined dynamically by the control as a function of the kinematic behavior of the cursor 3. Simply by lingering the cursor 3 at a certain point in a sensor area 2, the focus (see FIG. 6), different information can be displayed or played back gradually; the controller interprets the stay in the sensor area 2 as increased interest and controls the display of the computer 1 on the basis of the respective energy values.
- each object 14 has a modifier 17 which assigns a measure (for example in the range 1 to 100) to the object 14.
- the modifier 17 can be used, for example, to determine the transparency with which an image is displayed. With a modifier 17 with the value 100, the control system displays the image with full opacity, the background of the display is completely covered. With a value of 10, the image is only translucent on the screen, so that elements behind the picture shine through the picture.
- a modifier 17 can also be used, for example, to influence the volume of a noise, the frequency with which images are displayed or music is played, the selection of an image from a container or the sensitivity of the energy output or the energy consumption.
- Both the attributes 16 and the modifiers 17 can be changed in a predeterminable manner by the control. Likewise, it is possible for attributes 16 or modifiers 17 to be changed by the kinematic behavior of the cursor 3 and thus to be influenced directly by the behavior of the user.
- here is an audio system that controls the playback of pieces of music depending on the movement of a cursor 3. If the cursor 3 interacts with different sensor areas 2 one after the other in time, a sensor area 2 will not necessarily play the same pieces of music as the first time when it is searched again. Rather, it is possible to play thematically related pieces of music. Under certain circumstances, the interaction between the cursor 3 and the controller has signaled that the interest of a user has changed. After evaluating the information about the energy, the attributes 16 and the modifiers 17, the container content is therefore recompiled and the pieces of music then contained are played.
- the control controls the cooperation of the database and the evaluation of the kinematic behavior of the cursor 3 so that new information is always displayed. This creates a knowledge browser with completely new properties, namely the creation and viewing of data rooms and the possibility of interacting with a cursor.
- Fig. 6 the influence of a function 10 of a computer 1 by the dwell time of a cursor (not shown here) at position 8 is shown in a schematic manner.
- position 8 is in a sensor area 2, which is assigned to the image of a church
- different views of the church are shown after a certain time, ie information that is directly related to the selected ten object 14 are related.
- pictures of churches are shown that can be assigned to the same style.
- Church music from the corresponding epoch is played even later.
- the control according to the invention offering the user the possibility at any time of influencing the information provided by movements of the cursor 3 (ie by means of dwell time and position within the sensor area).
- the direction of the time axis 18 and the orientation of a so-called focal funnel thus indicate the “direction of interest”, that is, the focus of the user of the computer 1.
- the increasing interest is therefore shown in FIG. 6 by an expanding focal funnel 19; more and more objects 14 are being detected.
- a shift of the position 8 into another sensor area 2 therefore corresponds to a changed orientation of the focal funnel 19.
- the embodiment of the invention is not limited to the preferred exemplary embodiments specified above. Rather, a number of variants are conceivable which make use of the computer control according to the invention even in the case of fundamentally different types.
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA002274786A CA2274786A1 (en) | 1996-12-13 | 1997-12-15 | Computer control system |
JP10526102A JP2000512415A (ja) | 1996-12-13 | 1997-12-15 | コンピュータ制御システム |
EP97953649A EP1015960A1 (de) | 1996-12-13 | 1997-12-15 | Computersteuerung |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE1996153682 DE19653682C2 (de) | 1996-12-13 | 1996-12-13 | Steuervorrichtung und -verfahren für mindestens eine Einrichtung eines Raumes, und Raum mit Steuervorrichtung |
DE19653682.0 | 1996-12-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1998026346A1 true WO1998026346A1 (de) | 1998-06-18 |
Family
ID=7815787
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/DE1997/002970 WO1998026346A1 (de) | 1996-12-13 | 1997-12-15 | Computersteuerung |
PCT/DE1997/002969 WO1998026345A1 (de) | 1996-12-13 | 1997-12-15 | Raumsteuerung |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/DE1997/002969 WO1998026345A1 (de) | 1996-12-13 | 1997-12-15 | Raumsteuerung |
Country Status (5)
Country | Link |
---|---|
EP (2) | EP1015960A1 (de) |
JP (2) | JP2000512415A (de) |
CA (2) | CA2274786A1 (de) |
DE (1) | DE19654944A1 (de) |
WO (2) | WO1998026346A1 (de) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7310609B2 (en) * | 1997-09-11 | 2007-12-18 | Unicast Communications Corporation | Tracking user micro-interactions with web page advertising |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10125309C1 (de) * | 2001-05-21 | 2002-12-12 | Humatic Gmbh | Verfahren und Anordnung zum Steuern von audiovisuellen medialen Inhalten |
KR100575906B1 (ko) | 2002-10-25 | 2006-05-02 | 미츠비시 후소 트럭 앤드 버스 코포레이션 | 핸드 패턴 스위치 장치 |
JP2005242694A (ja) | 2004-02-26 | 2005-09-08 | Mitsubishi Fuso Truck & Bus Corp | ハンドパターンスイッチ装置 |
DE102007057799A1 (de) * | 2007-11-30 | 2009-06-10 | Tvinfo Internet Gmbh | Grafische Benutzerschnittstelle |
DE102011102038A1 (de) * | 2011-05-19 | 2012-11-22 | Rwe Effizienz Gmbh | Heimautomatisierungssteuerungssystem sowie Verfahren zum Steuern einer Einrichtung eines Heimautomatisierungssteuerungssystems |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4109145A (en) * | 1974-05-20 | 1978-08-22 | Honeywell Inc. | Apparatus being controlled by movement of the eye |
EP0342838A2 (de) * | 1988-05-20 | 1989-11-23 | International Business Machines Corporation | Benutzerschnittstelle zur Dateneingabe |
EP0403116A2 (de) * | 1989-06-16 | 1990-12-19 | International Business Machines Corporation | Handhabung der Triggerbereiche auf einem Anzeigegerät |
US5196838A (en) * | 1990-12-28 | 1993-03-23 | Apple Computer, Inc. | Intelligent scrolling |
WO1995014964A1 (en) * | 1993-11-23 | 1995-06-01 | Roca Productions, Inc. | Electronically simulated rotary-type cardfile |
EP0660218A1 (de) * | 1993-12-21 | 1995-06-28 | Xerox Corporation | Graphische Tastatur |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2183889B (en) * | 1985-10-07 | 1989-09-13 | Hagai Sigalov | Optical control means |
US5107746A (en) * | 1990-02-26 | 1992-04-28 | Will Bauer | Synthesizer for sounds in response to three dimensional displacement of a body |
JP3138058B2 (ja) * | 1992-05-25 | 2001-02-26 | 東芝キヤリア株式会社 | 換気扇の制御装置 |
US5534917A (en) * | 1991-05-09 | 1996-07-09 | Very Vivid, Inc. | Video image based control system |
JPH05108258A (ja) * | 1991-10-14 | 1993-04-30 | Nintendo Co Ltd | 座標データ発生装置 |
US5326028A (en) * | 1992-08-24 | 1994-07-05 | Sanyo Electric Co., Ltd. | System for detecting indoor conditions and air conditioner incorporating same |
US5448693A (en) * | 1992-12-29 | 1995-09-05 | International Business Machines Corporation | Method and system for visually displaying information on user interaction with an object within a data processing system |
DE4406668C2 (de) * | 1993-04-27 | 1996-09-12 | Hewlett Packard Co | Verfahren und Vorrichtung zum Betreiben eines berührungsempfindlichen Anzeigegeräts |
US5704836A (en) * | 1995-03-23 | 1998-01-06 | Perception Systems, Inc. | Motion-based command generation technology |
-
1996
- 1996-12-13 DE DE19654944A patent/DE19654944A1/de not_active Withdrawn
-
1997
- 1997-12-15 EP EP97953649A patent/EP1015960A1/de not_active Withdrawn
- 1997-12-15 WO PCT/DE1997/002970 patent/WO1998026346A1/de not_active Application Discontinuation
- 1997-12-15 CA CA002274786A patent/CA2274786A1/en not_active Abandoned
- 1997-12-15 EP EP97953648A patent/EP1015959A1/de not_active Withdrawn
- 1997-12-15 JP JP10526102A patent/JP2000512415A/ja active Pending
- 1997-12-15 JP JP10526101A patent/JP2000512467A/ja active Pending
- 1997-12-15 CA CA002274702A patent/CA2274702A1/en not_active Abandoned
- 1997-12-15 WO PCT/DE1997/002969 patent/WO1998026345A1/de not_active Application Discontinuation
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4109145A (en) * | 1974-05-20 | 1978-08-22 | Honeywell Inc. | Apparatus being controlled by movement of the eye |
EP0342838A2 (de) * | 1988-05-20 | 1989-11-23 | International Business Machines Corporation | Benutzerschnittstelle zur Dateneingabe |
EP0403116A2 (de) * | 1989-06-16 | 1990-12-19 | International Business Machines Corporation | Handhabung der Triggerbereiche auf einem Anzeigegerät |
US5196838A (en) * | 1990-12-28 | 1993-03-23 | Apple Computer, Inc. | Intelligent scrolling |
WO1995014964A1 (en) * | 1993-11-23 | 1995-06-01 | Roca Productions, Inc. | Electronically simulated rotary-type cardfile |
EP0660218A1 (de) * | 1993-12-21 | 1995-06-28 | Xerox Corporation | Graphische Tastatur |
Non-Patent Citations (4)
Title |
---|
"AUTOMATIC SELECTION WITH MOUSE POINTER", IBM TECHNICAL DISCLOSURE BULLETIN, vol. 38, no. 10, 1 October 1995 (1995-10-01), pages 549/550, XP000540586 * |
"SMART BUTTON", IBM TECHNICAL DISCLOSURE BULLETIN, vol. 36, no. 11, 1 November 1993 (1993-11-01), pages 57, XP000424775 * |
POON A ET AL: "GESTURAL USER INTERFACE TECHNIQUE FOR CONTROLLING THE PLAYBACK OF SEQUENTIAL MEDIA", XEROX DISCLOSURE JOURNAL, vol. 19, no. 2, 1 March 1994 (1994-03-01), pages 187 - 189, XP000435062 * |
ROBERTSON G G ET AL: "BUTTONS AS FIRST CLASS OBJECTS ON AN X DESKTOP", PROCEEDINGS OF THE SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOL ( UIST ), HILTON HEAD, S. CAROLINA, NOV. 11 - 13, 1991, no. SYMP. 4, 11 November 1991 (1991-11-11), ASSOCIATION FOR COMPUTING MACHINERY, pages 35 - 44, XP000315064 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7310609B2 (en) * | 1997-09-11 | 2007-12-18 | Unicast Communications Corporation | Tracking user micro-interactions with web page advertising |
Also Published As
Publication number | Publication date |
---|---|
DE19654944A1 (de) | 1998-06-25 |
WO1998026345A1 (de) | 1998-06-18 |
EP1015960A1 (de) | 2000-07-05 |
CA2274786A1 (en) | 1998-06-18 |
JP2000512467A (ja) | 2000-09-19 |
EP1015959A1 (de) | 2000-07-05 |
CA2274702A1 (en) | 1998-06-18 |
JP2000512415A (ja) | 2000-09-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102018100809A1 (de) | Verfahren, vorrichtung und endgerät zum anzeigen einer virtuellen tastatur | |
DE69233211T2 (de) | Gerät zur Manipulation eines auf einem Bildschirm angezeigten Objektes | |
DE60024655T2 (de) | Verfahren zur benutzung von mit einem anzeigegerät verbundenen tasten für den zugriff und die ausführung von damit verbundenen funktionen | |
DE112014000441T5 (de) | Dynamische Benutzerinteraktionen für Displaysteuerung und Angepaßte Gesten Interpretation | |
DE60028894T2 (de) | Präsentationssystem mit einer interaktiven Darstellung | |
DE60036261T2 (de) | Graphische Kontrolle einer zeitbasierten Einstellungscharakteristik in einem Videospiel | |
DE102009014555A1 (de) | Verfahren zum Unterstützen der Steuerung der Bewegung eines Positionsanzeigers mittels eines Tastfelds | |
WO2010046147A1 (de) | Verfahren und vorrichtung zum anzeigen von in listen geordneter information | |
DE112006002954T5 (de) | Virtuelles Schnittstellensystem | |
DE202008005344U1 (de) | Elektronische Vorrichtung mit schaltbarer Benutzerschnittstelle und elektronische Vorrichtung mit zugreifbarer Berührbedienung | |
DE202014011483U1 (de) | Endgerät | |
DE112016005818T5 (de) | Umstellung erweiterter realitätsobjekte in physischen und digitalen umgebungen | |
DE202008005343U1 (de) | Elektronische Vorrichtung mit behinderungsfreier Bedienung | |
EP3040817A1 (de) | Vorrichtung und verfahren zur eingabe eines textes über virtuelle bedienelemente mit haptischer rückkopplung zur simulation einer tastenhaptik, insbesondere in einem kraftfahrzeug | |
DE102015105022B4 (de) | Datenverarbeitungsverfahren und Elektronikgerät | |
WO1998026346A1 (de) | Computersteuerung | |
DE102012219129B4 (de) | Verfahren zum Betreiben einer Vorrichtung, die eine Benutzerschnittstelle mit einem Berührungssensor aufweist, sowie entsprechende Vorrichtung | |
DE19653682A1 (de) | Systemsteuerung | |
DE10210799B4 (de) | Anpassung einer Mensch-Maschine-Schnittstelle in Abhängigkeit eines Psychoprofils und einer momentanen Befindlichkeit eines Anwenders | |
DE102014019648A1 (de) | Datenverarbeitungsverfahren und elektronische Vorrichtung | |
DE102009031158A1 (de) | Vorrichtung und Verfahren zur Erkennung einer Zeigegeste eines Nutzers zur Interaktion mit einer Eingabefläche | |
DE102019113133A1 (de) | Systeme und verfahren zur beeinflussung von spotlight-effekten_ | |
DE102017112040A1 (de) | Modifizieren einer hervorhebungsfunktion auf einer anzeige auf der basis des inhalts | |
AT520234B1 (de) | Vorrichtung zur interaktiven Präsentation von visuellen Inhalten | |
DE102004027289A1 (de) | Verfahren und Anordnung zur berührungslosen Navigation in einem Dokument |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CA JP US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
ENP | Entry into the national phase |
Ref document number: 2274786 Country of ref document: CA Ref country code: CA Ref document number: 2274786 Kind code of ref document: A Format of ref document f/p: F |
|
WWE | Wipo information: entry into national phase |
Ref document number: 09319819 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref country code: JP Ref document number: 1998 526102 Kind code of ref document: A Format of ref document f/p: F |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1997953649 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1997953649 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 1997953649 Country of ref document: EP |