Studying user-defined iPad gestures for interaction in multi-display environment.
Ekaterina KurdyukovaMatthias RedlinElisabeth AndréPublished in: IUI (2012)
Keyphrases
- user defined
- data types
- real time
- human computer interaction
- agent environment
- physical space
- multi touch
- physical world
- human robot interaction
- input device
- dynamic environments
- query language
- mobile robot
- query processor
- gaze control
- hand gestures
- databases
- face to face interactions
- similarity queries
- gesture recognition
- metric space
- user interaction
- database management systems
- mobile devices
- data structure
- metadata