Touch-less interaction with medical images using hand & foot gestures.
Shahram JalaliniyaJeremiah SmithMiguel SousaLars BütheThomas PedersonPublished in: UbiComp (Adjunct Publication) (2013)
Keyphrases
- medical images
- input device
- hand gestures
- human computer interaction
- human robot interaction
- anatomical structures
- medical imaging
- medical image analysis
- medical data
- deformable models
- mr images
- gesture recognition
- magnetic resonance imaging
- image intensity
- magnetic resonance
- computer aided diagnosis
- ct images
- hand movements
- imaging modalities
- medical diagnosis
- brain images
- augmented reality
- segmentation of medical images
- hidden markov models
- pointing gestures
- magnetic resonance images
- hand motion
- region of interest
- computed tomography
- face to face interactions
- hand postures
- medical image segmentation
- sign language
- computer tomography
- computer aided
- biomedical images
- clinical applications
- clinical practice
- x ray images
- three dimensional
- medical image retrieval
- fully automatic
- soft tissue
- multimodal medical images