Analyzing Deaf and Hard-of-Hearing Users' Behavior, Usage, and Interaction with a Personal Assistant Device that Understands Sign-Language Input.
Abraham GlasserMatthew WatkinsKira HartSooyeon LeeMatt HuenerfauthPublished in: CHI (2022)
Keyphrases
- sign language
- input device
- hand gestures
- gesture recognition
- sign language recognition
- user interaction
- human computer interaction
- sign recognition
- hand tracking
- american sign language
- hand gesture recognition
- information seeking
- usage patterns
- user interface
- end users
- content adaptation
- recommender systems
- video camera
- behavioral model