Development of benchmark datasets of multioriented hand gestures for speech and hearing disabled.
Soumi PaulHayat NasserAyatullah Faruk MollahArpan BhattacharyyaPhuc NgoMita NasipuriIsabelle Debled-RennessonSubhadip BasuPublished in: Multim. Tools Appl. (2022)
Keyphrases
- benchmark datasets
- hand gestures
- sign language
- hand movements
- human computer interaction
- gesture recognition
- hand gesture recognition
- hearing impaired
- software engineering
- ensemble methods
- uci machine learning repository
- hand tracking
- uci repository
- video camera
- motion patterns
- american sign language
- automatic speech recognition
- spatio temporal
- speech recognition
- user interface