Egocentric Human Trajectory Forecasting With a Wearable Camera and Multi-Modal Fusion.
Jianing QiuLipeng ChenXiao GuFrank P.-W. LoYa-Yen TsaiJiankai SunJiaqi LiuBenny LoPublished in: IEEE Robotics Autom. Lett. (2022)
Keyphrases
- multi modal fusion
- activity recognition
- video camera
- wearable devices
- short term
- structure from motion
- hand held
- real time
- gesture recognition
- field of view
- visual saliency
- multi camera
- multiple cameras
- augmented reality
- human computer interaction
- vision system
- human activities
- camera calibration
- ambient intelligence
- trajectory data
- gaze tracking
- eye contact
- high quality