Egocentric Human Trajectory Forecasting with a Wearable Camera and Multi-Modal Fusion.
Jianing QiuLipeng ChenXiao GuFrank Po Wen LoYa-Yen TsaiJiankai SunJiaqi LiuBenny LoPublished in: CoRR (2021)
Keyphrases
- multi modal fusion
- video camera
- activity recognition
- field of view
- hand held
- data sets
- camera calibration
- wearable devices
- camera motion
- vision system
- human activities
- real time
- surveillance system
- gaze tracking
- multiple cameras
- stereo camera
- human walking
- wearable sensors
- inertial sensors
- real environment
- computer vision
- trajectory data
- gesture recognition
- sensor networks
- video sequences
- structure from motion
- ambient intelligence
- detection algorithm
- input image