Detecting start and end times of object-handlings on a table by fusion of camera and load sensors.
Ryuta YasuokaAtsushi HashimotoTakuya FunatomiMichihiko MinohPublished in: CEA@ACM Multimedia (2013)
Keyphrases
- data fusion
- multi sensor
- sensor fusion
- camera network
- real time
- pose determination
- camera views
- inertial sensors
- position and orientation
- video camera
- camera images
- database
- multiple cameras
- range sensors
- moving platform
- object motion
- hand held
- ground plane
- field of view
- time of flight
- moving objects
- infrared video
- d objects
- vision system
- object tracking
- camera motion
- multi camera
- computer vision
- inertial measurement unit
- acquired images
- pose estimation
- sensor data
- uncalibrated images
- structure from motion
- surveillance system
- stereo camera
- range images
- image sensor
- partial occlusion
- multi view
- optical axis
- imaging sensors
- fusion method
- multiple objects
- focal length