LiDAR-Camera Fusion for Depth Enhanced Unsupervised Odometry.
Naida FeticEren AydemirMustafa UnelPublished in: VTC Spring (2022)
Keyphrases
- inertial sensors
- depth cameras
- sensor fusion
- view interpolation
- position estimation
- depth estimation
- defocused images
- hand held
- field of view
- scene geometry
- time of flight
- point cloud
- depth information
- camera calibration
- single shot
- infrared video
- real scenes
- vision system
- structure from motion
- scene depth
- depth map
- semi supervised
- real time
- information fusion
- visual odometry
- camera motion
- ego motion
- motion parallax
- depth images
- multiple cameras
- video camera
- surveillance system
- camera parameters
- multi camera
- multi sensor
- infrared
- defocus blur
- d scene
- three dimensional
- autonomous navigation
- supervised learning
- high resolution
- depth data
- calibration method
- camera views
- fusion method
- stereo camera
- focal length
- data fusion
- laser scanner
- unsupervised learning
- camera pose
- light field
- point correspondences
- image fusion
- stereo vision