Publication:
Automatic laser and camera extrinsic calibration for data fusion using road plane

Loading...
Thumbnail Image
Identifiers
ISBN: 978-84-9012-355-3
Publication date
2014-10-07
Defense date
Advisors
Tutors
Journal Title
Journal ISSN
Volume Title
Publisher
IEEE
Impact
Google Scholar
Export
Research Projects
Organizational Units
Journal Issue
Abstract
Driving Assistance Systems and Autonomous Driving applications require trustable detections. These demanding requirements need sensor fusion to provide information reliable enough. But data fusion presents the problem of data alignment in both rotation and translation. Laser scanner and video cameras are widely used in sensor fusion. Laser provides operation in darkness, long range detection and accurate measurement but lacks the means for reliable classification due to the limited information provided. The camera provides classification thanks to the amount of data provided but lacks accuracy for measurements and is sensitive to illumination conditions. Data alignment processes require supervised and accurate measurements, that should be performed by experts, or require specific patterns or shapes. This paper presents an algorithm for inter-calibration between the two sensors of our system, requiring only a flat surface for pitch and roll calibration and an obstacle visible for both sensors for determining the yaw. The advantage of this system is that it does not need any particular shape to be located in front of the vehicle apart from a flat surface, which is usually the road. This way, calibration can be achieved at virtually any time without human intervention.
Description
Keywords
Calibration, Data Alignment, LIDAR, Extrinsic, RANSAC, Fusion
Bibliographic citation
17th International Conference on Information Fusion (FUSION): IEEE. Pp. 1-6