A novel method is presented here for the calibration of a sensor fusion system for intelligent vehicles. In this example, the sensors are a camera and a laser scanner which observe the same scene from different viewpoints. The method employs the Nelder-Mead direct search algorithm to minimize the sum of squared errors between the image coordinates and the re-projected laser data by iteratively adjusting and improving the calibration parameters. The method is applied to a real set of data collected from a test vehicle. Using only 11 well-spaced target points observable by each sensor, 12 intrinsic and extrinsic parameters indicating the position relationship between the sensors can be estimated to give an accurate projection. Experiments show that the method can project the laser points onto the image plane with an average error of 1.01 pixels (1.51 pixels worst case).
Bibliographical noteThe full text of this item is not available from the repository.
- Nelder-Mead optimization
- sensor fusion