SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Yuwen X, Chen L, Yan F, Zhang H, Tang J, Tian B, Ai Y. IEEE Trans. Intel. Transp. Syst. 2022; 23(1): 215-224.

Copyright

(Copyright © 2022, IEEE (Institute of Electrical and Electronics Engineers))

DOI

10.1109/TITS.2020.3009674

PMID

unavailable

Abstract

In unmanned vehicles, LiDARs and GPS/INSs are the most popular sensors to achieve perception and positioning. Precise calibration of the extrinsic parameters between the LiDAR and the GPS/INS is necessary for a successful implementation of sensor fusion. The extrinsic transformation between the LiDAR and GPS/INS is 6D ( x , y , z , yaw , pitch , roll ), but the motion of a vehicle is mainly 3D ( x , y , yaw ). The problem is to calculate the 6D extrinsic parameters with the limitation of 3D motion (plane constraint). The solution to this problem has been breaking the plane constraint by designing specific vehicle motions. This paper proposes a new method, a trajectory-based hand-eye calibration method, which makes full use of the large range of unmanned vehicles. The trajectories with large and small ranges are used to solve the rotation and translation, respectively. It is proved that the extrinsic parameters can be solved when the trajectory range of the unmanned vehicle is sufficiently large. The method proposed is tested with simulation, custom and KITTI datasets, and compared with the state-of-the-art methods. The results demonstrate that the accuracy and efficiency of the method proposed are comparable to the state of the art methods.


Language: en

Keywords

Calibration; hand-eye model; Intelligent sensors; Laser radar; LiDAR calibration; Three-dimensional displays; trajectory; Trajectory; Unmanned vehicles

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print