Estimation of the closest in-path vehicle by low-channel lidar and camera sensor fusion for autonomous vehicles

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

In autonomous driving, using a variety of sensors to recognize preceding vehicles at middle and long distances is helpful for improving driving performance and developing various functions. However, if only LiDAR or cameras are used in the recognition stage, it is difficult to obtain the necessary data due to the limitations of each sensor. In this paper, we proposed a method of converting the vision-tracked data into bird’s eye-view (BEV) coordinates using an equation that projects LiDAR points onto an image and a method of fusion between LiDAR and vision-tracked data. Thus, the proposed method was effective through the results of detecting the closest in-path vehicle (CIPV) in various situations. In addition, even when experimenting with the EuroNCAP autonomous emergency braking (AEB) test protocol using the result of fusion, AEB performance was improved through improved cognitive performance than when using only LiDAR. In the experimental results, the performance of the proposed method was proven through actual vehicle tests in various scenarios. Consequently, it was convincing that the proposed sensor fusion method significantly improved the adaptive cruise control (ACC) function in autonomous maneuvering. We expect that this improvement in perception performance will contribute to improving the overall stability of ACC.

Original languageEnglish
Article number3124
JournalSensors
Volume21
Issue number9
DOIs
StatePublished - 1 May 2021

Bibliographical note

Publisher Copyright:
© 2021 by the authors. Licensee MDPI, Basel, Switzerland.

Keywords

  • Alignment of point clouds to images
  • Autonomous emergency braking (AEB) test
  • Bird’s eye-view (BEV)
  • Closest in-path vehicle (CIPV)
  • Sensor fusion

Fingerprint

Dive into the research topics of 'Estimation of the closest in-path vehicle by low-channel lidar and camera sensor fusion for autonomous vehicles'. Together they form a unique fingerprint.

Cite this