Multi-level fusion scheme for target classification using camera and radar sensors

Eugin Hyun, Young Seok Jin

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

We propose a multi-level fusion scheme for target detection using camera and radar sensors. For the proposed scheme, the radar sensor provides the target track information of the range, velocity, angle, and track ID. This data is applied during the vision processing step as the ROI (region of interest) where the target may exist. Next, for feature-level fusion, the Doppler spectrum of the ROI is provided to the sensor-fusion-based target classifier. In the classifier, we then determine the class of the target using an image database and a Doppler pattern database. In the experimental results, we verify the proposed processing scheme using a 24 GHz FMCW transceiver with a single antenna.

Original languageEnglish
Title of host publicationProceedings of the 2017 International Conference on Image Processing, Computer Vision, and Pattern Recognition, IPCV 2017
EditorsHamid R. Arabnia, Leonidas Deligiannidis, Fernando G. Tinetti
PublisherCSREA Press
Pages111-114
Number of pages4
ISBN (Electronic)1601324642, 9781601324641
StatePublished - 2017
Event2017 International Conference on Image Processing, Computer Vision, and Pattern Recognition, IPCV 2017 - Las Vegas, United States
Duration: 17 Jul 201720 Jul 2017

Publication series

NameProceedings of the 2017 International Conference on Image Processing, Computer Vision, and Pattern Recognition, IPCV 2017

Conference

Conference2017 International Conference on Image Processing, Computer Vision, and Pattern Recognition, IPCV 2017
Country/TerritoryUnited States
CityLas Vegas
Period17/07/1720/07/17

Bibliographical note

Publisher Copyright:
CSREA Press ©

Keywords

  • Automotive radar
  • Radar sensor
  • Sensor fusion

Fingerprint

Dive into the research topics of 'Multi-level fusion scheme for target classification using camera and radar sensors'. Together they form a unique fingerprint.

Cite this