ITSC 2024 Paper Abstract

Close

Paper FrBT6.1

Chen, Sihan (Beijing University of Chemical Technology), Wang, Keqiu (BUCT), Wang, Yadong (Beijing University of Chemical Technology), Shen, Tianyu (Beijing University of Chemical Technology), Wang, Kunfeng (Beijing University of Chemical Technology)

DG-BEV: Depth-Guided BEV 3D Object Detection with Sparse LiDAR Data

Scheduled for presentation during the Regular Session "LiDAR-based perception" (FrBT6), Friday, September 27, 2024, 13:30−13:50, Salon 14

2024 IEEE 27th International Conference on Intelligent Transportation Systems (ITSC), September 24- 27, 2024, Edmonton, Canada

This information is tentative and subject to change. Compiled on October 8, 2024

Keywords Sensing, Vision, and Perception, Driver Assistance Systems, Multi-modal ITS

Abstract

In the field of autonomous driving, Bird's Eye View (BEV) technology has garnered significant attention due to its excellent utilization of multi-view multi-modal data. However, current BEV detection frameworks still encounter challenges arising from insufficient incorporation of image semantic information and the sparsity of LiDAR data. This paper introduces a depth-guided BEV (DG-BEV) 3D detection method comprising a depth-guided view Transformation module (DG-VTM) and a visual-based depth completion module, enabling mitigating the limitations of sparse LiDAR data and enhancing the overall perception. Additionally, a multi-scale semantic enhancement module (MSEM) is proposed to ensure holistic and nuanced integration of semantic details into the detection process. The DG-VTM and MSEM are seamlessly incorporated as a plug-and-play unit, making it adaptable for integration into various BEV detection models. In experiments conducted on the nuScenes validation dataset, DG-BEV reaches an NDS of 71.87%, exceeding several state-of-the-art methods.

 

 

All Content © PaperCept, Inc.


This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2024 PaperCept, Inc.
Page generated 2024-10-08  14:12:01 PST  Terms of use