ITSC 2024 Paper Abstract

Close

Paper FrAT17.8

Tang, Xuewei (Tsinghua University), Jiang, Kun (Tsinghua University), Yang, Mengmeng (Tsinghua University), Wen, Tuopu (Tsinghua), Jia, Peijin (Tsinghua University), Cui, Le (DiDi Voyager), Luo, Mingshan (DiDi Voyager), Sheng, Kehua (DiDi Voyager), Zhang, Bo (Tsinghua Universty), Yang, Diange (State Key Laboratory of Automotive Safety and Energy, Collaborat)

TSCMapNet: Temporal Spatial Consistency for Online Mapping

Scheduled for presentation during the Poster Session "Transportation Data Analysis and Calibration" (FrAT17), Friday, September 27, 2024, 10:30−12:30, Foyer

2024 IEEE 27th International Conference on Intelligent Transportation Systems (ITSC), September 24- 27, 2024, Edmonton, Canada

This information is tentative and subject to change. Compiled on December 26, 2024

Keywords Sensing, Vision, and Perception, Network Modeling, Other Theories, Applications, and Technologies

Abstract

High-definition (HD) maps are an essential component for autonomous vehicles, offering precise environmental information that is critical for navigation and safety. Despite recent progress in online mapping techniques, current methods frequently neglect the temporal and spatial coherence intrinsic to map data. This oversight leads to isolated predictions of each frame, causing inconsistencies in the shape and distribution of the same map elements across different frames. Such discontinuities not only hinder the seamless integration required for constructing a unified global HD map but also present a challenge to the accuracy of map representation. This paper introduces an innovative network architecture that addresses these limitations by exploiting the temporal and spatial consistency of map elements. The proposed network operates under the premise that, from a bird's-eye view (BEV), the geometric form and spatial arrangement of a map element are temporally stable across successive frames. By capitalizing on this temporal stability, the network improves the precision and integrity of map element detection, a key factor in the assembly of comprehensive global maps. Our approach has undergone rigorous evaluation, demonstrating remarkable performance. With a mean Average Precision (mAP) of 64.1% on the nuScenes dataset, our method outperforms current state-of-the-art techniques, underscoring its adeptness at leveraging spatiotemporal correlations. This advancement represents a significant step forward in the evolution of HD map construction for autonomous driving applications.

 

 

All Content © PaperCept, Inc.


This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2024 PaperCept, Inc.
Page generated 2024-12-26  17:19:07 PST  Terms of use