ITSC 2025 Paper Abstract

Close

Paper FR-EA-T43.1

LI, ZIWEI (The University of Tokyo), Hirano, Masahiro (University of Tokyo), GUO, QITONG (The University of Tokyo), JIA, RUOYU (The University of Tokyo), Yamakawa, Yuji (The University of Tokyo)

SG-NLOSTrack: Semantic-Guided Non-Line-of-Sight Target Tracking

Scheduled for presentation during the Regular Session "S43b-Multi-Sensor Fusion and Perception for Robust Autonomous Driving" (FR-EA-T43), Friday, November 21, 2025, 13:30−13:50, Stradbroke

2025 IEEE 28th International Conference on Intelligent Transportation Systems (ITSC), November 18-21, 2025, Gold Coast, Australia

This information is tentative and subject to change. Compiled on October 18, 2025

Keywords Real-time Object Detection and Tracking for Dynamic Traffic Environments, Advanced Sensor Fusion for Robust Autonomous Vehicle Perception, Deep Learning for Scene Understanding and Semantic Segmentation in Autonomous Vehicles

Abstract

Perceiving and tracking non-line-of-sight (NLOS) targets remains an open and underexplored problem for autonomous driving systems. Due to occlusions, visual sensors often fail to provide reliable observations, resulting in significantly increased collision risks. While radar enables NLOS target detection via multipath propagation, it typically relies on prior knowledge of the surrounding environment. To address these challenges, we propose SG-NLOSTrack, a dual-branch semantic-guided framework that enhances 4D radar-based NLOS tracking with image semantics. One branch leverages image semantics to identify environmental structures in radar point clouds, followed by geometric reconstruction. The other performs 3D point cloud segmentation to detect NLOS targets, whose positions are inferred by mirroring their first-order reflections. A Kalman filter then tracks the recovered targets over time to ensure temporal consistency. We evaluate our approach using the UT-NLOS Dataset, which features typical urban NLOS scenarios with point-wise semantic labels. Experimental results show that SG-NLOSTrack achieves a mean absolute error (MAE) of 0.4190 m and a total tracking success rate TSRtotal of 80%, demonstrating the feasibility and effectiveness under NLOS conditions.

 

 

All Content © PaperCept, Inc.


This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2025 PaperCept, Inc.
Page generated 2025-10-18  21:27:52 PST  Terms of use