ITSC 2025 Paper Abstract

Close

Paper FR-LM-T37.2

Kösel, Michael (Ulm University), Schreiber, Marcel (Robert Bosch GmbH), Ulrich, Michael (Robert Bosch GmbH), Gläser, Claudius (Robert Bosch GmbH), Dietmayer, Klaus (University of Ulm)

ALOOD: Exploiting Language Representations for LiDAR-Based Out-Of-Distribution Object Detection

Scheduled for presentation during the Regular Session "S37a-Reliable Perception and Robust Sensing for Intelligent Vehicles" (FR-LM-T37), Friday, November 21, 2025, 10:50−11:10, Coolangata 1

2025 IEEE 28th International Conference on Intelligent Transportation Systems (ITSC), November 18-21, 2025, Gold Coast, Australia

This information is tentative and subject to change. Compiled on October 18, 2025

Keywords Deep Learning for Scene Understanding and Semantic Segmentation in Autonomous Vehicles

Abstract

LiDAR-based 3D object detection plays a critical role for reliable and safe autonomous driving systems. However, existing detectors often produce overly confident predictions for objects not belonging to known categories, posing significant safety risks. This is caused by so-called out-of-distribution (OOD) objects, which were not part of the training data, resulting in incorrect predictions. To address this challenge, we propose ALOOD (Aligned LiDAR representations for Out-Of-Distribution Detection), a novel approach that incorporates language representations from a vision-language model (VLM). By aligning the object features from the object detector to the feature space of the VLM, we can treat the detection of OOD objects as a zero-shot classification task. We demonstrate competitive performance on the nuScenes OOD benchmark, establishing a novel approach to OOD object detection in LiDAR using language representations. The source code is available at https://github.com/uulm-mrm/mmood3d.

 

 

All Content © PaperCept, Inc.


This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2025 PaperCept, Inc.
Page generated 2025-10-18  21:18:56 PST  Terms of use