ITSC 2025 Paper Abstract

Close

Paper VP-VP.21

Gao, Zihui (Zhejiang University), zhao, xinhai (Shanghai Jiao Tong University), Chen, Hao (Zhejiang University)

Improving Monocular 3D Object Detection for Far-Field Scenarios

Scheduled for presentation during the Video Session "On-Demand Video Presentations" (VP-VP), Saturday, November 22, 2025, 08:00−18:00, On-Demand Platform

2025 IEEE 28th International Conference on Intelligent Transportation Systems (ITSC), November 18-21, 2025, Gold Coast, Australia

This information is tentative and subject to change. Compiled on April 2, 2026

Keywords Deep Learning for Scene Understanding and Semantic Segmentation in Autonomous Vehicles, Real-time Object Detection and Tracking for Dynamic Traffic Environments

Abstract

Monocular 3D object detection has been intensively studied for its vital role in autonomous driving and robotics. One of the most critical factors is instance depth estimation because of the ill-posedness of the problem. The distance estimation for far objects (over 30m) is especially challenging because of the fewer annotated far objects and limited object resolution. Improving the distance estimation of far objects is a burning problem for monocular 3D perception in the wild. We propose a real-time single-stage anchor free framework named MonoFar. On the one hand, the proposed framework adopts a feature fusion module that enhances features for 3D object detection with local contextual depth representation, which has a crucial effect on the far objects. The module adaptively selects region-of-interest contextual features aided by predicted heatmaps. On the other hand, it benefits from a 3D scale data augmentation that solve the imbalance between the close and far objects. Among all dense-depth-based methods, MonoFar achieves the state-of-the-art accuracy and the best Hard objects detection result, and the inference time is only 1/15 of CaDDN on the KITTI 3D object detection benchmark.

 

 

All Content © PaperCept, Inc.


This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2026 PaperCept, Inc.
Page generated 2026-04-02  10:58:22 PST  Terms of use