ITSC 2024 Paper Abstract

Close

Paper ThBT15.1

Chaghazardi, Zahra (Surrey university), Fallah, Saber (University of Surrey), Tamaddoni-Nezhad, Alireza (University of Surrey)

Trustworthy Vision for Autonomous Vehicles: A Robust Logic-Infused Deep Learning Approach

Scheduled for presentation during the Poster Session "Safety and Reliability Techniques for Autonomous Vehicles" (ThBT15), Thursday, September 26, 2024, 14:30−16:30, Foyer

2024 IEEE 27th International Conference on Intelligent Transportation Systems (ITSC), September 24- 27, 2024, Edmonton, Canada

This information is tentative and subject to change. Compiled on October 8, 2024

Keywords Advanced Vehicle Safety Systems, Sensing, Vision, and Perception, Human Factors in Intelligent Transportation Systems

Abstract

Deep learning constitutes a fundamental pillar in the field of image recognition within autonomous vehicles (AVs), facilitating precise predictions based on unprocessed data. However, unlike human cognition, deep learning models are susceptible to adversarial attacks. This paper proposes a novel approach, termed the Robust Logic-infused Deep Learning (RLDL) Approach, designed for traffic sign recognition. RLDL employs Inductive Logic Programming (ILP) to derive logical rules from a combination of positive and negative examples. These rules are subsequently transformed into a matrix of logical constraints, allowing for the assessment of logical consistency in predictions. Then, this logical consistency is incorporated into the neural network through the loss function. This study explores the impact of integrating logical constraints into deep learning models on the reliability of vision tasks in AVs. Our experiments demonstrate that the proposed method substantially enhances the accuracy of recognising traffic signs under adversarial attacks.

 

 

All Content © PaperCept, Inc.


This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2024 PaperCept, Inc.
Page generated 2024-10-08  13:57:05 PST  Terms of use