ITSC 2024 Paper Abstract

Close

Paper ThBT5.3

Jonas, Schmitt (KIT), Liu, Ruiping (Karlsruhe Institute of Technology), Zheng, Junwei (Karlsruhe Institute of Technology), Zhang, Jiaming (Karlsruhe Institute of Technology (KIT)), Stiefelhagen, Rainer (Karlsruhe Institute of Technology)

Comb, Prune, Distill: Towards Unified Pruning for Vision Model Compression

Scheduled for presentation during the Regular Session "Sensing, Vision, and Perception IV" (ThBT5), Thursday, September 26, 2024, 15:10−15:30, Salon 13

2024 IEEE 27th International Conference on Intelligent Transportation Systems (ITSC), September 24- 27, 2024, Edmonton, Canada

This information is tentative and subject to change. Compiled on December 26, 2024

Keywords Sensing, Vision, and Perception, Driver Assistance Systems, Other Theories, Applications, and Technologies

Abstract

Lightweight and effective models are essential for devices with limited resources, such as intelligent vehicles. Structured pruning offers a promising approach to model compression and efficiency enhancement. However, existing methods often tie pruning techniques to specific model architectures or vision tasks. To address this limitation, we propose a novel unified pruning framework Comb, Prune, Distill (CPD), which addresses both model-agnostic and task-agnostic concerns simultaneously. Our framework employs a combing step to resolve hierarchical layer-wise dependency issues, enabling architecture independence. Additionally, the pruning pipeline adaptively remove parameters based on the importance scoring metrics regardless of vision tasks. To support the model in retaining its learned information, we introduce knowledge distillation during the pruning step. Extensive experiments demonstrate the generalizability of our framework, encompassing both convolutional neural network (CNN) and transformer models, as well as image classification and segmentation tasks. In image classification we achieve a speedup of up to x4.3 with a accuracy loss of 1.8% and in semantic segmentation up to x1.89 with a 5.1% loss in mIoU.

 

 

All Content © PaperCept, Inc.


This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2024 PaperCept, Inc.
Page generated 2024-12-26  06:12:26 PST  Terms of use