ijaers social
facebook
twitter
Blogger
google plus

International Journal of Advanced Engineering, Management and Science


Research on Recovery under Pruning Degeneration Using LoRA Technology

( Vol-11,Issue-6,November - December 2025 )

Author(s): Hao-Lin Ye, Chih-Ying Chuang


Download Full Text PDF
Download with Cover Page Total View : 652
Downloads : 11
Page No: 009-018
ijaems crossref doiDOI: 10.22161/ijaems.116.2

Keywords:

L1-Norm Pruning, SNIP Pruning, Taylor Pruning, LoRA, Model Sparsity.

Abstract:

In several fields, Convolutional Neural Networks (CNNs) have demonstrated impressive progress in recent years. However, its adoption on devices with limited resources is limited by its enormous model scale and high computational requirements. Neural network pruning in particular has become one of the most important methods for resolving this problem. The choice of importance criteria has a significant impact on pruning's effectiveness. Without systematic comparisons of numerous criteria under the same pruning ratio, the majority of the research to far has been on the proposal of single criteria or comparisons under non-strict control. Furthermore, trimming frequently results in performance loss that must be fixed through fine-tuning. The advent of parameter-efficient fine-tuning algorithms like LoRA offers a fresh approach to addressing the high computational cost of conventional global fine-tuning. It is still unknown how they work together with various pruning criteria. This is accomplished by conducting controlled experiments on the CIFAR-10 dataset to evaluate the performance of three widely used pruning criteria: L1-Norm pruning, SNIP pruning, and Taylor pruning, at pruning ratios ranging from 30% to 60%. LoRA is being methodically incorporated into the pruning recovery stage for the first time, demonstrating that it is a versatile and successful fine-tuning method that might significantly lessen the performance loss caused by trimming. Furthermore, in order to support the deployment of effective neural networks, this research offers empirical evidence for choosing suitable pruning and fine-tuning procedures for actual application objectives as seeking compression rate or accuracy.

Article Info:

Received: 28 Sept 2025; Received in revised form: 31 Oct 2025; Accepted: 04 Nov 2025; Available online: 08 Nov 2025

Cite This Article:
Citations:
APA | ACM | Chicago | Harvard | IEEE | MLA | Vancouver | Bibtex
Share: