GD-FPS: Growth-Driven Feedforward Parameter Selection for Efficient Fine-Tuning
arXiv:2510.27359v2 Announce Type: replace
Abstract: Parameter-Efficient Fine-Tuning (PEFT) has emerged as a key strategy for adapting large-scale pre-trained models to downstream tasks, but existing approaches face notable limitations. Addition-based methods, such as Adapters, introduce inference latency and engineering complexity, whereas selection-based methods like Gradient-based Parameter Selection (GPS) require a full backward pass. The reliance on gradients not only incurs massive memory usage and substantial computational latency, but also leaves the selection vulnerable to the randomness of stochastic batch sampling. To resolve this, we propose Growth-Driven Feedforward Parameter Selection (GD-FPS). Operating entirely via forward passes, this strictly gradient-free method identifies the optimal parameter subset by scaling intrinsic weight magnitudes by their relative activation growth against a pre-training anchor. Evaluated on $26$ visual tasks spanning image classification and semantic segmentation, GD-FPS achieves competitive or superior performance over state-of-the-art PEFT baselines. Crucially, compared to GPS, it reduces peak memory usage by nearly $18\times$ and accelerates execution by over $2.7\times$ during the parameter selection stage. By guaranteeing deterministic selection, GD-FPS offers a memory-efficient, fast, and robust solution for fine-tuning.