MemFlow: A Lightweight Forward Memorizing Framework for Quick Domain Adaptive Feature Mapping

arXiv:2402.14598v3 Announce Type: replace-cross Abstract: Deploying pretrained visual models in real-world environments often suffers from significant performance degradation due to the diversity of testing scenarios. Continuous adaptation of learning models on edge devices via unlabeled data collected from the target domain is highly effective for boosting generalization capability. However, gradient-backpropagation-based optimization of the massive parameters in deep neural networks is vastly more time-consuming than forward inference, rendering online learning infeasible on low-power edge devices. To address this critical challenge, we propose a lightweight gradient-free forward-memorizing framework, namely MemFlow, which leverages a frozen backbone and enables efficient fine-tuning of the mapping between features and predictions. Specifically, MemFlow employs randomly connected neurons to memorize feature-label associations; within the network, spiking signals are propagated, and predictions are generated by associating neuron-stored memories according to their confidence levels. More notably, MemFlow supports reinforced memorization of feature mappings using unlabeled data, thereby enabling rapid adaptation to new domains. Extensive experiments on four real-world cross-domain datasets demonstrate that MemFlow achieves performance improvements of up to 10\% while consuming less than 1\% of the computational time required by traditional domain adaptation methods.The code is available at https://github.com/so-link/MemFlow.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top