cs.CL, cs.LG

Sparse Memory Finetuning as a Low-Forgetting Alternative to LoRA and Full Finetuning

arXiv:2605.03229v1 Announce Type: new
Abstract: Adapting a pretrained language model to a new task often hurts the general capabilities it already had, a problem known as catastrophic forgetting. Sparse Memory Finetuning (SMF) tries to avoid this by a…