Crowded in B-Space: Calibrating Shared Directions for LoRA Merging
arXiv:2604.16826v1 Announce Type: new
Abstract: Merging separately trained LoRA adapters is a practical alternative to joint multi-task training, but it often hurts performance. Existing methods usually treat the LoRA update $\Delta W = BA$ as a singl…