What Happens When You Use LoRA On a CNN
Exactly what you would expect đ« IntroductionLoRA usually comes up in the context of LLMs, where the whole idea is pretty simple : freeze a large pretrained model, inject a small low-rank adapter, and fine-tune only that instead of updating the entire pa…