Ortho-Hydra: Orthogonalized Experts for DiT LoRA
arXiv:2605.03252v1 Announce Type: new
Abstract: LoRA fine-tuning of diffusion transformers (DiT) on multi-style data suffers from \emph{style bleed}: a single low-rank residual cannot represent several distinct artist fingerprints, and the optimizer c…