Omnimodal Dataset Distillation via High-order Proxy Alignment
arXiv:2604.10666v1 Announce Type: cross
Abstract: Dataset distillation compresses large-scale datasets into compact synthetic sets while preserving training performance, but existing methods are largely restricted to single-modal or bimodal settings. …