Label-efficient underwater species classification with logistic regression on frozen foundation model embeddings
arXiv:2604.00313v2 Announce Type: replace
Abstract: Automated species classification from underwater imagery is bottlenecked by the cost of expert annotation, and supervised models trained on one dataset rarely transfer to new conditions. We investigate whether a simple classifier operating on frozen foundation model embeddings can close this gap. Using frozen DINOv3 ViT-B/16 embeddings with no fine-tuning, we train a logistic regression classifier and evaluate on the AQUA20 benchmark (20 marine species). At full supervision, logistic regression achieves 88.5% macro F1 compared to ConvNeXt's 88.9%, a gap of 0.4 percentage points, while outperforming the supervised baseline on 8 of 20 species. Under label scarcity, with 21 labeled examples per class (approximately 6% of training labels), macro F1 exceeds 80%. The near-parity with end-to-end supervised learning demonstrates that these general-purpose, frozen representations exhibit strong linear separability at the species level in the underwater domain. Our approach requires no deep learning training, no domain-specific data engineering, and no underwater-adapted models, establishing a practical, immediately deployable baseline for label-efficient marine species recognition. All results are reported on the held-out test set over 100 random seed initialisations. This is a preliminary report; further evaluations and ablations are forthcoming.