G-Loss: Graph-Guided Fine-Tuning of Language Models
arXiv:2604.25853v2 Announce Type: replace-cross
Abstract: Traditional loss functions, including cross-entropy, contrastive, triplet, and su pervised contrastive losses, used for fine-tuning pre-trained language models such as BERT, operate only within…