cs.AI

When Do We Need LLMs? A Diagnostic for Language-Driven Bandits

arXiv:2604.05859v1 Announce Type: new
Abstract: We study Contextual Multi-Armed Bandits (CMABs) for non-episodic sequential decision making problems where the context includes both textual and numerical information (e.g., recommendation systems, dynam…

Scroll to Top