Multi-Armed Bandits With Best-Action Queries
arXiv:2605.08287v1 Announce Type: cross
Abstract: We study \emph{multi-armed bandits} (MABs) augmented with \emph{best-action queries}, in which the learner may additionally query an oracle that reveals the best arm in the current round. This setting …