Which Pieces Does Unigram Tokenization Really Need?

arXiv:2512.12641v2 Announce Type: replace Abstract: The Unigram tokenization algorithm offers a probabilistic alternative to the greedy heuristics of Byte-Pair Encoding. Despite its theoretical elegance, its implementation in practice is complex, limiting its adoption to the SentencePiece package and adapters thereof. We bridge this gap between theory and practice by providing a clear guide to implementation and parameter choices. We also identify a simpler algorithm that accepts slightly higher training loss in exchange for improved compression.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top