Author name: Wenshuo Wang

cs.AI

LLM Reasoning Is Latent, Not the Chain of Thought

arXiv:2604.15726v1 Announce Type: new
Abstract: This position paper argues that large language model (LLM) reasoning should be studied as latent-state trajectory formation rather than as faithful surface chain-of-thought (CoT). This matters because cl…

Scroll to Top