LocalLLaMA

Anthropic admits to have made hosted models more stupid, proving the importance of open weight, local models

TL;DR: On March 4, we changed Claude Code's default reasoning effort from high to medium to reduce the very long latency—enough to make the UI appear frozen—some users were seeing in high mode. This was the wrong tradeoff. We reverted this c…