LocalLLaMAI don’t believe this benchmark 27b size model next opus 4.5! Anyone can confirm testing with real agentic workflow? /u/Wonderful-Ad-5952 / April 22, 2026 submitted by /u/Wonderful-Ad-5952 [link] [comments]
LocalLLaMAOpus = 0.5T × 10 = ~5T parameters ? /u/Wonderful-Ad-5952 / April 9, 2026 submitted by /u/Wonderful-Ad-5952 [link] [comments]