LocalLLaMA

Experience of using OpenClaude and Gemma4 26b

Hi Guys, I am relatively new to the LocalLLM scene, and today I started to download my first Local LLM with Gemma 4 26b. I am using Ollama and am running on a M1 Max with 32GB of RAM. When I just use Gemma 4 inside of Ollama, it works like a charm. It …