LocalLLaMA

We just shipped Gemma 4 support in Off Grid 🔥- open-source mobile app, on-device inference, zero cloud. Android live, iOS coming soon.

We shipped Gemma 4 (E2B and E4B edge variants) in Off Grid today — our open-source, offline-first AI app for Android and iOS. What makes this different from other local LLM setups: → No server, no Python, no laptop. Runs entirely on your phone's NP…