| Hi everyone, I've been working on Pocket LLM, an Android app for running local LLMs fully offline for private, real-time chat. The latest v1.3.0 update adds: • LiteRT support for Gemma 4 E2B, Gemma 4 E4B, and Qwen3-0.6B • Persistent local chat history • Previous Chats • Thinking Mode for supported models Better markdown rendering • Themes, font size settings, and a more polished chat UI The goal is to make local LLMs on Android more usable as an actual app, not just a basic demo. Repo: https://github.com/dineshsoudagar/local-llms-on-android Releases / prebuilt APKs: https://github.com/dineshsoudagar/local-Ilms-on-android/releases Would love feedback, especially on model support, performance across devices, and UI/UX. [link] [comments] |