turning my phone into a local AI server (open source project update)

I made an app A.I.R.I, it runs LLMs locally on your phone. I’ve made a pretty big upgrade from its initial release and it’s starting to feel like something more than just a chat app.

The main idea now is: your phone = a personal AI server

It can: - run models locally - be accessed by other devices on your Wi-Fi - support voice conversations (TTS + STT) - handle documents with a simple RAG pipeline - manage and download models inside the app - keep chat history + user profiles for context - I also completely refactored the architecture so it’s modular and easier to extend (which was badly needed).

Still a work in progress, but this is the first time it feels like the original idea is actually working. Repo: Link

submitted by /u/amithatprogrammer
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top