What is the most unexpected thing you have gotten a local model to do?
Most local LLM use cases I see are chat, coding, and RAG. But with vision models getting better and faster on consumer hardware, I feel like there is a lot of untapped territory. I got a local VLM to play a board game by just looking at the screen and…