What is the most unexpected thing you have gotten a local model to do?

Most local LLM use cases I see are chat, coding, and RAG. But with vision models getting better and faster on consumer hardware, I feel like there is a lot of untapped territory.

I got a local VLM to play a board game by just looking at the screen and it worked way better than I expected.

What is the weirdest or most unexpected thing you have used a local model for?

submitted by /u/Enough-Astronaut9278
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top