What is the most unexpected thing you have gotten a local model to do?

Posted by Enough-Astronaut9278@reddit | LocalLLaMA | View on Reddit | 28 comments

Most local LLM use cases I see are chat, coding, and RAG. But with vision models getting better and faster on consumer hardware, I feel like there is a lot of untapped territory.

I got a local VLM to play a board game by just looking at the screen and it worked way better than I expected.

What is the weirdest or most unexpected thing you have used a local model for?