16GB VRAM x coding model

I’m looking for recommendations on coding models. I have a 5060 Ti with 16GB of VRAM, it’s a modest GPU, but it has been helping me build a lot of cool stuff at work.

Yesterday we had downtime with Codex and Claude Code, and I realized I really need a local “backup” model for coding.

I downloaded Qwen2.5 14B Coder, but I couldn’t get it to run properly in OpenCode , it would start generating and then stop. After searching online, I saw several people reporting the same issue.

So I started wondering: what other models could I run on my setup? What are you guys using?

I’d love some recommendations, since I never know when I might need them (what if everything goes down at the same time lol).

submitted by /u/Junior-Wish-7453
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top