Why doesn’t any OSS tool treat llama.cpp as a first class citizen?
Be it opencode, VS code copilot extension or whatever "open source" AI tool, I rarely see llama.cpp treated as a first class provider? Every single one of them has ollama and sometimes LMStudio. Engineering wise there's literally 0 effort…