Built myself a bit of a local llm workhorse. What’s a good model to try out with llamacpp that will put my 56G of VRAM to good use? Any other fun suggestions?

Built myself a bit of a local llm workhorse. What's a good model to try out with llamacpp that will put my 56G of VRAM to good use? Any other fun suggestions? submitted by /u/SBoots
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top