LocalLLaMA

Venturing into the world of local LLM’s, would love some pointers!

Hi everyone! Very exciting times we live in where we can run models from laptops and GPU's which 4 years ago would've been SOTA. I have been working with cloud models for years now, and I am now starting to dig into local models. At work, I am …