LocalLLaMA

Do not fall into the trap of chasing the next scale or upgrade.

I mean; don't get me wrong, I love me some improvements and enhancements and it keeps on giving… and with MTP making its way to llama.cpp soon, a lot of you who aren't already running custom compiles are about to get a boost in inference spee…