| Hi folks, I built this GUI application and have open sourced it now for others. What does it solves? so its basically a UI wrapper on top of llama-server but as you know it has so many flags and usually people post here the optimal flags they found for their GPU and its annoying to keep coming back or look at the shell history to find those CLI params when you actually have to run the command, so this manager is where you can keep them and spin up llama-server whenever you need to. Using AI tools, I also made some cool looking graphs etc, and you can configure some global setting such as host/port which will remain consistent for all launches. This is completely local (sqlite) and i do have plans to make available community shared recipes. I still have to figure out some security implication and backend for it till then I hope you like it. There are built in binaries for Windows/Linux and MacOS You can find the project: https://github.com/coder3101/llama-recipe-manager [link] [comments] |