Author name: /u/FeiX7

LocalLLaMA

Creating Pi Extension with Pi and Qwen3.5 27B

Following my latest post about setting up Claude Code to be used with Local Models I received a recommendation in the comments to try **Pi**. The suggestion was based on its customizability and superior harness for local models. Unlike Claude Code, whi…

LocalLLaMA

Local Claude Code with Qwen3.5 27B

after long research, finding best alternative for Using a local LLM in OpenCode with llama.cpp to use totally local environment for coding tasks I found this article How to connect Claude Code CLI to a local llama.cpp server how to disable telemetry an…

Scroll to Top