GLM-5.1
submitted by /u/danielhanchen [link] [comments]
Hey guys, you can now fine-tune Gemma 4 E2B and E4B in our free Unsloth notebooks! You need 8GB VRAM to train Gemma-4-E2B locally. Unsloth trains Gemma 4 ~1.5x faster with ~50% less VRAM than FA2 setups: https://github.com/unslothai/unsloth We al…