At this moment, the models such as Qwen 3.6 35b/27b crush the competition, yet I can't help, but notice this pattern. While the local RP scene is abundant with the Western model tunes: LLaMA, Mistral (all sizes), Nemo and more recently Gemma 4, which is a powerhouse when set up correctly, we have absolutely a tumbleweed desert of small local creative writing / RP models of the Chinese origin. This is quite sad because the copyright (and sometimes even the questionable content) views of the Chinese side are much more relaxed and they could have made exceptional base models for the community.
To my latest knowledge, there are simply no prominent base models under 100B parameters. (not even speaking of <40B) All of the Qwen series is atrocious for writing, they are dry and STEM-focused. On the contrary, we have hundreds of vibrant Western models tunes and merges on basically all themes and there is an entire ecosystem with the players such as TheDrummer, ReadyArt and SicariusSicarii. Again, the tuners can only alter so much if the data has been filtered from the pretrain like Google/Mistral do, but it's the best we have.
Why don't the Chinese companies want to fill in the creative writing / role-playing niche for local players as they do with coding, image and (used to) video generation? They could have swayed a large portion of the enthusiasts towards them and boosted their place. Will this situation change in the future or the small creative models will continue to be ignored by them?
[link] [comments]