Why is no open weight model inference provider hosting Mimo-v2.5 or Mimo-v2.5-pro?

Literally no 3rd party api inference provider is hosting the mimo-2.5 series models from Xiaomi. They seem to be reallly good.

High token efficiency and very low halucination rate compared to Kimi-k2.6, Deepseek-V4 or GLM-5.1, and yet no provider not even chutes is hosting it other than Xiaomi themselves.

I find it very strange.

submitted by /u/True_Requirement_891
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top