MachineLearning

Why do only big ML labs dominate widely-used models despite many open-source pretrained models smaller labs could do RL on? [D]

I’m trying to understand why models from major labs (GPT, Claude, etc.) dominate real-world usage? You might say it's due to the expensive pretraining compute budge, but there already exists many pretrained open-source models at the same scale (e.g…