"Hardware is the only moat".
I read that quote yesterday, and at first, I thought it was just another person trying to sound smart on Twitter. But after the latest Anthropic + xAI developments, I’m starting to believe it.
Open source will probably win in the long run, and even xAI seems to have realized that. Based on what we’ve seen over the last couple of months from leading AI researchers, LLMs alone don’t seem capable of reaching AGI. Because of that, most frontier labs now appear to be focusing more on building products around their models and staying competitive rather than pursuing AGI directly.
If LLMs really do have a theoretical ceiling, then it’s only a matter of time before open source catches up completely.
What we do know is that inference is going to become even more competitive in the near future. Companies will likely start buying even more hardware and compute resources at massive scale to guarantee good performance for increasingly large models.
There’s also the trend of consumer hardware becoming even more expensive, since manufacturers are now prioritizing data center demand over consumer GPUs, creating shortages for regular users.
We’re already seeing how happy people who bought stacks of 3090s with NVLink support are right now.
So, what do you guys think?
Should we wait, or should we upgrade ASAP?
[link] [comments]