
Meta just went proprietary. After years of championing open-source AI with Llama, the company’s first model from Superintelligence Labs signals a complete strategy reversal. Here’s what happened — and who gets burned.
For three years, Meta was the loudest voice in open-source AI. Llama models powered thousands of startups, research labs, and indie projects. Developers built entire companies on the assumption that Meta would keep shipping open weights.
That assumption just died.
On April 8, 2026, Meta released Muse Spark — its first major AI model from the newly formed Meta Superintelligence Labs. It’s proprietary. It’s closed. And it marks the most dramatic strategic pivot in the AI industry this year.
The timing isn’t accidental. After Llama 4 failed to impress developers, after Meta poured $14.3 billion into acquiring Scale AI talent, and after watching Anthropic and OpenAI race ahead — Mark Zuckerberg decided open source wasn’t fast enough.
I’ve been tracking Meta’s AI strategy closely, and this move has consequences that most news coverage is missing. This isn’t just a new model. It’s the end of an era.
Quick Answer: Meta Muse Spark is Meta’s first proprietary AI model, built by Alexandr Wang’s Superintelligence Labs team, marking a strategic shift away from Meta’s open-source Llama approach. The model powers Meta AI across Facebook, Instagram, and WhatsApp with multimodal understanding, reasoning modes, and an AI-driven shopping experience.
TL;DR — What You’ll Learn:
- Muse Spark is Meta’s first closed, proprietary AI model — a sharp break from its Llama open-source strategy
- The model was built in 9 months by Meta Superintelligence Labs after the Llama 4 launch flopped
- Meta spent $14.3 billion acquiring Scale AI’s Alexandr Wang to lead the effort
- Muse Spark includes a “Shopping Mode” that could reshape how 3.3 billion Meta users buy things online
- Developers who built on Llama face real uncertainty about Meta’s future open-source commitments
Table of Contents
- What Is Meta Muse Spark?
- Why Did Meta Abandon Open-Source AI?
- Inside Meta Superintelligence Labs
- How Does Muse Spark Compare to ChatGPT and Claude?
- The Shopping Mode Play That Nobody Is Talking About
- What This Means for Developers Who Built on Llama
- The Bigger Picture: Open Source vs. Proprietary in 2026
- FAQ
- Key Takeaways
What Is Meta Muse Spark?
Meta Muse Spark (originally codenamed “Avocado”) is a multimodal AI model that accepts voice, text, and image inputs and generates text-based responses. It’s the first model to ship from Meta Superintelligence Labs — the new AI division Meta created after CEO Mark Zuckerberg grew frustrated with the pace of Llama development.
The model is available now on Meta AI’s web app and the meta.ai website, with rollouts planned across Facebook, Instagram, and WhatsApp. It includes a standard fast mode for casual questions, multiple reasoning modes for complex problems, and a “Contemplating” mode that uses multiple AI agents working in parallel to solve harder tasks.
According to Meta’s official blog, the model was built with improved AI training techniques on rebuilt infrastructure, allowing them to create a smaller model that matches the performance of their older, larger Llama 4 variant — using roughly one-tenth the compute. All versions of Muse Spark are free to use, though Meta may eventually introduce rate limits.
What makes this significant isn’t the model’s raw capabilities — Meta’s own executives acknowledged to Axios that Muse Spark doesn’t represent a new state-of-the-art. Where it stands out is what it represents strategically: Meta’s first step toward building proprietary, closed AI products designed to compete directly with ChatGPT and Claude.
Key Insight: Muse Spark isn’t about beating benchmarks — it’s about Meta owning its AI stack instead of giving it away.
Why Did Meta Abandon Open-Source AI?
This is the question nobody in the official coverage is asking directly. So let me piece it together.
The Llama 4 failure was the trigger. When Meta released Llama 4 in April 2025, it was supposed to cement Meta’s position as the leader in open-source AI. Instead, according to CNBC, the release “failed to captivate developers.” Community reception was lukewarm. The models didn’t deliver the performance leap people expected. While Meta was shipping incremental open-source updates, OpenAI was building ChatGPT into a platform with 900 million weekly active users, and Anthropic’s Claude was dominating enterprise adoption with Cowork and Claude Code.
The competitive gap was widening, not shrinking. Meta’s open-source strategy was great for developer goodwill but terrible for competitive positioning. Every Llama improvement was immediately available to competitors. Google fine-tuned on Llama outputs. Startups used Llama to build products that competed with Meta AI. The open-source approach meant Meta was funding its own competition.
The Zuckerberg frustration factor. TechCrunch reported that Zuckerberg was “unhappy with the progress of Meta” and how Llama models lagged behind ChatGPT and Claude. That dissatisfaction led to a complete organizational restructuring — the creation of Meta Superintelligence Labs, the $14.3 billion deal with Scale AI, and aggressive hiring from OpenAI, Anthropic, and Google.
The financial pressure is real. Meta disclosed in its latest earnings that AI-related capital expenditures in 2026 will hit $115 billion to $135 billion — nearly double its capex from the previous year. When you’re spending that kind of money, the calculus on giving away your best work for free changes fast.
I think this is the honest version of events that Meta won’t say publicly: open source was a distribution strategy when Meta was behind. Now that they’re spending $130 billion a year on AI, they want to capture the value instead of sharing it.
Key Insight: Meta didn’t abandon open source because it stopped believing in it. Meta abandoned open source because it started spending $130 billion a year on AI and needed a return.
Inside Meta Superintelligence Labs
Meta Superintelligence Labs is the AI division that built Muse Spark, led by Alexandr Wang — the former co-founder and CEO of Scale AI who Meta brought on board in a deal worth $14.3 billion for a 49% stake in the data labeling company.
Wang joined Meta roughly nine months before Muse Spark launched, and his team rebuilt Meta’s entire AI stack from the ground up during that period. According to Meta’s blog, the lab “moved faster than any development cycle we have run before.” That’s a remarkable claim for a company that employs tens of thousands of engineers.
The team recruited aggressively from the top AI labs. TechCrunch reported that Meta hired researchers from OpenAI, Anthropic, and Google to staff the new division. The organizational signal was clear: Meta Superintelligence Labs operates with a different mandate than FAIR (Meta’s existing research division). This isn’t about publishing papers. It’s about shipping products.
Wang’s background at Scale AI is particularly relevant here. Scale AI is the dominant data labeling and AI training data company, with contracts across every major AI lab. Wang understands the data pipeline better than almost anyone in the industry. His approach to building Muse Spark reportedly emphasized training data quality and infrastructure efficiency over brute-force model scaling — which explains how the team achieved “order of magnitude less compute” for equivalent performance.
Meta has also positioned Muse Spark as “a first step” toward what they’re calling “personal superintelligence.” The vision, as described in their blog, is an AI that doesn’t just answer questions but understands users deeply enough to serve as a personal assistant, shopping advisor, and creative collaborator across all of Meta’s apps.
Key Insight: Alexandr Wang didn’t just bring talent to Meta — he brought Scale AI’s data infrastructure philosophy, which prioritizes training efficiency over parameter count.
How Does Muse Spark Compare to ChatGPT and Claude?

Let me be direct: Muse Spark is not the best AI model available right now, and Meta knows it.
In interviews with Axios, a Meta executive said Muse Spark is “competitive with the latest models from leading labs at certain tasks” — including multimodal understanding and health information processing — but acknowledged there’s still a gap in areas like coding.
Here’s how it stacks up based on available information:
Muse Spark’s strengths: Multimodal input (voice, text, images), visual STEM questions, health information, interactive experiences (minigames, appliance troubleshooting), and shopping recommendations powered by Meta’s social data.
Where it trails: Coding tasks, complex multi-step reasoning, and enterprise workflows where Claude (Cowork, Code) and ChatGPT (Codex) have established strong positions.
The distribution advantage nobody can match: Muse Spark is free, runs across Facebook, Instagram, and WhatsApp, and reaches 3.3 billion monthly active users. ChatGPT has ~900 million weekly users. Claude doesn’t publish user numbers but is significantly smaller. Meta doesn’t need to be the best AI — it needs to be good enough for 3.3 billion people.
The privacy trade-off: Users must log in with a Meta account. Meta’s privacy policy places few limits on how the company uses data shared with its AI system. This is fundamentally different from Anthropic’s approach (which emphasizes data minimization) and even OpenAI’s approach (which has clearer data usage boundaries for paid tiers). For developers and privacy-conscious users, this is a critical consideration.
The honest comparison: ChatGPT leads in breadth and coding. Claude leads in enterprise workflows and reasoning. Muse Spark leads in distribution and social integration. None of them wins across every dimension.
Key Insight: Meta’s AI strategy isn’t “build the smartest model.” It’s “build a good-enough model and put it in front of 3.3 billion people.”
The Shopping Mode Play That Nobody Is Talking About
While every headline focused on the open-source pivot, I think the most commercially significant feature in Muse Spark is Shopping Mode.
Here’s what it does: Shopping Mode combines Muse Spark’s language model capabilities with Meta’s enormous dataset on user interests, browsing behavior, and social connections to surface product recommendations. According to Meta, it “draws from the styling inspiration and brand storytelling already happening across our apps, surfacing ideas from the creators and communities people already follow.”
Translation: Meta is building an AI shopping assistant that knows what you like based on your Instagram follows, Facebook groups, and browsing history — then recommends products using that information.
The business implications are massive. Meta’s advertising business generated over $160 billion in revenue in 2025. If Muse Spark can turn Meta AI into a product discovery and purchase channel, that changes the entire economics of social commerce. Instead of showing users ads and hoping they click, Meta can deploy an AI that proactively recommends products during natural conversations.
I initially assumed Shopping Mode was a gimmick. But after reading Meta’s technical blog and the Axios interview, I think this might be the real reason Meta went proprietary. You don’t give away an AI model that’s designed to monetize your platform’s social graph. That’s the crown jewels.
Meta also mentioned experimenting with offering Muse Spark’s technology to third-party developers via an API — creating a new revenue stream. This is the OpenAI playbook: free consumer product, paid enterprise API.
Key Insight: Shopping Mode isn’t a feature — it’s Meta’s plan to turn AI into a revenue engine that monetizes 3.3 billion users’ social data.
What This Means for Developers Who Built on Llama
This is the section that matters most to the developer community, and I want to be honest about the uncertainty here.
Meta has said it plans to release a version of Muse Spark under an open-source license eventually. Axios reported this, and Meta’s blog mentioned “hope to open-source future versions of the model.” But “hope” and “plan” are doing a lot of heavy lifting in those statements.
Here’s what developers building on Llama need to consider:
Your existing Llama deployments still work. Meta hasn’t pulled Llama models. Llama 4 is still available. Nothing changes for current production systems running on Llama weights.
Future investment in Llama is uncertain. If Meta’s best researchers are now working on proprietary Muse models at Superintelligence Labs, who’s still working on Llama? Meta hasn’t clarified how resources will be split between the open-source Llama roadmap and the proprietary Muse roadmap. History suggests the proprietary track will get priority.
The “eventually open source” pattern has risks. Meta might release an older version of Muse Spark once the newer version ships — similar to how some companies open-source last-generation tech. That’s valuable but not the same as having access to the frontier model.
Alternatives exist and are improving. Google’s Gemma 4 (released this month) offers strong open models for reasoning and agentic workflows. Mistral continues shipping competitive open models. The open-source ecosystem is bigger than Meta.
My take: if you built critical infrastructure on the assumption that Meta would always ship frontier open-source models, you need a backup plan. Not because Llama is going away, but because the best Meta models will now be proprietary first and open-source (maybe) later.
If you’re exploring open-source AI alternatives for your stack, I maintain an awesome-genai-toolkit repository on GitHub that tracks the best tools, frameworks, and models across the ecosystem — including non-Meta options worth evaluating right now.
Key Insight: Llama isn’t dead, but it’s no longer Meta’s priority. Developers should diversify their model dependencies.
The Bigger Picture: Open Source vs. Proprietary in 2026

Meta’s pivot is the latest data point in a broader trend: the era of frontier open-source AI may be ending.
Consider the landscape. OpenAI was never open source. Anthropic has always been proprietary. Google open-sources smaller models (Gemma) but keeps Gemini proprietary. Now Meta — the biggest champion of open-source AI — has gone proprietary for its frontier model.
The economics explain why. According to multiple reports, AI venture capital hit $242 billion in Q1 2026 alone — 80% of all global VC funding. Companies spending this kind of money need returns. Open-source models generate goodwill. Proprietary models generate revenue.
This doesn’t mean open-source AI dies. Llama, Gemma, Mistral, and others will continue to improve. The community around open-weight models is massive and self-sustaining. But it does mean the gap between the best open-source and the best proprietary models is likely to widen, not shrink.
For the AI industry, the Muse Spark launch is a signal: 2026 is the year the AI race moved from “build the best model” to “build the best business.” And businesses are built on proprietary advantages, not shared infrastructure.
Key Insight: The open-source AI era isn’t ending — but the frontier is going proprietary, and Meta just made that official.
Frequently Asked Questions
What is Meta Muse Spark?
Meta Muse Spark is Meta’s first proprietary AI model, built by Meta Superintelligence Labs under Alexandr Wang’s leadership. It accepts voice, text, and image inputs, produces text-based outputs, and powers Meta AI across Facebook, Instagram, WhatsApp, and the meta.ai website. It was released on April 8, 2026.
Why did Meta stop making open-source AI models?
Meta hasn’t officially abandoned open-source AI — it said it “hopes to open-source future versions.” But Muse Spark is proprietary because Meta’s Llama 4 launch underperformed, competitive pressure from OpenAI and Anthropic intensified, and Meta’s AI spending reached $115–135 billion per year in 2026, creating pressure to capture value rather than share it.
Is Muse Spark better than ChatGPT or Claude?
Muse Spark is competitive with leading models in multimodal understanding and health information but trails in coding and complex reasoning. Its advantage is distribution — it reaches Meta’s 3.3 billion monthly users for free. ChatGPT leads in coding, Claude leads in enterprise workflows, and Muse Spark leads in social integration.
How much did Meta spend on Muse Spark and Superintelligence Labs?
Meta invested $14.3 billion in Scale AI (acquiring a 49% stake and Alexandr Wang’s leadership) and disclosed $115–135 billion in total AI capital expenditures for 2026. The specific cost of developing Muse Spark has not been disclosed separately.
Will Meta open-source Muse Spark?
Meta has stated it “hopes to open-source future versions of the model,” but has not committed to a timeline. The current version of Muse Spark is proprietary and closed-source. Historical patterns suggest older versions may be open-sourced after newer proprietary models ship.
What is Meta’s Shopping Mode AI?
Shopping Mode is a Muse Spark feature that combines AI language capabilities with Meta’s social data — user interests, browsing behavior, creator follows — to deliver personalized product recommendations. It represents Meta’s strategy to turn AI into a social commerce revenue channel.
Is Muse Spark safe to use with personal data?
Users must log in with a Meta account (Facebook or Instagram). Meta’s privacy policy places few limits on how the company can use data shared with its AI system. Privacy-conscious users should review Meta’s data policy before sharing sensitive information with Muse Spark.
What happens to Llama now?
Existing Llama models remain available and unchanged. However, Meta’s best AI researchers are now focused on the proprietary Muse series at Superintelligence Labs, which raises questions about the pace and priority of future Llama development. Developers should monitor Meta’s roadmap and consider diversifying model dependencies.
Key Takeaways
- Meta Muse Spark is a proprietary model, marking Meta’s pivot away from its open-source Llama strategy after three years of championing open weights.
- The trigger was Llama 4’s failure to impress developers, combined with competitive pressure from OpenAI ($122B raise, 900M weekly users) and Anthropic (Claude Cowork, Mythos).
- Alexandr Wang rebuilt Meta’s AI stack in 9 months at Superintelligence Labs, achieving equivalent performance to Llama 4 with ~10× less compute.
- Shopping Mode is the real story — it turns Meta’s social data into a personalized AI commerce engine for 3.3 billion users.
- Meta’s AI spending ($115–135B in 2026) makes the economics of free, open-source frontier models unsustainable for the company.
- Developers who built on Llama should diversify — Llama isn’t dead, but it’s no longer the priority track for Meta’s best researchers.
- The broader trend is clear: frontier AI is going proprietary in 2026, and Meta just made it official.
If this analysis helped you think about the AI market differently, give it a clap and drop a comment with your take — is Meta right to go proprietary? I respond to every comment. Follow me for weekly deep dives on AI strategy, open-source tools, and the business of artificial intelligence.
About the Author
Shubh is an AI-focused writer and developer who covers the open-source AI ecosystem, developer tools, and generative AI strategy on Medium. He maintains the awesome-genai-toolkit on GitHub — a curated resource tracking the best AI tools and frameworks. Follow for contrarian analysis on AI business trends and technical deep dives.
All claims verified as of April 9, 2026. Last updated: April 2026.
Meta Muse Spark: Why Meta Abandoned Open-Source AI (And What It Means) was originally published in Towards AI on Medium, where people are continuing the conversation by highlighting and responding to this story.