2 min read AI-generated

Meta plans open-source relaunch for its AI models — first release under Alexandr Wang

Copy article as Markdown

Meta wants to release its next generation of AI models as open source. It would be the first model launch under new AI chief Alexandr Wang — and a clear signal against the company's recent drift toward closed source.

Featured image for "Meta plans open-source relaunch for its AI models — first release under Alexandr Wang"

Meta is planning to release its next generation of AI models as open source. Gizmodo reported this on April 6, citing internal information — it would be the first major model launch under the leadership of Alexandr Wang, the Scale AI founder Meta installed as its new AI chief following the Scale acquisition. For anyone waiting on the Llama successor, this is the most important story of the week.

What we know so far

Gizmodo writes that Meta is internally working to release the new model family under an open-source licence. Concrete details about size, architecture or release date are still missing — what’s clear is that the decision is a deliberate return to the open-model approach. Meta had built up a strong open-source position with Llama 3 and 4 but more recently leaned toward a more closed stance with some models and research papers, which had created unease in the community.

Wang’s role makes this especially interesting. As the founder of Scale AI, he has a deep understanding of how training data moves the needle — and he comes from a world where open models and commercial data pipelines coexist just fine. If Meta is going open source again under Wang, it’s not a PR move but a strategic call about the company’s place in the ecosystem.

Why it matters

The past few months have been mixed for open-source LLMs. Qwen has shipped impressive releases from Alibaba all the way up to 30B+ parameters and 1M-token context windows, Gemma 4 is out, Mistral keeps delivering — but Meta, the original open-source champion, has been unusually quiet. A large model released under Apache 2.0 or a comparable licence would shift the balance noticeably.

Second, this is a signal to Meta’s own research org. FAIR has been losing talent in recent months, some of it explicitly citing the shrinking flow of published work. An open-source relaunch would also be an internal statement: Wang positioning himself as someone who takes the open community seriously again.

Takeaway

For those of us building on top of these models, the pragmatic move is to wait for hard facts — model size, licence text, benchmarks. The big question is whether Meta shows up with a model that can actually compete with Qwen and DeepSeek in the open-source league, or whether it stays a symbolic gesture. What we can already say today: if you’re planning an open-source strategy for 2026, keep the release window open.

Sources: