Zuckerberg Bets Big on ‘Super Intelligence’ with Over Half of New Team Poached from OpenAI

On June 30, local time, Meta, the social media giant, announced a major restructuring of its AI team, including the formation of the Meta Superintelligence Labs. In an internal memo, Meta founder Mark Zuckerberg announced the creation of the new lab, integrating all basic research and product teams, including the Facebook AI Research (FAIR) lab. The new lab will also include a division dedicated to developing the next generation of AI models.

The new Meta Superintelligence Labs will be led by two individuals: Alexandr Wang, former CEO of Scale AI, who will also serve as Meta’s Chief AI Officer, and Nat Friedman, the former CEO of GitHub (owned by Microsoft), who will head the AI products and applications research.

“We will simultaneously begin the development of the next generation of models, aiming to achieve industry-leading levels by next year,” Zuckerberg stated in the memo. He further mentioned that Meta’s long-term goal is to create a “superintelligent system” for everyone.

Meta employees have confirmed the content of the internal memo.

On June 12, Meta announced it had invested approximately $14.3 billion to acquire a 49% stake in the startup Scale AI, bringing its CEO, Wang, on board. Wang, born in 1997 to Chinese immigrant parents, dropped out of MIT in 2016 and founded Scale AI in San Francisco. The company specializes in data labeling for tech companies such as Microsoft, OpenAI, and Meta, and during the COVID-19 pandemic, Wang even roomed with OpenAI CEO Sam Altman. After Wang’s departure, Scale AI was temporarily led by its Chief Strategy Officer, Jason Droege.

Meta refrained from fully acquiring Scale AI to avoid triggering a review by the U.S. government. In 2020, the Federal Trade Commission (FTC) sued Meta, alleging that its acquisition of competitors such as Instagram and WhatsApp was aimed at maintaining its monopoly in the social networking service market. The case is still pending.

In addition to acquiring a stake in Scale AI and recruiting its founder, Zuckerberg has been aggressively recruiting talent in Silicon Valley, which has led to strong dissatisfaction from Sam Altman. In a June podcast, Altman publicly complained that Meta sees OpenAI as its biggest competitor, going as far as offering signing bonuses of up to $100 million and higher annual salaries to poach OpenAI employees.

Zuckerberg was unapologetic in his internal memo. Besides the two new leaders, he announced that 11 individuals had joined Meta Superintelligence Labs, including 6 from OpenAI, 3 from Google, and 1 from the AI startup Anthropic. Zuckerberg specifically mentioned that among the new recruits, Trapit Bansal had worked on OpenAI’s O-series models, Bi Shuchao had contributed to the development of GPT-4o’s speech module and O4-mini, Chang Huiwen had worked on GPT-4o’s image generation, Lin Ji had contributed to O3/O4-mini, GPT-4o, GPT-4.1, GPT-4.5, and 4o-imagegen, Ren Hongyu had worked on GPT-4o, 4o-mini, O1-mini, O3-mini, O3, and O4-mini, and Yu Jiahui had contributed to O3, O4-mini, GPT-4.1, and GPT-4o.

“I’m very excited to join Meta! Building superintelligence is one of the greatest challenges and opportunities humanity faces,” said Jack Rae, who previously worked on Google’s Gemini pre-training models.

It is worth noting that among the 11 new members, 7 are of Chinese descent, most of whom graduated from top Chinese universities like Tsinghua and Peking University, then earned their PhDs in the U.S. before working for American tech companies.

Zuckerberg’s urgency to recruit talent stems from Meta’s gradual lag behind in the large language model (LLM) space since the end of last year. After OpenAI launched ChatGPT in late 2022, Meta attempted to catch up by open-sourcing its large model, Llama, which briefly became a benchmark in the open-source model space. However, by the end of 2024, as DeepSeek’s open-source model gained attention globally, Meta’s higher-cost models began to lose their competitive edge.

In April 2025, Meta open-sourced its multimodal Llama 4 model, which for the first time used the hybrid expert architecture (MoE) that DeepSeek had also adopted. However, Meta quickly faced criticism when it was found that the model submitted for leaderboard scoring was not the same as the one open-sourced to the public. The optimized leaderboard model received higher scores, but the actual model was less advanced than the previous generation released by Musk’s xAI.

In this internal memo, Zuckerberg stated that in the past few months, he had been meeting with industry experts, and that in the coming weeks, even more people would be joining Meta. On the same day, Meta’s stock price rose by 0.61%, reaching $738.09 per share, setting a new all-time high since the company went public in 2012.

Copyright Notice: This article is exclusively published by Vancisco. No individual or organization shall copy, plagiarize, scrape, distribute, or otherwise reproduce any content from this website—including but not limited to text, images, videos, or data—on any website, publication, or media platform without obtaining formal written authorization from this site.

China's New Economy Index fell to 32.7 in June

2025-7-2 11:37:48

Musk Criticizes 'Beautiful Big Act' Again for Increasing Debt, Claims US Two-Party System Is Essentially One-Party Rule

2025-7-1 20:44:30

Search