Meta Platforms scoops up AI networking chip team from Graphcore

Meta Platforms scoops up AI networking chip team from Graphcore

Technology

Meta Platforms scoops up AI networking chip team from Graphcore

(Reuters) - Meta Platforms Inc (META.O) has hired an Oslo-based team that until late last year was building artificial-intelligence networking technology at British chip unicorn Graphcore.

A Meta spokesperson confirmed the hiring in response to a request for comment after Reuters identified 10 people whose LinkedIn profiles said they worked at Graphcore until December 2022 or January 2023 and subsequently joined Meta in February or March of this year.

"We recently welcomed a number of highly-specialized engineers in Oslo to our infrastructure team at Meta. They bring deep expertise in the design and development of super-computing systems to support AI and machine learning at scale in Meta's data centers," said Jon Carvill, the Meta spokesperson.

The move brings additional muscle to the social media giant's bid to improve how its data centers handle AI work, as it races to cope with demand for AI-oriented infrastructure from teams across the company looking to build new features.

Meta, which owns Facebook and Instagram, has become increasingly reliant on AI technology to target advertising, select posts for its apps' feeds and purge banned content from its platforms.

On top of that, it is now rushing to join competitors like Microsoft Corp (MSFT.O) and Alphabet Inc's (GOOGL.O) Google in releasing generative AI products capable of creating human-like writing, art and other content, which investors see as the next big growth area for tech companies.

The 10 employees' job descriptions on LinkedIn indicated the team had worked on AI-specific networking technology at Graphcore, which develops computer chips and systems optimized for AI work.

Carvill declined to say what they would be working on at Meta.

Graphcore closed its Oslo office as part of a broader restructuring announced in October last year, a spokesperson for the startup said, as it struggled to make inroads against U.S.-based firms like Nvidia Corp (NVDA.O) and Advanced Micro Devices Inc (AMD.O) which dominate the market for AI chips.

Meta already has an in-house unit designing several kinds of chips aimed at speeding up and maximizing efficiency for its AI work, including a network chip that performs a sort of air traffic control function for servers, two sources told Reuters.

Efficient networking is especially useful for modern AI systems like those behind chatbot ChatGPT or image-generation tool Dall-E, which are far too large to fit onto a single computing chip and must instead be split up over many chips strung together.

A new category of network chips has emerged to help keep data moving smoothly within those computing clusters. Nvidia, AMD and Intel Corp (INTC.O) all make such network chips.

In addition to its network chip, Meta is also designing a complex computing chip to both train AI models and perform inference, a process in which the trained models make judgments and generate responses to prompts, although it does not expect that chip to be ready until around 2025.

Graphcore, one of the UK's most valuable tech startups, once was seen by investors like Microsoft and venture capital firm Sequoia as a promising potential challenger to Nvidia's commanding lead in the market for AI chip systems.

However, it faced a setback in 2020 when Microsoft scrapped an early deal to buy Graphcore's chips for its Azure cloud computing platform, according to a report by the UK newspaper The Times. Microsoft instead used Nvidia's GPUs to build the massive infrastructure powering ChatGPT developer OpenAI, which Microsoft also backs.

Sequoia has since written down its investment in Graphcore to zero, although it remains on the company's board, according to a source familiar with the relationship. The write-down was first reported by Insider in October.

The Graphcore spokesperson confirmed the setbacks but said the company was "perfectly positioned" to take advantage of accelerating commercial adoption of AI.