Meta and Broadcom Extend MTIA Partnership to 1GW+ Commitment — First AI Chip on 2nm Process
Meta and Broadcom announced on April 14, 2026 an extended partnership to co-develop multiple generations of Meta's MTIA (Meta Training and Inference Accelerator) custom AI chip, with Meta committing to deploy more than 1 gigawatt of MTIA-powered compute in the first phase of what the companies described as a sustained multi-gigawatt rollout. The agreement extends the existing collaboration through 2029 and covers four new generations of MTIA silicon, advancing Meta's strategy of reducing dependence on third-party AI accelerators for its core ranking, recommendation, and generative AI workloads.
The new MTIA chips will be built on a 2nm process node — making them the first AI accelerators to reach that density milestone — using Broadcom's XPU platform, which enables tightly optimized custom silicon designed around Meta's specific training and inference workload profiles. The deal also encompasses advanced packaging and networking co-development, meaning Broadcom will work with Meta across the full silicon stack rather than just chip design. Meta plans to deploy four new MTIA generations within two years to support AI experiences across its apps serving more than 3.5 billion daily active users.
As part of the announcement, Broadcom President and CEO Hock Tan will step down from Meta's board of directors and transition to an advisory role focused on Meta's custom silicon roadmap. The move formalizes what had been an arms-length relationship into a deeper structural partnership. Analysts at The Motley Fool described the deal as a significant win for Broadcom investors, noting that a multi-gigawatt commitment from Meta represents years of high-margin chip design revenue and positions Broadcom to capture a larger slice of the custom AI silicon market alongside its existing Alphabet and Apple engagements.
Sources
CNBC, Meta, Broadcom, SiliconANGLE