Meta and Broadcom have entered a multi-year agreement to co-develop several generations of Meta’s MTIA silicon. As a result, the initial deployment will exceed one gigawatt, while a broader multi-gigawatt rollout will extend through 2029. This collaboration strengthens an existing relationship and positions Broadcom’s XPU platform at the core of Meta’s AI infrastructure.
Moreover, the agreement covers chip design, advanced packaging, and Ethernet-based networking. These elements will support both current and future MTIA models. The MTIA series focuses on inference and recommendation workloads across Meta’s platforms. At the same time, newer versions such as MTIA 450 and MTIA 500 are expanding into generative AI inference, with large-scale deployment expected in 2027.
Shift Toward In-House AI Hardware
At the same time, Meta is reducing its reliance on merchant silicon. The company has already developed multiple MTIA variants, including the 300, 400, 450, and 500 series. Some of these chips are already operating in production environments. In addition, Meta continues to accelerate its custom chip roadmap, targeting faster development cycles.
However, limitations remain. Meta has paused development of its advanced training chip project, which means it still depends on external GPUs for large-scale model training. Therefore, its strategy blends internal innovation with selective reliance on existing hardware ecosystems.
Broadcom Strengthens Industry Position
Meanwhile, Broadcom continues to expand its role in custom AI silicon across major technology firms. The company has secured long-term agreements to develop advanced processing units and supply large-scale compute capacity. These deals reinforce its position as a central supplier of custom chip infrastructure.
Consequently, Broadcom now plays a critical role in shaping AI hardware strategies across competing platforms. As demand for AI compute rises, such partnerships highlight a shift toward specialized silicon and integrated system design across the industry.








