Meta Drops 4 Custom MTIA Chips at Once — A New Chip Every 6 Months
Meta unveils MTIA 300/400/450/500 custom AI chips built on RISC-V. Modular design enables 6-month release cadence to reduce Nvidia dependency.
Hundreds of Thousands of Chips Already Running
Meta just announced four generations of its custom AI silicon — MTIA 300, 400, 450, and 500 — all at once. In an industry where a single chip generation typically takes 2-3 years, Meta is delivering four in under two.
What Happened
MTIA (Meta Training and Inference Accelerator) chips are built on open-source RISC-V architecture, designed with Broadcom, and fabricated by TSMC. The secret sauce is modular design — like Lego blocks, components can be swapped and upgraded. That's how Meta achieves a new chip every six months.
MTIA 300 is already deployed in production, handling content ranking across Facebook and Instagram. The 400-500 series will take on GenAI inference (the process of AI generating responses) through 2027.
Why It Matters
The biggest bottleneck in AI right now is GPU supply. Depending on Nvidia means ceding control over both pricing and availability. Meta has already deployed hundreds of thousands of MTIA chips for inference workloads, actively reducing that dependency. This isn't just cost-cutting — it's a strategic supply chain play.
Going Deeper
Interestingly, Meta still buys Nvidia GPUs at massive scale for training while shifting inference to custom silicon — a hybrid strategy. Training stays on Nvidia; inference goes in-house. Following Google (TPU) and Amazon (Trainium/Inferentia), Meta is now the third hyperscaler going all-in on custom AI chips.
Bottom Line
Meta's simultaneous 4-generation MTIA launch = the most aggressive move yet toward "AI without Nvidia." A 6-month chip cadence is an industry first.
AI 트렌드를 앞서가세요
매일 아침, 엄선된 AI 뉴스를 받아보세요. 스팸 없음. 언제든 구독 취소.
