Inference-optimized chip 30% cheaper than any other AI silicon on the market today, Azure's Scott Guthrie claims Microsoft on ...
What is the Maia 200 AI accelerator? The Maia 200 is Microsoft's custom-designed chip, specifically an AI inference ...
Microsoft Azure's AI inference accelerator Maia 200 aims to outperform Google TPU v7 and AWS Inferentia with 10 Petaflops of FP4 compute power.
Maia 200 is Microsoft’s latest custom AI inference accelerator, designed to address the requirements of AI workloads.
Microsoft has announced that Azure’s US central datacentre region is the first to receive a new artificial intelligence (AI) inference accelerator, Maia 200.