tech
January 26, 2026
Microsoft announces powerful new chip for AI inference
Microsoft has announced the launch of its latest chip, the Maia 200, which the company describes as a silicon workhorse designed for scaling AI inference.

TL;DR
- Microsoft announced the Maia 200, an AI chip for scaling AI inference.
- The Maia 200 is designed for faster and more efficient AI model execution.
- It features over 100 billion transistors, delivering significant performance gains over the Maia 100.
- Optimizing inference costs is a key focus for maturing AI companies.
- Microsoft aims for Maia 200 to reduce disruption and power usage for AI businesses.
- The chip is part of a trend of tech giants developing in-house chips to reduce reliance on NVIDIA.
- Maia 200 is positioned to compete with Google's TPUs and Amazon's Trainium chips.
- Microsoft's Superintelligence team and Copilot are already utilizing the Maia chip.
- A software development kit for Maia 200 is available for external parties to use.