Maia 200 is most efficient inference system Microsoft has ever deployed, with 30% better performance per dollar than latest ...
Hyperscaler leverages a two-tier Ethernet-based topology, custom AI Transport Layer & software tools to deliver a tightly integrated, low-latency platform ...
Microsoft has unveiled its Maia 200 AI accelerator, claiming triple the inference performance of Amazon's Trainium 3 and superiority over Google's TPU v7.