Microsoft Azure Maia and Microsoft Azure Cobalt are the new custom silicon chips for cloud workloads
2 min. read
Updated on
Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more
Following Google and Amazon’s AWS, Microsoft today announced its own custom silicon for cloud workloads. While Google has TPUs for AI acceleration, Amazon’s AWS has Graviton, Inferentia and Trainium. Microsoft’s new custom-built silicon is targeted towards AI and enterprise workloads in the Microsoft Cloud. Microsoft highlighted that the new custom silicon offerings will be available alongside other established silicon offerings from Intel, AMD, NVIDIA and others.
Azure Maia and Azure Cobalt were built with a holistic view of hardware and software systems to optimize performance and price.
Microsoft today announced the following new custom silicon chips:
- Microsoft Azure Maia
- Microsoft Azure Cobalt
Microsoft Azure Maia is an AI Accelerator chip designed to run cloud-based training and inferencing for AI workloads, such as OpenAI models, Bing, GitHub Copilot and ChatGPT. Maia 100 is the first generation in the series and it consists of 105 billion transistors, making it one of the largest chips manufactured on 5nm process technology. Azure Maia 100 is not just the silicon, it involves work spanning silicon, software, network, racks, and cooling capabilities. Azure Maia will be competing with Google TPU and AWS Inferentia and Trainium.
Microsoft Azure Cobalt is a cloud-native chip based on Arm architecture optimized for performance, power efficiency and cost-effectiveness for general purpose workloads. Cobalt 100, is the first generation in the series and it is a 64-bit 128 core chip that can deliver up to 40% performance improvement over current generations of Azure Arm chips. Microsoft mentioned that Cobalt is already powering services such as Microsoft Teams and Azure SQL. Azure Cobalt will be competing with AWS Graviton.
Both Azure Maia and Azure Cobalt chips will start to roll out early next year to Microsoft’s datacenters. Initially, these chips will be powering Microsoft’s own services such as Microsoft 365 Copilot and Azure OpenAI Service.
The Maia 100 AI Accelerator was also designed specifically for the Azure hardware stack, said Brian Harry, a Microsoft technical fellow leading the Azure Maia team. That vertical integration – the alignment of chip design with the larger AI infrastructure designed with Microsoft’s workloads in mind – can yield huge gains in performance and efficiency, he said.
User forum
0 messages