Microsoft announces availability of cost-effective AMD MI300X accelerator-based VMs on Azure
1 min. read
Updated on
Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more
Key notes
- Microsoft claims that this new AMD accelerator is the most cost-effective GPU out there right now for Azure OpenAI.
At Build 2024, Microsoft today announced the general availability of AMD MI300X accelerator-based VMs on Azure. The new ND MI300X VM series makes Azure the first cloud platform to bring AMD’s Instinct MI300X Accelerator for customers. The ND MI300X VM combines eight AMD MI300X Instinct accelerators delivering great cost-performance for inferencing. Microsoft claims that this new AMD accelerator is the most cost-effective GPU out there right now for Azure OpenAI.
AMD MI300X is based on the next-gen AMD CDNA 3 accelerator architecture and supports up to 192 GB of HBM3 memory. With the large memory, customers can fit a large language model up to 40B on a single, MI300X accelerator. The AMD Instinct Platform brings together eight MI300X.
User forum
0 messages