Intel Gaudi 2 crushes Nvidia H100 in AI training; Stable Diffusion 3 runs faster & cheaper, too

Nvidia boss once said H100 is "so good that even when the competitor's chips are free, it's not cheap enough."

Reading time icon 2 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Key notes

  • Nvidia H100 praised by CEO Huang as top chip for AI.
  • However, Stability AI claims Intel’s Gaudi 2 outperforms H100 in AI training.
  • Gaudi 2 offers solid performance, cheaper costs, and faster inference speeds, according to Stability AI.

Nvidia H100 has been the talk of the town for quite some time. The company’s boss, Jensen Huang, even went as far as saying that it’s the best chip for AI training and inferencing — even better than Intel Gaudi 2.

He was on the record saying during his keynote speech at the 2024 SIEPR Economic Summit that the H1000 is “so good that even when the competitor’s chips are free, it’s not cheap enough.”

However, that does not necessarily paint the whole truth, or at least that’s what Stability AI has recently said. The start-up AI company, which also launched the Stable Diffusion models, has claimed that Intel’s Gaudi 2 chips have also showcased impressive performance for Stable Diffusion 3, running the multimodal diffusion transformer architecture faster than Nvidia’s H100s in scaled training pre-fp8. 

The cost-effective Gaudi3 is also set to deliver significant speed enhancements, with 673 tok/s inference observed on the upcoming StableBeluga 2.5 70b model, its fine-tuned version of LLaMA 2 70B that’s built on the Stable Beluga 2 model. 

Positioned between Nvidia’s A100 and H100 in terms of performance, Gaudi2 chips offer solid performance with 96 Gb VRAM and cheaper 2.4 Tb/s interconnect, making them a compelling choice for AI tasks.

Much like OpenAI’s DALL-E 3, Stable Diffusion 3 is Stability AI’s upcoming text-to-image model that’s coming soon for early preview. It will come in different sizes from 800M to 8B parameters.

You can read more on Stability AI’s findings here.