Mixtral 8x22B open source model has finally arrived, downloadable via Torrent

It's been a busy few days in the AI industry so far

Reading time icon 2 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Key notes

  • Mistral launched its frontier model, Mixtral 8x22B, as the race for AI heats up.
  • The open source model has 176B parameters and a context length of 65K tokens
  • It’s downloadable via Torrent.
mistral ai review

Mistral is back with a big announcement. Earlier on Wednesday, the French, Microsoft-backed company launched its frontier model, Mixtral 8x22B, as the race for AI heats up.

The best part of it? It’s an open-source system, and you can download it via Torrent at 281GB in file size as the AI startup posted its Magnet link on X (formerly known as Twitter). It’s also now available on HuggingFace and Perplexity AI Labs.

Mistral is mostly made up of former employees of Google and Meta. The company’s predecessor model, Mixtral 8x7B, was launched back in December last year, but it’s said to have outperformed rivals like Llama-2 70B and OpenAI’s GPT-3.5 in certain benchmarks like MMLU, ARC Challenge, MBPP, and others.

Now, Mixtral 8x22B has 176B parameters and a context length of 65K tokens. Even though it’s huge, it only uses a smaller part (44B) for each task, making it more affordable to use.

It has undoubtedly been one of the busiest 24 hours in the AI industry. Google has finally made Gemini 1.5 Pro available in most countries and rebranded its Duet AI for Developers as Gemini Code Assist, moving its infrastructure from Codey to Gemini. 

Another news from Meta, The Information has exclusively reported that the Facebook owner is ready to launch a smaller Llama 3 version next week. 

User forum

0 messages