Mixtral 8x22B open source model has finally arrived, downloadable via Torrent

Readers help support MSpoweruser. We may earn a commission if you buy through our links.

Read our disclosure page to find out how you can help MSPoweruser support the editorial team. Read more

Mistral returns with a big announcement. Earlier on Wednesday, the Microsoft-backed French company launched its frontier model, Mixtral 8x22B, as the AI โ€‹โ€‹race heats up.

The best part of this? It is an open source system and you can download it via Torrent with a file size of 281 GB as an AI starter. posted your Magnet link on X (previously known as Twitter). It is also available now at HugsFace and Perplexity AI Labs.

Mistral is primarily made up of former Google and Meta employees. The company's predecessor model, the Mixtral 8x7B, was thrown out in December last year, but it is said to have outperformed rivals like OpenAI's Llama-2 70B and GPT-3.5 on certain benchmarks like MMLU, ARC Challenge, MBPP, and others.

Now, Mixtral 8x22B has 176B parameters and a context length of 65K tokens. Although it is huge, it only uses a smaller part (44B) for each task, making it more affordable to use.

It has certainly been one of the busiest 24 hours in the AI โ€‹โ€‹industry. Google has finally done Gemini 1.5 Pro available in most countries and renamed its Duet AI for Developers as Gemini Code Supportmoving its infrastructure from Codey to Gemini.

Another news from Meta, Information has exclusively reported that the Facebook owner is set to launch a smaller version of Llama 3 next week.


Rafly is a reporter with years of journalistic experience, spanning technology, business, social media and culture. He currently reports news on Microsoft-related products, technology, and AI at Windows Report and MSPowerUser. Do you have advice? send it to [emailย protected].



Leave a Comment

Comments

No comments yet. Why donโ€™t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *