Menu

Blog

Dec 9, 2023

Mixtral: French start-up Mistral releases what is essentially a small GPT-4

Posted by in category: robotics/AI

French startup Mistral AI has released its new language model Mixtral 8x7B via a torrent link. Mixtral is a mixture-of-experts model, following an architecture that OpenAI is rumored to be using for GPT-4, but on a much larger scale.

There are no benchmarks, blog posts, or articles about the model yet, but Mistral-7B — the first version of Mistral AI — generally performed very well and was quickly adopted by the open-source community. Mistral is thought to have used the MegaBlocks MoE library for training. The Paris-based company was recently valued at nearly $2 billion.

Comments are closed.