{"id":177902,"date":"2023-12-09T10:22:57","date_gmt":"2023-12-09T16:22:57","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2023\/12\/mixtral-french-start-up-mistral-releases-what-is-essentially-a-small-gpt-4"},"modified":"2023-12-09T10:22:57","modified_gmt":"2023-12-09T16:22:57","slug":"mixtral-french-start-up-mistral-releases-what-is-essentially-a-small-gpt-4","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2023\/12\/mixtral-french-start-up-mistral-releases-what-is-essentially-a-small-gpt-4","title":{"rendered":"Mixtral: French start-up Mistral releases what is essentially a small GPT-4"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/mixtral-french-start-up-mistral-releases-what-is-essentially-a-small-gpt-42.jpg\"><\/a><\/p>\n<p><strong>French startup Mistral AI has released its new language model Mixtral 8x7B via a torrent link.<\/strong> Mixtral is a <a href=\"https:\/\/the-decoder.com\/large-ai-models-could-soon-become-even-larger-much-faster\/\">mixture-of-experts model<\/a>, following an architecture that OpenAI is <a href=\"https:\/\/the-decoder.com\/gpt-4-architecture-datasets-costs-and-more-leaked\/\">rumored to be using for GPT-4<\/a>, but on a much larger scale.<\/p>\n<p>There are no benchmarks, blog posts, or articles about the model yet, but <a href=\"https:\/\/the-decoder.com\/new-open-source-llm-mistral-7b-outperforms-larger-meta-llama-models\/\">Mistral-7B<\/a> \u2014 the first version of Mistral AI \u2014 generally performed very well and was quickly adopted by the open-source community. Mistral is thought to have used the <a target=\"_blank\" rel=\"noopener\" href=\"https:\/\/github.com\/mistralai\/megablocks-public\">MegaBlocks<\/a> MoE library for training. The Paris-based company was recently <a href=\"https:\/\/the-decoder.com\/french-ai-start-up-mistral-ai-is-approaching-a-valuation-of-2-billion-us-dollars\/\">valued at nearly $2 billion<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>French startup Mistral AI has released its new language model Mixtral 8x7B via a torrent link. Mixtral is a mixture-of-experts model, following an architecture that OpenAI is rumored to be using for GPT-4, but on a much larger scale. There are no benchmarks, blog posts, or articles about the model yet, but Mistral-7B \u2014 the [\u2026]<\/p>\n","protected":false},"author":359,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[],"class_list":["post-177902","post","type-post","status-publish","format-standard","hentry","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/177902","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/359"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=177902"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/177902\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=177902"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=177902"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=177902"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}