{"id":137273,"date":"2022-03-22T23:04:31","date_gmt":"2022-03-23T06:04:31","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2022\/03\/microsoft-translator-enhanced-with-z-code-mixture-of-experts-models"},"modified":"2022-03-22T23:04:31","modified_gmt":"2022-03-23T06:04:31","slug":"microsoft-translator-enhanced-with-z-code-mixture-of-experts-models","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2022\/03\/microsoft-translator-enhanced-with-z-code-mixture-of-experts-models","title":{"rendered":"Microsoft Translator enhanced with Z-code Mixture of Experts models"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/microsoft-translator-enhanced-with-z-code-mixture-of-experts-models3.jpg\"><\/a><\/p>\n<p>Translator, a Microsoft Azure Cognitive Service, is adopting Z-code <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/blog\/deepspeed-powers-8x-larger-moe-model-training-with-high-performance\/\">Mixture of Experts models<\/a>, a breakthrough AI technology that significantly improves the quality of production translation models. As a component of Microsoft\u2019s larger <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/blog\/a-holistic-representation-toward-integrative-ai\/\">XYZ-code initiative<\/a> to combine AI models for text, vision, audio, and language, Z-code supports the creation of AI systems that can speak, see, hear, and understand. This effort is a part of <a href=\"https:\/\/azure.microsoft.com\/en-us\/services\/cognitive-services\/\" target=\"_blank\" rel=\"noreferrer noopener\">Azure AI<\/a> and <a href=\"https:\/\/turing.microsoft.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">Project Turing<\/a>, focusing on building multilingual, large-scale language models that support various production teams. Translator is using NVIDIA GPUs and Triton Inference Server to deploy and scale these models efficiently for high-performance inference. Translator is the first machine translation provider to introduce this technology live for customers.<\/p>\n<p><b>Z-code MoE boosts efficiency and quality<\/b><\/p>\n<p>Z-code models utilize a new architecture called Mixture of Experts (MoE), where different parts of the models can learn different tasks. The models learn to translate between multiple languages at the same time. The Z-code MoE model utilizes more parameters while dynamically selecting which parameters to use for a given input. This enables the model to specialize a subset of the parameters (experts) during training. At runtime, the model uses the relevant experts for the task, which is more computationally efficient than utilizing all model\u2019s parameters.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Translator, a Microsoft Azure Cognitive Service, is adopting Z-code Mixture of Experts models, a breakthrough AI technology that significantly improves the quality of production translation models. As a component of Microsoft\u2019s larger XYZ-code initiative to combine AI models for text, vision, audio, and language, Z-code supports the creation of AI systems that can speak, see, [\u2026]<\/p>\n","protected":false},"author":662,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1522,6],"tags":[],"class_list":["post-137273","post","type-post","status-publish","format-standard","hentry","category-innovation","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/137273","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/662"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=137273"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/137273\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=137273"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=137273"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=137273"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}