{"id":125927,"date":"2021-08-06T11:23:07","date_gmt":"2021-08-06T18:23:07","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2021\/08\/microsoft-ai-researchers-introduce-a-neural-network-with-135-billion-parameters-and-deployed-it-on-bing-to-improve-search-results"},"modified":"2021-08-06T11:23:07","modified_gmt":"2021-08-06T18:23:07","slug":"microsoft-ai-researchers-introduce-a-neural-network-with-135-billion-parameters-and-deployed-it-on-bing-to-improve-search-results","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2021\/08\/microsoft-ai-researchers-introduce-a-neural-network-with-135-billion-parameters-and-deployed-it-on-bing-to-improve-search-results","title":{"rendered":"Microsoft AI Researchers Introduce A Neural Network With 135 Billion Parameters And Deployed It On Bing To Improve Search Results"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/microsoft-ai-researchers-introduce-a-neural-network-with-135-billion-parameters-and-deployed-it-on-bing-to-improve-search-results3.jpg\"><\/a><\/p>\n<p>Transformer-based deep learning models like GPT-3 have been getting much attention in the machine learning world. These models excel at understanding semantic relationships, and they have contributed to large improvements in Microsoft Bing\u2019s search experience. However, these models can fail to capture more nuanced relationships between query and document terms beyond pure semantics.<\/p>\n<p>The Microsoft team of researchers developed a <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/blog\/make-every-feature-binary-a-135b-parameter-sparse-neural-network-for-massively-improved-search-relevance\/\">neural network with 135 billion parameters<\/a>, which is the largest \u201cuniversal\u201d artificial intelligence that they have running in production. The large number of parameters makes this one of the most sophisticated AI models ever detailed publicly to date. OpenAI\u2019s GPT-3 natural language processing model has 175 billion parameters and remains as the world\u2019s largest neural network built to date.<\/p>\n<p>Microsoft researchers are calling their latest <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/blog\/make-every-feature-binary-a-135b-parameter-sparse-neural-network-for-massively-improved-search-relevance\/\">AI project MEB (Make Every Feature Binary)<\/a>. The 135-billion parameter machine is built to analyze queries that Bing users enter. It then helps identify the most relevant pages from around the web with a set of other machine learning algorithms included in its functionality, and without performing tasks entirely on its own.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Transformer-based deep learning models like GPT-3 have been getting much attention in the machine learning world. These models excel at understanding semantic relationships, and they have contributed to large improvements in Microsoft Bing\u2019s search experience. However, these models can fail to capture more nuanced relationships between query and document terms beyond pure semantics. The Microsoft [\u2026]<\/p>\n","protected":false},"author":396,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[41,418,6],"tags":[],"class_list":["post-125927","post","type-post","status-publish","format-standard","hentry","category-information-science","category-internet","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/125927","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/396"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=125927"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/125927\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=125927"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=125927"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=125927"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}