{"id":132120,"date":"2021-12-10T19:22:28","date_gmt":"2021-12-11T03:22:28","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2021\/12\/deepmind-says-its-new-ai-has-almost-the-reading-comprehension-of-a-high-schooler"},"modified":"2021-12-10T19:22:28","modified_gmt":"2021-12-11T03:22:28","slug":"deepmind-says-its-new-ai-has-almost-the-reading-comprehension-of-a-high-schooler","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2021\/12\/deepmind-says-its-new-ai-has-almost-the-reading-comprehension-of-a-high-schooler","title":{"rendered":"DeepMind Says Its New AI Has Almost the Reading Comprehension of a High Schooler"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/deepmind-says-its-new-ai-has-almost-the-reading-comprehension-of-a-high-schooler2.jpg\"><\/a><\/p>\n<p>Alphabet\u2019s AI research company DeepMind <a href=\"https:\/\/deepmind.com\/blog\/article\/language-modelling-at-scale\" class=\"\">has released<\/a> the next generation of its language model, and it says that it has close to the reading comprehension of a high schooler \u2014 a startling claim.<\/p>\n<p>It says the language model, called Gopher, was able to significantly improve its reading comprehension by ingesting massive repositories of texts online.<\/p>\n<p>DeepMind boasts that its algorithm, an \u201cultra-large language model,\u201d has 280 billion parameters, which are a measure of size and complexity. That means it falls somewhere between OpenAI\u2019s GPT-3 (175 billion parameters) and Microsoft and NVIDIA\u2019s Megatron, which features 530 billion parameters, <a href=\"https:\/\/www.theverge.com\/2021\/12\/8\/22822199\/large-language-models-ai-deepmind-scaling-gopher\" class=\"\"><em>The Verge<\/em> points out<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Alphabet\u2019s AI research company DeepMind has released the next generation of its language model, and it says that it has close to the reading comprehension of a high schooler \u2014 a startling claim. It says the language model, called Gopher, was able to significantly improve its reading comprehension by ingesting massive repositories of texts online. [\u2026]<\/p>\n","protected":false},"author":396,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[41,6],"tags":[],"class_list":["post-132120","post","type-post","status-publish","format-standard","hentry","category-information-science","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/132120","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/396"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=132120"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/132120\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=132120"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=132120"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=132120"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}