{"id":107897,"date":"2020-05-30T20:05:42","date_gmt":"2020-05-31T03:05:42","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2020\/05\/openai-debuts-gigantic-gpt-3-language-model-with-175-billion-parameters"},"modified":"2020-05-30T20:05:42","modified_gmt":"2020-05-31T03:05:42","slug":"openai-debuts-gigantic-gpt-3-language-model-with-175-billion-parameters","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2020\/05\/openai-debuts-gigantic-gpt-3-language-model-with-175-billion-parameters","title":{"rendered":"OpenAI debuts gigantic GPT-3 language model with 175 billion parameters"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/openai-debuts-gigantic-gpt-3-language-model-with-175-billion-parameters.jpg\"><\/a><\/p>\n<p>A team of more than 30 OpenAI researchers have released a <a href=\"https:\/\/arxiv.org\/abs\/2005.14165\" target=\"_blank\" rel=\"noopener noreferrer\">paper about GPT-3<\/a>, a language model capable of achieving state-of-the-art results on a set of benchmark and unique natural language processing tasks that range from language translation to generating news articles to answering SAT questions. GPT-3 has a whopping 175 billion parameters. By comparison, the largest version of <a href=\"https:\/\/venturebeat.com\/2019\/08\/20\/openai-releases-curtailed-version-of-gpt-2-language-model\/\">GPT-2 was 1.5 billion parameters<\/a>, and the largest Transformer-based language model in the world \u2014 <a href=\"https:\/\/venturebeat.com\/2020\/05\/19\/microsofts-zero-2-with-deepspeed-trains-neural-networks-with-up-to-170-billion-parameters\/\">introduced by Microsoft earlier this month \u2014 is 17 billion parameters<\/a>.<\/p>\n<p><a href=\"https:\/\/venturebeat.com\/2019\/08\/20\/openai-releases-curtailed-version-of-gpt-2-language-model\/\" target=\"_blank\" rel=\"noopener noreferrer\">OpenAI released GPT-2 last year<\/a>, controversially taking a <a href=\"https:\/\/venturebeat.com\/2019\/08\/20\/openai-releases-curtailed-version-of-gpt-2-language-model\/\" target=\"_blank\" rel=\"noopener noreferrer\">staggered release approach<\/a> due to fear that the model could be used for malicious purposes. OpenAI was <a href=\"https:\/\/venturebeat.com\/2019\/02\/22\/ai-weekly-experts-say-openais-controversial-model-is-a-potential-threat-to-society-and-science\/\" target=\"_blank\" rel=\"noopener noreferrer\">criticized by some<\/a> for the staggered approach, while others applauded the company for demonstrating a way to carefully release an AI model with the potential for misuse. GPT-3 made its debut with a preprint arXiv paper Thursday, but no release details are provided. An OpenAI spokesperson declined to comment when VentureBeat asked if a full version of GPT-3 will be released or one of seven smaller versions ranging in size from 125 million to 13 billion parameters.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>A team of more than 30 OpenAI researchers have released a paper about GPT-3, a language model capable of achieving state-of-the-art results on a set of benchmark and unique natural language processing tasks that range from language translation to generating news articles to answering SAT questions. GPT-3 has a whopping 175 billion parameters. By comparison, [\u2026]<\/p>\n","protected":false},"author":396,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[],"class_list":["post-107897","post","type-post","status-publish","format-standard","hentry","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/107897","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/396"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=107897"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/107897\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=107897"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=107897"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=107897"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}