{"id":161788,"date":"2023-04-09T12:22:54","date_gmt":"2023-04-09T17:22:54","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2023\/04\/memorygpt-is-like-chatgpt-with-long-term-memory"},"modified":"2023-04-09T12:22:54","modified_gmt":"2023-04-09T17:22:54","slug":"memorygpt-is-like-chatgpt-with-long-term-memory","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2023\/04\/memorygpt-is-like-chatgpt-with-long-term-memory","title":{"rendered":"MemoryGPT is like ChatGPT with long-term memory"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/memorygpt-is-like-chatgpt-with-long-term-memory2.jpg\"><\/a><\/p>\n<p>I ve been quite impressed so far. And, if they can be improved over night i would love to see it.<\/p>\n<hr>\n<p>\n<strong>With long-term memory, language models could be even more specific \u2013 or more personal. MemoryGPT gives a first impression.<\/strong><\/p>\n<p>Right now, interaction with language models refers to single instances, e.g. in <a class=\"\" href=\"https:\/\/the-decoder.com\/chatgpt-is-a-gpt-3-chatbot-from-openai-that-you-can-test-now\/\">ChatGPT<\/a> to a single chat. Within that chat, the language model can to some extent take the context of the input into account for new texts and replies.<\/p>\n<p>In the currently most powerful version of <a href=\"https:\/\/the-decoder.com\/open-ai-gpt-4-announcement\/\">GPT-4, this is up to 32,000 tokens<\/a> \u2013 about 50 pages of text. This makes it possible, for example, to chat about the contents of a long paper. To find new solutions, developers can talk to a larger code database. The context window is an important building block for the practical use of large language models, an innovation made possible by <a href=\"https:\/\/the-decoder.com\/transformer-the-building-block-of-ais-future\/\">Transformer networks<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>I ve been quite impressed so far. And, if they can be improved over night i would love to see it. With long-term memory, language models could be even more specific \u2013 or more personal. MemoryGPT gives a first impression. Right now, interaction with language models refers to single instances, e.g. in ChatGPT to a [\u2026]<\/p>\n","protected":false},"author":359,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1522],"tags":[],"class_list":["post-161788","post","type-post","status-publish","format-standard","hentry","category-innovation"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/161788","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/359"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=161788"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/161788\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=161788"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=161788"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=161788"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}