{"id":187834,"date":"2024-04-21T16:26:40","date_gmt":"2024-04-21T21:26:40","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2024\/04\/llm-systems-will-soon-have-infinite-context-length"},"modified":"2024-04-21T16:26:40","modified_gmt":"2024-04-21T21:26:40","slug":"llm-systems-will-soon-have-infinite-context-length","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2024\/04\/llm-systems-will-soon-have-infinite-context-length","title":{"rendered":"LLM Systems Will Soon Have Infinite Context Length"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/llm-systems-will-soon-have-infinite-context-length2.jpg\"><\/a><\/p>\n<p>LLMs forget. Everyone knows that. The primary culprit behind this is the finity of <a href=\"https:\/\/analyticsindiamag.com\/busting-the-myth-of-context-length\/\">context length<\/a> of the models. Some even say that it is the biggest bottleneck when it comes to <a href=\"https:\/\/www.reddit.com\/r\/LocalLLaMA\/comments\/1aejoib\/if_this_is_true_it_is_over_unlimited_context\/\" target=\"_blank\" rel=\"nofollow\">achieving AGI<\/a>.<\/p>\n<p>Soon, it appears that the debate over which model boasts the largest context length will become irrelevant. Microsoft, Google, and Meta, have all been taking strides in this direction \u2013 making context length infinite.<\/p>\n<p>While all LLMs are currently running on Transformers, it might soon become a thing of the past. For example, Meta has introduced <a href=\"https:\/\/analyticsindiamag.com\/meta-releases-megalodon-efficient-llm-pre-training-and-inference-on-infinite-context-length\/\">MEGALODON<\/a>, a neural architecture designed for efficient sequence modelling with unlimited context length.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>LLMs forget. Everyone knows that. The primary culprit behind this is the finity of context length of the models. Some even say that it is the biggest bottleneck when it comes to achieving AGI. Soon, it appears that the debate over which model boasts the largest context length will become irrelevant. Microsoft, Google, and Meta, [\u2026]<\/p>\n","protected":false},"author":578,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[],"class_list":["post-187834","post","type-post","status-publish","format-standard","hentry","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/187834","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/578"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=187834"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/187834\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=187834"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=187834"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=187834"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}