{"id":184798,"date":"2024-03-09T11:31:21","date_gmt":"2024-03-09T17:31:21","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2024\/03\/korean-researchers-power-shame-nvidia-with-new-neural-ai-chip-claim-625-times-less-power-draw-41-times-smaller"},"modified":"2024-03-09T11:31:21","modified_gmt":"2024-03-09T17:31:21","slug":"korean-researchers-power-shame-nvidia-with-new-neural-ai-chip-claim-625-times-less-power-draw-41-times-smaller","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2024\/03\/korean-researchers-power-shame-nvidia-with-new-neural-ai-chip-claim-625-times-less-power-draw-41-times-smaller","title":{"rendered":"Korean researchers power-shame Nvidia with new neural AI chip \u2014 claim 625 times less power draw, 41 times smaller"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/korean-researchers-power-shame-nvidia-with-new-neural-ai-chip-claim-625-times-less-power-draw-41-times-smaller2.jpg\"><\/a><\/p>\n<p>A team of scientists from the Korea Advanced Institute of Science and Technology (KAIST) detailed their \u2018Complementary-Transformer\u2019 AI chip during the recent 2024 International Solid-State Circuits Conference (ISSCC). The new C-Transformer chip is claimed to be the world\u2019s first ultra-low power AI accelerator chip capable of large language model (LLM) processing.<\/p>\n<p>In a <a data-analytics-id=\"inline-link\" href=\"http:\/\/koreabizwire.com\/kaist-develops-next-generation-ultra-low-power-llm-accelerator\/274663\" data-url=\"http:\/\/koreabizwire.com\/kaist-develops-next-generation-ultra-low-power-llm-accelerator\/274663\">press release<\/a>, the researchers power-shame Nvidia, claiming that the C-Transformer uses 625 times less power and is 41x smaller than the green team\u2019s A100 Tensor Core GPU. It also reveals that the Samsung fabbed chip\u2019s achievements largely stem from refined <a data-analytics-id=\"inline-link\" href=\"https:\/\/www.tomshardware.com\/news\/intel-loihi-chip-neuromorphic-computing, 35537.html\" data-before-rewrite-localise=\"https:\/\/www.tomshardware.com\/news\/intel-loihi-chip-neuromorphic-computing, 35537.html\">neuromorphic computing<\/a> technology.<\/p>\n<p>Though we are told that the KAIST C-Transformer chip can do the same LLM processing tasks as one of Nvidia\u2019s beefy <a data-analytics-id=\"inline-link\" href=\"https:\/\/www.tomshardware.com\/news\/nvidia-ampere-A100-gpu-7nm\" data-before-rewrite-localise=\"https:\/\/www.tomshardware.com\/news\/nvidia-ampere-A100-gpu-7nm\">A100 GPUs<\/a>, none of the press nor conference materials we have provided any direct comparative performance metrics. That\u2019s a significant statistic, conspicuous by its absence, and the cynical would probably surmise that a performance comparison doesn\u2019t do the C-Transformer any favors.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>A team of scientists from the Korea Advanced Institute of Science and Technology (KAIST) detailed their \u2018Complementary-Transformer\u2019 AI chip during the recent 2024 International Solid-State Circuits Conference (ISSCC). The new C-Transformer chip is claimed to be the world\u2019s first ultra-low power AI accelerator chip capable of large language model (LLM) processing. In a press release, [\u2026]<\/p>\n","protected":false},"author":359,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1635,6],"tags":[],"class_list":["post-184798","post","type-post","status-publish","format-standard","hentry","category-materials","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/184798","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/359"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=184798"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/184798\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=184798"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=184798"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=184798"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}