Toggle light / dark theme

GPT-4, PaLM, Claude, Bard, LaMDA, Chinchilla, Sparrow – the list of large-language models on the market continues to grow. But behind their remarkable capabilities, users are discovering substantial costs. While LLMs offer tremendous potential, understanding their economic implications is crucial for businesses and individuals considering their adoption.

While LLMs offer tremendous potential, understanding their economic implications is crucial for businesses and individuals considering their adoption.

First, building and training LLMs is expensive. It requires thousands of Graphics Processing Units, or GPUs, offering the parallel processing power needed to handle the massive datasets these models learn from. The cost of the GPUs, alone, can amount to millions of dollars. According to a technical overview of OpenAI’s GPT-3 language model, training required at least $5 million worth of GPUs.

With the pace at which artificial intelligence (AI) and machine learning (ML) applications are ramping up, we can expect to see industries and companies use these systems and tools in everyday processes. As these data-intensive applications continue to grow in complexity, the demand for high-speed transmission and efficient communication between computing units becomes paramount.

This need has sparked interest in optical interconnects, particularly in the context of short-reach connections between XPUs (CPUs, GPUs and memory). Silicon photonics is emerging as a promising technology that improves performance, cost-efficiency and thermal-management capabilities that ultimately improve the function of AI/ML applications compared with traditional approaches.


The key to getting the most out of artificial intelligence may lie in the use of silicon photonics, a powerful new tech.

Bobbi is SVP, Software Engineering at Loopio. She is a technology leader with over 25 years of diverse experience in the industry.

AI and emerging technologies under the AI umbrella—like generative pre-trained transformers (GPT)—are reshaping the business world. These technologies are fostering greater organizational efficiencies and innovations and are quickly becoming crucial for companies of all sizes.

The ability to automate processes and tasks opens up a plethora of new opportunities for organizations. When automation can scale with an organization, this can completely transform day-to-day operations. In this article, I’ll look at three ways that engineering organizations in particular can use AI to transform their organizational efficiencies, organizational structure and software practices and processes.

The world’s largest democracy is poised to transform itself and the world, embracing AI on an enormous scale.

Speaking with the press Friday in Bengaluru, in the context of announcements from two of India’s largest conglomerates, Reliance Industries Limited and Tata Group, NVIDIA founder and CEO Jensen Huang detailed plans to bring AI technology and skills to address the world’s most populous nation’s greatest challenges.

“I think this is going to be one of the largest AI markets in the world,” said Huang, who was wrapping up a week of high-level meetings across the nation, including with Prime Minister Narendra Modi, leading AI researchers, top business leaders, members of the press and the country’s 4,000-some NVIDIA employees.

NEW YORK—()— Paige, a technology disruptor in healthcare, has joined forces with Microsoft in the fight against cancer, making headway in their collaboration to transform cancer diagnosis and patient care by building the world’s largest image-based artificial intelligence (AI) models for digital pathology and oncology.

“Unleashing the power of AI is a game changer in advancing healthcare to improve lives.” Tweet this

Paige, a global leader in end-to-end digital pathology solutions and clinical AI, developed the first Large Foundation Model using over one billion images from half a million pathology slides across multiple cancer types. Paige is developing with Microsoft a new AI model that is orders-of-magnitude larger than any other image-based AI model existing today, configured with billions of parameters. This model assists in capturing the subtle complexities of cancer and serves as the cornerstone for the next generation of clinical applications and computational biomarkers that push the boundaries of oncology and pathology.

Globalization is not dead, but it is changing. The United States and China are creating two separate spheres for technology, and artificial intelligence is on the front lines of this new “Digital Cold War.” If democracies want to succeed in this new era of “re-globalization” they will need to coordinate across governments and between the private and public sectors. AI is coming, whether we like it or not. We are at a fork in the road and all segments of society will need to pitch in to build AI systems that contribute to a just and democratic future where humans can thrive.

Page-utils class= article-utils—vertical hide-for-print data-js-target= page-utils data-id= tag: blogs.harvardbusiness.org, 2007/03/31:999.362544 data-title= AI and the New Digital Cold War data-url=/2023/09/ai-and-the-new-digital-cold-war data-topic= Public-private partnerships data-authors= Hemant Taneja; Fareed Zakaria data-content-type= Digital Article data-content-image=/resources/images/article_assets/2023/08/Sep23_02_792DVvbiBBo-383x215.jpg data-summary=

Companies and countries need to prioritize collaboration and transformation over competition and disruption.

This microbot has the adeptness to navigate precisely within clusters of cells.

In recent years, introducing tiny robots into biological studies and therapeutic delivery has generated significant excitement and is poised to revolutionize the medical field.

These mini robotic systems, often measuring just a few millimeters or even smaller, bring various capabilities and advantages, transforming multiple aspects of medicine, including targeting precise tumor sites to deliver drugs, cellular simulation, and even performing microsurgery.