Before chatbots exploded in popularity, a group of researchers, tech executives and venture capitalists had worked for more than a decade to fuel A.I.

Elon Musk discusses the progress and challenges of Tesla’s full self-driving technology, expressing less optimism about its timeline but recognizing its potential value and impact on the future of transportation.
Questions to spark discussion.
What is the status of Tesla’s full self-driving technology?
—Elon Musk expects a delay in the release of the new version of Full Self Driving, but believes it will learn dramatically faster.
Panelists include Albert Marinez, chief analytics officer at the Cleveland Clinic; Tatyana Fedotova, director, global data, platforms and partnerships, at Johnson & Johnson; and Christopher Larkin, chief technology officer at Concord Technologies.
Panelists will reveal the most critical questions to ask and decisions to be made at each phase of the AI journey, from build versus buy and tool selection to ensuring AI investments are targeted for maximum impact, and much more.
Supposedly only til next month, Jan. 2024. But, it is why we absolutely must keep heat on full acceleration of AI. There is a clear camp wants to stop it, turn it off and walk it a decade backwards. And that is unacceptable.
A new report by The Information says Google has pushed back the launch of its next-gen AI, Gemini. The company was reportedly planning to introduce the new foundational model in events scheduled for next week, but has quietly delayed it until January after finding it needed to work on its responses to non-English queries.
A researcher has just finished writing a scientific paper. She knows her work could benefit from another perspective. Did she overlook something? Or perhaps there’s an application of her research she hadn’t thought of. A second set of eyes would be great, but even the friendliest of collaborators might not be able to spare the time to read all the required background publications to catch up.
Kevin Yager—leader of the electronic nanomaterials group at the Center for Functional Nanomaterials (CFN), a U.S. Department of Energy (DOE) Office of Science User Facility at DOE’s Brookhaven National Laboratory—has imagined how recent advances in artificial intelligence (AI) and machine learning (ML) could aid scientific brainstorming and ideation. To accomplish this, he has developed a chatbot with knowledge in the kinds of science he’s been engaged in.
Rapid advances in AI and ML have given way to programs that can generate creative text and useful software code. These general-purpose chatbots have recently captured the public imagination. Existing chatbots—based on large, diverse language models—lack detailed knowledge of scientific sub-domains. By leveraging a document-retrieval method, Yager’s bot is knowledgeable in areas of nanomaterial science that other bots are not.
OpenGPT is a promising toolkit for building custom chatbots like GPTs, but it is completely open-source and offers even more configuration options. Which means it is also more complicated.
With GPTs, OpenAI introduced the evolution of its plugin concept at Dev Days in November 2023. The AI company is giving end users different tools to create a chatbot tailored to their needs, without having to know how to code a chatbot. OpenAI even plans to give successful GPT creators a share of the revenue from ChatGPT Plus in the future.
When setting up GPTs, users can upload their files, link APIs, assign system prompts, and enable modules for web browsing, DALL-E, and code interpreters.
Combining AI with traditional wet lab work creates a virtuous circle from lab to data and back to the lab.
AI, with the right data, can span all of these scales and make sense of the data we collect on all of them. It’s poised to accelerate basic science, the business of biotechs, the behemoth pharmaceutical companies, and the broader bioeconomy.
A first-principles model accounts for the wide range of critical temperatures (Tc’s) for four materials and suggests a parameter that determines Tc in any high-temperature superconductor.
Since the first high-temperature superconducting materials, known as the cuprates, were discovered in 1986, researchers have struggled to explain their properties and to find materials with even higher superconducting transition temperatures (Tc’s). One puzzle has been the cuprates’ wide variation in Tc, ranging from below 10 K to above 130 K. Now Masatoshi Imada of Waseda University in Japan and his colleagues have used first-principles calculations to determine the order parameters—which measure the density of superconducting electrons—for four cuprate materials and have predicted the Tc’s based on those order parameters [1]. The researchers have also found what they believe is the fundamental parameter that determines Tc in a given material, which they hope will lead to the development of higher-temperature superconductors.
For each material, Imada and his colleagues applied the basic principles of quantum mechanics, focusing on the planes of copper and oxygen atoms that are known to host the superconducting electrons. They used a combination of numerical techniques, including one supplemented by machine learning, and did not require any adjustable parameters.