Here’s how our senior editor, Will Douglas Heaven, is thinking about the AI landscape in 2025.

Size matters. Economists have long known that; economies of scale are among the building blocks of their science. In the digital era, it quickly became apparent that value was directly proportional to the size of the network (the number of users linked by a particular technology or system).
The race to create scale is critical amid the sizzling geopolitical competition over leadership in new technologies. It has assumed even greater urgency in Western capitals in the wake of China’s success in that race. They’ve had to reconceptualize scale to overcome the advantages China has a result of the size of its economy and its population. It’s a work in progress and the results are mixed, at best.
For those who’ve forgotten their introductory economics, economies of scale are cost advantages created by expanding operations. As companies build more products, they become more efficient, reducing cost per unit. This allows them to produce even more of that product, reinforcing their competitive advantage and keep the virtuous circle turning.
Earth is spinning faster this summer, making the days marginally shorter and attracting the attention of scientists and timekeepers.
July 10 was the shortest day of the year so far, lasting 1.36 milliseconds less than 24 hours, according to data from the International Earth Rotation and Reference Systems Service and the US Naval Observatory, compiled by timeanddate.com. More exceptionally short days are coming on July 22 and August 5, currently predicted to be 1.34 and 1.25 milliseconds shorter than 24 hours, respectively.
The length of a day is the time it takes for the planet to complete one full rotation on its axis —24 hours or 86,400 seconds on average. But in reality, each rotation is slightly irregular due to a variety of factors, such as the gravitational pull of the moon, seasonal changes in the atmosphere and the influence of Earth’s liquid core. As a result, a full rotation usually takes slightly less or slightly more than 86,400 seconds — a discrepancy of just milliseconds that doesn’t have any obvious effect on everyday life.
Welcome back to In the Loop, TIME’s new twice-weekly newsletter about the world of AI.
If you’re reading this in your browser, you can subscribe to have the next one delivered straight to your inbox.
President Trump will deliver a major speech on Wednesday at an event in Washington, D.C., titled “Winning the AI Race,” where he is expected to unveil his long-awaited AI action plan. The 20-page, high-level document will focus on three main areas, according to a person with knowledge of the matter. It will come as a mixture of directives to federal agencies, with some grant programs. “It’s mostly carrots, not sticks,” the person said.
A research team successfully implemented CuInSe2 thin-film solar cells composed of copper (Cu), indium (In), and selenium (Se) on transparent electrode substrates. Furthermore, the team developed a “bifacial solar cell technology” that receives sunlight from both the front and back sides to generate power. This technology can be fabricated at low temperatures, enabling a simpler production process, and is broadly applicable to building-integrated solar power, agricultural solar power, and high-efficiency tandem solar cells in the future.
In the race to develop AI that understands complex images like financial forecasts, medical diagrams and nutrition labels—essential for AI to operate independently in everyday settings—closed-source systems like ChatGPT and Claude currently set the pace. But no one outside their makers knows how those models were trained or what data they used, leaving open-source alternatives scrambling to catch up.
Now, researchers at Penn Engineering and the Allen Institute for AI (Ai2) have developed a new approach to train open-source models: using AI to create scientific figures, charts and tables that teach other AI systems how to interpret complex visual information.
Their tool, CoSyn (short for Code-Guided Synthesis), taps open-source AI models’ coding skills to render text-rich images and generate relevant questions and answers, giving other AI systems the data they need to learn how to “see” and understand scientific figures.
Mapping the N6-methyladenosine (m6A) transcriptome in prostate cancer has established its clinical potential value as a prognostic biomarker for this disease. A multidisciplinary approach that integrates genomics, transcriptomics, epitranscriptomics, proteomics and clinical oncology is essential to translate the intricacies of m6A modification into tangible benefits for patients.
Scientists at Rice University and University of Houston have developed an innovative, scalable approach to engineer bacterial cellulose into high-strength, multifunctional materials. The study, published in Nature Communications, introduces a dynamic biosynthesis technique that aligns bacterial cellulose fibers in real-time, resulting in robust biopolymer sheets with exceptional mechanical properties.
Plastic pollution persists because traditional synthetic polymers degrade into microplastics, releasing harmful chemicals like bisphenol A (BPA), phthalates and carcinogens. Seeking sustainable alternatives, the research team led by Muhammad Maksud Rahman, assistant professor of mechanical and aerospace engineering at the University of Houston and adjunct assistant professor of materials science and nanoengineering at Rice, leveraged bacterial cellulose — one of Earth’s most abundant and pure biopolymers — as a biodegradable alternative.