Toggle light / dark theme

Increasing energy demands and problems associated with burning fossil fuels have heightened interest in more sustainable energy sources, such as sunlight. But there are still areas where carbon-based fuel remains the standard, such as in the aviation industry. To address this need, scientists have been working to devise a way to use sunlight to generate solar-thermal heating that could then drive the chemical reactions that are needed to make jet fuel with net-zero carbon emissions.

Now, a team at Caltech that is part of a Department of Energy (DOE) Energy Innovation Hub known as the Liquid Sunlight Alliance, or LiSA, has developed such a solar-thermal heating system on a small scale and demonstrated that it can successfully drive an important reaction for jet fuel production.

Completely powered by solar energy, the so-called photothermocatalytic reactor incorporates a spectrally selective solar absorber to maximize the generation of solar-thermal heating. The modular design of the reactor takes advantage of current fabrication technologies and existing silicon solar panel production infrastructure.

Artificial Intelligence (AI) has made significant strides in recent years, transforming various aspects of our lives. From self-driving cars to personalized recommendations on streaming platforms, AI has become an integral part of our daily existence. However, the fear that AI will replace humans entirely is unfounded. Instead, a more nuanced perspective emerges: AI will augment human capabilities, leading to the emergence of “AI-powered humans.”

A research team at UNIST has identified the causes of oxygen generation in a novel cathode material called quasi-lithium and proposed a material design principle to address this issue.

Quasi-lithium materials theoretically enable batteries to store 30% to 70% more energy compared to existing technologies through high-voltage charging of over 4.5V. This advancement could allow to achieve a of up to 1,000 km on a single charge. However, during the high-voltage charging process, oxygen trapped inside the material can oxidize and be released as gas, posing a significant explosion risk.

The research team, led by Professor Hyun-Wook Lee in the School of Energy and Chemical Engineering, discovered that oxygen oxidizes near 4.25V, causing partial structural deformation and gas release.

Structural adhesives play a crucial role in assembling automobiles, aircraft, and buildings. Among these, epoxy adhesives stand out for their exceptional mechanical strength and durability. However, traditional cured epoxy resins are often rigid and lack flexibility, resulting in low peel and impact strength.

Now, a groundbreaking advancement in structural adhesives has emerged from the laboratories of Nagoya University, promising to transform material bonding as we know it. This next-generation adhesive boasts an unprecedented impact strength – 22 times higher than conventional epoxy-based adhesives without rubbery additives.


New adhesive using elastomer makes lighter, more carbon-efficient vehicles possible.

Scientists are urging people who live in southcentral Alaska to begin preparing for a possible eruption of the Mount Spurr volcano.

The Alaska Volcano Observatory said now is a good time for Alaskans to “familiarize themselves with the possible hazards of a Spurr eruption” following last week’s announcement that the likelihood of an eruption has increased.

“The major hazards to Alaska residents from Spurr would be from ash risk to aviation and possible ashfall,” the observatory said in a Wednesday post on X.

To understand exactly what’s going on, we need to back up a bit. Roughly put, building a machine-learning model involves training it on a large number of examples and then testing it on a bunch of similar examples that it has not yet seen. When the model passes the test, you’re done.

What the Google researchers point out is that this bar is too low. The training process can produce many different models that all pass the test but—and this is the crucial part—these models will differ in small, arbitrary ways, depending on things like the random values given to the nodes in a neural network before training starts, the way training data is selected or represented, the number of training runs, and so on. These small, often random, differences are typically overlooked if they don’t affect how a model does on the test. But it turns out they can lead to huge variation in performance in the real world.

In other words, the process used to build most machine-learning models today cannot tell which models will work in the real world and which ones won’t.

Mark Rober’s Tesla crash story and video on self-driving cars face significant scrutiny for authenticity, bias, and misleading claims, raising doubts about his testing methods and the reliability of the technology he promotes.

Questions to inspire discussion.

Tesla Autopilot and Testing 🚗 Q: What was the main criticism of Mark Rober’s Tesla crash video? A: The video was criticized for failing to use full self-driving mode despite it being shown in the thumbnail and capable of being activated the same way as autopilot. 🔍 Q: How did Mark Rober respond to the criticism about not using full self-driving mode? A: Mark claimed it was a distinction without a difference and was confident the results would be the same if he reran the experiment in full self-driving mode. 🛑 Q: What might have caused the autopilot to disengage during the test?

NVIDIA has just unveiled the Isaac GR00T N1, a foundation model that is revolutionizing humanoid robotics. This AI-driven system can learn tasks, make decisions, and adapt like never before!

At GTC 2025, NVIDIA CEO Jensen Huang revealed the Isaac GR00T N1, a next-generation AI model designed to train humanoid robots with unprecedented efficiency. It uses a dual-system approach—one for instant reactions and another for strategic thinking. NVIDIA also introduced Newton, a physics engine developed in collaboration with Google DeepMind and Disney, aiming to enhance robotic motion.

Additionally, NVIDIA’s Isaac GR00T Blueprint enables large-scale training with synthetic data. In just 11 hours, the system generated over 780,000 training examples, drastically improving robot accuracy. These advancements could reshape industries by making humanoid robots more intelligent and useful in real-world applications.

What do you think of NVIDIA’s latest robotics breakthrough? Let us know in the comments! Do not forget to like, subscribe, and turn on notifications for more updates on AI and robotics.

#NVIDIA #AI #Robotics.
Get More Great Car Videos — Subscribe: https://goo.gl/BSIaFc

Adopting liquid cooling technology could significantly reduce electricity costs across the data center.

Many Porsche “purists” reflect forlornly upon the 1997, 5th generation, 996 version of the iconic 911 sports car. It was the first year of the water-cooled engine versions of the 911, which had previously been based on air-cooled engines since their entry into the market in 1964. The 911 was also the successor to the popular air-cooled 356. For over three decades, Porsche’s flagship 911 was built around an air-cooled engine. The two main reasons often provided for the shift away from air-cooled to water-cooled engines were 1) environmental (emission standards) and 2) performance (in part cylinder head cooling). The writing was on the wall: If Porsche was going to remain competitive in the sports car market and racing world, the move to water-cooled engines was unavoidable.

Fast forward to current data centers trying to meet the demands for AI computing. For similar reasons, we’re seeing a shift towards liquid cooling. Machines relying on something other than air for cooling date back at least to the Cray-1 supercomputer which used a freon-based system and the Cray-2 which used Fluorinert, a non-conductive liquid in which boards were immersed. The Cray-1 was rated at about 115kW and the Cray-2 at 195kW, both a far cry from the 10’s of MWs used by today’s most powerful supercomputers. Another distinguishing feature here is that these are “supercomputers” and not just data center servers. Data centers have largely run on air-cooled processors, but with the incredible demand for computing created by the explosive increase in AI applications, data centers are being called on to provide supercomputing-like capabilities.