Toggle light / dark theme

The sun, the essential engine that sustains life on Earth, generates its tremendous energy through the process of nuclear fusion. At the same time, it releases a continuous stream of neutrinos—particles that serve as messengers of its internal dynamics. Although modern neutrino detectors unveil the sun’s present behavior, significant questions linger about its stability over periods of millions of years—a timeframe that spans human evolution and significant climate changes.

Finding answers to this is the goal of the LORandite EXperiment (LOREX) that requires a precise knowledge of the solar neutrino cross section on thallium. This information has now been provided by an international collaboration of scientists using the unique facilities at GSI/FAIR’s Experimental Storage Ring ESR in Darmstadt to obtain an essential measurement that will help to understand the long-term stability of the sun. The results of the measurements have been published in the journal Physical Review Letters.

LOREX is the only long-time geochemical solar neutrino experiment still actively pursued. Proposed in the 1980s, it aims to measure solar neutrino flux averaged over a remarkable four million years, corresponding to the geological age of the lorandite ore.

UNIVERSITY PARK, Pa. — A recently developed electronic tongue is capable of identifying differences in similar liquids, such as milk with varying water content; diverse products, including soda types and coffee blends; signs of spoilage in fruit juices; and instances of food safety concerns. The team, led by researchers at Penn State, also found that results were even more accurate when artificial intelligence (AI) used its own assessment parameters to interpret the data generated by the electronic tongue.

(Many people already posted this. This is the press release from Penn Sate who did the research)


The tongue comprises a graphene-based ion-sensitive field-effect transistor, or a conductive device that can detect chemical ions, linked to an artificial neural network, trained on various datasets. Critically, Das noted, the sensors are non-functionalized, meaning that one sensor can detect different types of chemicals, rather than having a specific sensor dedicated to each potential chemical. The researchers provided the neural network with 20 specific parameters to assess, all of which are related to how a sample liquid interacts with the sensor’s electrical properties. Based on these researcher-specified parameters, the AI could accurately detect samples — including watered-down milks, different types of sodas, blends of coffee and multiple fruit juices at several levels of freshness — and report on their content with greater than 80% accuracy in about a minute.

“After achieving a reasonable accuracy with human-selected parameters, we decided to let the neural network define its own figures of merit by providing it with the raw sensor data. We found that the neural network reached a near ideal inference accuracy of more than 95% when utilizing the machine-derived figures of merit rather than the ones provided by humans,” said co-author Andrew Pannone, a doctoral student in engineering science and mechanics advised by Das. “So, we used a method called Shapley additive explanations, which allows us to ask the neural network what it was thinking after it makes a decision.”

This approach uses game theory, a decision-making process that considers the choices of others to predict the outcome of a single participant, to assign values to the data under consideration. With these explanations, the researchers could reverse engineer an understanding of how the neural network weighed various components of the sample to make a final determination — giving the team a glimpse into the neural network’s decision-making process, which has remained largely opaque in the field of AI, according to the researchers. They found that, instead of simply assessing individual human-assigned parameters, the neural network considered the data it determined were most important together, with the Shapley additive explanations revealing how important the neural network considered each input data.

Here on planet Earth, as well as in most locations in the Universe, everything we observe and interact with is made up of atoms. Atoms come in roughly 90 different naturally occurring species, where all atoms of the same species share similar physical and chemical properties, but differ tremendously from one species to another. Once thought to be indivisible units of matter, we now know that atoms themselves have an internal structure, with a tiny, positively charged, massive nucleus consisting of protons and neutrons surrounded by negatively charged, much less massive electrons. We’ve measured the physical sizes of these subatomic constituents exquisitely well, and one fact stands out: the size of atoms, at around 10-10 meters apiece, are much, much larger than the constituent parts that compose them.

Protons and neutrons, which compose the atom’s nucleus, are roughly a factor of 100,000 smaller in length, with a typical size of only around 10-15 meters. Electrons are even smaller, and are assumed to be point-like particles in the sense that they exhibit no measurable size at all, with experiments constraining them to be no larger than 10-19 meters across. Somehow, protons, neutrons, and electrons combine together to create atoms, which occupy much greater volumes of space than their components added together. It’s a mysterious fact that atoms, which must be mostly empty space in this regard, are still impenetrable to one another, leading to enormous collections of atoms that make up the solid objects we’re familiar with in our macroscopic world.

So how does this happen: that atoms, which are mostly empty space, create solid objects that cannot be penetrated by other solid objects, which are also made of atoms that are mostly empty space? It’s a remarkable fact of existence, but one that requires quantum physics to explain.

Researchers have developed a device that can simultaneously measure six markers of brain health. The sensor, which is inserted through the skull into the brain, can pull off this feat thanks to an artificial intelligence (AI) system that pieces apart the six signals in real time.

Being able to continuously monitor biomarkers in patients with traumatic brain injury could improve outcomes by catching swelling or bleeding early enough for doctors to intervene. But most existing devices measure just one marker at a time. They also tend to be made with metal, so they can’t easily be used in combination with magnetic resonance imaging.


Simultaneous access to measurements could improve outcomes for brain injuries.

The interactions between light and nitroaromatic hydrocarbon molecules have important implications for chemical processes in our atmosphere that can lead to smog and pollution. However, changes in molecular geometry due to interactions with light can be very difficult to measure because they occur at sub-Angstrom length scales and femtosecond time scales.

Different types of cancer have unique molecular “fingerprints” which are detectable in early stages of the disease and can be picked up with near-perfect accuracy by small, portable scanners in just a few hours, according to a study published today in the journal Molecular Cell.

The discovery by researchers at the Centre for Genomic Regulation (CRG) in Barcelona sets the foundation for creating new, non-invasive diagnostic tests that detect different types of cancer faster and earlier than currently possible.

The study centers around the ribosome, the protein factories of a cell. For decades, ribosomes were thought to have the same blueprint across the human body. However, researchers discovered a hidden layer of complexity—tiny chemical modifications which vary between different tissues, developmental stages, and disease.

Controlling matter at the atomic level has taken a major step forward, thanks to groundbreaking nanotechnology research by an international team of scientists led by physicists at the University of Bath.

This advancement has profound implications for fundamental scientific understanding. It is also likely to have important practical applications, such as transforming the way researchers develop new medications.

S smallest movie. In the film, single molecules, consisting of two atoms bonded together, were magnified 100-million times and positioned frame-by-frame to tell a stop-motion story on an atomic scale. +.


Physicists are getting closer to controlling single-molecule chemical reactions – could this shape the future of pharmaceutical research?

Current laser technologies for the extended short-wave infrared (SWIR) spectral range rely on expensive and complex materials, limiting their scalability and affordability. To address these challenges, ICFO researchers have presented a novel approach based on colloidal quantum dots in an Advanced Materials article. The team managed to emit coherent light (a necessary condition to create lasers) in the extended SWIR range with large colloidal quantum dots made of lead sulfide (PbS).

This new CQD-based technology offers a solution to the aforementioned challenges while maintaining compatibility with silicon CMOS platforms (the technology used for constructing integrated circuit chips) for on-chip integration.

Their PbS colloidal quantum dots are the first semiconductor lasing material to cover such a broad wavelength range. Remarkably, the researchers accomplished this without altering the dots’ chemical composition. These results pave the way towards the realization of more practical and compact lasers.

A team of Rice University scientists has solved a long-standing problem in thermal imaging, making it possible to capture clear images of objects through hot windows. Imaging applications in a range of fields—such as security, surveillance, industrial research and diagnostics—could benefit from the research findings, which were reported in the journal Communications Engineering.

“Say you want to use to monitor in a high-temperature reactor chamber,” said Gururaj Naik, an associate professor of electrical and computer engineering at Rice and corresponding author on the study. “The problem you’d be facing is that the thermal radiation emitted by the window itself overwhelms the camera, obscuring the view of objects on the other side.”

A possible solution could involve coating the window in a material that suppresses thermal light emission toward the camera, but this would also render the window opaque. To get around this issue, the researchers developed a coating that relies on an engineered asymmetry to filter out the thermal noise of a hot window, doubling the contrast of thermal imaging compared to conventional methods.

Scientists have developed the first electrically pumped continuous-wave semiconductor laser composed exclusively of elements from the fourth group of the periodic table—the “silicon group.”

Built from stacked ultrathin layers of germanium-tin and germanium-tin, this new laser is the first of its kind directly grown on a silicon wafer, opening up new possibilities for on-chip integrated photonics. The findings have been published in Nature Communications. The team includes researchers from Forschungszentrum Jülich, FZJ, the University of Stuttgart, and the Leibniz Institute for High Performance Microelectronics (IHP), together with their French partner CEA-Leti.

The rapid growth of artificial intelligence and the Internet of Things are driving the demand for increasingly powerful, energy-efficient hardware. Optical data transmission, with its ability to transfer vast amounts of data while minimizing , is already the preferred method for distances above 1 meter and is proving advantageous even for shorter distances. This development points towards future microchips featuring low-cost photonic integrated circuits (PICs), offering significant cost savings and improved performance.