Toggle light / dark theme

When 2D layered materials are made thinner (i.e., at the atomic scale), their properties can dramatically change, sometimes resulting in the emergence of entirely new features and in the loss of others. While new or emerging properties can be very advantageous for the development of new technologies, retaining some of the material’s original properties is often equally important.

Researchers at Tsinghua University, the Chinese Academy of Sciences and the Frontier Science Center for Quantum Information have recently been able to realize tailored Ising superconductivity in a sample of intercalated bulk niobium diselenide (NbSe2), a characteristic of bulk NbSe2 that is typically compromised in . The methods they used, outlined in a paper published in Nature Physics, could pave the way towards the fabrication of 2D thin-layered superconducting materials.

“Atomically thin 2D materials exhibit interesting properties that are often distinct from their bulk materials, which consist of hundreds and thousands of layers,” Shuyun Zhou, one of the researchers who carried out the study, told Phys.org. “However, atomically thin films/flakes are difficult to fabricate, and the emerging new properties are sometimes achieved by sacrificing some other important properties.”

“Dwell fatigue” is a phenomenon that can occur in titanium alloys when held under stress, such as a jet engine’s fan disc during takeoff. This peculiar failure mode can initiate microscopic cracks that drastically reduce a component’s lifetime.

The most widely used titanium alloy, Ti-6Al-4V, was not believed to exhibit dwell before the 2017 Air France Flight 66 incident, in which an Airbus en route from Paris to Los Angeles suffered fan disc failure over Greenland that forced an emergency landing. The analysis of that incident and several more recent concerns prompted the Federal Aviation Administration and European Union Aviation Safety Agency to coordinate work across the to determine the root causes of dwell fatigue.

According to experts, metals deform predominantly via dislocation slip—the movement of line defects in the underlying crystal lattice. Researchers hold that dwell fatigue can initiate when slip is restricted to narrow bands instead of occurring more homogenously in three dimensions. The presence of nanometer-scale intermetallic Ti3Al precipitates promotes band formation, particularly when processing conditions allow for their long-range ordering.

When Courtney “CJ” Johnson pulls up footage from her Ph.D. dissertation, it’s like she’s watching an attempted break-in on a home security camera.

The intruder cases its target without setting a foot inside, looking for a point of entry. But this intruder is not your typical burglar. It’s a virus.

Filmed over two and a half minutes by pinpointing its location 1,000 times a second, the footage shows a tiny virus particle, thousands of times smaller than a grain of sand, as it lurches and bobs among tightly packed .

Cosmological observations of the orbits of stars and galaxies enable clear conclusions to be drawn about the attractive gravitational forces that act between the celestial bodies.

The astonishing finding: Visible matter is far from sufficient for being able to explain the development or movements of galaxies. This suggests that there exists another, so far unknown, type of matter. Accordingly, in the year 1933, the Swiss physicist and astronomer Fritz Zwicky inferred the existence of what is known now as dark matter. Dark matter is a postulated form of matter which isn’t directly visible but interacts via gravity, and consists of approximately five times more mass than the matter with which we are familiar.

Recently, following a precision experiment developed at the Albert Einstein Center for Fundamental Physics (AEC) at the University of Bern, an international research team succeeded in significantly narrowing the scope for the existence of dark matter. With more than 100 members, the AEC is one of the leading international research organizations in the field of particle physics. The findings of the team, led by Bern, have now been published in Physical Review Letters.

Gamma-ray bursts (GRBs) have been detected by satellites orbiting Earth as luminous flashes of the most energetic gamma-ray radiation lasting milliseconds to hundreds of seconds. These catastrophic blasts occur in distant galaxies, billions of light years from Earth.

A sub-type of GRB known as a short-duration GRB starts life when two neutron stars collide. These ultra-dense stars have the mass of our sun compressed down to half the size of a city like London, and in the final moments of their life, just before triggering a GRB, they generate ripples in space-time—known to astronomers as gravitational waves.

Until now, space scientists have largely agreed that the “engine” powering such energetic and short-lived bursts must always come from a newly formed black hole (a region of where gravity is so strong that nothing, not even light, can escape from it). However, new research by an international team of astrophysicists, led by Dr. Nuria Jordana-Mitjans at the University of Bath, is challenging this scientific orthodoxy.

Research led by the University of Amsterdam has demonstrated that elusive radiation coming from black holes can be studied by mimicking it in the lab.

Black holes are the most extreme objects in the universe, packing so much mass into so little space that nothing—not even light—can escape their gravitational pull once it gets close enough.

Understanding black holes is key to unraveling the most fundamental laws governing the cosmos, because they represent the limits of two of the best-tested theories of physics: the , which describes gravity as resulting from the (large-scale) warping of spacetime by massive objects, and the theory of , which describes physics at the smallest length scales. To fully describe black holes, we would need to stitch these two theories together and form a theory of quantum gravity.

Aspiring bakers are frequently called upon to adapt award-winning recipes based on differing kitchen setups. Someone might use an eggbeater instead of a stand mixer to make prize-winning chocolate chip cookies, for instance.

Being able to reproduce a recipe in different situations and with varying setups is critical for both talented chefs and , the latter of whom are faced with a similar problem of adapting and reproducing their own “recipes” when trying to validate and work with new AI models. These models have applications in ranging from climate analysis to brain research.

“When we talk about data, we have a practical understanding of the digital assets we deal with,” said Eliu Huerta, scientist and lead for Translational AI at the U.S. Department of Energy’s (DOE) Argonne National Laboratory. “With an AI model, it’s a little less clear; are we talking about data structured in a smart way, or is it computing, or software, or a mix?”

Large AI networks like language models make mistakes or contain outdated information. MEND shows how to update LLMs without changing the whole network.

Large AI models have become standard in many AI applications, such as natural language processing, image analysis, and image generation. The models, such as OpenAI’s GPT-3, often have more diverse capabilities than small, specialized models and can be further improved via finetuning.

However, even the largest AI models regularly make mistakes and additionally contain outdated information. GPT-3’s most recent data is from 2019 – when Theresa May was still prime minister.

Rats love to dance 🕺:3


The team had two alternate hypotheses: The first was that the optimal music tempo for beat synchronicity would be determined by the time constant of the body. This is different between species and much faster for compared to humans (think of how quickly a rat can scuttle). The second was that the optimal tempo would instead be determined by the time constant of the brain, which is surprisingly similar across species.

“After conducting our research with 20 human participants and 10 rats, our results suggest that the optimal tempo for beat synchronization depends on the time constant in the brain,” said Takahashi. “This demonstrates that the animal brain can be useful in elucidating the perceptual mechanisms of music.”

The rats were fitted with wireless, miniature accelerometers, which could measure the slightest head movements. Human participants also wore accelerometers on headphones. They were then played one-minute excerpts from Mozart’s Sonata for Two Pianos in D Major, K. 448, at four different tempos: Seventy-five percent, 100%, 200% and 400% of the original speed.