By testing the boundaries of reality, Spanish-language authors have created a sublime counterpart to experimental physics.
Category: physics – Page 20
Non-personalized content and ads are influenced by things like the content you’re currently viewing and your location (ad serving is based on general location). Personalized content and ads can also include things like video recommendations, a customized YouTube homepage, and tailored ads based on past activity, like the videos you watch and the things you search for on YouTube. We also use cookies and data to tailor the experience to be age-appropriate, if relevant.
Select “More options” to see additional information, including details about managing your privacy settings. You can also visit g.co/privacytools at any time.
Einstein’s theory of gravity, general relativity, has passed all tests with predictions that are spot-on. One prediction that remains is “gravitational wave memory”—the prediction that a passing gravitational wave will permanently change the distance between cosmic objects.
Supernovae—collapsing stars that explode outward—are thought to be generators of gravitational waves, though none have yet been definitively detected by the gravitational wave interferometers on Earth. Nor has the gravitational wave memory effect been seen, from mergers or supernovae, due to the limited sensitivity of interferometers below wave frequencies of 10 hertz.
But now a new study presents an approach to detecting the effect using currently existing gravitational wave observatories. The paper is published in Physical Review Letters.
A group of Brazilian researchers has presented an innovative proposal to resolve a decades-old debate among theoretical physicists: How many fundamental constants are needed to describe the observable universe? Here, the term “fundamental constants” refers to the basic standards needed to measure everything.
The study is published in the journal Scientific Reports.
The group argues that the number of fundamental constants depends on the type of space-time in which the theories are formulated; and that in a relativistic space-time, this number can be reduced to a single constant, which is used to define the standard of time. The study is an original contribution to the controversy sparked in 2002 by a famous article by Michael Duff, Lev Okun and Gabriele Veneziano published in the Journal of High Energy Physics.
The notion of entropy grew out of an attempt at perfecting machinery during the industrial revolution. A 28-year-old French military engineer named Sadi Carnot set out to calculate the ultimate efficiency of the steam-powered engine. In 1824, he published a 118-page book(opens a new tab) titled Reflections on the Motive Power of Fire, which he sold on the banks of the Seine for 3 francs. Carnot’s book was largely disregarded by the scientific community, and he died several years later of cholera. His body was burned, as were many of his papers. But some copies of his book survived, and in them lay the embers of a new science of thermodynamics — the motive power of fire.
Carnot realized that the steam engine is, at its core, a machine that exploits the tendency for heat to flow from hot objects to cold ones. He drew up the most efficient engine conceivable, instituting a bound on the fraction of heat that can be converted to work, a result now known as Carnot’s theorem. His most consequential statement comes as a caveat on the last page of the book: “We should not expect ever to utilize in practice all the motive power of combustibles.” Some energy will always be dissipated through friction, vibration, or another unwanted form of motion. Perfection is unattainable.
Reading through Carnot’s book a few decades later, in 1865, the German physicist Rudolf Clausius coined a term for the proportion of energy that’s locked up in futility. He called it “entropy,” after the Greek word for transformation. He then laid out what became known as the second law of thermodynamics: “The entropy of the universe tends to a maximum.”
Physicists of the era erroneously believed that heat was a fluid (called “caloric”). Over the following decades, they realized heat was rather a byproduct of individual molecules bumping around. This shift in perspective allowed the Austrian physicist Ludwig Boltzmann to reframe and sharpen the idea of entropy using probabilities.
Boltzmann distinguished the microscopic properties of molecules, such as their individual locations and velocities, from bulk macroscopic properties of a gas like temperature and pressure…
The notion of entropy grew out of an attempt at perfecting machinery during the industrial revolution. A 28-year-old French military engineer named Sadi Carnot set out to calculate the ultimate efficiency of the steam-powered engine. In 1824, he published a 118-page book(opens a new tab) titled Reflections on the Motive Power of Fire, which he sold on the banks of the Seine for 3 francs. Carnot’s book was largely disregarded by the scientific community, and he died several years later of cholera. His body was burned, as were many of his papers. But some copies of his book survived, and in them lay the embers of a new science of thermodynamics — the motive power of fire.
Carnot realized that the steam engine is, at its core, a machine that exploits the tendency for heat to flow from hot objects to cold ones. He drew up the most efficient engine conceivable, instituting a bound on the fraction of heat that can be converted to work, a result now known as Carnot’s theorem. His most consequential statement comes as a caveat on the last page of the book: “We should not expect ever to utilize in practice all the motive power of combustibles.” Some energy will always be dissipated through friction, vibration, or another unwanted form of motion. Perfection is unattainable.
Reading through Carnot’s book a few decades later, in 1865, the German physicist Rudolf Clausius coined a term for the proportion of energy that’s locked up in futility. He called it “entropy,” after the Greek word for transformation. He then laid out what became known as the second law of thermodynamics: “The entropy of the universe tends to a maximum.”
Physicists of the era erroneously believed that heat was a fluid (called “caloric”). Over the following decades, they realized heat was rather a byproduct of individual molecules bumping around. This shift in perspective allowed the Austrian physicist Ludwig Boltzmann to reframe and sharpen the idea of entropy using probabilities.
Boltzmann distinguished the microscopic properties of molecules, such as their individual locations and velocities, from bulk macroscopic properties of a gas like temperature and pressure…
Google DeepMind, Google’s flagship AI research lab, wants to beat OpenAI at the video-generation game — and it might just, at least for a little while.
When analyzing artworks, understanding the visual clarity of compositions is crucial. Inspired by digital artists, Okinawa Institute of Science and Technology (OIST) researchers from the Mechanics and Materials Unit have created a metric to quantify clarity in digital images. As a result, scientists can accurately capture changes in structure during artistic processes and physical transformations.
This new metric can improve analysis and decision-making across the scientific and creative domains, potentially transforming how we understand and evaluate the structure of images. It has been tested on digital artworks and physical systems. The research is published in the journal PNAS.
In case dark matter didn’t seem mysterious enough, a new study proposes that it could have arisen before the Big Bang.
Conventional thinking goes that the Big Bang was the beginning of everything – matter, dark matter, space, energy, all of it. After the event itself, the Universe went through a period of cosmic inflation, which saw its size swell by a factor of 10 septillion within an unfathomable fraction of a second.
But some theories suggest that this inflation period actually occurred before what we call the Big Bang. And now, physicists at the University of Texas (UT) at Austin have proposed that dark matter was formed during this brief window.
The reliable control of traveling waves emerging from the coupling of oscillations and diffusion in physical, chemical and biological systems is a long-standing challenge within the physics community. Effective approaches to control these waves help to improve the present understanding of reaction-diffusion systems and their underlying dynamics.
Researchers at Université libre de Bruxelles (ULB) and Université de Rennes recently demonstrated a promising approach to control chemical waves in a type of fluid flow known as hyperbolic flow. Their experimental methods, outlined in Physical Review Letters recently, entail the control of chemical waves via the stretching and compression of fluids.
“At a summer school in Corsica, discussions between the Brussels and Rennes team triggered the curiosity to see how chemical waves studied at ULB in Brussels would behave in hyperbolic flows analyzed in Rennes,” Anne De Wit, senior author of the paper, told Phys.org. “The primary objective was to see how a non-trivial flow would influence the dynamics of waves.”