An old thought experiment now appears in a new light. In 1935 Erwin Schrödinger formulated a thought experiment designed to capture the paradoxical nature of quantum physics. A group of researchers led by Gerhard Rempe, Director of the Department of Quantum Dynamics at the Max Planck Institute of Quantum Optics, has now realized an optical version of Schrödinger’s thought experiment in the laboratory. In this instance, pulses of laser light play the role of the cat. The insights gained from the project open up new prospects for enhanced control of optical states, that can in the future be used for quantum communications.
Aeronautics giant Airbus today announced that it is creating a global competition to encourage developers to find ways quantum computing can be applied to aircraft design.
Quantum computing is one of many next-generation computing architectures being explored as engineers worry that traditional computing is reaching its physical limits.
Computers today process information using bits, either 0s or 1s, stored in electrical circuits made up of transistors. Quantum computers harness the power of quantum systems, such as atoms that can simultaneously exist in multiple states and can be used as “quantum bits” or “qubits.” These can theoretically handle far more complex calculations.
An international research team led by the University of Liverpool and McMaster University has made a significant breakthrough in the search for new states of matter.
CERN has revealed plans for a gigantic successor of the giant atom smasher LHC, the biggest machine ever built. Particle physicists will never stop to ask for ever larger big bang machines. But where are the limits for the ordinary society concerning costs and existential risks?
CERN boffins are already conducting a mega experiment at the LHC, a 27km circular particle collider, at the cost of several billion Euros to study conditions of matter as it existed fractions of a second after the big bang and to find the smallest particle possible – but the question is how could they ever know? Now, they pretend to be a little bit upset because they could not find any particles beyond the standard model, which means something they would not expect. To achieve that, particle physicists would like to build an even larger “Future Circular Collider” (FCC) near Geneva, where CERN enjoys extraterritorial status, with a ring of 100km – for about 24 billion Euros.
Experts point out that this research could be as limitless as the universe itself. The UK’s former Chief Scientific Advisor, Prof Sir David King told BBC: “We have to draw a line somewhere otherwise we end up with a collider that is so large that it goes around the equator. And if it doesn’t end there perhaps there will be a request for one that goes to the Moon and back.”
“There is always going to be more deep physics to be conducted with larger and larger colliders. My question is to what extent will the knowledge that we already have be extended to benefit humanity?”
This was the first part in an interview series with Scott Aaronson — this one is on quantum computing — other segments are on Existential Risk, consciousness (including Scott’s thoughts on IIT) and thoughts on whether the universe is discrete or continuous.
First part in an interview series with Scott Aaronson — this one is on quantum computing — future segments will be on Existential Risk, consciousness (including Scott’s thoughts on IIT) and thoughts on whether the universe is discrete or continuous.
Interview with Scott Aaronson — covering whether quantum computers could have subjective experience, whether information is physical and what might be important for consciousness — he touches on classic philosophical conundrums and the observation that while people want to be thorough-going materialists, unlike traditional computers brain-states are not obviously copyable. Aaronson wrote about this his paper ‘The Ghost in the Quantum Turing Machine’ (found here https://arxiv.org/abs/1306.0159). Scott also critiques Tononi’s integrated information theory (IIT).
Scott discusses whether quantum computers could have subjective experience, whether information is physical and what might be important for consciousness — he touches on classic philosophical conundrums and the observation that while people want to be thorough-going materialists, unlike traditional computers brain-states are not obviously copyable. Aaronson wrote about this his paper ‘The Ghost in the Quantum Turing Machine’ (found here https://arxiv.org/abs/1306.0159). Scott also critiques Tononi’s integrated information theory (IIT).
Back in the first moment of the universe, everything was hot and dense and in perfect balance. There weren’t any particles as we’d understand them, much less any stars or even the vacuum that permeates space today. The whole of space was filled with homogeneous, formless, compressed stuff.
Then, something slipped. All that monotonous stability became unstable. Matter won out over its weird cousin, antimatter, and came to dominate the whole of space. Clouds of that matter formed and collapsed into stars, which became organized into galaxies. Everything that we know about started to exist.
Electronegativity is one of the most well-known models for explaining why chemical reactions occur. Now, Martin Rahm from Chalmers University of Technology, Sweden, has redefined the concept with a new, more comprehensive scale. His work, undertaken with colleagues including a Nobel Prize-winner, has been published in the Journal of the American Chemical Society.
The theory of electronegativity is used to describe how strongly different atoms attract electrons. By using electronegativity scales, one can predict the approximate charge distribution in different molecules and materials, without needing to resort to complex quantum mechanical calculations or spectroscopic studies. This is vital for understanding all kinds of materials, as well as for designing new ones. Used daily by chemists and materials researchers all over the world, the concept originates from Swedish chemist Jöns Jacob Berzelius’ research in the 19th century and is widely taught at high-school level.
Now, Martin Rahm, Assistant Professor in Physical Chemistry at Chalmers University of Technology, has developed a brand-new scale of electronegativity.
The production of entropy, which means increasing the degree of disorder in a system, is an inexorable tendency in the macroscopic world owing to the second law of thermodynamics. This makes the processes described by classical physics irreversible and, by extension, imposes a direction on the flow of time. However, the tendency does not necessarily apply in the microscopic world, which is governed by quantum mechanics. The laws of quantum physics are reversible in time, so in the microscopic world, there is no preferential direction to the flow of phenomena.
One of the most important aims of contemporary scientific research is knowing exactly where the transition occurs from the quantum world to the classical world and why it occurs — in other words, finding out what makes the production of entropy predominate. This aim explains the current interest in studying mesoscopic systems, which are not as small as individual atoms but nevertheless display well-defined quantum behavior.