However, these AI algorithms cannot explain the thought processes behind their decisions. A computer that masters protein folding and also tells researchers more about the rules of biology is much more useful than a computer that folds proteins without explanation.
Therefore, AI researchers like me are now turning our efforts toward developing AI algorithms that can explain themselves in a manner that humans can understand. If we can do this, I believe that AI will be able to uncover and teach people new facts about the world that have not yet been discovered, leading to new innovations.
Researchers at the Paul Scherrer Institute PSI have put forward a detailed plan of how faster and better defined quantum bits — qubits — can be created. The central elements are magnetic atoms from the class of so-called rare-earth metals, which would be selectively implanted into the crystal lattice of a material. Each of these atoms represents one qubit. The researchers have demonstrated how these qubits can be activated, entangled, used as memory bits, and read out. They have now published their design concept and supporting calculations in the journal PRX Quantum.
On the way to quantum computers, an initial requirement is to create so-called quantum bits or “qubits”: memory bits that can, unlike classical bits, take on not only the binary values of zero and one, but also any arbitrary combination of these states. “With this, an entirely new kind of computation and data processing becomes possible, which for specific applications means an enormous acceleration of computing power,” explains PSI researcher Manuel Grimm, first author of a new paper on the topic of qubits.
Scientists at the U.S. Department of Energy’s Ames Laboratory and collaborators at Brookhaven National Laboratory and the University of Alabama at Birmingham have discovered a new light-induced switch that twists the crystal lattice of the material, switching on a giant electron current that appears to be nearly dissipationless. The discovery was made in a category of topological materials that holds great promise for spintronics, topological effect transistors, and quantum computing.
Weyl and Dirac semimetals can host exotic, nearly dissipationless, electron conduction properties that take advantage of the unique state in the crystal lattice and electronic structure of the material that protects the electrons from doing so. These anomalous electron transport channels, protected by symmetry and topology, don’t normally occur in conventional metals such as copper. After decades of being described only in the context of theoretical physics, there is growing interest in fabricating, exploring, refining, and controlling their topologically protected electronic properties for device applications. For example, wide-scale adoption of quantum computing requires building devices in which fragile quantum states are protected from impurities and noisy environments. One approach to achieve this is through the development of topological quantum computation, in which qubits are based on “symmetry-protected” dissipationless electric currents that are immune to noise.
“Light-induced lattice twisting, or a phononic switch, can control the crystal inversion symmetry and photogenerate giant electric current with very small resistance,” said Jigang Wang, senior scientist at Ames Laboratory and professor of physics at Iowa State University. “This new control principle does not require static electric or magnetic fields, and has much faster speeds and lower energy cost.”
Direct observation of an ion moving through a Bose-Einstein condensate identifies the effect of ion-atom collisions on charge transport in an ultracold gas.
When you expose mobile electrical charges in a medium to an electrical field, current flows. The charges are accelerated by the field, but collisions within the medium give rise to a kind of friction effect, which limits the velocity of the charges and thus the current. This universal concept, called diffusive transport, describes a large range of media, such as metallic conductors, electrolytic solutions, and gaseous plasmas. But in a quantum system, such as a superconductor or a superfluid, other collective effects can influence the transport through the medium. Now, a group led by Florian Meinert and Tilman Pfau both of the University of Stuttgart, Germany, have carried out charge-transport experiments with a single ion traversing a Bose-Einstein condensate (BEC), which is a quantum gas of cold neutral atoms [1]. The precise tracking of the ion shows that the transport is diffusive and reveals the character of the ion-atom collisions.
Martinus J.G. Veltman, a Dutch theoretical physicist who was awarded the Nobel Prize for work that explained the structure of some of the fundamental forces in the universe, helping to lay the groundwork for the development of the Standard Model, the backbone of quantum physics, died on Jan. 4 in Bilthoven, the Netherlands. He was 89.
His death was announced by the National Institute for Subatomic Physics in the Netherlands. No cause was given.
There are four known fundamental forces in the universe: gravity, electromagnetism, the strong force that bonds subatomic particles together, and the weak force that is responsible for particle decay. Since the discovery of the last two forces in the first half of the 20th century, physicists have looked for a unified theory that could account for the existence of all four.
A team of researchers affiliated with several institutions in China has used drones to create a prototype of a small airborne quantum network. In their paper published in the journal Physical Review Letters, the researchers describe sending entangled particles from one drone to another and from a drone to the ground.
Computer scientists, physicists and engineers have been working over the last several years toward building a usable quantum network —doing so would involve sending entangled particles between users and the result would be the most secure network ever made. As part of that effort, researchers have sent entangled particles over fiber cables, between towers and even from satellites to the ground. In this new effort, the researchers have added a new element—drones.
To build a long-range quantum network, satellites appear to be the ideal solution. But for smaller networks, such as for communications between users in the same city, another option is needed. While towers can be of some use, they are subject to weather and blockage, intentional or otherwise. To get around this problem, the researchers used drones to carry the signals.
Scientists at the Institute of Physics of the University of Tartu have found a way to develop optical quantum computers of a new type. Central to the discovery are rare earth ions that have certain characteristics and can act as quantum bits. These would give quantum computers ultrafast computation speed and better reliability compared to earlier solutions. The University of Tartu researchers Vladimir Hizhnyakov, Vadim Boltrushko, Helle Kaasik and Yurii Orlovskii published the results of their research in the scientific journal Optics Communications.
While in ordinary computers, the units of information are binary digits or bits, in quantum computers the units are quantum bits or qubits. In an ordinary computer, information is mostly carried by electricity in memory storage cells consisting of field-effect transistors, but in a quantum computer, depending on the type of computer, the information carriers are much smaller particles, for example ions, photons and electrons. The qubit information may be carried by a certain characteristic of this particle (for example, spin of electron or polarization of photon), which may have two states. While the values of an ordinary bit are 0 or 1, also intermediate variants of these values are possible in the quantum bit. The intermediate state is called the superposition. This property gives quantum computers the ability to solve tasks, which ordinary computers are unable to perform within reasonable time.
The most recent observations at both quantum and cosmological scales are casting serious doubts on our current models. For instance, at quantum scale, the latest electronic hydrogen proton radius measurement resulted in a much smaller radius than the one predicted by the standard model of particles physics, which now is off by 4%. At cosmological scale, the amount of observations regarding black holes and galactic formation heading in the direction of a radically different cosmological model, is overwhelming. Black holes have shown being much older than their hosting galaxies, galactic formation is much younger than our models estimates, and there is evidence of at least 64 black holes aligned with respect to their axis of rotation, suggesting the presence of a large scale spatial coherence in angular momentum that is impossible to predict with our current models. Under such scenario, it should not fall as a surprise the absence of a better alternative to unify quantum theory and relativity, and thus connect the very small to the very big, than the idea that the universe is actually a neural network. And for this reason, a theory of everything would be based on it.
As explained in Targemann’s interview to Vanchurin on Futurism, the work of Vanchurin, proposes that we live in a huge neural network that governs everything around us.
“it’s a possibility that the entire universe on its most fundamental level is a neural network… With this respect it could be considered as a proposal for the theory of everything, and as such it should be easy to prove it wrong”. Vitaly Vanchurin The idea was born when he was studying deep machine learning. He wrote the book “Towards a theory of machine learning”, in order to apply the methods of statistical mechanics to study the behavior of neural networks, and he saw that in certain limits the learning (or training) dynamics of neural networks is very similar to the quantum dynamics. So, he decided to explore the idea that the physical world is a neural network.
The incredible physics behind quantum computing. Watch the newest video from Big Think: https://bigth.ink/NewVideo. Learn skills from the world’s top minds at Big Think Edge: https://bigth.ink/Edge. ——————————————————————————— While today’s computers—referred to as classical computers—continue to become more and more powerful, there is a ceiling to their advancement due to the physical limits of the materials used to make them. Quantum computing allows physicists and researchers to exponentially increase computation power, harnessing potential parallel realities to do so.
Quantum computer chips are astoundingly small, about the size of a fingernail. Scientists have to not only build the computer itself but also the ultra-protected environment in which they operate. Total isolation is required to eliminate vibrations and other external influences on synchronized atoms; if the atoms become ‘decoherent’ the quantum computer cannot function.
“You need to create a very quiet, clean, cold environment for these chips to work in,” says quantum computing expert Vern Brownell. The coldest temperature possible in physics is-273.15 degrees C. The rooms required for quantum computing are-273.14 degrees C, which is 150 times colder than outer space. It is complex and mind-boggling work, but the potential for computation that harnesses the power of parallel universes is worth the chase.
Check Chris Bernhardt’s book “Quantum Computing for Everyone (MIT Press)” at http://amzn.to/3nSg5a8 ——————————————————————————— TRANSCRIPT:
MICHIO KAKU: Years ago, we physicists predicted the end of Moore’s Law, which says a computer power doubles every 18 months. But we also, on the other hand, proposed a positive program—perhaps molecular computers, quantum computers can take over when silicon power is exhausted. In fact, already we see a slowing down of Moore’s Law. Computer power simply cannot maintain its rapid exponential rise using standard silicon technology. The two basic problems are heat and leakage. That’s the reason why the age of silicon will eventually come to a close. No one knows when, but as I mentioned we already now can see the slowing down of Moore’s Law, and in 10 years it could flatten out completely. So what’s the problem? The problem is that a Pentium chip today has a layer almost down to 20 atoms across, 20 atoms across. When that layer gets down to about five atoms across, it’s all over. You have two effects, heat. The heat generated will be so intense that the chip will melt. You can literally fry an egg on top of the chip, and the chip itself begins to disintegrate. And second of all, leakage. You don’t know where the electron is anymore. The quantum theory takes over. The Heisenberg Uncertainty Principle says you don’t know where that electron is anymore, meaning it could be outside the wire, outside the Pentium chip or inside the Pentium chip. So there is an ultimate limit set by the laws of thermodynamics and set by the laws of quantum mechanics, as to how much computing power you can do with silicon.
VERN BROWNELL: I refer to today’s computers as classical computers. They compute largely in the same way they have for the past 60 or 70 years, since John Von Neumann and others invented the first electronic computers back in the ‘40s. And we’ve had amazing progress over those years. Think of all the developments there’ve been on the hardware side and the software side over those 60 or 70 years and how much energy and development has been put into those areas. And we’ve achieved marvelous things with that classical computing environment, but it has its limits too, and people sometimes ask, “Why would we need any more powerful computers?” These applications, these problems that we’re trying to solve, are incredibly hard problems and aren’t well-suited for the architecture of classical computing. So I see quantum computing as another set of tools, another set of resources for scientists, researchers, computer scientists, programmers, to develop and enhance some of these capabilities to really change the world in a much better way than we’re able to today with classical computers.