Toggle light / dark theme

With Moore’s Law in doubt, eyes turn to quantum computing

Moore’s Law was already identified as a problem regardless of Quantum. And, the move to Quantum happened regardless of Moores Law and the excitment around QC was not the result of Moores Law limitations. Just like all things, we evolve to better level of maturity.


The chip industry is giving another sign that Moore’s Law is coming to an end, but IBM is offering a glimpse at what might be computing’s future.

Industry experts from around the world who have been working together for years for forecast technology advances in the tech industry are throwing in the towel.

The next version of the International Technology Roadmap for Semiconductors, which is produced jointly by the semiconductor industry associations of the United States, Europe, Japan, South Korea and Taiwan, will be the last, the New York Times reported.

Theorists smooth the way to modeling quantum friction: New paradigm offers a strategy for solving one of quantum mechanics’ oldest problems

Princeton’s answer to Quantum friction.


Abstract: Theoretical chemists at Princeton University have pioneered a strategy for modeling quantum friction, or how a particle’s environment drags on it, a vexing problem in quantum mechanics since the birth of the field. The study was published in the Journal of Physical Chemistry Letters.

“It was truly a most challenging research project in terms of technical details and the need to draw upon new ideas,” said Denys Bondar, a research scholar in the Rabitz lab and corresponding author on the work.

Quantum friction may operate at the smallest scale, but its consequences can be observed in everyday life. For example, when fluorescent molecules are excited by light, it’s because of quantum friction that the atoms are returned to rest, releasing photons that we see as fluorescence. Realistically modeling this phenomenon has stumped scientists for almost a century and recently has gained even more attention due to its relevance to quantum computing.

Google supercharges machine learning tasks with TPU custom chip

Posted by norm jouppi, distinguished hardware engineer, google.

Machine learning provides the underlying oomph to many of Google’s most-loved applications. In fact, more than 100 teams are currently using machine learning at Google today, from Street View, to Inbox Smart Reply, to voice search.

But one thing we know to be true at Google: great software shines brightest with great hardware underneath. That’s why we started a stealthy project at Google several years ago to see what we could accomplish with our own custom accelerators for machine learning applications.

New type of graphene-based transistor will increase the clock speed of processors

New graphene transistor makes for a faster processor.


Scientists have developed a new type of graphene-based transistor and using modelling they have demonstrated that it has ultralow power consumption compared with other similar transistor devices. The findings have been published in a paper in the journal Scientific Reports. The most important effect of reducing power consumption is that it enables the clock speed of processors to be increased. According to calculations, the increase could be as high as two orders of magnitude.

“The point is not so much about saving electricity — we have plenty of electrical energy. At a lower power, electronic components heat up less, and that means that they are able to operate at a higher clock speed — not one gigahertz, but ten for example, or even one hundred,” says the corresponding author of the study, the head of MIPT’s Laboratory of Optoelectronics and Two-Dimensional Materials, Dmitry Svintsov.

A&S Physicist Awarded IBM Grant to Develop Quantum Computing

So, IBM is giving grant money to A&S to build a Quantum Computer. Hmmm; so IBM announced they had a Quantum Computer and computing services via cloud. Guessing IBM has a pseudo version of QC given this move.


A physicist in the College of Arts and Sciences has been awarded a major grant to help develop quantum computing technology.

Britton Plourde, associate professor of physics, is using a three-year, $900,000 grant from IBM to conduct research for the LogiQ Program. LogiQ is part of the Intelligence Advanced Research Projects Activity (IARPA), based in the Office of the Director of National Intelligence.

Short for “Logical Qubits,” LogiQ studies advanced and alternative computing platforms, quantum information science and qubit systems. LogiQ seeks to build the world’s first logical qubit, capable of storing quantum information immune to environmental influence or error.

QC will change many industries and even some fortunes as well

QC will change many industries and even some fortunes as well. So, no wonders Canada & Australia both deem it as a priority.


Mike Lazaridis, founder of Blackberry Limited and the visionary who led the establishment of the Perimeter Institute for Theoretical Physics (PI), the Institute for Quantum Computing (IQC) at the University of Waterloo and Quantum Valley Investments, delivered a keynote address highlighting the Quantum Valley model in Waterloo Region, Ontario, Canada and the emphasis both federal and provincial governments have placed on the development of quantum technologies.

The Quantum Europe conference comes at a time when large scale investments from tech companies and governments around the world, including in Canada, are being made as part of the “Second Quantum Revolution” – a new global industry fueled by the commercialization of new transformative quantum technologies.

Mr. Lazaridis led a Canadian delegation to the Conference that included Lawrence Hanson, Assistant Deputy Minister, Innovation, Science and Economic Development Canada, Giles Gherson, Deputy Minister, Research and Innovation and Economic Development, Employment and Infrastructure Ontario and representatives from IQC and PI.

AI Research Tool Runs Experiment that won 2001 Nobel Prize in Physics

Australian physicists’ team has developed a new research assistant to carry out experiments in quantum mechanics in an artificial intelligence (AI) algorithm form, which quickly took control of the experiment, learned the job tasks and even innovated. In a statement, co-lead researcher Paul Wigley from the Australian National University (ANU) Research School of Physics and Engineering, said he didn’t expect that the machine would be able to conduct the experiment itself from scratch within an hour.

He added that in case a simple computer program had been used, it would have taken much more time than the age of the universe to go through all the combinations and work on it.

Scientists were looking forward to reconstruct an experiment that was awarded the 2001 Nobel Prize in Physics, which included very cold gas trapped in a laser beam called a Bose-Einstein condensate.

U.S. Navy’s SPAWAR will pay D-Wave $11 million for quantum computer training

US Navy paying D-Wave to train them on QC.


A division of the U.S. Navy intends to pay Canadian company D-Wave $11 million to learn how to use its quantum computing infrastructure, according to a federal filing posted online on Monday.

The unit seeking this training is the Navy’s Space and Naval Warfare Systems Center Pacific, known as SPAWAR or SSC-PAC for short, which is headquartered in San Diego and has previously researched amphibious throwable robots, unmanned aerial vehicles, virtual reality, and many other technologies. The filing does not actually cover the cost of quantum computing hardware. But NASA has been allowing SPAWAR scientists to learn how to use the D-Wave machine that it operates with Google at the NASA Ames Research Center, the San Diego Union-Tribune reported last month.

Quantum computers employ quantum bits, or qubits, each of which can be zero or one or both, unlike the regular bits in classical computers. The superposition of qubits lets machines perform great numbers of computations at once, making a quantum computer highly desirable for certain types of processes. Google recently found that quantum annealing with D-Wave hardware is 100 times faster than simulated annealing on a classic computer chip.

Can We Receive Messages from the Future?

About ten years ago scientist Dave Bacon, now at Google, presented that a time-travelling quantum computer could rapidly solve a bunch of problems, known as NP-complete, which mathematicians have lumped together as being hard. The problem was, Bacon’s quantum computer was travelling around ‘closed timelike curves’. These are paths through the fabric of spacetime that loop back on themselves. General relativity lets such paths to exist through contortions in spacetime identified as wormholes.

Why send a message back in time, but lock it so that no one can ever read the contents? As it may be the key to resolving presently intractable problems. That’s the claim of an international collaboration.

/* */