Toggle light / dark theme

Like this feature on QC.


If you have trouble wrapping your mind around quantum physics, don’t worry — it’s even hard for supercomputers. The solution, according to researchers from Google, Harvard, Lawrence Berkeley National Laboratories and others? Why, use a quantum computer, of course. The team accurately predicted chemical reaction rates using a supercooled quantum circuit, a result that could lead to improved solar cells, batteries, flexible electronics and much more.

Chemical reactions are inherently quantum themselves — the team actually used a quote from Richard Feynman saying “nature isn’t classical, dammit.” The problem is that “molecular systems form highly entangled quantum superposition states, which require many classical computing resources in order to represent sufficiently high precision,” according to the Google Research blog. Computing the lowest energy state for propane, a relatively simple molecule, takes around ten days, for instance. That figure is required in order to get the reaction rate.

That’s where the “Xmon” supercooled qubit quantum computing circuit (shown above) comes in. The device, known as a “variational quantum eigensolver (VQE)” is the quantum equivalent of a classic neural network. The difference is that you train a classical neural circuit (like Google’s DeepMind AI) to model classical data, and train the VQE to model quantum data. “The quantum advantage of VQE is that quantum bits can efficiently represent the molecular wave function, whereas exponentially many classical bits would be required.”

Luv it; more believers.


Quantum computers promise to enable faster, far more complex calculations than today’s silicon chip-based computers. But they also raise the possibility that future computers could retroactively break the security of any digital communications that exist today, which is why Google is experimenting with something called “post-quantum cryptography.”

While quantum computer development remains in its early stages, some such computers are already in operation. In theory, future generations of quantum computers could “decrypt any Internet communication that was recorded today, and many types of information need to remain confidential for decades,” software engineer Matt Braithwaite wrote yesterday in a post on Google’s security blog. “Thus even the possibility of a future quantum computer is something that we should be thinking about today.”

Preventing potential nightmares for cryptographers and security organizations will require post-quantum cryptography, Braithwaite said. But Google is far from the only organization researching the possibilities.

I shared this yesterday; however, another article with another spin (no pun intended)


Working at the Massachusetts Institute of Technology’s (MIT) Fermilab physics laboratory in Illinois, a team of physicists studied the states of neutrinos, among the smallest components of an atom.

Neutrinos are pretty inert, passing straight through matter and rarely interacting with it and require extremely sensitive equipment to be picked up.

More information on DARPA’s efforts in build new interface standards for modular design & practical circuit blocks.


Is it possible to develop chip technology that combines the high-performance characteristics of ASICS with the speedy, low-cost features of printed circuit boards?

Scientists at the Defense Advanced Research Projects Agency this week said they were looking for information on how to build interface standards that would enable modular design and practical circuit blocks that could be reused to greatly shorten electronics development time and cost.

+More on Network World: DARPA: Researchers develop chip part that could double wireless frequency capacity +

According to our best theories of physics, the universe is a fixed block where time only appears to pass. Yet if the flow of time is an illusion, how do we account for the distinction between past, present and future? In June, 60 physicists gathered for four days at the Perimeter Institute for Theoretical Physics to debate this another questions about the mysteries of time.

Read more

Open the hood of just about any electronic gadget and you probably will find printed circuit boards (PCBs)—most often in a leaf-green color—studded with processing, memory, data-relaying, graphics, and other types of chips and components, all interconnected with a labyrinth of finely embossed wiring. By challenging the technology community to integrate the collective functions hosted by an entire PCB onto a device approaching the size of a single chip, DARPA’s newest program is making a bid to usher in a fresh dimension of technology miniaturization.

“We are trying to push the massive amount of integration you typically get on a printed circuit board down into an even more compact format,” said Dr. Daniel Green, manager of the new program, whose acronym, “CHIPS,” is itself a typographic feat of miniaturization; the program’s full name is the Common Heterogeneous Integration and Intellectual Property (IP) Reuse Strategies Program. “It’s not just a fun acronym,” Green said. “The program is all about devising a physical library of component chips, or chiplets, that we can assemble in a modular fashion.”

A primary driver of CHIPS is to develop a novel, industry-friendly architectural strategy for designing and building new generations of microsystems in which the time and energy it takes to move signals—that is, data—between chips is reduced by factors of tens or even hundreds. “This is increasingly important for the data-intensive processing that we have to do as the data sets we are dealing with get bigger and bigger,” Green said. Although the program does not specify applications, the new architectural strategy at the program’s heart could open new routes to computational efficiencies required for such feats as identifying objects and actions in real-time video feeds, real-time language translation, and coordinating motion on-the-fly among swarms of fast-moving unmanned aerial vehicles (UAVs).

Read more

Every day, modern society creates more than a billion gigabytes of new data. To store all this data, it is increasingly important that each single bit occupies as little space as possible. A team of scientists at the Kavli Institute of Nanoscience at Delft University managed to bring this reduction to the ultimate limit: they built a memory of 1 kilobyte (8,000 bits), where each bit is represented by the position of one single chlorine atom.

“In theory, this storage density would allow all books ever created by humans to be written on a single post stamp”, says lead-scientist Sander Otte.

Read More on Delft University

Read more