Toggle light / dark theme

A high-tech version of an old-fashioned balance scale at the National Institute of Standards and Technology (NIST) has just brought scientists a critical step closer toward a new and improved definition of the kilogram. The scale, called the NIST-4 watt balance, has conducted its first measurement of a fundamental physical quantity called Planck’s constant to within 34 parts per billion — demonstrating the scale is accurate enough to assist the international community with the redefinition of the kilogram, an event slated for 2018.

The redefinition-which is not intended to alter the value of the kilogram’s mass, but rather to define it in terms of unchanging fundamental constants of nature-will have little noticeable effect on everyday life. But it will remove a nagging uncertainty in the official kilogram’s mass, owing to its potential to change slightly in value over time, such as when someone touches the metal artifact that currently defines it.

Planck’s constant lies at the heart of quantum mechanics, the theory that is used to describe physics at the scale of the atom and smaller. Quantum mechanics began in 1900 when Max Planck described how objects radiate energy in tiny packets known as “quanta.” The amount of energy is proportional to a very small quantity called h, known as Planck’s constant, which subsequently shows up in almost all equations in quantum mechanics. The value of h — according to NIST’s new measurement — is 6.62606983×10−34 kg?m2/s, with an uncertainty of plus or minus 22 in the last two digits.

Read more

Robert Dunleavy had just started his sophomore year at Lehigh University when he decided he wanted to take part in a research project. He sent an email to Bryan Berger, an assistant professor of chemical and biomolecular engineering, who invited Dunleavy to his lab.

Berger and his colleagues were conducting experiments on tiny semiconductor particles called quantum dots. The optical and electronic properties of QDs make them useful in lasers, light-emitting diodes (LEDs), medical imaging, solar cells, and other applications.

Dunleavy joined Berger’s group and began working with cadmium sulfide (CdS), one of the compounds from which QDs are fabricated. The group’s goal was to find a better way of producing CdS quantum dots, which are currently made with toxic chemicals in an expensive process that requires high pressure and temperature.

Read more

Scientists have mixed a molecule with light between gold particles, creating a new way to manipulate the physical and chemical properties of matter.

Light and matter are usually separate and have distinct properties. However, molecules of matter can emit particles of light called photons. Normally, emitted photons leave the molecule and the two do not mix again.

Now, scientists have trapped a single molecule in such a tiny space that when it emits a photon, the photon cannot escape. This produces an oscillation of energy between the molecule and the photon, creating a mixing of the properties of matter and light.

Read more

Over 20 years ago, I was interviewed by a group that asked me about the future of technology. I told them due to advancements such as nanotechnology that technology will definitely go beyond laptops, networks, servers, etc.; that we would see even the threads/ fibers in our clothing be digitized. I was then given a look by the interviewers that I must have walked of the planet Mars. However, I was proven correct. And, in the recent 10 years, again I informed others how and where Quantum would change our lives forever. Again, same looks and comments.

And, lately folks have been coming out with articles that they have spoken with or interviewed QC experts. And, they in many cases added their own commentary and cherry picked people comments to discredit the efforts of Google, D-Wave, UNSW, MIT, etc. which is very misleading and negatively impacts QC efforts. When I come across such articles, I often share where and why the authors have misinformed their readers as well as negatively impacted efforts and set folks up for failure who should be trying to plan for QC in their longer term future state strategy so that they can plan for budgets, people can be brought up to date in their understanding of QC because once QC goes live on a larger scale, companies and governments will not have time to catch up because once hackers (foreign government hackers, etc.) have this technology and you’re not QC enabled then you are exposed, and your customers are exposed. The QC revolution will be costly and digital transformation in general across a large company takes years to complete so best to plan and prepare early this time for QC because it is not the same as implementing a new cloud, or ERP, or a new data center, or rationalizing a silo enterprise environment.

The recent misguided view is that we’re 30 or 50 years away from a scalable quantum chip; and that is definitely incorrect. UNSW has proven scalable QC is achievable and Google has been working on making a scalable QC chip. And, lately RMIT researchers have shared with us how they have proven method to be able to trace particles in the deepest layers of entanglement which means that we now can build QC without the need of analog technology and take full advantage of quantum properties in QC which has not been the case.

So, sharing these three news releases for my QC friends to share with their non-believers and the uninformed.

http://www.zdnet.com/article/googles-quantum-computer-inches…akthrough/

Congrats to David Dean fellow Oak Ridge researcher and leader in ORNL’s efforts on this impressive research.


OAK RIDGE, Tenn., June XX, 2016—Soon to be deployed at the Department of Energy’s Oak Ridge National Laboratory is an experiment to explore new physics associated with neutrinos. The Precision Oscillation and Spectrum Experiment, or PROSPECT, is led by Yale University and includes partners from 14 academic and governmental institutions. The DOE High Energy Physics program will support the experiment at the High Flux Isotope Reactor (HFIR), a DOE Office of Science User Facility at ORNL. The neutrino, the subject of a 2015 Nobel Prize, remains a poorly understood fundamental particle of the Standard Model of particle physics.

These electrically neutral subatomic particles are made in stars and nuclear reactors as a byproduct of radioactive decay processes. They interact with other matter via the weak force, making their detection difficult. As a result of this elusiveness, neutrinos are the subject of many interesting and challenging detection experiments, including PROSPECT.

“Unique capabilities of ORNL will enable us to broaden the understanding of neutrino properties,” said David Dean, director of ORNL’s Physics Division. “The expansion of neutrino experiments at Oak Ridge National Laboratory is a win for the lab because we have a new scientific focus area, and a win for the scientific community because ORNL has unique neutrino sources that physicists will utilize to explore neutrino science.”

Nice.


Chapman University Institute for Quantum Studies (IQS) member Yutaka Shikano, Ph.D., recently had research published in Scientific Reports. Superconductors are one of the most remarkable phenomena in physics, with amazing technological implications. Some of the technologies that would not be possible without superconductivity are extremely powerful magnets that levitate trains and MRI machines used to image the human body. The reason that superconductivity arises is now understood as a fundamentally quantum mechanical effect.

The basic idea of quantum mechanics is that at the microscopic scale everything, including matter and light, has a wave property to it. Normally the wave nature is not noticeable as the waves are very small, and all the waves are out of synchronization with each other, so that their effects are not important. For this reason, to observe quantum mechanical behavior experiments generally have to be performed at a very low temperature, and at microscopic length scales.

Superconductors, on the other hand, have a dramatic effect in the disappearance of resistance, changing the entire property of the material. The key quantum effect that occurs is that the quantum waves become highly synchronized and occur at a macroscopic level. This is now understood to be the same basic effect as that seen in lasers. The similarity is that in a laser, all the photons making up the light are synchronized, and appear as one single coherent wave. In a superconductor the macroscopic wave is for the quantum waves of the electrons, instead of the photons, but the basic quantum feature is the same. Such macroscopic quantum waves have also been observed in Bose-Einstein condensates, where atoms cooled to nanokelvin temperatures all collapse into a single state.

This is huge! They have been able to develop a method to trace high-dimensional entanglement.

Before this point, we had a method that could trace entanglement to limited level among particles; this method allows us to detect high-dimensional entanglement and even enable us to certify whether or not the system has reached the maximum level of entanglement.

So, we are now going to finally see “real” full-scale quantum computing. This changes everything.


RMIT quantum computing researchers have developed and demonstrated a method capable of efficiently detecting high-dimensional entanglement.

Entanglement in quantum physics is the ability of two or more particles to be related to each other in ways which are beyond what is possible in classical physics.

A walk down memory lane: I thought it would be fun to revisit an article from 1998 about Los Alamos’ announcement about their move to Quantum Computing which we found out later they expanded it to include a Quantum Network which they announced in 2009 their success in that launch. Times certainly have changed.


LOS ALAMOS, N.M., March 17, 1998 — Researchers at the Department of Energy’s Los Alamos National Laboratory have answered several key questions required to construct powerful quantum computers fundamentally different from today’s computers, they announced today at the annual meeting of the American Physical Society.

“Based on these recent experiments and theoretical work, it appears the barriers to constructing a working quantum computer will be technical, rather than fundamental to the laws of physics,” said Richard Hughes of Los Alamos’ Neutron Science and Technology Group.

Hughes also said that a quantum computer like the one Los Alamos is building, in which single ionized atoms act like a computer memory, could be capable of performing small computations within three years.

Read more