Toggle light / dark theme

It seems like over the past few years, Quantum is being talked about more and more. We’re hearing words like qubits, entanglement, super position, and quantum computing. But what does that mean … and is quantum science really that big of a deal? Yeah, it is.

It’s because Quantum science has the potential to revolutionize our world. From processing data to predicting weather, to picking stocks or even discovering new medical drugs. Quantum, specifically quantum computers, could solve countless problems.

Dr. Heather Masson-Forsythe, an AAAS Science \& Technology Fellow in NSF’s Directorate for Computer and Information Science and Engineering, hosts this future-forward episode.

Featured guests include (in order of appearance):
Dr. Spiros Michalakis, the manager of outreach and a staff researcher at Caltech’s Institute for Quantum Information and Matter, an NSF Physics Frontiers Center.

Dolev Bluvstein, a doctoral student at Harvard University, working in the Lukin Group at the Quantum Optics Laboratory.

Dr. Scott Aaronson, Schlumberger Chair of Computer Science at The University of Texas at Austin and director of its Quantum Information Center.

In an interview Dr Stephanie Simmons, Chief Quantum Officer of Photonic, explains the need to scale quantum computers and their approach to tackling this challenge to pave the way for reliable, large-scale quantum computing.

For quantum computers to move from laboratory to commercialization, these devices will need to scale to millions of qubits.

Scaling quantum computers is critical to unlocking exponential speed-ups to help solve some of the world’s biggest problems and unlock its greatest opportunities, said Stephanie Simmons, CQO of Photonic, a company focused on using its photonically linked spin qubits in silicon to build a scalable, fault-tolerant and distributed quantum system.

The expansion of the universe has been a well-established fact of physics for almost a century. By the turn of the millennium the rate of this expansion, referred to as the Hubble constant (H 0), had converged to a value of around 70 km s –1 Mpc –1. However, more recent measurements have given rise to a tension: whereas those derived from the cosmic microwave background (CMB) cluster around a value of 67 km s –1 Mpc –1, direct measurements using a local distance-ladder (such as those based on Cepheids) mostly prefer larger values around 73 km s –1 Mpc –1. This disagreement between early-and late-universe measurements, respectively, stands at the 4–5 σ level, thereby calling for novel measurements.

One such source of new information are large galaxy surveys, such as the one currently being performed by the Dark Energy Spectroscopic Instrument (DESI). This Arizona-based instrument uses 5,000 individual robots that optimise the focal plane of the detector to allow it to measure 5,000 galaxies at the same time. The goal of the survey is to provide a detailed 3D map, which can be used to study the evolution of the universe by focussing on the distance between galaxies. During its first year of observation, the results of which have now been released, DESI has provided a catalogue of millions of objects.

Small fluctuations in the density of the early universe resulted not only in signatures in the CMB, as measured for example by the Planck probe, but also left imprints in the distribution of baryonic matter. Each over-dense region is thought to contain dark matter, baryonic matter and photons. The gravitational force from dark matter on the baryons is countered by radiation pressure from the photons. From the small over-densities, baryons are dragged along by photon pressure until these two types of particles decoupled during the recombination era. The original location of the over-density is surrounded by a sphere of baryonic matter, which typically is at a distance referred to as the sound horizon. The sound horizon at the moment of decoupling, denoted r d, leaves an imprint that has since evolved to produce the density fluctuations in the universe that seeded large-scale structures.

Hydrogen storage, heat conduction, gas storage, CO2 and water sequestration—metal-organic frameworks (MOFs) have extraordinary properties due to their unique structure in the form of microporous crystals, which have a very large surface area despite their small size. This makes them extremely interesting for research and practical applications. However, MOFs are very complex systems that have so far required a great deal of time and computing power to simulate accurately.

Calcium oxide is a cheap, chalky chemical compound commonly used in the manufacturing of cement, plaster, paper, and steel. But the material may soon have a more high-tech application.

UChicago Pritzker School of Molecular Engineering researchers and their collaborator in Sweden have used theoretical and computational approaches to discover how tiny, lone atoms of bismuth embedded within solid calcium oxide can act as qubits — the building blocks of quantum computers and quantum communication devices.

These qubits are described in Nature Communications (“Discovery of atomic clock-like spin defects in simple oxides from first principles”).

(Nanowerk News) Advanced technologies enable the controlled release of medicine to specific cells in the body. Scientists argue these same technologies must be applied to agriculture if growers are to meet increasing global food demands.

In a new Nature Nanotechnology journal review paper (“Towards realizing nano-enabled precision delivery in plants”), scientists from UC Riverside and Carnegie Mellon University highlight some of the best-known strategies for improving agriculture with nanotechnology.