Toggle light / dark theme

DNA writing is an aspect of our industry that I’ve been closely watching for several years because it is a critical component of so many groundbreaking capabilities, from cell and gene therapies to DNA data storage. At the SynBioBeta Conference in 2018, the co-founder of a new startup that was barely more than an idea gave a lightning talk on enzymatic DNA synthesis — and I was so struck by the technology the company was aiming to develop that I listed them as one of four synthetic biology startups to watch in 2019. I watched them, and I wasn’t disappointed.

Ansa Biotechnologies, Inc. — the Emeryville, California-based DNA synthesis startup using enzymes instead of chemicals to write DNA — announced in March the successful de novo synthesis of a 1005-mer, the world’s longest synthetic oligonucleotide, encoding a key part of the AAV vector used for developing gene therapies. And that’s just the beginning. Co-founder Dan Lin-Arlow will be giving another lightning talk at this year’s SynBioBeta Conference in just a few weeks. I caught up with him in the lead up and was truly impressed by what Ansa Biotechnologies has accomplished in just 5 years.

Synthetic DNA is a key enabling technology for engineering biology. For nearly 40 years, synthetic DNA has been produced using phosphoramidite chemistry, which facilitates the sequential addition of new bases to a DNA chain in a simple cyclic reaction. While this process is incredibly efficient and has supported countless innovative breakthroughs (a visit to Twist Bioscience’s website will quickly educate you on exciting advances in drug discovery, infectious disease research, cancer therapeutics, and even agriculture enabled by synthetic DNA) it suffers from two main drawbacks: its reliance on harsh chemicals and its inability to produce long (read: complex) DNA fragments.

In a development that could make quantum computers less prone to errors, a team of physicists from Quantinuum, California Institute of Technology and Harvard University has created a signature of non-Abelian anyons (nonabelions) in a special type of quantum computer. The team has published their results on the arXiv preprint server.

As scientists work to design and build a truly useful quantum computer, one of the difficulties is trying to account for errors that creep in. In this new effort, the researchers have looked to anyons for help.

Anyons are quasiparticles that exist in two dimensions. They are not true particles, but instead exist as vibrations that act like particles—certain groups of them are called nonabelions. Prior research has found that nonabelions have a unique and useful property—they remember some of their own history. This property makes them potentially useful for creating less error-prone quantum computers. But creating, manipulating and doing useful things with them in a quantum computer is challenging. In this new work, the team have come close by creating a physical simulation of nonabelions in action.

It simply does not make sense to keep the disks spinning.

Data storage on hard drives will soon become a thing of the past, according to an expert Shawn Rosemarin, who also owns a company selling solid-storage solutions. According to Rosemarin, we could see the last hard drive being sold in just about five years from now, PC Gamer.

Most computer users have long migrated to cloud storage solutions when it comes to safely storing their data. With content being streamed on smartphones and tablets practically everywhere, there is little reason to own a hard drive these days.

O.o!!!


The unification of general relativity and quantum theory is one of the fascinating problems of modern physics. One leading solution is Loop Quantum Gravity (LQG). Simulating LQG may be important for providing predictions which can then be tested experimentally. However, such complex quantum simulations cannot run efficiently on classical computers, and quantum computers or simulators are needed. Here, we experimentally demonstrate quantum simulations of spinfoam amplitudes of LQG on an integrated photonics quantum processor. We simulate a basic transition of LQG and show that the derived spinfoam vertex amplitude falls within 4% error with respect to the theoretical prediction, despite experimental imperfections.

During its ongoing Think 2023 conference, IBM today announced an end-to-end solution to prepare organisations to adopt quantum-safe cryptography. Called Quantum Safe technology, it is a set of tools and capabilities that integrates IBM’s deep security expertise. Quantum-safe cryptography is a technique to identify algorithms that are resistant to attacks by both classical and quantum computers.

Under Quantum Safe technology, IBM is offering three capabilities. First is the Quantum Safe Explorer to locate cryptographic assets, dependencies, and vulnerabilities and aggregate all potential risks in one central location. Next is the Quantum Safe Advisor which allows the creation of a cryptographic inventory to prioritise risks. Lastly, the Quantum Safe Remidiator lets organisations test quantum-safe remediation patterns and deploy quantum-safe solutions.

In addition, the company has also announced IBM Safe Roadmap, which will serve as the guide for industries to adopt quantum technology. IBM Quantum Safe Roadmap is the company’s first blueprint to help companies in dealing with anticipated cryptographic standards and requirements and protect systems from vulnerabilities.

Wood is good for a lot of things. Building boxes, boats, and bookcases, for instance. Making tools, or campfires. Feeding termites. And beavers.

You’ll note powering functional electrical appliances isn’t among them.

Researchers at Linköping University and the KTH Royal Institute of Technology in Sweden clearly never paid much attention to lists of things wood is bad at, so they went ahead and made the world’s first wooden transistor.

A new experiment uses superconducting qubits to demonstrate that quantum mechanics violates what’s called local realism by allowing two objects to behave as a single quantum system no matter how large the separation between them. The experiment wasn’t the first to show that local realism isn’t how the Universe works—it’s not even the first to do so with qubits.

But it’s the first to separate the qubits by enough distance to ensure that light isn’t fast enough to travel between them while measurements are made. And it did so by cooling a 30-meter-long aluminum wire to just a few milliKelvin. Because the qubits are so easy to control, the experiment provides a new precision to these sorts of measurements. And the hardware setup may be essential for future quantum computing efforts.

The silicon microchips of future quantum computers will be packed with millions, if not billions of qubits—the basic units of quantum information—to solve the greatest problems facing humanity. And with millions of qubits needing millions of wires in the microchip circuitry, it was always going to get cramped in there.

But now engineers at UNSW Sydney have made an important step toward solving a long-standing problem about giving their more breathing space—and it all revolves around jellybeans.

Not the kind we rely on for a sugar hit to get us past the 3pm slump. But jellybean quantum dots—elongated areas between qubit pairs that create more space for wiring without interrupting the way the paired qubits interact with each other.

In the United States, the first step on the road to exascale HPC systems began with a series of workshops in 2007. It wasn’t until a decade and a half later that the 1,686 petaflops “Frontier” system at Oak Ridge National Laboratory went online. This year, Argonne National Laboratory is preparing for the switch to be turned on for “Aurora,” which will be either the second or the third such exascale machine in the United States, depending on the timing of the “El Capitan” system at Lawrence Livermore National Laboratory.

There were delays and setbacks on the road to exascale for all of these machines, as well as technology changes, ongoing competition with China, and other challenges. But don’t expect the next leap to zettascale – or even quantum computing – to be any quicker, according to Rick Stevens, associate laboratory director of computing for environment and life sciences at Argonne. Both could take another 15 to 20 years or more.

Such is the nature of HPC.