Toggle light / dark theme

Biocomputing/ living circuit computing/ gene circuitry are the longer term future beyond Quantum. Here is another one of the many building blocks.


The tiny molecule responsible for transmitting the genetic data for every living thing on earth could be the answer to the IT industry’s quest for a more compact storage medium. In fact, researchers from Microsoft and the University of Washington recently succeeded in storing 200 MB of data on a few strands of DNA, occupying a small dot on a test tube many times smaller than the tip of a pencil.

The Internet in a Shoebox.

Despite the small space occupied by the DNA strands, the researchers were nonetheless able to successfully store and retrieve high-definition digital video, the top 100 books from Project Guttenberg, and copies of the Universal Declaration of Human Rights in more than 100 languages.

Read more

The Defense Advanced Research Projects Agency has finished its work to integrate live data feeds from several sources into the U.S. Space Surveillance Network run by the Air Force in an effort to help space monitoring teams check when satellites are at risk.

SSN is a global network of 29 military radar and optical telescopes and DARPA added seven space data providers to the network to help monitor the space environment under its OrbitOutlook program, the agency said Wednesday.

DARPA plans to test the automated algorithms developed to determine relevant data from the integrated feed in order to help SSA experts carry out their mission.

Read more

Great that they didn’t have to use a super computer to do their prescribed, lab controlled experiments. However, to limit QC to a super computer and experimental computations only is a big mistake; I cannot stress this enough. QC is a new digital infrastructure that changes our communications, cyber security, and will eventually (in the years to come) provide consumers/ businesses/ and governments with the performance they will need for AI, Biocomputing, and Singularity.


A group of physicists from the Skobeltsyn Institute of Nuclear Physics, the Lomonosov Moscow State University, has learned to use personal computer for calculations of complex equations of quantum mechanics, usually solved with help of supercomputers. This PC does the job much faster. An article about the results of the work has been published in the journal Computer Physics Communications.

Senior researchers Vladimir Pomerantcev and Olga Rubtsova, working under the guidance of Professor Vladimir Kukulin (SINP MSU) were able to use on an ordinary desktop PC with GPU to solve complicated integral equations of quantum mechanics — previously solved only with the powerful, expensive supercomputers. According to Vladimir Kukulin, personal computer does the job much faster: in 15 minutes it is doing the work requiring normally 2–3 days of the supercomputer time.

The equations in question were formulated in the 60s by the Russian mathematician Ludwig Faddeev. The equations describe the scattering of a few quantum particles, i.e., represent a quantum mechanical analog of the Newtonian theory of the three body systems. As the result, the whole field of quantum mechanics called “physics of few-body systems” appeared soon after this.

Read more

Very nice.


ARLINGTON, Va., 27 June 2016. U.S. military researchers are asking industry for new algorithms and protocols for large, mission-aware, computer, communications, and battlefield network systems that physically are dispersed over large forward-deployed areas.

Officials of the U.S. Defense Advanced Research Projects Agency (DARPA) in Arlington, Va., issued a broad agency announcement on Friday (DARPA-BAA-16–41) for the Dispersed Computing project, which seeks to boost application and network performance of dispersed computing architectures by orders of magnitude with new algorithms and protocol stacks.

Examples of such architectures include network elements, radios, smart phones, or sensors with programmable execution environments; and portable micro-clouds of different form factors.

Read more

(Phys.org)—Inspired by natural selection and the concept of “survival of the fittest,” genetic algorithms are flexible optimization techniques that can find the best solution to a problem by repeatedly selecting for and breeding ever “fitter” generations of solutions.

Now for the first time, researchers Urtzi Las Heras et al. at the University of the Basque Country in Bilbao, Spain, have applied genetic algorithms to digital and shown that genetic algorithms can reduce quantum errors, and may even outperform existing optimization techniques. The research, which is published in a recent issue of Physical Review Letters, was led by Ikerbasque Prof. Enrique Solano and Dr. Mikel Sanz in the QUTIS group.

In general, quantum simulations can provide a clearer picture of the dynamics of systems that are impossible to understand using conventional computers due to their high degree of complexity. Whereas computers calculate the behavior of these systems, quantum simulations approximate or “simulate” the behavior.

Read more

A high-tech version of an old-fashioned balance scale at the National Institute of Standards and Technology (NIST) has just brought scientists a critical step closer toward a new and improved definition of the kilogram. The scale, called the NIST-4 watt balance, has conducted its first measurement of a fundamental physical quantity called Planck’s constant to within 34 parts per billion — demonstrating the scale is accurate enough to assist the international community with the redefinition of the kilogram, an event slated for 2018.

The redefinition-which is not intended to alter the value of the kilogram’s mass, but rather to define it in terms of unchanging fundamental constants of nature-will have little noticeable effect on everyday life. But it will remove a nagging uncertainty in the official kilogram’s mass, owing to its potential to change slightly in value over time, such as when someone touches the metal artifact that currently defines it.

Planck’s constant lies at the heart of quantum mechanics, the theory that is used to describe physics at the scale of the atom and smaller. Quantum mechanics began in 1900 when Max Planck described how objects radiate energy in tiny packets known as “quanta.” The amount of energy is proportional to a very small quantity called h, known as Planck’s constant, which subsequently shows up in almost all equations in quantum mechanics. The value of h — according to NIST’s new measurement — is 6.62606983×10−34 kg?m2/s, with an uncertainty of plus or minus 22 in the last two digits.

Read more

Because of a plethora of data from sensor networks, Internet of Things devices and big data resources combined with a dearth of data scientists to effectively mold that data, we are leaving many important applications – from intelligence to science and workforce management – on the table.

It is a situation the researchers at DARPA want to remedy with a new program called Data-Driven Discovery of Models (D3M). The goal of D3M is to develop algorithms and software to help overcome the data-science expertise gap by facilitating non-experts to construct complex empirical models through automation of large parts of the model-creation process. If successful, researchers using D3M tools will effectively have access to an army of “virtual data scientists,” DARPA stated.

+More on Network World: Feeling jammed? Not like this I bet+

This army of virtual data scientists is needed because some experts project deficits of 140,000 to 190,000 data scientists worldwide in 2016 alone, and increasing shortfalls in coming years. Also, because the process to build empirical models is so manual, their relative sophistication and value is often limited, DARPA stated.

Read more