Toggle light / dark theme

Nice.


Richard Feynman suggested that it takes a quantum computer to simulate large quantum systems, but a new study shows that a classical computer can work when the system has loss and noise.

The field of quantum computing originated with a question posed by Richard Feynman. He asked whether or not it was feasible to simulate the behavior of quantum systems using a classical computer, suggesting that a quantum computer would be required instead [1]. Saleh Rahimi-Keshari from the University of Queensland, Australia, and colleagues [2] have now demonstrated that a quantum process that was believed to require an exponentially large number of steps to simulate on a classical computer could in fact be simulated in an efficient way if the system in which the process occurs has sufficiently large loss and noise.

The quantum process considered by Rahimi-Keshari et al. is known as boson sampling, in which the probability distribution of photons (bosons) that undergo a linear optical process [3] is measured or sampled. In experiments of this kind [4, 5], NN single photons are sent into a large network of beams splitters (half-silvered mirrors) and combined before exiting through MM possible output channels. The calculation of the probability distribution for finding the photons in each of the MM output channels is equivalent to calculating the permanent of a matrix. The permanent is the same as the more familiar determinant but with all of the minus signs replaced with plus signs.

Because of a plethora of data from sensor networks, Internet of Things devices and big data resources combined with a dearth of data scientists to effectively mold that data, we are leaving many important applications – from intelligence to science and workforce management – on the table.

It is a situation the researchers at DARPA want to remedy with a new program called Data-Driven Discovery of Models (D3M). The goal of D3M is to develop algorithms and software to help overcome the data-science expertise gap by facilitating non-experts to construct complex empirical models through automation of large parts of the model-creation process. If successful, researchers using D3M tools will effectively have access to an army of “virtual data scientists,” DARPA stated.

+More on Network World: Feeling jammed? Not like this I bet+

This army of virtual data scientists is needed because some experts project deficits of 140,000 to 190,000 data scientists worldwide in 2016 alone, and increasing shortfalls in coming years. Also, because the process to build empirical models is so manual, their relative sophistication and value is often limited, DARPA stated.

Read more

Computer chips have stopped getting faster. For the past 10 years, chips’ performance improvements have come from the addition of processing units known as cores.

In theory, a program on a 64- machine would be 64 times as fast as it would be on a single-core machine. But it rarely works out that way. Most computer programs are sequential, and splitting them up so that chunks of them can run in parallel causes all kinds of complications.

In the May/June issue of the Institute of Electrical and Electronics Engineers’ journal Micro, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) will present a new chip design they call Swarm, which should make parallel programs not only much more efficient but easier to write, too.

Read more

Good news alien hunters! A Kickstarter to fund a year-long investigation into KIC 8462852—the star voted most likely to harbor an advanced alien civilization—just got funded. Alien megastructure or not, we may finally get to the bottom of this bewildering, flickering star.

This crowdfunding campaign was set up in May by Yale astronomer Tabby Boyajian, and it managed to meet its $100,000 goal in just 30 days. A $10,000 surge in the last 100 minutes of the campaign managed to put the project over the top. The next step is to figure out the logistics, but Boyajian, who’s been leading the research into KIC 8462852, says observations could start as early as later this summer.

The ultimate goal of the project will be to determine why this star’s light dims at such irregular intervals, and at times by as much as 20 percent. These huge dips in luminosity are way too large to be a passing planet, hence the suspicion the anomaly is being caused by swarms of comets, a distorted star, some unknown astronomical phenomenon—or an advanced alien civilization in the process of building a gigantic solar array around the star.

Read more