Toggle light / dark theme

Physicists at the National Institute of Standards and Technology (NIST) have added to their collection of ingredients for future quantum computers by performing logic operations—basic computing steps—with two atoms of different elements. This hybrid design could be an advantage in large computers and networks based on quantum physics.

The NIST experiment, described in the Dec. 17 issue of Nature, manipulated one magnesium and one beryllium ion (charged atom) confined in a custom trap (see photo). The scientists used two sets of laser beams to entangle the two ions—establishing a special quantum link between their properties—and to perform two types of logic operations, a controlled NOT (CNOT) gate and a SWAP gate. The same issue of Nature describes similar work with two forms of performed at the University of Oxford.

“Hybrid quantum computers allow the unique advantages of different types of quantum systems to be exploited together in a single platform,” said lead author Ting Rei Tan. “Many research groups are pursuing this general approach. Each ion species is unique, and certain ones are better suited for certain tasks such as memory storage, while others are more suited to provide interconnects for data transfer between remote systems.”

Read more

A model of one form of double-stranded DNA attached to two electrodes (credit: UC Davis)

What do you call a DNA molecule that changes between high and low electrical conductance (amount of current flow)?

Answer: a molecular switch (transistor) for nanoscale computing. That’s what a team of researchers from the University of California, Davis and the University of Washington have documented in a paper published in Nature Communications Dec. 9.

Read more

A researcher at Singapore’s Nanyang Technological University (NTU) has developed a new technology that provides real-time detection, analysis, and optimization data that could potentially save a company 10 percent on its energy bill and lessen its carbon footprint. The technology is an algorithm that primarily relies on data from ubiquitous devices to better analyze energy use. The software uses data from computers, servers, air conditioners, and industrial machinery to monitor temperature, data traffic and the computer processing workload. Data from these already-present appliances are then combined with the information from externally placed sensors that primarily monitor ambient temperature to analyze energy consumption and then provide a more efficient way to save energy and cost.

The energy-saving computer algorithm was developed by NTU’s Wen Yonggang, an assistant professor at the School of Computer Engineering’s Division of Networks & Distributed Systems. Wen specializes in machine-to-machine communication and computer networking, including looking at social media networks, cloud-computing platforms, and big data systems.

Most data centers consume huge amount of electrical power, leading to high levels of energy waste, according to Wen’s website. Part of his research involves finding ways to reduce energy waste and stabilize power systems by scaling energy levels temporally and spatially.

Read more

Before each Computerphile interview we asked guests and regular contributors about their first computer.

Professor Uwe Aickelin: Missing Data: https://youtu.be/oCQbC818KKU
Professor Ross Anderson: Chip & PIN Fraud: https://youtu.be/Ks0SOn8hjG8

Spencer Lamb: Inside a Data Centre: https://youtu.be/fd3kSdu4W7c
Tom Scott: Animated GIFs and Space vs Time: http://youtu.be/blSzwPcL5Dw

Horia Maior: Brain Scanner: COMING SOON!

In 2010, a Canadian company called D-Wave announced that it had begun production of what it called the world’s first commercial quantum computer, which was based on theoretical work done at MIT. Quantum computers promise to solve some problems significantly faster than classical computers—and in at least one case, exponentially faster. In 2013, a consortium including Google and NASA bought one of D-Wave’s machines.

Over the years, critics have argued that it’s unclear whether the D-Wave machine is actually harnessing quantum phenomena to perform its calculations, and if it is, whether it offers any advantages over classical computers. But this week, a group of Google researchers released a paper claiming that in their experiments, a quantum algorithm running on their D-Wave machine was 100 million times faster than a comparable classical algorithm.

Scott Aaronson, an associate professor of electrical engineering and computer science at MIT, has been following the D-Wave story for years. MIT News asked him to help make sense of the Google researchers’ new paper.

Read more

Why send a message back in time, but lock it so that no one can ever read the contents? Because it may be the key to solving currently intractable problems. That’s the claim of an international collaboration who have just published a paper in npj Quantum Information.

It turns out that an unopened message can be exceedingly useful. This is true if the experimenter entangles the message with some other system in the laboratory before sending it. Entanglement, a strange effect only possible in the realm of quantum physics, creates correlations between the time-travelling message and the laboratory system. These correlations can fuel a quantum computation.

Around ten years ago researcher Dave Bacon, now at Google, showed that a time-travelling quantum computer could quickly solve a group of problems, known as NP-complete, which mathematicians have lumped together as being hard.

Read more

We have very good news for all fans of High RAM Powered Phones. Samsung started mass production of their new LPDDR4 DRAM, allowing for next Generation 6GB RAM phones in India. Samsung essentially produced the industry’s first 12Gigabit LPDDR4 RAM with Samsung’s 20nm manufacturing process.

samsung-128GB-ram-module

The real advantage of those chips is that they have a 50% higher density PCB layout with increased capacity as well as reduced power usage. Both of these are very important factors in small devices like a phone/tablet where every mm2 and mW matters. Please note that this is Gigabits, not Gigabytes. 12 Gigabits is around 1.5GB of RAM. Most high end smartphones have four memory dies, that means 1.5GB x 4 = 6GB RAM phones for us.

Read more