Toggle light / dark theme

Imagine Discovering That Your Teaching Assistant Really Is a Robot

I wonder if this would qualify as a turing test.


Lalith Polepeddi, a (human) teaching assistant and researcher on the Jill Watson project at the Georgia Institute of Technology.

Photo:
Lalith Polepeddi.

“I have been accused of being a computer,” says TA Lalith Polepeddi, a computer-science master’s student who was needled for responding to messages with lightning speed. “I don’t take it personally.”

Student Barric Reed, an analytics consultant at Accenture, ACN −0.07 % is embarrassed he didn’t pick up on the trick—for good reason.

The Quantum Experience: Feynman’s vision comes into focus

Great article about a mad-scientist whose vision caused the world to look cross-eyed. Many of us have been there before some time in our lives.

In 1981, Richard Feynman urged the world to build a quantum computer. In his own words.

“Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.”

Researchers Making Progress With Quantum Computing

I personally can confirm that QC is not being worked on and advance by just a couple groups such as D-Wave and IBM. The questions/bumps in the road that we will all face is threefold:

1) how do we standardize the QC? right now (like most innovation) is done in siloes and limited cross-collaboration across government, labs & universities, and commercial companies. 2) governance and compliance; how will these need to change across multiple areas 3) id & mitigate all impacts instead of after deployment (don’t be reactive) because we will not have that luxury due to hackers.


There is a temptation to lump quantum computing in with technologies such as fusion power in the sense that both have been proposed for decades with the promise of tremendous leaps in performance.

Whilst fusion power continues to frustrate, there are signs of real progress being made in quantum computing. There is barely a tech giant in the world that doesn’t have dedicated teams working on the topic, and these teams are beginning to bring quantum computing out of the lab and into the real world.

At the forefront of this is IBM, who recently announced that they would connect up a quantum computer to the web and allow us to play with it. The project involves a 5 qubit machine, with a qubit allowing it to operate in both ‘0 and 1’ states at the same time, thus increasing its potential computational power enormously. A one qubit machine has roughly 16 possible states, but once you get over 300, you begin to exceed the number of atoms in the universe.

Neutrons tap into magnetism in topological insulators at high temperatures

I know that I reported on this a few weeks ago; however, this article shares some additional insights on how this new method will enable more efficient smaller devices including promoting stabilization in Quantum Computing (QC)…


A multi-institutional team of researchers has discovered novel magnetic behavior on the surface of a specialized material that holds promise for smaller, more efficient devices and other advanced technology.

Researchers at the Department of Energy’s Oak Ridge National Laboratory, Massachusetts Institute of Technology and their collaborators used neutron scattering to reveal magnetic moments in hybrid topological insulator (TI) materials at room temperature, hundreds of degrees Fahrenheit warmer than the extreme sub-zero cold where the properties are expected to occur.

The discovery promises new opportunities for next-generation electronic and spintronic devices such as improved transistors and quantum computing technologies. Their research is discussed in a paper published in the journal Nature.

AI-On-A-Chip Soon Will Make Phones, Drones And More A Lot Smarter

Movidius’ Myriad 2 vision processing chip (Photo: Movidius)

The branch of artificial intelligence called deep learning has given us new wonders such as self-driving cars and instant language translation on our phones. Now it’s about to injects smarts into every other object imaginable.

That’s because makers of silicon processors from giants such as Intel Corp. and Qualcomm Technologies Inc. as well as a raft of smaller companies are starting to embed deep learning software into their chips, particularly for mobile vision applications. In fairly short order, that’s likely to lead to much smarter phones, drones, robots, cameras, wearables and more.

Google is quietly making progress on one of its most jaw-dropping tech projects

Google’s Project Soli was one of the highlights of the company’s developer conference last year, but there’s been little news about it since then.

The technology uses special radar-sensors packed in a tiny chip to detect a person’s physical movements (such as rubbing two fingers together), letting a person do things like turn the volume up on a radio without actual touching anything.

The recent news that Regina Dugan, the head of the Advanced Technology and Projects lab at Google that oversaw Soli, jumped ship to go work at rival Facebook, did not seem like a good sign for the future of Soli. And with Microsoft’s recent unveiling of similar technology, Google’s impressive product demo last year seemed like it might not make it out of the lab.

Harold Cohen, a Pioneer of Computer-Generated Art, Dies at 87

Harold Cohen, an abstract painter who developed Aaron, one of the first and eventually one of the most complex computer software programs for generating works of art, died on April 27 at his home in Encinitas, Calif. He was 87.

The cause was congestive heart failure, his son, Paul, said.

Mr. Cohen was a painter growing weary with the traditional practice of art in the late 1960s when he taught himself, out of curiosity, how to program a computer.

Garage Biotech: New drugs using only a computer, the internet and free online data

Garage startup (credit: Chase Dittmer)

By Director of UWA Centre for Software Practice, University of Western Australia

Pharmaceutical companies typically develop new drugs with thousands of staff and budgets that run into the billions of dollars. One estimate puts the cost of bringing a new drug to market at $2.6 billion with others suggesting that it could be double that cost at $5 billion.

One man, Professor Atul Butte, director of the University of California Institute of Computational Health Sciences, believes that like other Silicon Valley startups, almost anyone can bring a drug to market from their garage with just a computer, the internet, and freely available data.

In a talk given at the Science on the Swan conference held in Perth this week, Professor Butte outlined the process for an audience of local and international scientists and medics.