Current experiments with brain-computer interfaces have allowed an amputee to “feel” with his prosthetic hand — what other wonders will we achieve with this technology?
The head of Google’s self-driving car division made headlines recently for asking federal regulators to permit a vehicle without human-facing features like a steering wheel. Now he’s made a very good case for why no autonomous vehicle should have these things at all.
In an interview with NPR that aired today, Google’s Chris Urmson hit home the point that it’s simply not a good idea to any to have any kind of human-oriented controls in self-driving cars:
You wouldn’t imagine that in the back of a taxi, we put an extra steering wheel or brake pedal there for the passenger to grab ahold of anytime. It would just be crazy to think about doing that. But at the same time, I could imagine that there are vehicles where most of the days you don’t really want to drive it, so let it take you to and from work in the morning, for example, but on the weekend when you get a chance to get out onto some open road, that you might enjoy driving in that location. But I think the idea that you want the person to jump in who hasn’t been paying attention or maybe had a couple of drinks with dinner and then jump in to override is probably not the right idea.
Nice — Liquid biopsies, AI therapy, silico trials, precision surgery.
Negotiations and collaborations are launching now to decide which research trends and areas deserve the most support. Only disruptive innovations will be able to transform the status quo in cancer, leading patients to get more personalized and faster cancer care, while letting physicians do their job more effectively. Here are the technologies and trends that could help achieve the cancer moonshot.
Prevention and diagnosis
Cancer diagnosis must be early and accurate. Many cancer types cannot be detected early enough at the moment, while others are detected in time, but treated too severely. This notion requires not only great healthcare facilities and new diagnosis technologies, but also the proactivity of patients.
New Watson API Rel.
The lines are beginning to become blurred as machines gain artificial intelligence capabilities based on Watson’s popular API set.
IBM has just announced the beta release of three new APIs, which could help revolutionize the way we interact with machines. The APIs are called Tone Analyzer, Emotion Analysis and Visual Recognition.
When developers implement these APIs, machines can be trained to hear changes in a person’s voice, analyze a person’s emotional state and machines can even be trained to recognize objects presented to them using a picture or a real time image capturing device.