Toggle light / dark theme

Cracking the elaborate code

Step inside the portal and everything is white, calm, silent: this is where researchers are helping craft the future of virtual reality. I speak out loud, and my voice echoes around the empty space. In place of the clutter on the outside, each panel is unadorned, save for a series of small black spots: cameras recording your every move. There are 480 VGA cameras and 30 HD cameras, as well as 10 RGB-D depth sensors borrowed from Xbox gaming consoles. The massive collection of recording apparatus is synced together, and its collective output is combined into a single, digital file. One minute of recording amounts to 600GB of data.

The hundreds of cameras record people talking, bartering, and playing games. Imagine the motion-capture systems used by Hollywood filmmakers, but on steroids. The footage it records captures a stunningly accurate three-dimensional representation of people’s bodies in motion, from the bend in an elbow to a wrinkle in your brow. The lab is trying to map the language of our bodies, the signals and social cues we send one another with our hands, posture, and gaze. It is building a database that aims to decipher the constant, unspoken communication we all use without thinking, what the early 20th century anthropologist Edward Sapir once called an “elaborate code that is written nowhere, known to no one, and understood by all.”

The original goal of the Panoptic Studio was to use this understanding of body language to improve the way robots relate to human beings, to make them more natural partners at work or in play. But the research being done here has recently found another purpose. What works for making robots more lifelike and social could also be applied to virtual characters. That’s why this basement lab caught the attention of one of the biggest players in virtual reality: Facebook. In April 2015, the Silicon Valley giant hired Yaser Sheikh, an associate professor at Carnegie Mellon and director of the Panoptic Studio, to assist in research to improve social interaction in VR.

The Brain Tech to Merge Humans and AI Is Already Being Developed

With BMI technology, cell circuitry, etc. this is no surprise.


Are you scared of artificial intelligence (AI)?

Do you believe the warnings from folks like Prof. Stephen Hawking, Elon Musk and others?

Is AI the greatest tool humanity will ever create, or are we “summoning the demon”?

To quote the head of AI at Singularity University, Neil Jacobstein, “It’s not artificial intelligence I’m worried about, it’s human stupidity.”

Tern Tailsitter Drone: Pilot Not Included

One of the oddest military drones aborning reinvents a stillborn technology from 1951. That’s because the unmanned aircraft revolution is resurrecting configurations that were tried more than a half century ago but proved impractical with a human pilot inside. The case in point: Northrop Grumman’s new Tern, a drone designed to do everything armed MQ-1 Predators or MQ-9 Reapers can, but to do it flying from small ships or rugged scraps of land – i.e., no runway needed.

“No one has flown a large, unmanned tailsitter before,” Brad Tousley, director of the Tactical Technology Office at the Defense Advanced Research Projects Agency (DARPA), Tern’s primary funder, said in a news release. The key word there is “unmanned.”

Back in 1951, when all sorts of vertical takeoff and landing aircraft ideas were being tried, Convair and Lockheed built experimental manned tailsitters for the Navy. Convair’s XFY-1 and Lockheed’s XFV-1, nicknamed “Pogo” and “Pogo Stick,” each had two counter-rotating propellers on its nose and was to take off and land pointing straight up. Convair’s Pogo had a delta wing and, at right angles to the wing, large fins. Lockheed’s Pogo Stick had an X-shaped tail whose trailing tips, like Convair’s wing and fins, sported landing gear.

Pilotless planes may be landing at airports by 2020

In 3 years; can you imagine that?!


Not a week goes by without an update regarding headway made by one automobile manufacturer or another testing out their self-driving prototypes. Some have even started testing the vehicles on site, exciting all who want to embrace a future where self driving vehicles are a common site.

That future is not too far off, but imagine a future where airplanes fly without pilots.

In October Geo.tv reported about ALIAS a project funded by Defense Advanced Research Projects Agency (commonly known as DARPA); the ALIAS project run by Aurora Flight Sciences tested pilotless flying using a Cessna Caravan in Manassas, VA where instead of a pilot ALIAS a robot with tubes and pipes and claws flew the plane, with instructions being fed to it by a human pilot using a tablet PC.

Bot Memorial with Eugenia Kuyda

When a human passes away, we create a tombstone as a memorial. Friends and family visit a grave to remember the times they had with that person while they were still alive. Memorial bots are another way to celebrate the life of someone who has passed away. A memorial bot is created by taking the messages sent by a deceased person and passing it through a machine learning model in order to make a bot that replicates the deceased person.

Eugenia Kuyda is the CEO of Luka, a company that builds AI products. When her friend Roman Mazurenko suddenly died, she worked with her team to make a bot that replicates his speech patterns. In our interview, we discussed memorial bots, deep learning, and the product Luka is working on–Replika, a personal AI friend for anyone.