Toggle light / dark theme

Inspired by the human eye, researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed an adaptive metalens that is essentially a flat, electronically controlled artificial eye. The adaptive metalens simultaneously controls for three of the major contributors to blurry images: focus, astigmatism, and image shift.

The research is published in Science Advances.

“This research combines breakthroughs in artificial muscle technology with metalens technology to create a tunable metalens that can change its focus in real time, just like the human eye,” said Alan She, an SEAS graduate student at the Graduate School of Arts and Sciences, and first author of the paper. “We go one step further to build the capability of dynamically correcting for aberrations such as astigmatism and image shift, which the human eye cannot naturally do.”

When speaking about robots, people tend to imagine a wide range of different machines: Pepper, a social robot from Softbank; Atlas, a humanoid that can do backflip made by Boston Dynamics; the cyborg assassin from the Terminator movies; and the lifelike figures that populate the television series — West World. People who are not familiar with the industry tend to hold polarized views. Either they have unrealistically high estimations of robots’ ability to mimic human-level intelligence or they underestimate the potential of new researches and technologies.

Over the past year, my friends in the venture, tech, and startup scenes have asked me what’s “actually” going on in deep reinforcement learning and robotics. The wonder: how are AI-enabled robots different from traditional ones? Do they have the potential to revolutionize various industries? What are their capabilities and limitations? These questions tell me how surprisingly challenging it can be to understand the current technological progress and industry landscape, let alone make predictions for the future. I am writing this article with a humble attempt to demystify AI, in particular, and deep reinforcement learning enabled robotics, topics that we hear a lot about but understand superficially or not at all. To begin, I’ll answer a basic question: what are AI-enabled robots and what makes them unique?

Harvard University researchers have developed a new powered exosuit that can make you feel as much as a dozen pounds lighter when walking or running. Scientific American reports that the 11-pound system, which is built around a pair of flexible shorts and a motor worn on the lower back, could benefit anyone who has to cover large distances by foot, including recreational hikers, military personnel, and rescue workers.

According to the researchers, who have published their findings in the journal Science, this system differs from previous exosuits because it’s able to make it easier to both walk and run. The challenge, as shown by a video accompanying the research, is that your legs work very differently depending on whether you’re walking or running. When walking, the team says your center of mass moves like an “inverted pendulum,” while running causes it to move like a “spring-mass system.” The system needs to be able to accommodate both of them, and sense when the wearer’s gait changes.

Technology that translates cortical activity into speech would be transformative for people unable to communicate as a result of neurological impairment. Decoding speech from neural activity is challenging because speaking requires extremely precise and dynamic control of multiple vocal tract articulators on the order of milliseconds. Here, we designed a neural decoder that explicitly leverages the continuous kinematic and sound representations encoded in cortical activity to generate fluent and intelligible speech. A recurrent neural network first decoded direct cortical recordings into vocal tract movement representations, and then transformed those representations to acoustic speech output. Modeling the articulatory dynamics of speech significantly enhanced performance with limited data. Naïve listeners were able to accurately identify and transcribe decoded sentences. Additionally, speech decoding was not only effective for audibly produced speech, but also when participants silently mimed speech. These results advance the development of speech neuroprosthetic technology to restore spoken communication in patients with disabling neurological disorders.

Evgeny became wider known to the Russian public in March, after becoming one of the first to implant a chip – between his thumb and forefinger – even though such surgical procedures are forbidden in Russia.


He sleeps two hours a night, plays guitar with a custom prosthesis, and has illegally implanted a microchip. When Evgeny Nekrasov was disfigured by an accident at 14, he decided to leverage future technology to build a new life.

Evgeny, now 21, has no recollection of “messing around” after school with his friends in hometown Vladivostok and picking up the gas canister that exploded in his hands and into his face.

But the days after he woke up without sight in hospital are hard-coded in his memory.

I’m excited to share my new 1 hour interview at Singularity University radio with Steven Parton. Also, check out Singularity Hub and the write-up they did of the interview. We talk all things transhumanism, longevity, Cyborgs, and the future:


Singularity University, Singularity Hub, Singularity Summit, SU Labs, Singularity Labs, Exponential Medicine, Exponential Finance and all associated logos and design elements are trademarks and/or service marks of Singularity Education Group.

© 2019 Singularity Education Group. All Rights Reserved.

Singularity University is not a degree granting institution.

Our bodies do a decent enough job of repairing themselves, able to patch up wounds, fight off infections and even heal broken bones. But that only applies up to a certain point – lose a limb, for example, and it’s not coming back short of a prosthesis. Other creatures have mastered this skill though, and now scientists at the University of California Davis (UC Davis) and Harvard have sequenced the RNA transcripts for the immortal hydra and figured out how it manages to do just that.