Toggle light / dark theme

O., o circa 2020.


Last week, Honeywell’s Quantum Solutions division released its first commercial quantum computer: a system based on trapped ions comprising 10 qubits. The H1, as it’s called, is actually the same ion trap chip the company debuted as a prototype, but with four additional ions. The company revealed a roadmap that it says will rapidly lead to much more powerful quantum computers. Separately, a competitor in ion-trap quantum computing, Maryland-based startup IonQ, unveiled a 32-qubit ion computer last month.

Human-Autonomy Interaction, Collaboration and Trust — Dr. Julie Marble, JHU Applied Physics Laboratory (APL)


Dr. Julie Marble is a senior scientist at the Johns Hopkins University Applied Physics Laboratory (JHUAPL) leading research in human-autonomy interaction, collaboration and trust.

Dr. Marble earned her PhD in Human Factors/Cognitive Psychology from Purdue University. After graduating from Purdue University, she joined the Idaho National Laboratory (INL), one of the national laboratories of the United States Department of Energy involved in nuclear research, first in the Human Factors group and then the Human and Robotic Systems group.

We can immediately supersede the Mojo Vision approach for retinal projection, with an interim projection system using metalenses. The Mojo Lens approach is to try to put everything, including the television screen, projection method and energy source onto one contact lens. With recent breakthroughs in scaling up the size of metalenses, an approach utilizing a combination of a contact metalens and a small pair of glasses can be utilized. This is emphatically not the Google Glass approach, which did not use modern metalenses. The system would work as follows:

1)Thin TV cameras are mounted on both sides of a pair of wearable glasses.

2)The images from these cameras are projected via projection metalenses in a narrow beam to the center of the pupils.

3)A contact lens with a tiny metalens mounted in the center, directly over the pupil, projects this projected beam outwards, through the pupil, onto the full width of the curved retina.

The end result would be a 360 degree, full panorama image. This image can either be a high resolution real time vision of the wearer’s surroundings, or can be a projection of a movie, or augmented reality superimposed on the normal field of vision. It can inherently be full-color 3D. Of course such a system will be complemented with ear phones. Modern hearing aids are already so small they can barely be seen, and have batteries that last a week. A pair of ear phones will also allow full 3D sound and also will be the audible complement of augmented vision.