Toggle light / dark theme

Circa 2020


The present age of information technology – the transformation of daily life by laptop computers, smartphones, so-called artificial intelligence, etc – became possible thanks to the exponential increase in the processing power of microcircuits, which began in the 1970s and continues today.

This process is described empirically by the famous Moore’s law: the number of transistor elements that can be packed into an integrated circuit chip doubles about every two years.

Few people are aware, however, that a analogous process has been taking place in laser technology. The intensities of the light pulses which lasers can deliver has been increasing exponentially since the first laser was built in 1960.

Leidos subsidiary Dynetics has won a $12.3m valued phase 1 of the Air Combat Evolution (ACE) programme, Technical Area 3 (TA3).

The ACE TA3 (Alpha Mosaic) contract was awarded by the Defense Advanced Research Projects Agency’s (DARPA) Strategic Technology Office (STO).

As an initial challenge scenario, the programme uses aerial dogfighting for implementing artificial intelligence (AI) into high-intensity air conflicts, which increases the soldier’s trust in combat autonomy.

Living organisms are capable of sensing and responding to their environment through reflex‐driven pathways. The grand challenge for mimicking such natural intelligence in miniature robots lies in achieving highly integrated body functionality, actuation, and sensing mechanisms. Here, somatosensory light‐driven robots (SLiRs) based on a smart thin‐film composite tightly integrating actuation and multisensing are presented. The SLiR subsumes pyro/piezoelectric responses and piezoresistive strain sensation under a photoactuator transducer, enabling simultaneous yet non‐interfering perception of its body temperature and actuation deformation states. The compact thin film, when combined with kirigami, facilitates rapid customization of low‐profile structures for morphable, mobile, and multiple robotic functionality. For example, an SLiR walker can move forward on different surfaces, while providing feedback on its detailed locomotive gaits and subtle terrain textures, and an SLiR anthropomorphic hand shows bodily senses arising from concerted mechanoreception, thermoreception, proprioception, and photoreception. Untethered operation with an SLiR centipede is also demonstrated, which can execute distinct, localized body functions from directional motility, multisensing, to wireless human and environment interactions. This SLiR, which is capable of integrated perception and motility, offers new opportunities for developing diverse intelligent behaviors in soft robots.

A team of materials scientists at Lawrence Berkeley National Laboratory (Berkeley Lab) – scientists who normally spend their time researching things like high-performance materials for thermoelectrics or battery cathodes – have built a text-mining tool in record time to help the global scientific community synthesize the mountain of scientific literature on COVID-19 being generated every day.

The tool, live at covidscholar.org, uses natural language processing techniques to not only quickly scan and search tens of thousands of research papers, but also help draw insights and connections that may otherwise not be apparent. The hope is that the tool could eventually enable “automated science.”

“On Google and other search engines people search for what they think is relevant,” said Berkeley Lab scientist Gerbrand Ceder, one of the project leads. “Our objective is to do information extraction so that people can find nonobvious information and relationships. That’s the whole idea of machine learning and natural language processing that will be applied on these datasets.”

Circa 2019


When you imagine an exoskeleton, chances are it might look a bit like the Guardian XO from Sarcos Robotics. The XO is literally a robot you wear (or maybe, it wears you). The suit’s powered limbs sense your movements and match their position to yours with little latency to give you effortless superstrength and endurance—lifting 200 pounds will feel like 10.

A vision of robots and humankind working together in harmony. Now, isn’t that nice?

South Korea is developing autonomous robots to recover the remains of soldiers killed in the Korean War.

The excavations will take place in Arrowhead Ridge, a former battlefield inside the demilitarized zone (DMZ) that bisects the Korean Peninsula.

The droids will use AI to scan underground for bodies of soldiers still missing from the war, which began in 1950 when North Korean communist forces invaded the capitalist south.

Many emerging technologies rely on high-quality lasers. Laser-based LiDAR sensors can provide highly accurate scans of three-dimensional spaces, and as such are crucial in applications ranging from autonomous vehicles to geological mapping technologies and emergency response systems. High-quality lasers are also a key part of the high-speed, high-volume data centers that are the backbone of the internet.

When assessing the quality of a , researchers look to the noise in a laser’s frequency, or the number of times the laser’s light wave toggles in each second. Low-quality, “noisy” lasers have more random variations in those toggles, making them useless for systems that are meant to return or convey densely packed information.

At present, lasers with adequately low frequency noise are bulky, expensive and an impractical choice for mass manufacturing. Penn Engineers have set out to solve this problem with a device called a “phase noise filter” that can turn low-cost, compact lasers into those suitable for LiDAR and more.