Toggle light / dark theme

As we enter our third decade in the 21st century, it seems appropriate to reflect on the ways technology developed and note the breakthroughs that were achieved in the last 10 years.

The 2010s saw IBM’s Watson win a game of Jeopardy, ushering in mainstream awareness of machine learning, along with DeepMind’s AlphaGO becoming the world’s Go champion. It was the decade that industrial tools like drones, 3D printers, genetic sequencing, and virtual reality (VR) all became consumer products. And it was a decade in which some alarming trends related to surveillance, targeted misinformation, and deepfakes came online.

For better or worse, the past decade was a breathtaking era in human history in which the idea of exponential growth in information technologies powered by computation became a mainstream concept.

Tiny biohybrid robots on the micrometer scale can swim through the body and deliver drugs to tumors or provide other cargo-carrying functions. The natural environmental sensing tendencies of bacteria mean they can navigate toward certain chemicals or be remotely controlled using magnetic or sound signals.

To be successful, these tiny biological robots must consist of materials that can pass clearance through the body’s immune response. They also have to be able to swim quickly through viscous environments and penetrate to deliver cargo.

In a paper published this week in APL Bioengineering, from AIP Publishing, researchers fabricated biohybrid bacterial microswimmers by combining a genetically engineered E. coli MG1655 substrain and nanoerythrosomes, small structures made from red cells.

As capable as robots are, the original animals after which they tend to be designed are always much, much better. That’s partly because it’s difficult to learn how to walk like a dog directly from a dog — but this research from Google’s AI labs make it considerably easier.

The goal of this research, a collaboration with UC Berkeley, was to find a way to efficiently and automatically transfer “agile behaviors” like a light-footed trot or spin from their source (a good dog) to a quadrupedal robot. This sort of thing has been done before, but as the researchers’ blog post points out, the established training process can often “require a great deal of expert insight, and often involves a lengthy reward tuning process for each desired skill.”

That doesn’t scale well, naturally, but that manual tuning is necessary to make sure the animal’s movements are approximated well by the robot. Even a very doglike robot isn’t actually a dog, and the way a dog moves may not be exactly the way the robot should, leading the latter to fall down, lock up or otherwise fail.

A new programme from the US Defense Advanced Research Projects Agency (DARPA) aims to address a key weakness of autonomous and semi-autonomous land systems: the need for active illumination to navigate in low-light conditions.

Unmanned systems rely on active illumination — anything that emits light or electromagnetic radiation, such as light detection and ranging (LIDAR) systems — to navigate at night or underground.

However, according to Joe Altepeter, programme manager in DARPA’s Defense Sciences Office, this approach creates significant security concerns, as such emissions could be detected by potential adversaries.

Tech entrepreneur Roya Mahboob founded the trail-blazing programme in the Afghani city of Herat, selecting young girls from high schools across the country, usually aged 14 or 15, for the programme.

It was a passion project for Ms Mahboob, a serial entrepreneur who became one of Afghanistan’s first female chief executives at 23, established a non-profit organisation to help young women to build digital literacy, and has since been named one of Time Magazine’s 100 most influential people.

Participants are selected for the Dreamers based on their entrance exam for the 9th and 10th grades, and the very best of them then get to join the national team – the Afghan Girls Robotics Team – for international competitions. There are about 50 participants in the Dreamers, and they stay in the programme for about two years.

Is a low-cost Israeli #ventilator the key to saving #coronavirus patients in #Iran, Africa and more?


“We are not talking about a website for the general public, we are talking about engineers and other experts, and we know the groups who are working on it because they are in touch with us via WhatsApp and emails, to ask questions and understand how to proceed,” he said.

“AmboVent” is a device inspired by the bag-valve mask ventilators that paramedics use when they’re manually ventilating patients in an ambulance, which also offers controls for respiration rate, volume, and maximum peak pressure. Organizations involved in its development include the Magen David Adom, Israeli Air Force 108 Electronics Depot; physicians from Hadassah and Tel Aviv Sourasky medical centers; Microsoft; Rafael, an Israeli defense contractor; Israeli Aerospace Industries; and mentors and students from FIRST Israel, a student robotics organization.

A key feature of the project is that not only the technology is opensource, but its components can be easily built with limited tools and parts, for example 3D printers and car pieces, making the production much more accessible even in less developed country.

Welcome to the twilight zone.


TOKYO (Reuters) — Spring graduation ceremonies in Japan have been cancelled because of the coronavirus pandemic, but students at one school were able to attend remotely by controlling avatar robots while logged on at home.

The robots, dubbed “Newme” by developer ANA Holdings, were dressed in graduation caps and gowns for the ceremony at the Business Breakthrough University in Tokyo.

The robots’ “faces” were tablets that displayed the faces of the graduates, who logged on at home and controlled the robots via their laptops.

AI has revealed that mice have a range of facial expressions that show they feel — offering fresh clues about how emotional responses arise in human brains.

Scientists at the Max Planck Institute of Neurobiology in Germany made the discovery by recording the faces of lab mice when they were exposed to different stimuli, such as sweet flavors and electric shocks. The researchers then used machine learning algorithms to analyze how the rodents’ faces changed when they experienced different feelings.

Circa 2017


The era of merging our minds with technology has begun. Already, we can hack the brain to treat diseases such as Parkinson’s or help paralyzed people move again. But what if you could install a chip in your head that would not only fix any health issues, but could amp up your brainpower — would you remember every word said during a meeting, finish crossword puzzles faster, drive better thanks to enhanced senses, or pick up a new language before your next trip?

That’s the future envisioned by Elon Musk, the Tesla and SpaceX CEO who recently announced Neuralink, a new company dedicated to blending human brains with computers. In Musk’s view, we’ll have to keep pace with ever-smarter artificial intelligence by implanting a “neural lace,” or sci-fi inspired machine interface that will make us smarter.

“Under any rate of advancement in AI, we will be left behind by a lot,” Musk said last year. “The benign situation with ultra-intelligent AI is that we would be so far below in intelligence we’d be like a pet, or a house cat.”