Toggle light / dark theme

The Future is Now | Life after Artificial Intelligence

These days, neural networks, deep learning and all types of sensors allow AI to be used in healthcare, to operate self-driving cars and to tweak our photos on Instagram.

In the #future, the ability to learn, to emulate the creative process and to self-organize may give rise to previously unimagined opportunities and unprecedented threats.


When 20 years ago, a computer beat a human at chess, it marked the dawn of Artificial Intelligence, as we know it.
These days, neural networks, deep learning and all types of sensors allow AI to be used in healthcare, to operate self-driving cars and to tweak our photos on Instagram.
In the #future, the ability to learn, to emulate the creative process and to self-organize may give rise to previously unimagined opportunities and unprecedented threats.

SUBSCRIBE TO RTD Channel to get documentaries firsthand! http://bit.ly/1MgFbVy
FOLLOW US
RTD WEBSITE: https://RTD.rt.com/
#RTD ON TWITTER: http://twitter.com/RT_DOC
RTD ON FACEBOOK: http://www.facebook.com/RTDocumentary
RTD ON INSTAGRAM https://www.instagram.com/rtdocumentaries/
RTD LIVE https://rtd.rt.com/on-air/

#Documentary

Tech’s Biggest Leaps From the Last 10 Years, and Why They Matter

As we enter our third decade in the 21st century, it seems appropriate to reflect on the ways technology developed and note the breakthroughs that were achieved in the last 10 years.

The 2010s saw IBM’s Watson win a game of Jeopardy, ushering in mainstream awareness of machine learning, along with DeepMind’s AlphaGO becoming the world’s Go champion. It was the decade that industrial tools like drones, 3D printers, genetic sequencing, and virtual reality (VR) all became consumer products. And it was a decade in which some alarming trends related to surveillance, targeted misinformation, and deepfakes came online.

For better or worse, the past decade was a breathtaking era in human history in which the idea of exponential growth in information technologies powered by computation became a mainstream concept.

Personalized microrobots swim through biological barriers, deliver drugs to cells

Tiny biohybrid robots on the micrometer scale can swim through the body and deliver drugs to tumors or provide other cargo-carrying functions. The natural environmental sensing tendencies of bacteria mean they can navigate toward certain chemicals or be remotely controlled using magnetic or sound signals.

To be successful, these tiny biological robots must consist of materials that can pass clearance through the body’s immune response. They also have to be able to swim quickly through viscous environments and penetrate to deliver cargo.

In a paper published this week in APL Bioengineering, from AIP Publishing, researchers fabricated biohybrid bacterial microswimmers by combining a genetically engineered E. coli MG1655 substrain and nanoerythrosomes, small structures made from red cells.

Google research makes for an effortless robotic dog trot

As capable as robots are, the original animals after which they tend to be designed are always much, much better. That’s partly because it’s difficult to learn how to walk like a dog directly from a dog — but this research from Google’s AI labs make it considerably easier.

The goal of this research, a collaboration with UC Berkeley, was to find a way to efficiently and automatically transfer “agile behaviors” like a light-footed trot or spin from their source (a good dog) to a quadrupedal robot. This sort of thing has been done before, but as the researchers’ blog post points out, the established training process can often “require a great deal of expert insight, and often involves a lengthy reward tuning process for each desired skill.”

That doesn’t scale well, naturally, but that manual tuning is necessary to make sure the animal’s movements are approximated well by the robot. Even a very doglike robot isn’t actually a dog, and the way a dog moves may not be exactly the way the robot should, leading the latter to fall down, lock up or otherwise fail.

DARPA seeks enhanced low-light navigation performance for unmanned systems

A new programme from the US Defense Advanced Research Projects Agency (DARPA) aims to address a key weakness of autonomous and semi-autonomous land systems: the need for active illumination to navigate in low-light conditions.

Unmanned systems rely on active illumination — anything that emits light or electromagnetic radiation, such as light detection and ranging (LIDAR) systems — to navigate at night or underground.

However, according to Joe Altepeter, programme manager in DARPA’s Defense Sciences Office, this approach creates significant security concerns, as such emissions could be detected by potential adversaries.

The all-female robotics team in Afghanistan who made a cheap ventilator out of Toyota parts

Tech entrepreneur Roya Mahboob founded the trail-blazing programme in the Afghani city of Herat, selecting young girls from high schools across the country, usually aged 14 or 15, for the programme.

It was a passion project for Ms Mahboob, a serial entrepreneur who became one of Afghanistan’s first female chief executives at 23, established a non-profit organisation to help young women to build digital literacy, and has since been named one of Time Magazine’s 100 most influential people.

Participants are selected for the Dreamers based on their entrance exam for the 9th and 10th grades, and the very best of them then get to join the national team – the Afghan Girls Robotics Team – for international competitions. There are about 50 participants in the Dreamers, and they stay in the programme for about two years.

Coronavirus: Israeli researchers design low-cost open-source ventilator

Is a low-cost Israeli #ventilator the key to saving #coronavirus patients in #Iran, Africa and more?


“We are not talking about a website for the general public, we are talking about engineers and other experts, and we know the groups who are working on it because they are in touch with us via WhatsApp and emails, to ask questions and understand how to proceed,” he said.

“AmboVent” is a device inspired by the bag-valve mask ventilators that paramedics use when they’re manually ventilating patients in an ambulance, which also offers controls for respiration rate, volume, and maximum peak pressure. Organizations involved in its development include the Magen David Adom, Israeli Air Force 108 Electronics Depot; physicians from Hadassah and Tel Aviv Sourasky medical centers; Microsoft; Rafael, an Israeli defense contractor; Israeli Aerospace Industries; and mentors and students from FIRST Israel, a student robotics organization.

A key feature of the project is that not only the technology is opensource, but its components can be easily built with limited tools and parts, for example 3D printers and car pieces, making the production much more accessible even in less developed country.

“We kept the design and every aspect of it very simple so it would be as easy as possible to be replicated from everywhere,” he said.

Robots replace Japanese students at graduation amid coronavirus

Welcome to the twilight zone.


TOKYO (Reuters) — Spring graduation ceremonies in Japan have been cancelled because of the coronavirus pandemic, but students at one school were able to attend remotely by controlling avatar robots while logged on at home.

The robots, dubbed “Newme” by developer ANA Holdings, were dressed in graduation caps and gowns for the ceremony at the Business Breakthrough University in Tokyo.

The robots’ “faces” were tablets that displayed the faces of the graduates, who logged on at home and controlled the robots via their laptops.

AI reveals that mice’s faces express a range of emotions — just like humans

AI has revealed that mice have a range of facial expressions that show they feel — offering fresh clues about how emotional responses arise in human brains.

Scientists at the Max Planck Institute of Neurobiology in Germany made the discovery by recording the faces of lab mice when they were exposed to different stimuli, such as sweet flavors and electric shocks. The researchers then used machine learning algorithms to analyze how the rodents’ faces changed when they experienced different feelings.