Menu

Blog

Archive for the ‘robotics/AI’ category: Page 1826

Sep 1, 2019

Elon Musk: Humanity Is a Kind of ‘Biological Boot Loader’ for AI

Posted by in categories: biological, Elon Musk, robotics/AI

On Wednesday, Tesla CEO Elon Musk and Alibaba cofounder Jack Ma took the stage at the World AI Conference in Shanghai to debate artificial intelligence and its implications for humanity. As expected, Ma took a far more optimistic stance than Musk. Ma encouraged people to have faith in humanity, our creativity, and the future. “I don’t think artificial intelligence is a threat,” he said, to which Musk replied, “I don’t know, man, that’s like, famous last words.” An edited transcript of the discussion follows.

Elon Musk: What are we supposed to say? Just things about AI perhaps? Yeah. Okay. Let’s see.

Jack Ma: The AI, right? Okay, great.

Aug 31, 2019

Psychosensory electronic skin technology for future AI and humanoid development

Posted by in categories: biotech/medical, cyborgs, engineering, robotics/AI, space

Professor Jae Eun Jang’s team in the Department of Information and Communication Engineering has developed electronic skin technology that can detect “prick” and “hot” pain sensations like humans. This research result has applications in the development of humanoid robots and prosthetic hands in the future.

Scientists are continuously performing research to imitate tactile, olfactory and palate senses, and is expected to be the next mimetic technology for various applications. Currently, most tactile sensor research is focused on physical mimetic technologies that measure the pressure used for a robot to grab an object, but psychosensory tactile research on mimicking human tactile sensory responses like those caused by soft, smooth or rough surfaces has a long way to go.

Professor Jae Eun Jang’s team has developed a tactile sensor that can feel and temperature like humans through a joint project with Professor Cheil Moon’s team in the Department of Brain and Cognitive Science, Professor Ji-woong Choi’s team in the Department of Information and Communication Engineering, and Professor Hongsoo Choi’s team in the Department of Robotics Engineering. Its key strengths are that it has simplified the sensor structure and can measure pressure and temperature at the same time. Furthermore, it can be applied on various tactile systems regardless of the measurement principle of the sensor.

Aug 31, 2019

Researchers develop process flow for high-res 3D printing of mini soft robotic actuators

Posted by in categories: 3D printing, robotics/AI

Soft robots are a class of robotic systems made of compliant materials and capable of safely adapting to complex environments. They have seen rapid growth recently and come in a variety of designs spanning multiple length scales, from meters to submicrometers.

In particular, small soft robots at millimeter scale are of practical interest as they can be designed as a combination of miniature actuators simply driven by pneumatic pressure. They are also well suited for navigation in confined areas and manipulation of small objects.

However, scaling down soft pneumatic robots to millimeters results in finer features that are reduced by more than one order of magnitude. The design complexity of such robots demands great delicacy when they are fabricated with traditional processes such as molding and soft lithography. Although emerging 3D printing technologies like digital light processing (DLP) offer high theoretical resolutions, dealing with microscale voids and channels without causing clogging has still been challenging. Indeed, successful examples of 3D printing miniature soft pneumatic robots are rare.

Aug 31, 2019

Meet Olli 2.0, a 3D-printed autonomous shuttle

Posted by in categories: 3D printing, robotics/AI

From afar, Olli resembles many of the “future is now!” electric autonomous shuttles that have popped up in recent years.

The tall rectangular pod, with its wide-set headlights and expansive windows nestled between a rounded frame, gives the shuttle a friendly countenance that screams, ever so gently, “come along, take a ride.”

Continue reading “Meet Olli 2.0, a 3D-printed autonomous shuttle” »

Aug 31, 2019

NASA Considers Robotic Lunar Pit Mission; Moon’s Subsurface Key To Exploration

Posted by in categories: robotics/AI, space, sustainability

The Moon’s subsurface is the key to its longterm development and sustainability, says NASA scientist.


A view of the Apollo 11 lunar module “Eagle” as it returned from the surface of the moon to dock with the command module “Columbia”. A smooth mare area is visible on the Moon below and a half-illuminated Earth hangs over the horizon. Command module pilot Michael Collins took this picture.

Aug 31, 2019

Robot pilot that can grab the flight controls gets its plane licence

Posted by in categories: robotics/AI, transportation

By David Hambling

A robot pilot is learning to fly. It has passed its pilot’s test and flown its first plane, but it has also had its first mishap too.

Unlike a traditional autopilot, the ROBOpilot Unmanned Aircraft Conversion System literally takes the controls, pressing on foot pedals and handling the yoke using robotic arms. It reads the dials and meters with a computer vision system.

Aug 30, 2019

Watch a Self-Driving Car Deftly Zoom Through a Heavy Rainstorm

Posted by in categories: robotics/AI, transportation

The video uploaded by Logan LeGrand today shows a modified 2019 Toyota Corolla hatchback easily maintaining its 40 mph speed throughout a heavy downpour.

Aug 30, 2019

Robotic thread is designed to slip through the brain’s blood vessels

Posted by in categories: biotech/medical, robotics/AI

Magnetically controlled device could deliver clot-reducing therapies in response to stroke or other brain blockages.

Aug 30, 2019

A deep learning technique for context-aware emotion recognition

Posted by in categories: biotech/medical, robotics/AI

A team of researchers at Yonsei University and École Polytechnique Fédérale de Lausanne (EPFL) has recently developed a new technique that can recognize emotions by analyzing people’s faces in images along with contextual features. They presented and outlined their deep learning-based architecture, called CAER-Net, in a paper pre-published on arXiv.

For several years, researchers worldwide have been trying to develop tools for automatically detecting by analyzing images, videos or audio clips. These tools could have numerous applications, for instance, improving robot-human interactions or helping doctors to identify signs of mental or neural disorders (e.g.„ based on atypical speech patterns, facial features, etc.).

So far, the majority of techniques for recognizing emotions in images have been based on the analysis of people’s facial expressions, essentially assuming that these expressions best convey humans’ emotional responses. As a result, most datasets for training and evaluating emotion recognition tools (e.g., the AFEW and FER2013 datasets) only contain cropped images of human faces.

Aug 30, 2019

Artificial intelligence uncovers new details about Old Master paintings

Posted by in categories: media & arts, robotics/AI

Artificial intelligence has been used to analyse high-resolution digital X-ray images of the world famous Ghent Altarpiece, as part of an investigative project led by UCL.

The finding is expected to improve our understanding of art masterpieces and provide new opportunities for art investigation, conservation and presentation.

Researchers from the National Gallery, Duke University and UCL worked with technical images acquired from the brothers Van Eyck’s Ghent Altarpiece, a large and complex 15th-century altarpiece in St Bavo’s Cathedral, Belgium.