Menu

Blog

Archive for the ‘robotics/AI’ category: Page 1466

Sep 19, 2020

Huang’s Law Is the New Moore’s Law, and Explains Why Nvidia Wants Arm

Posted by in category: robotics/AI

The rule that the same dollar buys twice the computing power every 18 months is no longer true, but a new law—which we named for the CEO of Nvidia, the company now most emblematic of commercial AI—is in full effect.

Sep 18, 2020

The brain’s memory abilities inspire AI experts in making neural networks less ‘forgetful’

Posted by in categories: biotech/medical, robotics/AI

Artificial intelligence (AI) experts at the University of Massachusetts Amherst and the Baylor College of Medicine report that they have successfully addressed what they call a “major, long-standing obstacle to increasing AI capabilities” by drawing inspiration from a human brain memory mechanism known as “replay.”

First author and postdoctoral researcher Gido van de Ven and principal investigator Andreas Tolias at Baylor, with Hava Siegelmann at UMass Amherst, write in Nature Communications that they have developed a new method to protect—” surprisingly efficiently”— from “catastrophic forgetting;” upon learning new lessons, the networks forget what they had learned before.

Siegelmann and colleagues point out that deep are the main drivers behind recent AI advances, but progress is held back by this forgetting.

Sep 18, 2020

New data processing module makes deep neural networks smarter

Posted by in category: robotics/AI

Artificial intelligence researchers at North Carolina State University have improved the performance of deep neural networks by combining feature normalization and feature attention modules into a single module that they call attentive normalization (AN). The hybrid module improves the accuracy of the system significantly, while using negligible extra computational power.

“Feature normalization is a of training deep neural networks, and feature attention is equally important for helping networks highlight which features learned from raw data are most important for accomplishing a given task,” says Tianfu Wu, corresponding author of a paper on the work and an assistant professor of electrical and computer engineering at NC State. “But they have mostly been treated separately. We found that combining them made them more efficient and effective.”

To test their AN module, the researchers plugged it into four of the most widely used neural architectures: ResNets, DenseNets, MobileNetsV2 and AOGNets. They then tested the networks against two industry standard benchmarks: the ImageNet-1000 classification and the MS-COCO 2017 object detection and instance segmentation benchmark.

Sep 18, 2020

Fully Automated Microchip Electrophoresis Analyzer for Potential Life Detection Missions

Posted by in categories: robotics/AI, space

There are a variety of complementary observations that could be used in the search for life in extraterrestrial settings. At the molecular scale, patterns in the distribution of organics could provide powerful evidence of a biotic component. In order to observe these molecular biosignatures during spaceflight missions, it is necessary to perform separation science in situ. Microchip electrophoresis (ME) is ideally suited for this task. Although this technique is readily miniaturized and numerous instruments have been developed over the last 3 decades, to date, all lack the automation capabilities needed for future missions of exploration. We have developed a portable, automated, battery-powered, and remotely operated ME instrument coupled to laser-induced fluorescence detection. This system contains all the necessary hardware and software interfaces for end-to-end functionality. Here, we report the first application of the system for amino acid analysis coupled to an extraction unit in order to demonstrate automated sample-to-data operation. The system was remotely operated aboard a rover during a simulated Mars mission in the Atacama Desert, Chile. This is the first demonstration of a fully automated ME analysis of soil samples relevant to planetary exploration. This validation is a critical milestone in the advancement of this technology for future implementation on a spaceflight mission.

Sep 18, 2020

Quantum-inspired multimodal fusion for video sentiment analysis

Posted by in categories: quantum physics, robotics/AI

We tackle the crucial challenge of fusing different modalities of features for multimodal sentiment analysis. Mainly based on neural networks, existing approaches largely model multimodal interactions in an implicit and hard-to-understand manner. We address this limitation with inspirations from quantum theory, which contains principled methods for modeling complicated interactions and correlations. In our quantum-inspired framework, the word interaction within a single modality and the interaction across modalities are formulated with superposition and entanglement respectively at different stages. The complex-valued neural network implementation of the framework achieves comparable results to state-of-the-art systems on two benchmarking video sentiment analysis datasets. In the meantime, we produce the unimodal and bimodal sentiment directly from the model to interpret the entangled decision.

Sep 18, 2020

NASA to test precision automated landing system designed for the moon and Mars on upcoming Blue Origin mission

Posted by in categories: information science, robotics/AI, space travel

NASA is going to be testing a new precision landing system designed for use on the tough terrain of the moon and Mars for the first time during an upcoming mission of Blue Origin’s New Shepard reusable suborbital rocket. The “Safe and Precise Landing – Integrated Capabilities Evolution” (SPLICE) system is made up of a number of lasers, an optical camera and a computer to take all the data collected by the sensors and process it using advanced algorithms, and it works by spotting potential hazards, and adjusting landing parameters on the fly to ensure a safe touchdown.

SPLICE will get a real-world test of three of its four primary subsystems during a New Shepard mission to be flown relatively soon. The Jeff Bezos –founded company typically returns its first-stage booster to Earth after making its trip to the very edge of space, but on this test of SPLICE, NASA’s automated landing technology will be operating on board the vehicle the same way they would when approaching the surface of the moon or Mars. The elements tested will include “terrain relative navigation,” Doppler radar and SPLICE’s descent and landing computer, while a fourth major system — lidar-based hazard detection — will be tested on future planned flights.

Currently, NASA already uses automated landing for its robotic exploration craft on the surface of other planets, including the Perseverance rover headed to Mars. But a lot of work goes into selecting a landing zone with a large area of unobstructed ground that’s free of any potential hazards in order to ensure a safe touchdown. Existing systems can make some adjustments, but they’re relatively limited in that regard.

Sep 17, 2020

The Army wants to build a brand new exoskeleton to help soldiers ruck faster and harder

Posted by in categories: cyborgs, robotics/AI

The Army is formally moving ahead with the development and fielding of a powered exoskeleton to help soldiers move faster and carry more while reducing overall fatigue after years of experimentation and testing.

Officials with Army Futures Command are currently in the process of drafting formal requirements for an infantry exoskeleton ahead of a defense industry day sometime in November, said Ted Maciuba, deputy director of the robotic requirements division for Army Futures Command.

Breaking Defense first reported news of the fresh exoskeleton effort.

Sep 16, 2020

This artificial spiderweb mimics the elasticity

Posted by in category: robotics/AI

This artificial spiderweb mimics the elasticity, adhesion, and tensile strength of spiderweb silk and, with the capacity to self-clean and sense objects, can even replicate some spiderweb features that rely on the behavior of spiders themselves.

Read more about the research Science Robotics:
🕸https://fcld.ly/wsnulle
🕸https://fcld.ly/rvgs2ub

Sep 16, 2020

Walmart Now Piloting On-Demand Drone Delivery with Flytrex

Posted by in categories: business, drones, robotics/AI

Sept. 9, 2020 By Tom Ward, Senior Vice President, Customer Product, Walmart.

Years ago, our founder Sam Walton famously said, “I have always been driven to buck the system, to innovate, to take things beyond where they’ve been.” It remains a guiding principle at Walmart to this day. From being an early pioneer of universal bar codes and electronic scanning cash registers to our work on autonomous vehicle delivery, we’re working to understand how these technologies can impact the future of our business and help us better serve our customers.

Our latest initiative has us exploring how drones can deliver items in a way that’s convenient, safe, and – you guessed it – fast. Today, we’re taking the next step in our exploration of on-demand delivery by announcing a new pilot with Flytrex, an end-to-end drone delivery company.

Sep 16, 2020

New bionics let us run, climb and dance | Hugh Herr

Posted by in categories: biotech/medical, business, cyborgs, robotics/AI, transhumanism

Visit http://TED.com to get our entire library of TED Talks, transcripts, translations, personalized talk recommendations and more.

Hugh Herr is building the next generation of bionic limbs, robotic prosthetics inspired by nature’s own designs. Herr lost both legs in a climbing accident 30 years ago; now, as the head of the MIT Media Lab’s Biomechatronics group, he shows his incredible technology in a talk that’s both technical and deeply personal — with the help of ballroom dancer Adrianne Haslet-Davis, who lost her left leg in the 2013 Boston Marathon bombing, and performs again for the first time on the TED stage.

Continue reading “New bionics let us run, climb and dance | Hugh Herr” »