Walking the show floor at Amazon re: MARS.
Category: robotics/AI – Page 1905
Scientific collaborators from Carnegie Mellon University and University of Minnesota have created a way for people to control a robotic arm using a non-invasive brain-computer interface (BCI). Previously, electrode array implants in the brain have been necessary to give severely disabled people the ability to manipulate an external robot. That is because implants can gather more actionable signal information by being placed right on the surface of the brain. Avoiding dangerously invasive brain surgery to place these implants, though, is a big goal in the field of brain-computer interfaces.
The Carnegie Mellon team turned to newly developed sensing and machine learning methods to accurately read signals coming from deep within the brain, relying only on an external electroencephalography cap for signal gathering. The system can quickly improve both its performance and that of the person using it, to achieve drastically better results than previous solutions. Volunteers using the technology were put through a pursuit task and a training regimen to improve their engagement, while the system was performing an analysis of their brain signals.
Here’s a video showing how the system allows for accurate, and very smooth tracking of a cursor on a screen by a robotic arm that’s mind-controlled by a human:
A 3D-printed prosthetic hand controlled using a new AI-based approach could significantly lower the cost of bionic limbs for amputees.
Real need: There are approximately 540,000 upper-limb amputees in the United States, but sophisticated “myoelectric” prosthetics, controlled by muscle contractions, are still very expensive. Such devices cost between $25,000 and $75,000 (not including maintenance and repair), and they can be difficult to use because it is hard for software to distinguish between different muscle flexes.
Handy invention: Researchers in Japan came up with a cheaper, smarter myoelectric device. Their five-fingered, 3D-printed hand is controlled using a neural network trained to recognize combined signals—or, as they call them, “muscle synergies.” Details of the bionic hand are published today in the journal Science Robotics.
Rutgers computer scientists used artificial intelligence to control a robotic arm that provides a more efficient way to pack boxes, saving businesses time and money.
“We can achieve low-cost, automated solutions that are easily deployable. The key is to make minimal but effective hardware choices and focus on robust algorithms and software,” said the study’s senior author Kostas Bekris, an associate professor in the Department of Computer Science in the School of Arts and Sciences at Rutgers University-New Brunswick.
Bekris, Abdeslam Boularias and Jingjin Yu, both assistant professors of computer science, formed a team to deal with multiple aspects of the robot packing problem in an integrated way through hardware, 3D perception and robust motion.
Cerebral organoids are artificially grown, 3D tissue cultures that resemble the human brain. Now, researchers from Japan report functional neural networks derived from these organoids in a study publishing June 27 in the journal Stem Cell Reports. Although the organoids aren’t actually “thinking,” the researchers’ new tool—which detects neural activity using organoids—could provide a method for understanding human brain function.
“Because they can mimic cerebral development, cerebral organoids can be used as a substitute for the human brain to study complex developmental and neurological disorders,” says corresponding author Jun Takahashi, a professor at Kyoto University.
However, these studies are challenging, because current cerebral organoids lack desirable supporting structures, such as blood vessels and surrounding tissues, Takahashi says. Since researchers have a limited ability to assess the organoids’ neural activities, it has also been difficult to comprehensively evaluate the function of neuronal networks.
Circa 2017
The deep learning tool can identify all the small mutations that make you unique, more accurately than every existing method.
I recently discovered it’s possible for someone in their 20s to feel old—just mention Microsoft’s Clippy to anyone born after the late 90s. Weirdly, there is an entire generation of people who never experienced that dancing wide-eyed paper-clip interrupting a Word doc writing project.
For readers who never knew him, Clippy was an interactive virtual assistant that took the form of an animated paperclip designed to be helpful in guiding users through Microsoft Word. As an iconic symbol of its decade, Clippy was also famously terrible. Worldwide consensus decided that Clippy was annoying, intrusive, and Time magazine even named it among the 50 worst inventions of all time (squeezed between ‘New Coke’ and Agent Orange. Not a fun list).
Though Clippy was intended to help users navigate their software lives, it may have been 20 or so years ahead of its time.
A huge acceleration in the use of robots will affect jobs around the world, Oxford Economics says.
For the first time, astrophysicists have used artificial intelligence techniques to generate complex 3D simulations of the universe. The results are so fast, accurate and robust that even the creators aren’t sure how it all works.