Archive for the ‘robotics/AI’ category: Page 1214
Mar 13, 2022
How GitHub Uses Machine Learning to Extend Vulnerability Code Scanning
Posted by Kelvin Dafiaghor in categories: robotics/AI, security
Applying machine learning techniques to its rule-based security code scanning capabilities, GitHub hopes to be able to extend them to less common vulnerability patterns by automatically inferring new rules from the existing ones.
GitHub Code Scanning uses carefully defined CodeQL analysis rules to identify potential security vulnerabilities lurking in source code.
Mar 13, 2022
Jeff Dean Co-authors Guidelines for Resolving Instability and Quality Issues in the Design of Effective Sparse Expert Models
Posted by Kelvin Dafiaghor in category: robotics/AI
In the ongoing effort to scale AI systems without incurring prohibitively high training and compute costs, sparse mixture-of-expert models (MoE) have shown their potential for achieving impressive neural network pretraining speedups by dynamically selecting only the related parameters for each input. This enables such networks to vastly expand their parameters while keeping their FLOPs per token (compute) roughly constant. Advancing MoE models to state-of-the-art performance has however been hindered by training instabilities and uncertain quality during fine-tuning.
To address these issues, a research team from Google AI and Google Brain has published a set of guidelines for designing more practical and reliable sparse expert models. The team tested their recommendations by pretraining a 269B sparse model, which it says is the first to achieve state-of-the-art results on natural language processing (NLP) benchmarks.
The team summarizes their main contributions as:
Mar 13, 2022
The Amazing Artificial Intelligences of 2030 — AI Predictions
Posted by Dan Breeden in categories: futurism, robotics/AI
Artificial Intelligence has made tremendous progress these past few years, but what are the biggest AI Researchers expecting Artificial Intelligence to look like in the year 2030. They’ve made some amazing futurism predictions on all the amazing human abilities and efficiencies these AI’s will likely have. Human level AI will likely be a thing and other technology predictions like the metaverse will likely turn out to be true aswell.
–
TIMESTAMPS:
00:00 The Future of Artificial Intelligence.
00:53 Artificial Intelligence Predictions for 2030
02:15 The Metaverse in 2030
04:00 Other Technology Predictions for 2030
06:41 Last Words.
–
#future #ai #2030
Mar 13, 2022
‘Ultra-intelligence’ computer planned for 2024
Posted by Future Timeline in category: robotics/AI
UK-based AI chipmaker Graphcore has announced a project called The Good Computer. This will be capable of handling neural network models with 500 trillion parameters – large enough to enable what the company calls ‘ultra-intelligence’.
Mar 13, 2022
A Soft Thumb-Sized Vision-Based Touch Sensor
Posted by Shubham Ghosh Roy in category: robotics/AI
A team from the Max Planck Institute for Intelligent Systems in Germany have developed a novel thumb-shaped touch sensor capable of resolving the force of a contact, as well as its direction, over the whole surface of the structure. Intended for dexterous manipulation systems, the system is constructed from easily sourced components, so should scale up to a larger assemblies without breaking the bank. The first step is to place a soft and compliant outer skin over a rigid metallic skeleton, which is then illuminated internally using structured light techniques. From there, machine learning can be used to estimate the shear and normal force components of the contact with the skin, over the entire surface, by observing how the internal envelope distorts the structured illumination.
The novelty here is the way they combine both photometric stereo processing with other structured light techniques, using only a single camera. The camera image is fed straight into a pre-trained machine learning system (details on this part of the system are unfortunately a bit scarce) which directly outputs an estimate of the contact shape and force distribution, with spatial accuracy reported good to less than 1 mm and force resolution down to 30 millinewtons. By directly estimating normal and shear force components the direction of the contact could be resolved to 5 degrees. The system is so sensitive that it can reportedly detect its own posture by observing the deformation of the skin due its own weight alone!
Continue reading “A Soft Thumb-Sized Vision-Based Touch Sensor” »
Mar 13, 2022
US Regulators No Longer Require Fully Autonomous Vehicles to Have Human Controls
Posted by Kelvin Dafiaghor in categories: robotics/AI, transportation
US regulators have devised a set of new rules specifically targeting automated driving system (ADS) vehicles that do not feature human controls.
Mar 13, 2022
The next generation of robots will be shape-shifters
Posted by Dan Breeden in categories: nanotechnology, physics, robotics/AI
Physicists have discovered a new way to coat soft robots in materials that allow them to move and function in a more purposeful way. The research, led by the UK’s University of Bath, is described today in Science Advances.
Authors of the study believe their breakthrough modeling on ‘active matter’ could mark a turning point in the design of robots. With further development of the concept, it may be possible to determine the shape, movement and behavior of a soft solid not by its natural elasticity but by human-controlled activity on its surface.
The surface of an ordinary soft material always shrinks into a sphere. Think of the way water beads into droplets: the beading occurs because the surface of liquids and other soft material naturally contracts into the smallest surface area possible—i.e. a sphere. But active matter can be designed to work against this tendency. An example of this in action would be a rubber ball that’s wrapped in a layer of nano-robots, where the robots are programmed to work in unison to distort the ball into a new, pre-determined shape (say, a star).
Mar 13, 2022
Training robots with realistic pain expressions can reduce doctors’ risk of causing pain during physical exams
Posted by Dan Breeden in categories: biotech/medical, robotics/AI
A new approach to producing realistic expressions of pain on robotic patients could help to reduce error and bias during physical examination.
A team led by researchers at Imperial College London has developed a way to generate more accurate expressions of pain on the face of medical training robots during physical examination of painful areas.
Findings, published today in Scientific Reports, suggest this could help teach trainee doctors to use clues hidden in patient facial expressions to minimize the force necessary for physical examinations.
Mar 13, 2022
AI Overcomes Stumbling Block on Brain-Inspired Hardware
Posted by Dan Breeden in categories: information science, robotics/AI
Algorithms that use the brain’s communication signal can now work on analog neuromorphic chips, which closely mimic our energy-efficient brains.