Join and follow AI Sciences.
Category: robotics/AI – Page 1,975


Scientists develop artificial intelligence system for high precision recognition of hand gestures
Scientists from Nanyang Technological University, Singapore (NTU Singapore) have developed an Artificial Intelligence (AI) system that recognizes hand gestures by combining skin-like electronics with computer vision.
The recognition of human hand gestures by AI systems has been a valuable development over the last decade and has been adopted in high-precision surgical robots, health monitoring equipment and in gaming systems.
AI gesture recognition systems that were initially visual-only have been improved upon by integrating inputs from wearable sensors, an approach known as ‘data fusion’. The wearable sensors recreate the skin’s sensing ability, one of which is known as ‘somatosensory’.
NASA Robot Seamlessly Exits a Car In Mesmerizing Video
Watching NASA’s RoboSimian robot, a.k.a. “Clyde,” exit a vehicle by seamlessly becoming a creepy quadruped never gets old.
AI can predict quakes ‘unprecedentedly early’
Researchers have revealed a radical new use of AI — to predict earthquakes.
A team from Tokyo Metropolitan University have used machine-learning techniques to analyze tiny changes in geomagnetic fields.
These allow the system, to predict natural disaster far earlier than current methods.
Topic: Guest: Dan Elton PhD, Staff Scientist at National Institute of Health, USA
Topic: guest: dan elton phd, staff scientist at national institute of health, USA.
Topic: Application of AI to medical imaging Guest: Dan Elton PhD, Staff Scientist at National Institute of Health, USA.
New device delivers single cells in just one click
EPFL spin-off SEED Biosciences has developed a pipetting robot that can dispense individual cells one by one. Their innovation allows for enhanced reliability and traceability, and can save life-science researchers time and money.
The engineers at SEED Biosciences, an EPFL spin-off, have come up with a unique pipetting robot that can isolate single cells with the push of a button—without damaging the cells. Their device also records the cells’ electrical signature so that they can be traced. While this innovation may seem trivial, it can save researchers several weeks of precious time and speed up development work in pharmaceuticals, cancer treatments and personalized medicine. The company began marketing its device this year.

Deepfakes declared top AI threat, biometrics and content attribution scheme proposed to detect them
Biometrics may be the best way to protect society against the threat of deepfakes, but new solutions are being proposed by the Content Authority Initiative and the AI Foundation.
Deepfakes are the most serious criminal threat posed by artificial intelligence, according to a new report funded by the Dawes Centre for Future Crime at the University College London (UCL), among a list of the top 20 worries for criminal facilitation in the next 15 years.
The study is published in the journal Crime Science, and ranks the 20 AI-enabled crimes based on the harm they could cause.

Scientists find vision relates to movement
To get a better look at the world around them, animals constantly are in motion. Primates and people use complex eye movements to focus their vision (as humans do when reading, for instance); birds, insects, and rodents do the same by moving their heads, and can even estimate distances that way. Yet how these movements play out in the elaborate circuitry of neurons that the brain uses to “see” is largely unknown. And it could be a potential problem area as scientists create artificial neural networks that mimic how vision works in self-driving cars.
To better understand the relationship between movement and vision, a team of Harvard researchers looked at what happens in one of the brain’s primary regions for analyzing imagery when animals are free to roam naturally. The results of the study, published Tuesday in the journal Neuron, suggest that image-processing circuits in the primary visual cortex not only are more active when animals move, but that they receive signals from a movement-controlling region of the brain that is independent from the region that processes what the animal is looking at. In fact, the researchers describe two sets of movement-related patterns in the visual cortex that are based on head motion and whether an animal is in the light or the dark.
The movement-related findings were unexpected, since vision tends to be thought of as a feed-forward computation system in which visual information enters through the retina and travels on neural circuits that operate on a one-way path, processing the information piece by piece. What the researchers saw here is more evidence that the visual system has many more feedback components where information can travel in opposite directions than had been thought.
