Watching NASA’s RoboSimian robot, a.k.a. “Clyde,” exit a vehicle by seamlessly becoming a creepy quadruped never gets old.
Watching NASA’s RoboSimian robot, a.k.a. “Clyde,” exit a vehicle by seamlessly becoming a creepy quadruped never gets old.
Researchers have revealed a radical new use of AI — to predict earthquakes.
A team from Tokyo Metropolitan University have used machine-learning techniques to analyze tiny changes in geomagnetic fields.
These allow the system, to predict natural disaster far earlier than current methods.
Topic: guest: dan elton phd, staff scientist at national institute of health, USA.
Topic: Application of AI to medical imaging Guest: Dan Elton PhD, Staff Scientist at National Institute of Health, USA.
EPFL spin-off SEED Biosciences has developed a pipetting robot that can dispense individual cells one by one. Their innovation allows for enhanced reliability and traceability, and can save life-science researchers time and money.
The engineers at SEED Biosciences, an EPFL spin-off, have come up with a unique pipetting robot that can isolate single cells with the push of a button—without damaging the cells. Their device also records the cells’ electrical signature so that they can be traced. While this innovation may seem trivial, it can save researchers several weeks of precious time and speed up development work in pharmaceuticals, cancer treatments and personalized medicine. The company began marketing its device this year.
Biometrics may be the best way to protect society against the threat of deepfakes, but new solutions are being proposed by the Content Authority Initiative and the AI Foundation.
Deepfakes are the most serious criminal threat posed by artificial intelligence, according to a new report funded by the Dawes Centre for Future Crime at the University College London (UCL), among a list of the top 20 worries for criminal facilitation in the next 15 years.
The study is published in the journal Crime Science, and ranks the 20 AI-enabled crimes based on the harm they could cause.
To get a better look at the world around them, animals constantly are in motion. Primates and people use complex eye movements to focus their vision (as humans do when reading, for instance); birds, insects, and rodents do the same by moving their heads, and can even estimate distances that way. Yet how these movements play out in the elaborate circuitry of neurons that the brain uses to “see” is largely unknown. And it could be a potential problem area as scientists create artificial neural networks that mimic how vision works in self-driving cars.
To better understand the relationship between movement and vision, a team of Harvard researchers looked at what happens in one of the brain’s primary regions for analyzing imagery when animals are free to roam naturally. The results of the study, published Tuesday in the journal Neuron, suggest that image-processing circuits in the primary visual cortex not only are more active when animals move, but that they receive signals from a movement-controlling region of the brain that is independent from the region that processes what the animal is looking at. In fact, the researchers describe two sets of movement-related patterns in the visual cortex that are based on head motion and whether an animal is in the light or the dark.
The movement-related findings were unexpected, since vision tends to be thought of as a feed-forward computation system in which visual information enters through the retina and travels on neural circuits that operate on a one-way path, processing the information piece by piece. What the researchers saw here is more evidence that the visual system has many more feedback components where information can travel in opposite directions than had been thought.
AI is emerging as a driving technology behind the internet of things (IoT). Learn about the new AIoT, and how it will impact the future.
On the higher end, they work to ensure that development is open in order to work on multiple cloud infrastructures, providing companies the ability to know that portability exists.
That openness is also why deep learning is not yet part of a solution. There is still not the transparency needed into the DL layers in order to have the trust necessary for privacy concerns. Rather, these systems aim to help manage information privacy for machine learning applications.
Artificial intelligence applications are not open, and can put privacy at risk. The addition of good tools to address privacy for data being used by AI systems is an important early step in adding trust into the AI equation.
Facebook is using cutting-edge artificial intelligence to extract valuable data from images. The company showed Digital Trends some of its ongoing work.
Robert Behnken and Chris Cassidy concluded their spacewalk at 12:41 p.m. EDT, after five hours and 29 minutes. The two NASA astronauts completed a number of tasks designed to upgrade International Space Station systems.
They began by installing a protective storage unit that includes two Robotic External Leak Locator (RELL) units the Canadian Space Agency’s Dextre robot can use to detect leaks of ammonia, which is used to operate the station’s cooling system.
Behnken and Cassidy then removed two lifting fixtures at the base of station solar arrays on the near port truss, or backbone, of the station. The “H-fixtures” were used for ground processing of the solar arrays prior to their launch.