Menu

Blog

Archive for the ‘robotics/AI’ category: Page 1716

Apr 17, 2020

Using nano-scale spintronics, researchers aim to build novel artificial brain

Posted by in categories: nanotechnology, particle physics, robotics/AI

Department of Engineering, Aarhus University, is coordinating a FET-Open backed project to build an entirely new AI hardware technology using nano-scale spintronics that can radically change the way in which computers work. The project will develop a neuromorphic computing system using synaptic neurons implemented in spintronics: a novel AI hardware that can set a framework for AI software in a physical system built like a human brain, upping computer performance by up to 100.000 times.

Apr 17, 2020

Robot Deliveries Might End Up Being Common, Post-Coronavirus Pandemic

Posted by in categories: biotech/medical, food, humor, robotics/AI

While the Wuhan district in China was under quarantine, news surfaced of robots delivering food and, later, medical supplies. Meanwhile, in the United States, the French company NAVYA configured its autonomous passenger shuttles in Florida to transport COVID-19 tests to the Mayo Clinic from off-site test locations. As the weeks of stay-at-home orders and recommendations slip into months, the delivery robots that were seen as a joke, fad, or nuisance have in some instances found a way into the public consciousness as important tools to combat the spread of coronavirus. The question is, will their usefulness extend post-lockdown?

Apr 17, 2020

Moscow’s Facial Recognition Tech Will Outlast the Coronavirus

Posted by in categories: biotech/medical, robotics/AI, surveillance

đŸ‘œ Facial recognition and Covid 19 in Moscow, Russia.

Fyodor R.

Continue reading “Moscow’s Facial Recognition Tech Will Outlast the Coronavirus” »

Apr 17, 2020

Robots with insect brains

Posted by in categories: genetics, habitats, neuroscience, robotics/AI

It is an engineer’s dream to build a robot as competent as an insect at locomotion, directed action, navigation, and survival in complex conditions. But as well as studying insects to improve robotics, in parallel, robot implementations have played a useful role in evaluating mechanistic explanations of insect behavior, testing hypotheses by embedding them in real-world machines. The wealth and depth of data coming from insect neuroscience hold the tantalizing possibility of building complete insect brain models. Robotics has a role to play in maintaining a focus on functional understanding—what do the neural circuits need to compute to support successful behavior?

Insect brains have been described as “minute structures controlling complex behaviors” (1): Compare the number of neurons in the fruit fly brain (∌135,000) to that in the mouse (70 million) or human (86 billion). Insect brain structures and circuits evolved independently to solve many of the same problems faced by vertebrate brains (or a robot’s control program). Despite the vast range of insect body types, behaviors, habitats, and lifestyles, there are many surprising consistencies across species in brain organization, suggesting that these might be effective, efficient, and general-purpose solutions.

Unraveling these circuits combines many disciplines, including painstaking neuroanatomical and neurophysiological analysis of the components and their connectivity. An important recent advance is the development of neurogenetic methods that provide precise control over the activity of individual neurons in freely behaving animals. However, the ultimate test of mechanistic understanding is the ability to build a machine that replicates the function. Computer models let researchers copy the brain’s processes, and robots allow these models to be tested in real bodies interacting with real environments (2). The following examples illustrate how this approach is being used to explore increasingly sophisticated control problems, including predictive tracking, body coordination, navigation, and learning.

Apr 16, 2020

Kebotix raises $11.5 million to automate lab experiments with AI and robotics

Posted by in categories: finance, robotics/AI

Kebotix, which is developing a lab automation platform that leverages AI and robotics, has raised $11.5 million in venture capital.

Apr 16, 2020

AT&T 4G LTE connects IoT robots to kill germs, keep shelves stocked

Posted by in categories: biotech/medical, internet, robotics/AI

AT&T is connecting IoT robots, in new partnerships with Xenex and Brain Corp., that aim to help hospitals and retail establishments like grocery stores keep facilities clean, kill germs and keep shelves stocked more efficiently.

Chris Penrose, SVP of Advanced Solutions at AT&T, told FierceWireless that the robots are riding on the carrier’s 4G LTE network, rather than narrowband IoT (NB-IoT) or LTE-M networks. That’s because of the large amounts of data they need to push, along with latency and speed requirements for these particular use cases.

Continue reading “AT&T 4G LTE connects IoT robots to kill germs, keep shelves stocked” »

Apr 16, 2020

Edge AI Is The Future, Intel And Udacity Are Teaming Up To Train Developers

Posted by in categories: futurism, robotics/AI

On April 16, 2020, Intel and Udacity jointly announced their new Intel¼ Edge AI for IoT Developers Nanodegree program to train the developer community in deep learning and computer vision. If you are wondering where AI is headed, now you know, it’s headed to the edge. Edge computing is the concept of storing data and computing data directly at the location where it is needed. The global edge computing market is forecasted to reach 1.12 trillion dollars by 2023.

There’s a real need for developers worldwide in this new market. Intel and Udacity aim to train 1 million developers.

AI Needs To Be On the Edge.

Apr 16, 2020

Robot painters

Posted by in category: robotics/AI

These industrial robots are impeccable portrait painters.

Apr 16, 2020

Facebook is using bots to simulate what its users might do

Posted by in category: robotics/AI

Has developed a new method to play out the consequences of its code.

The context: Like any software company, the tech giant needs to test its product any time it pushes updates. But the sorts of debugging methods that normal-size companies use aren’t really enough when you’ve got 2.5 billion users. Such methods usually focus on checking how a single user might experience the platform and whether the software responds to those individual users’ actions as expected. In contrast, as many as 25% of Facebook’s major issues emerge only when users begin interacting with one another. It can be difficult to see how the introduction of a feature or updates to a privacy setting might play out across billions of user interactions.

SimCity: In response, Facebook built a scaled-down version of its platform to simulate user behavior. Called WW, it helps engineers identify and fix the undesired consequences of new updates before they’re deployed. It also automatically recommends changes that can be made to the platform to improve the community experience.

Apr 16, 2020

Google’s AI enables robots to make decisions on the fly

Posted by in category: robotics/AI

In a new study, Google researchers describe a system that makes decisions continuously in response to changes in the environment.