Menu

Blog

Archive for the ‘robotics/AI’ category: Page 1523

Mar 25, 2021

Space Robotics with International Business Law & Space CEO Malak Trabelsi

Posted by in categories: business, law, robotics/AI, space

We invite you to join us for our talk with International Business Law & Space CEO Malak Trabelsi and Everette Philips.

Mar 25, 2021

Tiny robots can now smuggle drugs into brain tumors

Posted by in categories: biotech/medical, robotics/AI

Interesting.

Ranjan KC


Researchers have discovered a way to camouflage microrobots in the body using white blood cells to pass through the blood-brain barrier.

Mar 24, 2021

Tiny swimming robots reach their target faster thanks to AI nudges

Posted by in categories: information science, particle physics, robotics/AI

Swimming robots the size of bacteria can be knocked off course by particles in the fluid they are moving through, but an AI algorithm learns from feedback to get them to their target quickly.

Mar 24, 2021

AI-controlled vertical farm produces 400 times more food per acre than a flat farm

Posted by in categories: food, life extension, robotics/AI, sustainability

Dedicated to those who argue that life extension is bad because it will create overpopulation problems. In adittion to the fact that natality rates are dangerously decreasing in some developed countries, this is only one example of changes that may will take place well before life extension may create a problem of such type, if ever.


Plenty, an ag-tech startup in San Francisco co-founded by Nate Storey, has been able to increase its productivity and production quality by using artificial intelligence and its new farming strategy. The company’s farm farms take up only 2 acres yet produce 720 acres worth of fruit and vegetables. In addition to their impressive food production, they also manage the production with robots and artificial intelligence.

Continue reading “AI-controlled vertical farm produces 400 times more food per acre than a flat farm” »

Mar 24, 2021

More Than Words: Using AI to Map How the Brain Understands Sentences

Posted by in categories: biotech/medical, robotics/AI

Summary: Combining neuroimaging data with artificial intelligence technology, researchers have identified a complex network within the brain that comprehends the meaning of spoken sentences.

Source: university of rochester medical center.

Have you ever wondered why you are able to hear a sentence and understand its meaning – given that the same words in a different order would have an entirely different meaning?

Mar 23, 2021

Expressing some doubts: Comparative analysis of human and android faces could lead to improvements

Posted by in category: robotics/AI

Researchers from the Graduate School of Engineering and Symbiotic Intelligent Systems Research Center at Osaka University used motion capture cameras to compare the expressions of android and human faces. They found that the mechanical facial movements of the robots, especially in the upper regions, did not fully reproduce the curved flow lines seen in the faces of actual people. This research may lead to more lifelike and expressive artificial faces.

The field of robotics has advanced a great deal in recent decades. However, while current androids can appear very humanlike at first, their active facial expressions are still unnatural and unsettling to people. The exact reasons for this effect have been difficult to pinpoint. Now, a research team at Osaka University has used motion capture technology to monitor the facial expressions of five android faces and compared the results with actual human facial expressions. This was accomplished with six infrared cameras that monitored reflection markers at 120 frames per second and allowed the motions to be represented as three-dimensional displacement vectors.

“Advanced artificial systems can be difficult to design because the numerous components have with each other. The appearance of an android face can experience surface deformations that are hard to control,” study first author Hisashi Ishihara says. These deformations can be due to interactions between components such as the soft skin sheet and the skull-shaped structure, as well as the mechanical actuators.

Mar 22, 2021

Mars 360: 1.2 billion pixel panorama of Mars — Sol 3060 (360video 8K)

Posted by in categories: climatology, media & arts, robotics/AI, space

1.2 billion pixel panorama of Mars by Curiosity rover at Sol 3060 (March 152021)

🎬 360VR video 8K: 🔎 360VR photo 85K: http://bit.ly/sol3060

Continue reading “Mars 360: 1.2 billion pixel panorama of Mars — Sol 3060 (360video 8K)” »

Mar 22, 2021

Artificial neurons are smaller and more energy efficient

Posted by in category: robotics/AI

Researchers have developed a new device able to run neural network computations using 100 times less energy and area than existing CMOS-based hardware.

Mar 22, 2021

A Thousand Brains introduces a novel theory of intelligence

Posted by in categories: futurism, robotics/AI

A Thousand Brains provides a new theory of intelligence, how it can lead to the creation of truly intelligent AI, and implications for the future of humanity. ‘’Brilliant…exhilarating’’ — from the foreword by Richard Dawkins.

Mar 22, 2021

Researchers’ algorithm designs soft robots that sense

Posted by in categories: information science, robotics/AI

There are some tasks that traditional robots — the rigid and metallic kind — simply aren’t cut out for. Soft-bodied robots, on the other hand, may be able to interact with people more safely or slip into tight spaces with ease. But for robots to reliably complete their programmed duties, they need to know the whereabouts of all their body parts. That’s a tall task for a soft robot that can deform in a virtually infinite number of ways.

MIT researchers have developed an algorithm to help engineers design soft robots that collect more useful information about their surroundings. The deep-learning algorithm suggests an optimized placement of sensors within the robot’s body, allowing it to better interact with its environment and complete assigned tasks. The advance is a step toward the automation of robot design. “The system not only learns a given task, but also how to best design the robot to solve that task,” says Alexander Amini. “Sensor placement is a very difficult problem to solve. So, having this solution is extremely exciting.”

The research will be presented during April’s IEEE International Conference on Soft Robotics and will be published in the journal IEEE Robotics and Automation Letters. Co-lead authors are Amini and Andrew Spielberg, both PhD students in MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). Other co-authors include MIT PhD student Lillian Chin, and professors Wojciech Matusik and Daniela Rus.