Toggle light / dark theme

Dedicated to those who argue that life extension is bad because it will create overpopulation problems. In adittion to the fact that natality rates are dangerously decreasing in some developed countries, this is only one example of changes that may will take place well before life extension may create a problem of such type, if ever.


Plenty, an ag-tech startup in San Francisco co-founded by Nate Storey, has been able to increase its productivity and production quality by using artificial intelligence and its new farming strategy. The company’s farm farms take up only 2 acres yet produce 720 acres worth of fruit and vegetables. In addition to their impressive food production, they also manage the production with robots and artificial intelligence.

The company says their farm produces about 400 times more food per acre than a traditional farm. It uses robots and AI to monitor water consumption, light, and the ambient temperature of the environment where plants grow. Over time, the AI learns how to grow crops faster with better quality.

Summary: Combining neuroimaging data with artificial intelligence technology, researchers have identified a complex network within the brain that comprehends the meaning of spoken sentences.

Source: university of rochester medical center.

Have you ever wondered why you are able to hear a sentence and understand its meaning – given that the same words in a different order would have an entirely different meaning?

Researchers from the Graduate School of Engineering and Symbiotic Intelligent Systems Research Center at Osaka University used motion capture cameras to compare the expressions of android and human faces. They found that the mechanical facial movements of the robots, especially in the upper regions, did not fully reproduce the curved flow lines seen in the faces of actual people. This research may lead to more lifelike and expressive artificial faces.

The field of robotics has advanced a great deal in recent decades. However, while current androids can appear very humanlike at first, their active facial expressions are still unnatural and unsettling to people. The exact reasons for this effect have been difficult to pinpoint. Now, a research team at Osaka University has used motion capture technology to monitor the facial expressions of five android faces and compared the results with actual human facial expressions. This was accomplished with six infrared cameras that monitored reflection markers at 120 frames per second and allowed the motions to be represented as three-dimensional displacement vectors.

“Advanced artificial systems can be difficult to design because the numerous components have with each other. The appearance of an android face can experience surface deformations that are hard to control,” study first author Hisashi Ishihara says. These deformations can be due to interactions between components such as the soft skin sheet and the skull-shaped structure, as well as the mechanical actuators.

1.2 billion pixel panorama of Mars by Curiosity rover at Sol 3060 (March 152021)

🎬 360VR video 8K: 🔎 360VR photo 85K: http://bit.ly/sol3060

NASA’s Mars Exploration Program Source images credit: NASA / JPL-Caltech / MSSS Stitching and retouching: Andrew Bodrov / 360pano.eu.

Music in video Song: Gates Of Orion Artist: Dreamstate Logic (http://www.dreamstatelogic.com​)