Toggle light / dark theme

Join us on Patreon!
https://www.patreon.com/MichaelLustgartenPhD

Papers referenced in the video:
Life-Span Extension in Mice by Preweaning Food Restriction and by Methionine Restriction in Middle Age.
https://pubmed.ncbi.nlm.nih.gov/19414512/

Low methionine ingestion by rats extends life span.
https://pubmed.ncbi.nlm.nih.gov/8429371/

Fasting glucose level and all-cause or cause-specific mortality in Korean adults: a nationwide cohort study.
https://pubmed.ncbi.nlm.nih.gov/32623847/

Total plasma homocysteine and cardiovascular risk profile. The Hordaland Homocysteine Study.
https://pubmed.ncbi.nlm.nih.gov/7474221/

Predicting Age by Mining Electronic Medical Records with Deep Learning Characterizes Differences between Chronological and Physiological Age.

The same goes for artificial intelligence (AI) and machine learning (ML) models.

And just as the human brain created AI and ML models that grow increasingly sophisticated by the day, these systems are now being applied to study the human brain itself. Specifically, such studies are seeking to enhance the capabilities of AI systems and more closely model them after brain functions so that they can operate in increasingly autonomous ways.

Researchers at Meta AI have embarked on one such initiative. The research arm of Facebook’s parent company today announced a long-term study to better understand how the human brain processes language. Researchers are looking at how the brain and AI language models respond to the same spoken or written sentences.

The West Japan Rail Company (or JR West) unveiled its Gundam-style heavy equipment robot for carrying out repairs.

The robot takes on the appearance of a humanoid upper body mounted on the end of a hydraulic crane arm, which rides around on the rail system atop a specially braced rail car. The rail car can deploy stabilizing legs when it arrives at its destination along the line, allowing the robot to manipulate heavy equipment around the rail system instead of workers “to improve productivity and safety.”

NASA has an ambitious plan to bring a piece of Mars back to Earth for study. Called the Mars Sample Return mission, the idea is to send a robotic team consisting of a lander, rover, and an ascent vehicle to the red planet to pick up samples being collected and sealed in tubes by the Perseverance rover. These samples will then be launched off the Martian surface and into orbit, where they’ll be collected and brought back to Earth.

If that sounds complicated, it is. NASA is working on some of the hardware required for this ambitious long-term mission, and recently the agency tested out a new design for the Earth Entry System vehicle which will carry the sample through our planet’s atmosphere and to the surface. And its test was a dramatic one — dropping a model of the vehicle from 1,200 feet and seeing if it survived.

The test was focused on the vehicle’s areoshell, testing out one possible design for the shell which has to protect the delicate electronics and sample inside from the heat and forces of passing through Earth’s atmosphere. To do this, the test was performed at the Utah Test and Training Range, where a helicopter ascended with a model of the vehicle and areoshell, called a Manufacturing Demonstration Unit (MDU), that was covered in sensors and measures 1.25 meters across. The MDU was then dropped by the helicopter and its descent was recorded. Coming from an altitude of 1,200 feet, the MDU reached the speeds that would be engineers think are equivalent to a sample landing mission.

Our brain is constantly working to make sense of the world around us and finding patterns in it, even when we are asleep the brain is storing patterns. Making sense of the brain itself, however, has remained an intricate pursuit.

Christoff Koch, a well-known neuroscientist, famously called the human brain the “most complex object in our observable universe” [1]. Aristotle, on the other hand, thought it was the heart that gave rise to consciousness and that the brain functioned as a cooling system both practically and philosophically [2]. Theories of the brain have evolved since then, generally shaped by knowledge gathered over centuries. Historically, to analyze the brain, we had to either extract the brain from deceased people or perform invasive surgery. Progress over the past decades has led to inventions that allow us to study the brain without invasive surgeries. A few examples of imaging techniques that do not require surgery include macroscopic imaging techniques such as functional magnetic resonance imaging (fMRI) or approaches with a high temporal resolution such as electroencephalogy (EEG). Advances in treatments, such as closed-loop electrical stimulation systems, have enabled the treatment of disorders like epilepsy and more recently depression [3, 4]. Existing neuroimaging approaches can produce a considerable amount of data about a very complex organ that we still do not fully understand which has led to an interest in non-linear modeling approaches and algorithms equipped to learn meaningful features.

This article provides an informal introduction to unique aspects of neuroimaging data and how we can leverage these aspects with deep learning algorithms. Specifically, this overview will first explain some common neuroimaging modalities more in-depth and then discuss applications of deep learning in conjunction with some of the unique characteristics of neuroimaging data. These unique characteristics tie into a broader movement in deep learning, namely that data understanding should be a goal in itself to maximize the impact of applied deep learning.

Waves break once they swell to a critical height, before cresting and crashing into a spray of droplets and bubbles. These waves can be as large as a surfer’s point break and as small as a gentle ripple rolling to shore. For decades, the dynamics of how and when a wave breaks have been too complex to predict.

Now, MIT engineers have found a new way to model how waves break. The team used machine learning along with data from wave-tank experiments to tweak equations that have traditionally been used to predict wave behavior. Engineers typically rely on such equations to help them design resilient offshore platforms and structures. But until now, the equations have not been able to capture the complexity of breaking waves.

The updated model made more accurate predictions of how and when waves break, the researchers found. For instance, the model estimated a wave’s steepness just before breaking, and its energy and frequency after breaking, more accurately than the conventional wave equations.

The West Japan Rail Company has released video of its new humanoid heavy equipment robot. Mounted on the end of a crane, this gundam-style robot torso mimics the arm and head motions of a human pilot, who sees through the robot’s eyes via VR goggles.

The key objectives here, according to the company, are “to improve productivity and safety,” enabling workers to lift and naturally manipulate heavy equipment around the rail system without exposing them to the risk of electric shocks or falling.

The robot’s large torso is mounted to a hydraulic crane arm, which rides around the rail system on a specially braced rail car, putting down stabilizing legs when it’s time to get to work.

ECHO, the robot, belongs to the Woods Hole Oceanographic Institution and rolls around the tundra collecting data used to study marine ecosystems.

The small robot takes readings and collects data like a normal researcher, but his existence allows researchers to collect real-time information year round and minimize the impact their presence could have on the animals’ lives.

Researchers say the penguins seem to be getting along swimmingly with the robot.