Toggle light / dark theme

AI age teller: Using chest X-ray to tell your age

The Japanese have created an uncanny AI model that can estimate your true age from the looks of your chest X-ray. It can help doctors in the early detection of chronic disorders.

Have you ever wondered why some people look much older than their chronological age? A new study from Japan’s Osaka Metropolitan University (OMU) suggests this could be a sign of a disease they don’t know yet.

The study authors have developed an AI program that can accurately calculate an individual’s age by reading their chest X-ray. This model estimates age, unlike various previously reported AI programs that examine radiographs to detect lung anomalies. Then researchers use this information to predict body ailments further.

AI And The Issues with Data Scraping

This post is also available in: he עברית (Hebrew)

Many artificial intelligence tools use public data to train their large language models, but now large social media sites are looking for ways to defend against data scraping. The problem is that scraping isn’t currently illegal.

According to Cybernews, data scraping refers to a computer program extracting data from the output generated from another program, and it is becoming a big problem for large social media sites like Twitter or Reddit.

How old are you, really? AI can tell your true age by looking at your chest

What if “looking your age” refers not to your face, but to your chest? Osaka Metropolitan University scientists have developed an advanced artificial intelligence (AI) model that utilizes chest radiographs to accurately estimate a patient’s chronological age. More importantly, when there is a disparity, it can signal a correlation with chronic disease.

These findings mark a leap in , paving the way for improved early disease detection and intervention. The results are published in The Lancet Healthy Longevity.

The research team, led by graduate student Yasuhito Mitsuyama and Dr. Daiju Ueda from the Department of Diagnostic and Interventional Radiology at the Graduate School of Medicine, Osaka Metropolitan University, first constructed a deep learning-based AI model to estimate age from chest radiographs of healthy individuals.

Google reportedly building A.I. that offers life advice

😗😁


One of Google’s AI units is using generative AI to develop at least 21 different tools for life advice, planning and tutoring, The New York Times reported Wednesday.

Google’s DeepMind has become the “nimble, fast-paced” standard-bearer for the company’s AI efforts, as CNBC previously reported, and is behind the development of the tools, the Times reported.

News of the tool’s development comes after Google’s own AI safety experts had reportedly presented a slide deck to executives in December that said users taking life advice from AI tools could experience “diminished health and well-being” and a “loss of agency,” per the Times.

Neuroscientists Re-create Pink Floyd Song from Listeners’ Brain Activity

Neuroscientists have worked for decades to decode what people are seeing, hearing or thinking from brain activity alone. In 2012 a team that included the new study’s senior author—cognitive neuroscientist Robert Knight of the University of California, Berkeley—became the first to successfully reconstruct audio recordings of words participants heard while wearing implanted electrodes. Others have since used similar techniques to reproduce recently viewed or imagined pictures from participants’ brain scans, including human faces and landscape photographs. But the recent PLOS Biology paper by Knight and his colleagues is the first to suggest that scientists can eavesdrop on the brain to synthesize music.

“These exciting findings build on previous work to reconstruct plain speech from brain activity,” says Shailee Jain, a neuroscientist at the University of California, San Francisco, who was not involved in the new study. “Now we’re able to really dig into the brain to unearth the sustenance of sound.”

To turn brain activity data into musical sound in the study, the researchers trained an artificial intelligence model to decipher data captured from thousands of electrodes that were attached to the participants as they listened to the Pink Floyd song while undergoing surgery.

The humanoid robot that can pilot an airplane better than a human

The robot’s memory is so large that it can memorise all Jeppesen navigation charts, a task that is impossible for human pilots.

Both artificial intelligence (AI) and robotics have made significant strides in recent years, meaning most human jobs could soon be overtaken by technology — on the ground and even in the skies above us.

A team of engineers and researchers from the Korea Advanced Institute of Science & Technology (KAIST) is currently developing a humanoid robot that can fly aircraft without needing to modify the cockpit.

Mimicking the Mind: Quantum Material Exhibits Brain-Like “Non-Local” Behavior

UC San Diego’s Q-MEEN-C is developing brain-like computers through mimicking neurons and synapses in quantum materials. Recent discoveries in non-local interactions represent a critical step towards more efficient AI hardware that could revolutionize artificial intelligence technology.

We often believe that computers are more efficient than humans. After all, computers can solve complex math equations in an instant and recall names that we might forget. However, human brains can process intricate layers of information rapidly, accurately, and with almost no energy input. Recognizing a face after seeing it only once or distinguishing a mountain from an ocean are examples of such tasks. These seemingly simple human functions require considerable processing and energy from computers, and even then, the results may vary in accuracy.

How close the measured value conforms to the correct value.