Toggle light / dark theme

Elon Musk on Neuralink: Solving Brain Diseases & Reducing the Risk of AI

Elon Musk delves into the groundbreaking potential of Neuralink, a revolutionary venture aimed at interfacing with the human brain to tackle an array of brain-related disorders. Musk envisions a future where Neuralink’s advancements lead to the resolution of conditions like autism, schizophrenia, memory loss, and even spinal cord injuries.

Elon Musk discusses the transformative power of Neuralink, highlighting its role in restoring motor control after spinal cord injuries, revitalizing brain function post-stroke, and combating genetically or trauma-induced brain diseases. Musk’s compelling insights reveal how interfacing with neurons at an intricate level can pave the way for repairing and enhancing brain circuits using cutting-edge technology.

Discover the three-layer framework Musk envisions: the primary layer akin to the limbic system, the more intelligent cortex as the secondary layer, and the potential tertiary layer where digital superintelligence might exist. Musk’s thought-provoking perspective raises optimism about the coexistence of a digital superintelligence with the human brain, fostering a harmonious relationship between these layers of consciousness.

Elon Musk emphasises the urgency of Neuralink’s mission, stressing the importance of developing a human brain interface before the advent of digital superintelligence and the elusive singularity. By doing so, he believes we can mitigate existential risks and ensure a stable future for humanity and consciousness as we navigate the uncharted territories of technological evolution.

For more insights, visit EM360tech.com:
https://em360tech.com/tech-news.

#AI #superintelligence #machinelearning.

An energy-efficient analog chip for AI inference

We’re just at the beginning of an AI revolution that will redefine how we live and work. In particular, deep neural networks (DNNs) have revolutionized the field of AI and are increasingly gaining prominence with the advent of foundation models and generative AI. But running these models on traditional digital computing architectures limits their achievable performance and energy efficiency. There has been progress in developing hardware specifically for AI inference, but many of these architectures physically split the memory and processing units. This means the AI models are typically stored in a discrete memory location, and computational tasks require constantly shuffling data between the memory and processing units. This process slows down computation and limits the maximum achievable energy efficiency.

Dunno if anyone has already posted this.


The chip showcases critical building blocks of a scalable mixed-signal architecture.

Scientists Recreate Pink Floyd Song by Reading Brain Signals of Listeners

The researchers also found a spot in the brain’s temporal lobe that reacted when volunteers heard the 16th notes of the song’s guitar groove. They proposed that this particular area might be involved in our perception of rhythm.

The findings offer a first step toward creating more expressive devices to assist people who can’t speak. Over the past few years, scientists have made major breakthroughs in extracting words from the electrical signals produced by the brains of people with muscle paralysis when they attempt to speak.

But a significant amount of the information conveyed through speech comes from what linguists call “prosodic” elements, like tone — “the things that make us a lively speaker and not a robot,” Dr. Schalk said.

“AI Town” lets you build your own GPT-based AI civilization

Venture capitalist a16z rebuilds a research paper with “AI Town” and releases the code. AI Town uses a language model to simulate a Sims-like virtual world in which all characters can flexibly pursue their motives and make decisions based on prompts.

In April, a team of researchers from Google and Stanford published the research paper Smallville. OpenAI’s GPT-3.5 simulates AI agents in a small digital town based solely on prompts.

Each character has an occupation, personality, and relationships with other characters, which are specified in an initial description. With further prompts, the AI agents begin to observe, plan, and make decisions.

Nvidia’s earnings will be the AI hype cycle’s biggest test

Nvidia (NVDA) will report its second quarter earnings after the closing bell next Wednesday, setting up what will be the AI hype cycle’s biggest test yet. During this AI gold rush, companies around the world looking to profit have turned to Nvidia’s graphics processors to power new AI software and platforms.

Currently, tech firms of all sizes are doing everything they can to get their hands on Nvidia chips. During Tesla’s (TSLA) Q2 earnings call, CEO Elon Musk told analysts that the automaker will take as many Nvidia graphics processors as the company can produce.


Nvidia is widely expected to have a blowout earnings report. A miss could derail the AI hype train.

Meet Pibot: Korea’s LLM-powered smart robotic pilot

Called “Pibot,” this humanoid robot integrates large language models to help it fly any aircraft as well as, if not better than, a human pilot.

Researchers at the Korea Advanced Institute of Science & Technology (KAIST) are working to develop a humanoid pilot that can fly an aircraft without modifying the cockpit. Called “Pibot,” the robot has articulated arms and fingers that can interact with flight controls with great precision and dexterity. It also comes with camera “eyes” that help the robot monitor the internal and external conditions of the aircraft while in control.


Korea Herald.

Real robot-pilot.

AI age teller: Using chest X-ray to tell your age

The Japanese have created an uncanny AI model that can estimate your true age from the looks of your chest X-ray. It can help doctors in the early detection of chronic disorders.

Have you ever wondered why some people look much older than their chronological age? A new study from Japan’s Osaka Metropolitan University (OMU) suggests this could be a sign of a disease they don’t know yet.

The study authors have developed an AI program that can accurately calculate an individual’s age by reading their chest X-ray. This model estimates age, unlike various previously reported AI programs that examine radiographs to detect lung anomalies. Then researchers use this information to predict body ailments further.

AI And The Issues with Data Scraping

This post is also available in: he עברית (Hebrew)

Many artificial intelligence tools use public data to train their large language models, but now large social media sites are looking for ways to defend against data scraping. The problem is that scraping isn’t currently illegal.

According to Cybernews, data scraping refers to a computer program extracting data from the output generated from another program, and it is becoming a big problem for large social media sites like Twitter or Reddit.

How old are you, really? AI can tell your true age by looking at your chest

What if “looking your age” refers not to your face, but to your chest? Osaka Metropolitan University scientists have developed an advanced artificial intelligence (AI) model that utilizes chest radiographs to accurately estimate a patient’s chronological age. More importantly, when there is a disparity, it can signal a correlation with chronic disease.

These findings mark a leap in , paving the way for improved early disease detection and intervention. The results are published in The Lancet Healthy Longevity.

The research team, led by graduate student Yasuhito Mitsuyama and Dr. Daiju Ueda from the Department of Diagnostic and Interventional Radiology at the Graduate School of Medicine, Osaka Metropolitan University, first constructed a deep learning-based AI model to estimate age from chest radiographs of healthy individuals.