Dec 25, 2024
Dynamical Modeling of Extracted Connectomes
Posted by Dan Breeden in category: futurism
Spring 2018
Spring 2018
New technology is constantly finding its way into the learning space. Click here to find out how tech is modernizing our schools.
Reference: Essays on Substance
In his 1952 article Relativity and the Problem of Space, Einstein says:
The world of artificial intelligence (AI) has made remarkable strides in recent years, particularly in understanding human language. At the heart of this revolution is the Transformer model, a core innovation that allows large language models (LLMs) to process and understand language with an efficiency that previous models could only dream of. But how do Transformers work? To explain this, let’s take a journey through their inner workings, using stories and analogies to make the complex concepts easier to grasp.
Researchers at the California Institute of Technology have unveiled a startling revelation about the human mind: our thoughts move at a mere 10 bits per second, a rate that pales in comparison to the staggering billion bits per second at which our sensory systems gather environmental data. This discovery, published in the journal Neuron, is challenging long-held assumptions about human cognition.
The research, conducted in the laboratory of Markus Meister, the Anne P. and Benjamin F. Biaggini Professor of Biological Sciences at Caltech, and spearheaded by graduate student Jieyu Zheng, applied information theory techniques on an extensive collection of scientific literature. By analyzing human behaviors such as reading, writing, video gaming, and Rubik’s Cube solving, the team calculated the 10 bits per second figure – a rate that Meister describes as “extremely low.”
To put this in perspective, a typical Wi-Fi connection processes about 50 million bits per second, making our thought processes seem glacial by comparison. This stark contrast raises a paradox that Meister and his team are eager to explore further: “What is the brain doing to filter all of this information?”
Apple’s latest machine learning research could make creating models for Apple Intelligence faster, by coming up with a technique to almost triple the rate of generating tokens when using Nvidia GPUs.
One of the problems in creating large language models (LLMs) for tools and apps that offer AI-based functionality, such as Apple Intelligence, is inefficiencies in producing the LLMs in the first place. Training models for machine learning is a resource-intensive and slow process, which is often countered by buying more hardware and taking on increased energy costs.
Earlier in 2024, Apple published and open-sourced Recurrent Drafter, known as ReDrafter, a method of speculative decoding to improve performance in training. It used an RNN (Recurrent Neural Network) draft model combining beam search with dynamic tree attention for predicting and verifying draft tokens from multiple paths.
An AI agent helped drive a memecoin to a billion-dollar market cap this year, but the real crypto x AI innovations are coming in 2025.
Researchers at Australia’s Monash University are using a common medicine cabinet antiseptic in unique battery chemistry that could soon power drones and other electric aircraft, according to a school news release.
The team is tapping Betadine, a common brand name for a topical medication used to treat cuts and other wounds, in research garnering surprising results.
What if robots could work together like ants to move objects, clear blockages, and guide living creatures? Discover more!
Scientists at Hanyang University in Seoul, South Korea, have developed small magnetic robots that work together in swarms to perform complex tasks, such as moving and lifting objects much more significant than themselves. These microrobot swarms, controlled by a rotating magnetic field, can be used in challenging environments, offering solutions for tasks like minimally invasive treatments for clogged arteries and guiding small organisms.
The researchers tested how microrobot swarms with different configurations performed various tasks. They discovered that swarms with a high aspect ratio could climb obstacles five times higher than a single robot’s body length and throw themselves over them. In another demonstration, a swarm of 1,000 microrobots formed a raft on water, surrounding a pill 2,000 times heavier than a single robot, allowing the swarm to transport the drug through the liquid. On land, a swarm moved cargo 350 times heavier than each robot, while another swarm unclogged tubes resembling blocked blood vessels. Using spinning and orbital dragging motions, the team also developed a system where robot swarms could guide the movements of small organisms.