AI eating its own tail: the risk of model collapse in generative systems.
Category: robotics/AI – Page 319
BHP’s (ASX, NYSE: BHP) Spence copper mine in Chile has celebrated three months of being the company’s first fully autonomous operation, a status reached in April after a two-year journey that included converting its trucks fleet and drilling rigs.
Spence, which produced 249,000 tonnes of copper last year, is BHP’s second largest copper mine behind Escondida, the world’s biggest copper operation. In the three months to July 29, the copper operation has moved 80 million tonnes of material without any safety incidents, surpassing the production plan to date, BHP said.
For the first time, researchers have demonstrated that not just individual bits, but entire bit sequences can be stored in cylindrical domains: tiny, cylindrical areas measuring just around 100 nanometers. As the team reports in the journal Advanced Electronic Materials, these findings could pave the way for novel types of data storage and sensors, including even magnetic variants of neural networks.
Groundbreaking Magnetic Storage
“A cylindrical domain, which we physicists also call a bubble domain, is a tiny, cylindrical area in a thin magnetic layer. Its spins, the electrons’ intrinsic angular momentum that generates the magnetic moment in the material, point in a specific direction. This creates a magnetization that differs from the rest of the environment. Imagine a small, cylinder-shaped magnetic bubble floating in a sea of opposite magnetization,” says Prof. Olav Hellwig from Helmholtz-Zentrum Dresden-Rossendorf ’s Institute of Ion Beam Physics and Materials Research, describing the subject of his research. He and his team are confident that such magnetic structures possess a great potential for spintronic applications.
An AI model developed by scientists at King’s College London, in close collaboration with University College London, has produced three-dimensional, synthetic images of the human brain that are realistic and accurate enough to use in medical research.
The model and images have helped scientists better understand what the human brain looks like, supporting research to predict, diagnose and treat brain diseases such as dementia, stroke, and multiple sclerosis.
The algorithm was created using the NVIDIA Cambridge-1, the UK’s most powerful supercomputer. One of the fastest supercomputers in the world, the Cambridge-1 allowed researchers to train the AI in weeks rather than months and produce images of far higher quality.
Official trailer for HUXLEY: THE ORACLE, the next prequel story in Ben Mauro’s post apocalyptic sci-fi universe! The Oracle Empire is at the height of its power, Max is a young recruit in the Ronin army sent out on an important mission deep into the wasteland with his team. What they discover could change the course of history forever. The AI wars have begun. Directed by Syama Pedersen of ‘ASTARTES’ Warhammer 40k and the renown UNIT IMAGE animation studio, dive deeper into the world of HUXLEY in this exciting new story.
If you would like to know more, read the original graphic novel and the new Oracle prequel book that tells the story glimpsed in the trailer. Available worldwide for pre-order from Thames \& Hudson. https://vol.co/collections/the-oracle.
Official site: https://www.huxleysaga.com
Shiver me timbers: Security researchers have demonstrated that it’s possible to spy on what’s visible on your screen by intercepting electromagnetic radiation from video cables with great accuracy, thanks to artificial intelligence. The team from Uruguay’s University of the Republic says their AI-powered cable-tapping method is good enough that these attacks are likely already happening.
Back in the analog video era, it was relatively straightforward for hackers to reconstruct what was on a screen by detecting the leakage from video cables. But once digital protocols like HDMI took over, that became much trickier. The data zipping through HDMI is much more complex than old analog signals.
However, those digital signals still leak some electromagnetic radiation as they transmit between your computer and display. By training an AI model on samples of matching original and intercepted HDMI signals, the researchers were able to decode those leaks into readable screen captures.
“These spots are a big surprise,” said Dr. David Flannery. “On Earth, these types of features in rocks are often associated with the fossilized record of microbes living in the subsurface.”
Did Mars once have life billions of years ago? This is what NASA’s Perseverance (Percy) rover hopes to figure out, and scientists might be one step closer to answering that question with a recent discovery by the car-sized robotic explorer that found a unique rock with “leopard spots” that have caused some in the scientific community to claim this indicates past life might have once existed on the now cold and dry Red Planet. However, others have just as quickly rushed to say that further evidence is required before jumping to conclusions.
Upon analyzing the rock using Percy’s intricate suite of scientific instruments, scientists determined that it contained specific chemical signatures indicative of life possibly having existed billions of years ago when liquid water flowed across the surface. However, the science team is also considering other reasons for the rock’s unique appearance, including further research to determine if the findings are consistent with potential ancient life.
The unique features of the rock include calcium sulfate veins with reddish material between the veins which indicate the presence of hematite, which is responsible for the Red Planet’s rusty color. Upon further inspecting the reddish material, Percy identified dozens of off-white splotches at the millimeter-scale with black material surrounding it, hence the name “leopard spots”
AI can revolutionize healthcare and boost economic outcomes. But it comes with a lot of risks, too.
Imagine a crew of astronauts headed to Mars. About 140 million miles away from Earth, they discover their spacecraft has a cracked O-ring. But instead of relying on a dwindling cache of spare parts, what if they could simply fabricate any part they needed on demand?
A team of Berkeley researchers, led by Ph.D. student Taylor Waddell, may have taken a giant leap toward making this option a reality. On June 8, they sent their 3D printing technology to space for the first time as part of the Virgin Galactic 7 mission.
Their next-generation microgravity printer—dubbed SpaceCAL—spent 140 seconds in suborbital space while aboard the VSS Unity space plane. In that short time span, it autonomously printed and post-processed a total of four test parts, including space shuttles and benchy figurines from a liquid plastic called PEGDA.
NVIDIA workflows connect real and synthetic data
Training foundation models for humanoid and other robots typically requires large amounts of data, noted NVIDIA. Teleoperation is one way to capture human demonstration data, but it can be expensive and time-consuming, it said.
NVIDIA announced a workflow that uses AI and Omniverse to enable developers to train robots with smaller amounts of data than previously required. First, developers use Apple Vision Pro to capture a relatively small number of teleoperated demonstrations.