Toggle light / dark theme

RIKEN and Fujitsu unveil world-leading 256-qubit quantum computer

RIKEN and Fujitsu Limited have developed a 256-qubit superconducting quantum computer that will significantly expand their joint quantum computing capabilities. The system, located at the RIKEN RQC-FUJITSU Collaboration Center, located on the RIKEN Wako campus, builds upon the advanced technology of the 64-qubit iteration, which was launched with the support of the Japanese Ministry of Education, Culture, Sports, Science and Technology (MEXT) in October 2023, and incorporates newly-developed high-density implementation techniques. The new system overcomes some key technical challenges, including appropriate cooling within the dilution refrigerator, which is achieved through the incorporation of high-density implementation and cutting-edge thermal design.

This announcement marks a new step toward the practical application of superconducting quantum computers and unlocking their potential to grapple with some of the world’s most complex issues, such as the analysis of larger molecules and the implementation and demonstration of sophisticated error correction algorithms.

The organizations plan to integrate the 256-qubit superconducting quantum computer into their platform for hybrid quantum computing lineup and offer it to companies and research institutions globally starting in the first quarter of fiscal 2025. Looking further into the future, Fujitsu and RIKEN will continue R&D efforts toward the launch of a 1,000-qubit computer, scheduled to be launched in 2026. For more information, see a longer press release on Fujitsu’s websiteThe webpage will open in a new tab..

A big data approach for next-generation battery electrolytes

Discovering new, powerful electrolytes is one of the major bottlenecks in designing next-generation batteries for electric vehicles, phones, laptops and grid-scale energy storage.

The most stable electrolytes are not always the most conductive. The most efficient batteries are not always the most stable. And so on.

“The electrodes have to satisfy very different properties at the same time. They always conflict with each other,” said Ritesh Kumar, an Eric and Wendy Schimdt AI in Science Postdoctoral Fellow working in the Amanchukwu Lab at the University of Chicago Pritzker School of Molecular Engineering (UChicago PME).

Math shaken: 200-year-old algebra rule falls to Geode number discovery

A mathematician has solved a 200-year-old maths problem after figuring out a way to crack higher-degree polynomial equations without using radicals or irrational numbers.

The method developed by Norman Wildberger, PhD, an honorary professor at the School of Mathematics and Statistics at UNSW Sydney, solves one of algebra’s oldest challenges by finding a general solution to equations where the variable is raised to the fifth power or higher.

“Unbreakable Armor for Tomorrow’s Nuclear Powerhouses” as Next-Gen Reactors Boast Cutting-Edge Shielding Design to Revolutionize Safety Standards

IN A NUTSHELL 🔬 Scientists at the University of South China have developed innovative algorithms to optimize radiation shielding for next-generation nuclear reactors. 💡 The newly created algorithms, RP-NSGA and RP-MOABC, significantly improve performance by integrating a reference-point-selection strategy with established optimization techniques. 📈 Experiments demonstrated that these algorithms achieve substantial reductions in volume and.

Non-invasive imaging modalities for diagnosing pulsatile tinnitus: a comprehensive review and recommended imaging algorithm

Pulsatile tinnitus (PT) is a challenging diagnostic condition arising from various vascular, neoplastic, and systemic disorders. Non-invasive imaging is essential for identifying underlying causes while minimizing risks of invasive diagnostic angiography. Although no consensus exists on the primary imaging modality for PT and currently CT, ultrasound, and MRI are used in the diagnostic pathway, MRI is increasingly preferred as the first-line screening test for its diagnostic efficacy and safety. MRI protocols such as time-of-flight, magnetic resonance angiography, diffusion-weighted imaging, and arterial spin labeling can identify serious causes, including vascular shunting lesions, venous sinus stenosis, and tumors.

AI Designs Optics Hardware

A machine-learning algorithm rapidly generates designs that can be simpler than those developed by humans.

Researchers in optics and photonics rely on devices that interact with light in order to transport it, amplify it, or change its frequency, and designing these devices can be painstaking work requiring human ingenuity. Now a research team has demonstrated that the discovery of the core design concepts can be automated using machine learning, which can rapidly provide efficient designs for a wide range of uses [1]. The team hopes the approach will streamline research and development for scientists and engineers who work with optical, mechanical, or electrical waves, or with combinations of these wave types.

When a researcher needs a transducer, an amplifier, or a similar element in their experimental setup, they draw on design concepts tested and proven in earlier experiments. “There are literally hundreds of articles that describe ideas for the design of devices,” says Florian Marquardt of the University of Erlangen-Nuremberg in Germany. Researchers often adapt an existing design to their specific needs. But there is no standard procedure to find the best design, and researchers could miss out on simpler designs that would be easier to implement.

AI model based on neural oscillations delivers stable, efficient long-sequence predictions

Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a novel artificial intelligence (AI) model inspired by neural oscillations in the brain, with the goal of significantly advancing how machine learning algorithms handle long sequences of data.

AI often struggles with analyzing complex information that unfolds over long periods of time, such as climate trends, biological signals, or financial data. One new type of AI model called “state-space models” has been designed specifically to understand these sequential patterns more effectively. However, existing state-space models often face challenges—they can become unstable or require a significant amount of computational resources when processing long data sequences.

To address these issues, CSAIL researchers T. Konstantin Rusch and Daniela Rus have developed what they call “linear oscillatory state-space models” (LinOSS), which leverage principles of forced harmonic oscillators—a concept deeply rooted in physics and observed in .

Robotics researchers develop algorithms that make mobile navigation more efficient

Delivery robots made by companies such as Starship Technologies and Kiwibot autonomously make their way along city streets and through neighborhoods.

Under the hood, these robots—like most in use today—use a variety of different sensors and software-based algorithms to navigate in these environments.

Lidar sensors—which send out pulses of light to help calculate the distances of objects—have become a mainstay, enabling these robots to conduct simultaneous localization and mapping, otherwise known as SLAM.

Mathematician solves algebra’s oldest problem

Most people’s experiences with polynomial equations don’t extend much further than high school algebra and the quadratic formula. Still, these numeric puzzles remain a foundational component of everything from calculating planetary orbits to computer programming. Although solving lower order polynomials—where the x in an equation is raised up to the fourth power—is often a simple task, things get complicated once you start seeing powers of five or greater. For centuries, mathematicians accepted this as simply an inherent challenge to their work, but not Norman Wildberger. According to his new approach detailed in The American Mathematical Monthly, there’s a much more elegant approach to high order polynomials—all you need to do is get rid of pesky notions like irrational numbers.

Babylonians first conceived of two-degree polynomials around 1800 BCE, but it took until the 16th century for mathematicians to evolve the concept to incorporate three-and four-degree variables using root numbers, also known as radicals. Polynomials remained there for another two centuries, with larger examples stumping experts until in 1832. That year, French mathematician Évariste Galois finally illustrated why this was such a problem—the underlying mathematical symmetry in the established methods for lower-order polynomials simply became too complicated for degree five or higher. For Galois, this meant there just wasn’t a general formula available for them.

Mathematicians have since developed approximate solutions, but they require integrating concepts like irrational numbers into the classical formula.

How researchers discovered specific brain cells that enable intelligent behavior

For decades, neuroscientists have developed mathematical frameworks to explain how brain activity drives behavior in predictable, repetitive scenarios, such as while playing a game. These algorithms have not only described brain cell activity with remarkable precision but also helped develop artificial intelligence with superhuman achievements in specific tasks, such as playing Atari or Go.

Yet these frameworks fall short of capturing the essence of human and animal behavior: our extraordinary ability to generalize, infer and adapt. Our study, published in Nature late last year, provides insights into how in mice enable this more complex, intelligent behavior.

Unlike machines, humans and animals can flexibly navigate new challenges. Every day, we solve new problems by generalizing from our knowledge or drawing from our experiences. We cook new recipes, meet new people, take a new path—and we can imagine the aftermath of entirely novel choices.