With grim prognosis hanging overhead, doctors and scientists at universities and institutions across the U.S. worked tirelessly to develop the world’s first custom gene-editing therapy to save the life of a newborn.
Get ready for GENIUS… I’m with STARGATE Genius! Join me and Dominic Dirkx, aeropspace engineer and assistant professor at @tudelft as we discuss distant m…
🔮 Step into a world of infinite possibilities with “The Maker of Universes” by Philip José Farmer—a groundbreaking science fiction novel that explores the boundaries of reality, mythology, and the human psyche across multiple universes. 🚀🌌 In this visionary tale, we follow the story of Robert Wolff, an Earthling who finds himself transported to the mysterious world of Tiers, a place where gods and mortals coexist, and where the boundaries between myth and reality blur. As Robert navigates this surreal landscape, encountering enigmatic beings, ancient prophecies, and cosmic mysteries, he discovers his role as a pivotal figure in the fabric of Tiers’ existence. Join us as we delve into the rich tapestry of “The Maker of Universes.” From the awe-inspiring landscapes and mythological creatures to the philosophical questions about the nature of creation and existence, Farmer’s novel offers a mind-bending exploration of alternate realities and the power of imagination. Our video immerses you in the fantastical realms of “The Maker of Universes,” delving into the themes of identity, destiny, and the interplay between myth and reality. We explore the intricate world-building, the diverse cultures, and the profound insights into human nature that make this novel a timeless gem in the science fiction and fantasy genre. Whether you’re a fan of epic adventures, mythic storytelling, or tales that challenge your perceptions of reality, “The Maker of Universes” promises an enthralling and thought-provoking reading experience that will transport you to realms beyond imagination. Prepare to unlock the secrets of the multiverse and embark on a journey of cosmic discovery with Philip José Farmer’s visionary masterpiece! 🌌📘 #PhilipJoséFarmer #TheMakerOfUniverses #ScienceFantasy #MultiverseAdventure #MythologicalExploration
Toothed whales use sound to find their way around, detect objects, and catch fish. They can investigate their environment by making clicking sounds, and then decoding the “echoic return signal” created when the clicking sounds bounce off objects and return to their ears. This “biosonar,” called echolocation, is rare in the animal kingdom.
Now, a new study by researchers at the Woods Hole Oceanographic Institution, New College of Florida, UC Berkeley, and Oxford University, and published in PLOS One, brings us closer to understanding how dolphin brains have evolved to support echolocation.
The research team applied new techniques for mapping networks in the excised brains of dead, stranded cetaceans to examine and compare the auditory pathways in echolocating dolphins and a non-echolocating baleen whale called a sei whale. A partnership with the International Fund for Animal Welfare (IFAW) and others is critical to advancing this work.
IBM has just unveiled its boldest quantum computing roadmap yet: Starling, the first large-scale, fault-tolerant quantum computer—coming in 2029. Capable of running 20,000X more operations than today’s quantum machines, Starling could unlock breakthroughs in chemistry, materials science, and optimization.
According to IBM, this is not just a pie-in-the-sky roadmap: they actually have the ability to make Starling happen.
In this exclusive conversation, I speak with Jerry Chow, IBM Fellow and Director of Quantum Systems, about the engineering breakthroughs that are making this possible… especially a radically more efficient error correction code and new multi-layered qubit architectures.
We cover:
- The shift from millions of physical qubits to manageable logical qubits.
- Why IBM is using quantum low-density parity check (qLDPC) codes.
- How modular quantum systems (like Kookaburra and Cockatoo) will scale the technology.
- Real-world quantum-classical hybrid applications already happening today.
- Why now is the time for developers to start building quantum-native algorithms.
00:00 Introduction to the Future of Computing.
01:04 IBM’s Jerry Chow.
01:49 Quantum Supremacy.
02:47 IBM’s Quantum Roadmap.
04:03 Technological Innovations in Quantum Computing.
05:59 Challenges and Solutions in Quantum Computing.
09:40 Quantum Processor Development.
14:04 Quantum Computing Applications and Future Prospects.
20:41 Personal Journey in Quantum Computing.
24:03 Conclusion and Final Thoughts.
A ground-breaking recent development by scientists from the U.S. National Science Foundation (NSF) National Solar Observatory (NSO), and New Jersey Institute of Technology (NJIT), is changing that by using adaptive optics to remove the blur.
From smartphones and TVs to credit cards, technologies that manipulate light are deeply embedded in our daily lives, many of which are based on holography. However, conventional holographic technologies have faced limitations, particularly in displaying multiple images on a single screen and in maintaining high-resolution image quality.
Recently, a research team led by Professor Junsuk Rho at POSTECH (Pohang University of Science and Technology) has developed a groundbreaking metasurface technology that can display up to 36 high-resolution images on a surface thinner than a human hair. This research has been published in Advanced Science.
This achievement is driven by a special nanostructure known as a metasurface. Hundreds of times thinner than a human hair, the metasurface is capable of precisely manipulating light as it passes through. The team fabricated nanometer-scale pillars using silicon nitride, a material known for its robustness and excellent optical transparency. These pillars, referred to as meta-atoms, allow for fine control of light on the metasurface.
Mitigating climate change is prompting all manner of changes: from the rapid transition to EVs to an explosion in renewables capacity. But these changes must be underpinned by a transformation of electricity grids to accommodate an energy sector that looks very different to how it does today.
The current conventional wisdom on deep neural networks (DNNs) is that, in most cases, simply scaling up a model’s parameters and adopting computationally intensive architectures will result in large performance improvements. Although this scaling strategy has proven successful in research labs, real-world industrial deployments introduce a number of complications, as developers often need to repeatedly train a DNN, transmit it to different devices, and ensure it can perform under various hardware constraints with minimal accuracy loss.
The research community has thus become increasingly interested in reducing such models’ storage size on devices while also improving their run-time. Explorations in this area have tended to follow one of two avenues: reducing model size via compression techniques, or using model pruning to reduce computation burdens.
In the new paper LilNetX: Lightweight Networks with EXtreme Model Compression and Structured Sparsification, a team from the University of Maryland and Google Research proposes a way to “bridge the gap” between the two approaches with LilNetX, an end-to-end trainable technique for neural networks that jointly optimizes model parameters for accuracy, model size on the disk, and computation on any given task.