Toggle light / dark theme

A new miracle drug could increase the human lifespan by up to 200 years. Dr. Andrew Steele, a British computational biologist recently published a new book on the longevity of human life. In the book, the doctor argues that it is completely feasible for humans to live beyond our standard 100-year lifespan thanks to a new type of drug.

Intel ((INTC) — Get Intel Corporation Report ) is the bearer of additional bad news.

The chip giant will give an extra blow to consumers and businesses concerned about the health of the economy. For several weeks in fact, consumers have seen their bills for groceries and other products increase. The price of gasoline at the pump has jumped when they go to fill up their car.

And the situation is not getting any better since inflation remains at its highest for forty years, which should push the Federal Reserve to be even more aggressive in raising rates. However, economists have already warned that this monetary policy would plunge the economy into recession.

In the 2016 sci-fi movie “Arrival,” a linguist and a theoretical physicist race against time to communicate with endangered extraterrestrial heptapods wishing to share their wisdom and technologies with the human race so it will survive and one day return the favor.

At the University of California, Berkeley, a real and more down-to-earth mission to decode an unknown form of communication is underway. Linguist Gasper Begus and computer scientist Shafi Goldwasser are part of an international team of researchers attempting interspecies communication with by deciphering their deafening, 200-plus decibel clicking sounds, or codas.

They are among the key members of the Cetacean Translation Initiative (CETI), a newly launched, five-year multidisciplinary project aimed at cracking whales’ Morse code-like communications off the Caribbean island of Dominica, to gain a deeper knowledge of the ocean’s brainiest predators and to preserve their habitat from further human disruption.

Before quantum computers and quantum networks can fulfil their huge potential, scientists have got several difficult problems to overcome – but a new study outlines a potential solution to one of these problems.

As we’ve seen in recent research, the silicon material that our existing classical computing components are made out of has shown potential for storing quantum bits, too.

These quantum bits – or qubits – are key to next-level quantum computing performance, and they come in a variety of types.

Moore’s law has driven the semiconductor industry to continue downscaling the critical size of transistors to improve device density. At the beginning of this century, traditional scaling started to encounter bottlenecks. The industry has successively developed strained Si/Ge, high-K/metal gate, and Fin-FETs, enabling Moore’s Law to continue.

Now, the critical size of FETs is down to 7 nm, which means there are almost 7 billion transistors per square centimeter on one chip, which brings huge challenges for fin-type structure and nanomanufacturing methods. Up to now, extreme ultraviolet lithography has been used in some critical steps, and it is facing alignment precision and high costs for high-volume manufacturing.

Meanwhile, the introduction of new materials and 3D complex structures brings serious challenges for top-down methods. Newly developed bottom-up manufacturing serves as a good complementary method and provides technical driving force for nanomanufacturing.

NVIDIA introduces QODA, a new platform for hybrid quantum-classical computing, enabling easy programming of integrated CPU, GPU, and QPU systems.


The past decade has seen quantum computing leap out of academic labs into the mainstream. Efforts to build better quantum computers proliferate at both startups and large companies. And while it is still unclear how far we are away from using quantum advantage on common problems, it is clear that now is the time to build the tools needed to deliver valuable quantum applications.

To start, we need to make progress in our understanding of quantum algorithms. Last year, NVIDIA announced cuQuantum, a software development kit (SDK) for accelerating simulations of quantum computing. Simulating quantum circuits using cuQuantum on GPUs enables algorithms research with performance and scale far beyond what can be achieved on quantum processing units (QPUs) today. This is paving the way for breakthroughs in understanding how to make the most of quantum computers.

In addition to improving quantum algorithms, we also need to use QPUs to their fullest potential alongside classical computing resources: CPUs and GPUs. Today, NVIDIA is announcing the launch of Quantum Optimized Device Architecture (QODA), a platform for hybrid quantum-classical computing with the mission of enabling this utility.

The paradox startled scientists at the U.S Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) more than a dozen years ago. The more heat they beamed into a spherical tokamak, a magnetic facility designed to reproduce the fusion energy that powers the sun and stars, the less the central temperature increased.

Big mystery

“Normally, the more beam power you put in, the higher the temperature gets,” said Stephen Jardin, head of the theory and computational science group that performed the calculations, and lead author of a proposed explanation published in Physical Review Letters. “So this was a big mystery: Why does this happen?”