Toggle light / dark theme

A nanotechnology material called graphene has captured attention worldwide, with many scientists dubbing it the latest “wonder material” with the potential to have an enormous human impact.

Graphene’s structure, made of carbon atoms arranged in a thin sheet, has properties that make it a strong contender to revolutionize many industries.

It’s often regarded as the thinnest and strongest material discovered so far, showing flexibility that few other materials can match. Its potential uses range from improving electronic devices to creating better ways to clean water.

Cellular death is a fundamental concept in biological sciences. Despite its importance, its definition varies depending on the context in which it occurs and lacks a general mathematical definition.

Researchers from the University of Tokyo propose a new mathematical definition of death based on whether a potentially dead cell can return to a predefined “representative state of living,” which are the states of being that we can confidently call “alive.” The researchers’ work could be useful for biological researchers and future medical research.

While it’s not something we like to think about, death comes for us all eventually, whether you’re an animal, a plant, or even a cell. And even though we can all differentiate between what is alive and dead, it might be surprising to know that death at a cellular level lacks a widely recognized mathematical definition.

Google on Monday announced Willow, its latest, greatest quantum computing chip. The speed and reliability performance claims Google’s made about this chip were newsworthy in themselves, but what really caught the tech industry’s attention was an even wilder claim tucked into the blog post about the chip.

Google Quantum AI founder Hartmut Neven wrote in his blog post that this chip was so mind-boggling fast that it must have borrowed computational power from other universes.

Ergo the chip’s performance indicates that parallel universes exist and “we live in a multiverse.”

While we have established, using rank-based methods, that the simulated annealing algorithm outperforms other randomization techniques in preserving the empirical network’s strength sequence, we have not quantified how well the different models preserve the strength distribution. The level to which the empirical strength distribution is preserved in a null network is crucial, because it ensures an accurate representation of influential graph features, such as hubs, whose importance is intricately tied to characteristics of the distribution.

To assess the goodness of fit between the strength distributions of the empirical and the randomized structural networks, we superimpose their cumulative distribution functions (Fig. 2b and Supplementary Fig. 8). Across all datasets, the curves produced via simulated annealing show the best match to the empirical strength cumulative distribution function with almost perfect superposition. Furthermore, the curves obtained using the Rubinov–Sporns and the Maslov–Sneppen algorithms show considerably more variability across null networks as shown by their wider spread, recapitulating previously observed patterns of underestimation and overestimation across datasets (see ‘Null model calibration’ section in Supplementary Information). To confirm these observations quantitatively, we compute Kolmogorov–Smirnov test statistics between the cumulative strength distributions of the empirical and each randomized network, measuring the maximum distance between them (Fig. 2b and Supplementary Fig. 8). Across all datasets, the simulated annealing algorithm outperforms the other two null models with significantly lower Kolmogorov–Smirnov statistics (P ≈ 0, CLES of 100% for all two-tailed, Wilcoxon–Mann–Whitney two-sample rank-sum tests). Furthermore, in the HCP dataset and the higher resolution Lausanne network, the Rubinov–Sporns algorithm generated cumulative strength distributions with slightly worse correspondence to the empirical distribution than the cumulative strength distributions yielded by the Maslov–Sneppen algorithm (LAU, high resolution: P 10−176, CLES of 61.58%; HCP: P ≈ 0, CLES of 100% for all empirical networks, two-tailed, Wilcoxon–Mann–Whitney two-sample rank-sum test).

As an illustration, we consider whether the nulls generated by different algorithms recapitulate fundamental characteristics associated with the empirical strength distribution. Namely, we focus on the heavy tailedness of the strength distribution (that is, does the null network also have a heavy-tailed strength distribution, suggesting the presence of hubs?) and the spatial location of high-strength hub nodes. We assess heavy tailedness and identify hubs using the nonparametric procedure outlined in refs. 73,74 (see Methods for more details).

Researchers have created nearly freestanding nanostructured two-dimensional (2D) gold monolayers, an impressive feat of nanomaterial engineering that could open up new avenues in catalysis, electronics, and energy conversion.

The research has been published in Nature Communications.

Gold is an inert metal which typically forms a solid three-dimensional (3D) structure. However, in its 2D form, it can unlock extraordinary properties, such as unique electronic behaviors, enhanced surface reactivity, and immense potential for revolutionary applications in catalysis and .

Quantum computers differ fundamentally from classical ones. Instead of using bits (0s and 1s), they employ “qubits,” which can exist in multiple states simultaneously due to quantum phenomena like superposition and entanglement.

For a quantum computer to simulate dynamic processes or process data, among other essential tasks, it must translate complex input data into “quantum data” that it can understand. This process is known as quantum compilation.

Essentially, quantum compilation “programs” the quantum computer by converting a particular goal into an executable sequence. Just as the GPS app converts your desired destination into a sequence of actionable steps you can follow, quantum compilation translates a high-level goal into a precise sequence of quantum operations that the quantum computer can execute.