Toggle light / dark theme

Quantum teleportation enables the transfer of quantum information to distant locations through the use of quantum entanglement and classical communication. This concept has been realized in various quantum light systems, ranging from laboratory-based experiments to practical real-world tests. Notably, by utilizing the low-Earth orbit Micius satellite, scientists have successfully teleported quantum information over distances exceeding 1,200 km. However, there hasn’t been a quantum teleportation system yet whose rate can reach the order of Hertz. This hinders future applications of the quantum internet.

In a paper published in Light Science & Application, a team of scientists, led by Prof. Guangcan Guo and Prof. Qiang Zhou from the University of Electronic Science and Technology of China (UESTC) cooperating with Prof. Lixing You from the Shanghai Institute of Microsystem and Information Technology of the Chinese Academy of Sciences, have improved the teleportation rate to 7.1 qubits per second for the first time based on the “No. 1 Metropolitan Quantum Internet of UESTC”.

This presents a new record for the quantum teleportation system over metropolitan range.

Quantum mechanics (QM) is derived based on a universe composed solely of events, for example, outcomes of observables. Such an event universe is represented by a dendrogram (a finite tree) and in the limit of infinitely many events by the p-adic tree. The trees are endowed with an ultrametric expressing hierarchical relationships between events. All events are coupled through the tree structure. Such a holistic picture of event-processes was formalized within the Dendrographic Hologram Theory (DHT). The present paper is devoted to the emergence of QM from DHT. We used the generalization of the QM-emergence scheme developed by Smolin. Following this scheme, we did not quantize events but rather the differences between them and through analytic derivation arrived at Bohmian mechanics.

Neurotechnology will improve our lives in many ways. However, to sustain a world where our neurobiological data (in some cases perhaps including our innermost thoughts and feelings) remains properly secure, we must invest in both policy and technology that prevents bad actors from stealing private information or even directly manipulating people’s brains. We don’t want the very real possibility of ‘telepathy’ and ‘mind control’ to harm people and society. So, let’s start laying the groundwork now to ensure the best possible neurotech future! #neurotech #future #policy #neuroscience


We provide a Perspective highlighting the significant ethical implications of the use of fast-developing neurotechnologies in humans, as well as the regulatory frameworks and guidelines needed to protect neurodata and mental privacy.

Neurons, the main cells that make up our brain and spinal cord, are among the slowest cells to regenerate after an injury, and many neurons fail to regenerate entirely. While scientists have made progress in understanding neuronal regeneration, it remains unknown why some neurons regenerate and others do not.

Using single-cell RNA sequencing, a method that determines which genes are activated in individual cells, researchers from University of California San Diego School of Medicine have identified a new biomarker that can be used to predict whether or not neurons will regenerate after an injury. Testing their discovery in mice, they found that the biomarker was consistently reliable in… More.


Researchers from University of California San Diego have identified a new biomarker that can predict whether or not neurons will regenerate after an injury. The findings could help scientists develop regenerative therapies for spinal cord injuries and other neurological conditions.

Before epilepsy was understood to be a neurological condition, people believed it was caused by the moon, or by phlegm in the brain. They condemned seizures as evidence of witchcraft or demonic possession, and killed or castrated sufferers to prevent them from passing tainted blood to a new generation.

Today we know epilepsy is a disease. By and large, it’s accepted that a person who causes a fatal traffic accident while in the grip of a seizure should not be charged with murder.

That’s good, says Stanford University neurobiologist Robert Sapolsky. That’s progress. But there’s still a long way to go.

After more than 40 years studying humans and other primates, Sapolsky has reached the… More.


Autonomous shopping carts that follow grocery store customers and robots that pick ripe cucumbers faster than humans may grab headlines, but the most compelling applications of AI and ML technology are behind the scenes. Increasingly, organizations are finding substantial efficiency gains by applying AI-and ML-powered tools to back-office procedures such as document processing, data entry, employee onboarding, and workflow automation.

The power of automation to augment productivity in the back office has been clear for decades, but the recent emergence of advanced AI and ML tools offers a step change in what automation can accomplish, including in highly regulated industries such as health care.

https://informatech.co/3Fv2


State-sponsored threat actors from Russia and China continue to throttle the remote code execution (RCE) WinRAR vulnerability in unpatched systems to deliver malware to targets.

Researchers at Google’s Threat Analysis Group (TAG) have been tracking attacks in recent weeks that exploit CVE-2023–38831 to deliver infostealers and backdoor malware, particularly to organizations in Ukraine and Papua New Guinea. The flaw is a known and patched vulnerability in RarLab’s popular WinRAR file archiver tool for Windows, but systems that haven’t been updated remain vulnerable.

“TAG has observed government-backed actors from a number of countries exploiting the WinRAR vulnerability as part of their operations,” Kate Morgan from Google TAG wrote in a blog post.

Researchers from University of California San Diego School of Medicine have used single-cell RNA sequencing (scRNA-seq) to identify a pattern of gene expression that can be used to predict whether or not neurons will regenerate after an injury. Tests in mice showed that this “Regeneration Classifier” was consistently reliable in predicting the regeneration potential of neurons across the nervous system and at different developmental stages. Conditional gene deletion then validated a role for NFE2L2 (or NRF2), a master regulator of antioxidant response, in corticospinal tract regeneration.

“Single-cell sequencing technology is helping us look at the biology of neurons in much more detail than has ever been possible, and this study really demonstrates that capability,” said senior author Binhai Zheng, PhD, professor in the Department of Neurosciences at UC San Diego School of Medicine. “What we’ve discovered here could be just the beginning of a new generation of sophisticated biomarkers based on single-cell data.” Zheng and colleagues reported on their findings in Neuron, in a paper titled “Deep scRNA sequencing reveals a broadly applicable Regeneration Classifier and implicates antioxidant response in corticospinal axon regeneration.” In their paper the team concluded, “Our data demonstrate a universal transcriptomic signature underlying the regenerative potential of vastly different neuronalpopulations and illustrate that deep sequencing of only hundreds of phenotypically identified neurons has the power to advance regenerative biology.”

Neurons are among the slowest cells to regenerate after an injury. While scientists have made progress in understanding neuronal regeneration, it remains unknown why some neurons regenerate and others do not.

Transformers are machine learning models designed to uncover and track patterns in sequential data, such as text sequences. In recent years, these models have become increasingly sophisticated, forming the backbone of popular conversational platforms, such as ChatGPT.

While existing transformers have achieved good results in a variety of tasks, their performance often declines significantly when processing longer sequences. This is due to their limited storage capacity, or in other words the small amount of data they can store and analyze at once.

Researchers at Sungkyunkwan University in South Korea recently developed a new memory system that could help to improve the performance of transformers on more characterized by longer data sequences. This system, introduced in a paper published on the arXiv preprint server, is inspired by a prominent theory of human memory, known as Hebbian theory.