Toggle light / dark theme

Fast and cheap for AI inference (responding to chat prompts with very low latency at very high speeds.)


Discussing how it works, benchmarks, how it compares to other AI accelerators and the future outlook!

Support me at Patreon ➜ / anastasiintech.

Sign up for my Deep In Tech Newsletter for free! ➜ https://anastasiintech.substack.com.

https://anastasiintech.com

A process that leverages capillary interactions between oligomers in an elastomeric polydimethylsiloxane substrate and deposited Ga enables the formation of Ga nanodroplets with nanoscale gaps in a single step. Gap-plasmon resonances excited within the nanogaps give rise to structural colours that can be tuned by changing the oligomer content in the substrate or by mechanical stretching.

Understanding Neuromorphic Engineering.

Neuromorphic Engineering draws inspiration from the human brain’s architecture and functioning, aiming to create electronic systems that mimic the brain’s ability to process information in a parallel, energy-efficient, and adaptable manner. Unlike traditional computing, which relies on sequential processing, neuromorphic systems leverage neural networks to enable faster and more efficient computation.

Mimicking the Human Brain.

Popular Summary.

Unequivocally demonstrating that a quantum computer can significantly outperform any existing classical computers will be a milestone in quantum science and technology. Recently, groups at Google and at the University of Science and Technology of China (USTC) announced that they have achieved such quantum computational advantages. The central quantity of interest behind their claims is the linear cross-entropy benchmark (XEB), which has been claimed and used to approximate the fidelity of their quantum experiments and to certify the correctness of their computation results. However, such claims rely on several assumptions, some of which are implicitly assumed. Hence, it is critical to understand when and how XEB can be used for quantum advantage experiments. By combining various tools from computer science, statistical physics, and quantum information, we critically examine the properties of XEB and show that XEB bears several intrinsic vulnerabilities, limiting its utility as a benchmark for quantum advantage.

Concretely, we introduce a novel framework to identify and exploit several vulnerabilities of XEB, which leads to an efficient classical algorithm getting comparable XEB values to Google’s and USTC’s quantum devices (2% 12% of theirs) with just one GPU within 2 s. Furthermore, its performance features better scaling with the system size than that of a noisy quantum device. We observe that this is made possible because the XEB can highly overestimate the fidelity, which implies the existence of “shortcuts” to achieve high XEB values without simulating the system. This is in contrast to the intuition of the hardness of achieving high XEB values by all possible classical algorithms.

Nanoparticles (NPs) administered in the human body will undergo rapid surface modification upon contact with biological fluids driven by their interfacial interaction with a diverse range of biomolecules. Such spontaneous self-assembly and adsorption of proteins and other biomolecules onto the NP surface constitute what is commonly known as the protein or biomolecule corona. This surface biotransformation of the NPs modulates their biological interactions and impact on physiological systems and can influence their overall pharmacological profile. Here, we comment on how the initially considered ‘nuisance’ of the in vivo corona formation can now be considered a nanoparticle engineering tool for biomedical use, such as in endogenous tissue targeting, personalized biomarker discovery and immunomodulation.