Menu

Blog

Page 458

Jul 24, 2024

Quantum Advantage Challenged: IBM And IonQ Develop Faster Classical Simulation Algorithm

Posted by in categories: computing, information science, quantum physics

The quantum advantage, a key goal in quantum computation, is achieved when a quantum computer’s computational capability surpasses classical means. A recent study introduced a type of Instantaneous Quantum Polynomial-Time (IQP) computation, which was challenged by IBM Quantum and IonQ researchers who developed a faster classical simulation algorithm. IQP circuits are beneficial due to their simplicity and moderate hardware requirements, but they also allow for classical simulation. The IQP circuit, known as the HarvardQuEra circuit, is built over n 3m 32k inputs. There are two types of simulation for quantum computations: noiseless weak/direct and noisy.

The quantum advantage is a key goal for the quantum computation community. It is achieved when a quantum computer’s computational capability becomes so complex that it cannot be reproduced by classical means. This ongoing negotiation between classical simulations and quantum computational experiments is a significant focus in the field.

A recent publication by Bluvstein et al. introduced a type of Instantaneous Quantum Polynomial-Time (IQP) computation, complemented by a 48-qubit logical experimental demonstration using quantum hardware. The authors projected the simulation time to grow rapidly with the number of CNOT layers added. However, researchers from IBM Quantum and IonQ reported a classical simulation algorithm that computes an amplitude for the 48-qubit computation in only 0.00257947 seconds, which is roughly 103 times faster than that reported by the original authors. This algorithm is not subject to a significant decline in performance due to the additional CNOT layers.

Jul 24, 2024

A carbon-nanotube-based tensor processing unit

Posted by in categories: nanotechnology, robotics/AI

Carbon nanotube networks made with high purity and ultraclean interfaces can be used to make a tensor processing unit that contains 3,000 transistors in a systolic array architecture to improve energy efficiency in accelerating neural network tasks.

Jul 24, 2024

Nonlinear encoding in diffractive information processing using linear optical materials

Posted by in categories: biotech/medical, materials

Furthermore, many experimental factors, such as fabrication errors and physical misalignments, can affect the performance of diffractive processors during the experimental deployment stage. Investigating the inherent robustness of different nonlinear encoding strategies to such imperfections, as well as their integration with vaccination-based training strategies39 or in situ training methods40, would provide more comprehensive guidance on the implementation and limitations of these approaches. These considerations would be crucial for future research and practical implementations of diffractive optical processors.

Throughout the manuscript, our analyses assumed that diffractive optical processors consist of several stacked diffractive layers interconnected through free-space light propagation, as commonly used in the literature10,13,41,42. Our forward model employs the angular spectrum method for light propagation, a broadly applicable technique known for its accuracy, covering all the propagating modes in free space. While our forward model does not account for multiple reflections between the diffractive layers, it is important to note that such cascaded reflections are much weaker than the transmitted light and, thus, have a negligible impact on the optimization process. This simplification does not compromise the model’s experimental validity since a given diffractive model also acts as a 3D filter for such undesired secondary sources that were ignored in the optimization process; stated differently, a by-product of the entire optimization process is that the resulting diffractive layers collectively filter out some of these undesired sources of secondary reflections, scattering them outside the output FOV. The foundation of our model has been extensively validated through various experiments10,11,16,18,43, providing a good match to the corresponding numerical model in each case, further supporting the accuracy of our forward model and diffractive processor design scheme.

Finally, our numerical analyses were conducted using coherent monochromatic light, which has many practical, real-world applications such as holographic microscopy and sensing, laser-based imaging systems, optical communications, and biomedical imaging. These applications, and many others, benefit from the precise control of the wave information carried by coherent light. In addition to coherent illumination, diffractive optical processors can also be designed to accommodate temporally and spatially incoherent illumination. By optimizing the layers for multiple wavelengths of illumination, a diffractive processor can be effectively designed to operate under broadband illumination conditions18,19,29,43,44,45,46,47. Similarly, by incorporating spatial incoherence into the forward model simulations, we can design diffractive processors that function effectively with spatially incoherent illumination30,48. Without loss of generality, our current study focuses on coherent monochromatic light to establish a foundational understanding of nonlinear encoding strategies in diffractive information processing using linear optical materials by leveraging the precise control that coherent processors offer. Future work could explore the extension of these principles to spatially or temporally incoherent illumination scenarios, further broadening the applicability of diffractive optical processors in practical settings.

Jul 24, 2024

SAQFT: Algebraic quantum field theory for elementary and composite particles

Posted by in categories: cosmology, information science, particle physics, quantum physics

Quantum field theory (QFT) was a crucial step in our understanding of the fundamental nature of the Universe. In its current form, however, it is poorly suited for describing composite particles, made up of multiple interacting elementary particles. Today, QFT for hadrons has been largely replaced with quantum chromodynamics, but this new framework still leaves many gaps in our understanding, particularly surrounding the nature of strong nuclear force and the origins of dark matter and dark energy. Through a new algebraic formulation of QFT, Dr Abdulaziz Alhaidari at the Saudi Center for Theoretical Physics hopes that these issues could finally be addressed.

The emergence of quantum field theory (QFT) was one of the most important developments in modern physics. By combining the theories of special relativity, quantum mechanics, and the interaction of matter via classical field equations, it provides robust explanations for many fundamental phenomena, including interactions between charged particles via the exchange of photons.

Still, QFT in its current form is far from flawless. Among its limitations is its inability to produce a precise description of composite particles such as hadrons, which are made up of multiple interacting elementary particles that are confined (cannot be observed in isolation). Since these particles possess an internal structure, the nature of these interactions becomes far more difficult to define mathematically, stretching the descriptive abilities of QFT beyond its limits.

Jul 24, 2024

Automated construction of cognitive maps with visual predictive coding

Posted by in categories: mapping, robotics/AI, space

Constructing spatial maps from sensory inputs is challenging in both neuroscience and artificial intelligence. Gornet and Thomson show that as an agent navigates an environment, a self-attention neural network using predictive coding can recover the environment’s map in its latent space.

Jul 24, 2024

Supermassive black holes provide ‘hearts and lungs’ that help galaxies live longer

Posted by in category: cosmology

Can black hole jets suffer from cosmic hypertension?

Jul 24, 2024

Next-Gen Brain Implant Uses a Graphene Chip

Posted by in categories: biotech/medical, computing, neuroscience

A brain-computer interface from the startup Inbrain could be used to treat Parkinson’s disease.

Jul 24, 2024

Kawasaki Does World’s First Public Demo Of Hydrogen-Fueled Motorcycle

Posted by in category: transportation

Kawasaki performed the world’s first public demonstration of its hydrogen-fueled prototype motorcycle this past weekend in Japan.

Jul 24, 2024

Finding novel treatments for Angelman syndrome: Could Syn3 be the answer?

Posted by in category: neuroscience

Dr John Marshall is a leading neuroscientist and a pioneer in the signalling and synaptic trafficking fields, and has made major contributions to understanding brain injury and neurodevelopmental disorders. He received his MSc from the University of Toronto and completed his PhD training in Neurobiology at the MRC at Cambridge University, England. He worked with Professor Len Kaczmarek at Yale University. Marshall assumed his position at Brown University in 1995 and has continued to produce cutting-edge research. His lab focus is on memory and behaviour in rodent models of Angelman syndrome.

Contact Details

Jul 24, 2024

Big Tech is suddenly obsessed with the ‘NPU.’ Here’s what that is and why it matters

Posted by in categories: mobile phones, robotics/AI

The “neural processing unit” is being pushed as the next big thing for “AI PCs” and “AI smartphones,” but they won’t eliminate the need for cloud-based AI.

Page 458 of 11,949First455456457458459460461462Last