Toggle light / dark theme

Researchers developed adjustable arrays of waveguides that introduce synthetic modal dimensions, enhancing the management of light within photonic systems. This innovation has potential applications ranging from mode lasing to quantum optics and data transmission.

In the realm of physics, synthetic dimensions (SDs) have emerged as a cutting-edge research frontier, providing a means to investigate phenomena in higher-dimensional spaces beyond our conventional 3D geometry. This concept has gained substantial attention, particularly in topological photonics, due to its potential to reveal complex physics that cannot be accessed within traditional dimensions.

Researchers have proposed various theoretical frameworks to study and implement SDs, aiming at harnessing phenomena like synthetic gauge fields, quantum Hall physics, discrete solitons, and topological phase transitions in four dimensions or higher. Those proposals could lead to new fundamental understandings in physics.

The project, led by Professor Zhiqin Chu from the Department of Electrical and Electronic Engineering at the University of Hong Kong (HKU), and Professor Qiang Wei from Sichuan University, utilized label-free quantum sensing technology to measure cellular force at the nanoscale. This advancement surpasses the limitations of traditional cellular force measurement tools and provides new insights into cellular mechanics, particularly regarding how cellular adhesion forces affect cancer cell spreading.

The research team has developed a new Quantum-Enhanced Diamond Molecular Tension Microscopy (QDMTM) that offers an effective approach for studying cell adhesion forces. Compared to cell force measurement methods that utilize fluorescent probes, QDMTM has the potential to overcome challenges such as photobleaching, limited sensitivity, and ambiguity in data interpretation. Furthermore, QDMTM sensors can be cleaned and reused, enhancing the absolute accuracy of comparing cell adhesion forces across various samples.

Researchers have created a quantum tornado in superfluid helium to simulate black hole conditions, advancing our understanding of black hole physics and the behavior of quantum fields in curved spacetimes, culminating in a unique art and science exhibition.

Scientists have, for the first time, created a giant quantum vortex in superfluid helium to mimic a black hole. This breakthrough has enabled them to observe in greater detail how analog black holes behave and interact with their surroundings.

Research led by the University of Nottingham, in collaboration with King’s College London and Newcastle University, has created a novel experimental platform: a quantum tornado. They have created a giant swirling vortex within superfluid helium that is chilled to the lowest possible temperatures. Through the observation of minute wave dynamics on the superfluid’s surface, the research team has shown that these quantum tornados mimic gravitational conditions near rotating black holes. The research has been published today in Nature.

Machine learning (ML) is one of the most important subareas of AI used in building great AI systems.

In ML, deep learning is a narrow area focused solely on neural networks. Through the field of deep learning, systems like ChatGPT and many other AI models can be created. In other words, ChatGPT is just a giant system based on neural networks.

However, there is a big problem with deep learning: computational efficiency. Creating big and effective AI systems with neural networks often requires a lot of energy, which is expensive.

Learn more about neural networks and large language models on Brilliant! First 30 days are free and 20% off the annual premium subscription when you use our link ➜ https://brilliant.org/sabine.

A lot of big banks are banking on quantum computing because they think it’ll give them an edge in trading. Though I have on previous occasions noted my doubt that we’ll see any useful quantum computers within the next ten years, two new papers detailing new methods of scaling quantum computers have shifted my perspective. Let’s have a look.

Paper 1: https://www.nature.com/articles/s4158
Paper 2: https://arxiv.org/abs/2404.

🤓 Check out my new quiz app ➜ http://quizwithit.com/
💌 Support me on Donorbox ➜ https://donorbox.org/swtg.
📝 Transcripts and written news on Substack ➜ https://sciencewtg.substack.com/
👉 Transcript with links to references on Patreon ➜ / sabine.
📩 Free weekly science newsletter ➜ https://sabinehossenfelder.com/newsle
👂 Audio only podcast ➜ https://open.spotify.com/show/0MkNfXl
🔗 Join this channel to get access to perks ➜
/ @sabinehossenfelder.
🖼️ On instagram ➜ / sciencewtg.

#science #sciencenews #quantumcomputer #technology #technews #tech

The quantum advantage, a key goal in quantum computation, is achieved when a quantum computer’s computational capability surpasses classical means. A recent study introduced a type of Instantaneous Quantum Polynomial-Time (IQP) computation, which was challenged by IBM Quantum and IonQ researchers who developed a faster classical simulation algorithm. IQP circuits are beneficial due to their simplicity and moderate hardware requirements, but they also allow for classical simulation. The IQP circuit, known as the HarvardQuEra circuit, is built over n 3m 32k inputs. There are two types of simulation for quantum computations: noiseless weak/direct and noisy.

The quantum advantage is a key goal for the quantum computation community. It is achieved when a quantum computer’s computational capability becomes so complex that it cannot be reproduced by classical means. This ongoing negotiation between classical simulations and quantum computational experiments is a significant focus in the field.

A recent publication by Bluvstein et al. introduced a type of Instantaneous Quantum Polynomial-Time (IQP) computation, complemented by a 48-qubit logical experimental demonstration using quantum hardware. The authors projected the simulation time to grow rapidly with the number of CNOT layers added. However, researchers from IBM Quantum and IonQ reported a classical simulation algorithm that computes an amplitude for the 48-qubit computation in only 0.00257947 seconds, which is roughly 103 times faster than that reported by the original authors. This algorithm is not subject to a significant decline in performance due to the additional CNOT layers.