String theory in inflation.
Shared with Dropbox.
The purpose of this work is to investigate how several inflationary and bouncing scenarios can be realized by imperfect fluids. We shall use two different theoretical frameworks, namely classical cosmology and Loop Quantum Cosmology (LQC) (see where the derivation of the Hamiltonian in LQC was firstly derived to yield the modified Friedman equation, and also see for a recent derivation of the effective Hamiltonian in LQC, which was derived by demanding repulsive gravity, as in Loop Quantum Gravity). In both cases we shall investigate which imperfect fluid can realize various inflationary and bouncing cosmology scenarios. The inflationary cosmology and bouncing cosmology are two alternative scenarios for our Universe evolution. In the case of inflation, the Universe starts from an initial singularity and accelerates at early times, while in the case of the bouncing cosmology, the Universe initially contracts until it reaches a minimum radius, and then it expands again. With regards to inflation, we shall be interested in four different inflationary scenarios, namely the intermediate inflation, the Starobinsky inflation, and two constant-roll inflation scenarios. With regards to bouncing cosmologies, we shall be interested in realizing several well studied bouncing cosmologies, and particularly the matter bounce scenario, the superbounce scenario and the singular bounce.
As we already mentioned we shall use two theoretical frameworks, that of classical cosmology and that of LQC. After presenting the reconstruction methods for realizing the various cosmologies with imperfect fluids, we proceed to the realization of the cosmologies by using the reconstruction methods. In the case of classical cosmology, we will calculate the power spectrum of primordial curvature perturbations, the scalar-to-tensor ratio and the running of the spectral index for all the aforementioned cosmologies, and we compare the results to the recent Planck data. The main outcome of our work is that, although the cosmological scenarios we study in this paper are viable in other modified gravity frameworks, these are not necessarily viable in all the alternative modified gravity descriptions. As we will demonstrate, in some cases the resulting imperfect fluid cosmologies are not compatible at all with the observational data, and in some other cases, there is partial compatibility.
We need to note that the perturbation aspects in LQC are not transparent enough and assume that there are no non-trivial quantum gravitational modifications arising due to presence of inhomogeneities. As it was shown in, a consistent Hamiltonian framework does not allow this assumption to be true. The perturbations issues that may arise in the context of the present work, are possibly more related to some early works in LQC, so any calculation of the primordial power spectrum should be addressed as we commented above.
Japan’s ambitious moonshot to develop fault-tolerant computers by 2050 has a clear goal, but it remains uncertain which technology will win out.
Levitating magnets at sub-zero temperatures could lead to revolutionary cosmic insights.
For years, niobium was considered an underperformer when it came to superconducting qubits. Now scientists supported by Q-NEXT have found a way to engineer a high-performing niobium-based qubit and so take advantage of niobium’s superior qualities.
When it comes to quantum technology, niobium is making a comeback.
For the past 15 years, niobium has been sitting on the bench after experiencing a few mediocre at-bats as a core qubit material.
Enhancing quantum features compensates for environmental losses, amplifying particle interactions, achieving entanglement at higher scales.
One of the oldest topics of contemporary science is where to draw the line between classical and quantum physics.
Abstract
The ability to engineer cavity-mediated interactions has emerged as a powerful tool for the generation of non-local correlations and the investigation of non-equilibrium phenomena in many-body systems. Levitated optomechanical systems have recently entered the multi-particle regime, with promise for using arrays of massive strongly coupled oscillators for exploring complex interacting systems and sensing. Here, by combining advances in multi-particle optical levitation and cavity-based quantum control, we demonstrate, for the first time, programmable cavity-mediated interactions between nanoparticles in a vacuum. The interaction is mediated by photons scattered by spatially separated particles in a cavity, resulting in strong coupling (Gzz/Ωz = 0.238 ± 0.005) that does not decay with distance within the cavity mode volume. We investigate the scaling of the interaction strength with cavity detuning and inter-particle separation and demonstrate the tunability of interactions between different mechanical modes. Our work paves the way towards exploring many-body effects in nanoparticle arrays with programmable cavity-mediated interactions, generating entanglement of motion, and using interacting particle arrays for optomechanical sensing.
Thin-layer films, due to their compatibility with plastic substrates, could serve modern high-frequency tech applications effectively. Bismuth thin films display a non-linear Hall effect, potentially enabling regulated terahertz signal use on electronic chips, hinting at tech applications.
Researchers in Imperial College London’s Department of Materials have developed a new portable maser that can fit the size of a shoebox.
Imperial College London pioneered the discovery of room-temperature solid-state masers in 2012, highlighting their ability to amplify extremely faint electrical signals and demonstrate high-frequency stability. This was a significant discovery because microwave signals can pass through the Earth’s atmosphere more easily than other wavelengths of light. Additionally, microwaves have the capability to penetrate through the human body, a feat not achievable by lasers.
Masers have extensive applications in telecommunications systems—everything from mobile phone networks to satellite navigation systems. They also have a key role in advancing quantum computing and improving medical imaging techniques, like MRI machines. They are typically large, bulky, stationary equipment found only in research laboratories.
Popular Summary.
Unequivocally demonstrating that a quantum computer can significantly outperform any existing classical computers will be a milestone in quantum science and technology. Recently, groups at Google and at the University of Science and Technology of China (USTC) announced that they have achieved such quantum computational advantages. The central quantity of interest behind their claims is the linear cross-entropy benchmark (XEB), which has been claimed and used to approximate the fidelity of their quantum experiments and to certify the correctness of their computation results. However, such claims rely on several assumptions, some of which are implicitly assumed. Hence, it is critical to understand when and how XEB can be used for quantum advantage experiments. By combining various tools from computer science, statistical physics, and quantum information, we critically examine the properties of XEB and show that XEB bears several intrinsic vulnerabilities, limiting its utility as a benchmark for quantum advantage.
Concretely, we introduce a novel framework to identify and exploit several vulnerabilities of XEB, which leads to an efficient classical algorithm getting comparable XEB values to Google’s and USTC’s quantum devices (2% 12% of theirs) with just one GPU within 2 s. Furthermore, its performance features better scaling with the system size than that of a noisy quantum device. We observe that this is made possible because the XEB can highly overestimate the fidelity, which implies the existence of “shortcuts” to achieve high XEB values without simulating the system. This is in contrast to the intuition of the hardness of achieving high XEB values by all possible classical algorithms.