Toggle light / dark theme

A new study makes a compelling case for the development of “NEMO”—a new observatory in Australia that could deliver on some of the most exciting gravitational-wave science next-generation detectors have to offer, but at a fraction of the cost.

The study, co-authored by the ARC Center of Excellence for Gravitational Wave Discovery (OzGrav), coincides with an Astronomy Decadal Plan mid-term review by Australian Academy of Sciences where “NEMO” is identified as a priority goal.

“Gravitational-wave astronomy is reshaping our understanding of the Universe,” said one of the study’s lead authors OzGrav Chief Investigator Paul Lasky, from Monash University.

For the first time ever, scientists have witnessed the interaction of a new phase of matter known as “time crystals”.

The discovery, published in Nature Materials, may lead to applications in quantum information processing because time crystals automatically remain intact—coherent—in varying conditions. Protecting coherence is the main difficulty hindering the development of powerful quantum computers.

Dr. Samuli Autti, lead author from Lancaster University, said: “Controlling the interaction of two time crystals is a major achievement. Before this, nobody had observed two time crystals in the same system, let alone seen them interact.

Quantum computing requires meticulously prepared hardware and big budgets, but cloud-based solutions could make the technology available to broader business audiences Several tech giants are racing to achieve “quantum supremacy”, but reliability and consistency in quantum output is no simple trick Covid-19 has prompted some researchers to look at how quantum computing could mitigate future pandemics with scientific precision and speed Quantum computing (QC) has been theorized for decades and has evolved rapidly over the last few years. An escalation in spend and development has seen powerhouses IBM, Microsoft, and Google race for ‘quantum supremacy’ — whereby quantum reliably and consistently outperforms existing computers. But do quantum computers remain a sort of elitist vision of the future or are we on course for more financially and infrastructurally viable applications across industries?

Getting to grips with qubits How much do you know? Ordinary computers (even supercomputers) deploy bits, and these bits comprise of traditional binary code. Computer processes – like code – are made up of countless combinations of 0’s and 1’s. Quantum computers, however, are broken down into qubits. Qubits are capable of ‘superpositions’: effectively adopting both 1 and 0 simultaneously, or any space on the spectrum between these two formerly binary points. The key to a powerful, robust, and reliable quantum computer is more qubits. Every qubit added exponentially increases the processing capacity of the machine.

Qubits and the impact of the superposition give quantum computers the ability to process large datasets within seconds, doing what it would take humans decades to do. They can decode and deconstruct, hypothesize and validate, tackling problems of absurd complexity and dizzying magnitude — and can do so across many different industries.

Wherein lies the issue then? Quantum computing for everybody! We’re still a way off – the general consensus being, it’s 5 years, at least, before this next big wave of computing is seen widely across industries and use cases, unless your business is bustling with the budgets of tech giants like Google, IBM, and the like. But expense isn’t the only challenge.

Frail and demanding — the quantum hardware Quantum computers are interminably intricate machines. It doesn’t take much at all to knock a qubit out of the delicate state of superposition. They’re powerful, but not reliable. The slightest interference or frailty leads to high error rates in quantum processing, slowing the opportunity for more widespread use, and rendering ‘quantum supremacy’ a touch on the dubious side.

Background:Pantoea is a genus within the Enterobacterales whose members encompass free-living and host-associated lifestyles. Despite our growing understanding of the role of mobile genetic elements in the biology, ecology, and evolution of this bacterial group, few Pantoea bacteriophages have been identified and characterized.

https://www.liebertpub.com/doi/10.1089/phage.2019.


Background: Pantoea is a genus within the Enterobacterales whose members encompass free-living and host-associated lifestyles. Despite our growing understanding of the role of mobile genetic elements in the biology, ecology, and evolution of this bacterial group, few Pantoea bacteriophages have been identified and characterized.

Materials and Methods: A bacteriophage that could infect Pantoea agglomerans was isolated from barnyard soil. We used electron microscopy and complete genome sequencing to identify the viral family, and evaluated its host range across 10 different Pantoea species groups using both bacterial lawn and phage lawn assays. The latter assays were carried out using a scalable microplate assay to increase throughput and enable spectrophotometric quantitation. We also performed a phylogenetic analysis to determine the closest relatives of our phage.

#CyberneticSingularity


About 542 million years ago, something weird and profoundly remarkable happened on Earth. Quite suddenly, life went insanely inventive, proliferating from simple, rudimentary single-celled organisms into myriad multi-cellular forms. Evolution discovered the idea of more sophisticated and specialized cells, and most of the basic body plans we know today. Biologists call it the Cambrian explosion.

Today, we are on the verge of yet another event of astronomical significance, akin to some kind of Intelligence Supernova, which I refer to as the Cybernetic Singularity, or the Syntellect Emergence. In the scientific community, this upcoming intelligence explosion is also known as the Technological Singularity. Surprisingly enough, most people are still simply oblivious of this rapidly approaching “galactic event” that so many of us are about to witness in our lifetimes.

Objective reality is merely a pattern that a mind constructs because it provides a useful simplified explanatory scaffolding of the long series of subjectively perceived moments stored in its memory. Needless to say that the Vigner’s Friend experiment mentioned in the article is not the only experimental evidence for the objectivity myth. Think about it when the next time you come across these overloaded terms ‘objective reality’ and ‘objectivity’ — to be precise, they mean ‘intersubjectivity’ instead: Termites would never comprehend chess, for example, this human abstraction lies beyond their species-specific intersubjective mind-network. Apart from inter-species levels of abstractions we should consider psychological, cultural and linguistic differences between individuals of the same species that makes objectivity simply non-existent. Conclusion: we can still use ‘objective reality’, ‘objectivity’ or ‘objectively’ colloquially but we should bear in mind that in a deeper sense these terms are no more than colorful misnomers. https://medium.com/@alexvikoulov/the-objectivity-myth-what-w…b697a5179d

#ObjectivityMyth #ObjectiveReality #ConsensusReality #intersubjectivity #UniversalMind #UniversalConsciousness


“It will remain remarkable, in whatever way our future concepts may develop, that the very study of the external world led to the conclusion that the content of consciousness is the ultimate reality.” –Eugene Wigner