Toggle light / dark theme

In search for a unifying quantum gravity theory that would reconcile general relativity with quantum theory, it turns out quantum theory is more fundamental, after all. Quantum mechanical principles, some physicists argue, apply to all of reality (not only the realm of ultra-tiny), and numerous experiments confirm that assumption. After a century of Einsteinian relativistic physics gone unchallenged, a new kid of the block, Computational Physics, one of the frontrunners for quantum gravity, states that spacetime is a flat-out illusion and that what we call physical reality is actually a construct of information within [quantum neural] networks of conscious agents. In light of the physics of information, computational physicists eye a new theory as an “It from Qubit” offspring, necessarily incorporating consciousness in the new theoretic models and deeming spacetime, mass-energy as well as gravity emergent from information processing.

In fact, I expand on foundations of such new physics of information, also referred to as [Quantum] Computational Physics, Quantum Informatics, Digital Physics, and Pancomputationalism, in my recent book The Syntellect Hypothesis: Five Paradigms of the Mind’s Evolution. The Cybernetic Theory of Mind I’m currently developing is based on reversible quantum computing and projective geometry at large. This ontological model, a “theory of everything” of mine, agrees with certain quantum gravity contenders, such as M-Theory on fractal dimensionality and Emergence Theory on the code-theoretic ontology, but admittedly goes beyond all current models by treating space-time, mass-energy and gravity as emergent from information processing within a holographic, multidimensional matrix with the Omega Singularity as the source.

There’s plenty of cosmological anomalies of late that make us question the traditional interpretation of relativity. First off, what Albert Einstein (1879 — 1955) himself called “the biggest blunder” of his scientific career – t he rate of the expansion of our Universe, or the Hubble constant – is the subject of a very important discrepancy: Its value changes based how scientists try to measure it. New results from the Hubble Space Telescope have now “raised the discrepancy beyond a plausible level of chance,” according to one of the latest papers published in the Astrophysical Journal. We are stumbling more often on all kinds of discrepancies in relativistic physics and the standard cosmological model. Not only the Hubble constant is “constantly” called into question but even the speed of light, if measured by different methods, and on which Einsteinian theories are based upon, shows such discrepancies and turns out not really “constant.”

A giant star called VVV-WIT-08 exhibited a smooth, eclipse-like drop in brightness to a depth of 97% in 2012; minimum brightness occurred in April 2012 and the total event duration was a few hundred days, according to an analysis of data from the VISTA Variables in the Via Lactea survey (VVV), a project using the British-built VISTA telescope in Chile and operated by ESO.

It may belong to a new class of ‘blinking’ binary system, where a giant star — 100 times larger than the Sun — is eclipsed once every few decades by an as-yet unseen orbital companion.

Livescience.com|By LiveScience


The five-year survey, conducted across a section of the cosmos known as the nearby universe because of its proximity to our own galaxy, used the Atacama Large Millimeter/Submillimeter Array (ALMA) radio telescope located in Chile’s Atacama Desert. By conducting their survey in the radio part of the electromagnetic spectrum, rather than the optical part, the astronomers could focus on the faint glow from the dust and gas of the dark and dense molecular clouds, as opposed to the visible light from the young stars birthed by them.

This allowed the researchers to study how a star’s home cloud shapes its formation.

NASA is already so impressed by the Starship that it has contracted SpaceX to build a lunar-landing version of it to return astronauts to the moon as early as 2024. The selection has enraged Musk’s rivals such as Blue Origin’s Jeff Bezos Jeffrey (Jeff) Preston BezosSeat on Bezos-backed space flight sells for million at auction Researchers: Wealth accumulation at Ivy League presents ‘fundamental threat to our democracy’ Democrats reintroduce bill to create ‘millionaires surtax’ MORE and has perturbed some members of Congress. Both have only themselves to blame — Blue Origin for offering an inferior design and Congress for underfunding the Human Landing System project.

Military technology development has often been defined by the advent of new ways to transport people and cargo. The racing galleon of the 16th century became the frigates and ships of the line that defined naval warfare in the 18th and early 19th centuries. The steam engine and iron and steel armor led to the dreadnoughts of the early 20th century. Modern warships incorporate nuclear power. Air travel has caused the same sort of evolution, from the motorized kites of World War I to modern jets that can deliver destruction and death from thousands of miles away.

Now, space transportation technology is poised to cause a similar revolution in the military’s ability to defend the United States and its allies and to inflict mayhem and death on any enemy that would propose to make war on America. The great irony is that the Starship will be used by a branch of the military that Musk once compared to Starfleet, the fictional service depicted in the “Star Trek” television shows and movies. The thought would likely bring a smile to the face of the franchise’s creator, Gene Roddenberry, in whatever afterlife one envisions him inhabiting.