Menu

Blog

Page 2810

Jan 3, 2023

The Planck Density: The Density of the Early Universe

Posted by in categories: cosmology, materials

I’ve looked at quite a few of the Planck base units, and I’ve even worked them out mathematically, but today I’m going to look at one of the derived units and I’ll compare it to some other things to see how big or small this is. Today then I’m going to be looking at the Planck Density. Let’s find out more.
Before we start, we need to know what density is. Density is a measure of how tightly packed a material is. In other words, how much stuff is packed into a certain volume of space.

To work out density then we need a formula, and units. To work out density we use the following formula, density and that is denoted by the greek letter rho equals mass divided by volume. The SI unit of density is kilograms per metre cubed. So now that we know what density is and we have our units, time to see how dense different materials are and then compare that to the Planck density, which is very dense indeed. At the end I’ll show you where the numbers come from. We’ll start off by looking at some very un dense things and work our way up.

Continue reading “The Planck Density: The Density of the Early Universe” »

Jan 3, 2023

Boltzmann Brains: A Cosmological Horror Story

Posted by in categories: cosmology, neuroscience, physics

Boltzmann brains are perhaps one of the spookiest ideas in physics. A Boltzmann brain is a single, isolated human brain complete with false memories that spontaneously fluctuates into existence from the void. They’re the kind of thing you’d find in a campfire horror story. The big problem, however, is that a range of plausible cosmological models (including our current cosmology) predict that Boltzmann brains will exist. Even worse, these brains should massively outnumber “ordinary” conscious observers like ourselves. At every moment of your existence, it is more likely that you are an isolated Boltzmann brain, falsely remembering your past, than a human being on a rocky planet in a low-entropy universe.

In this video I explain where the idea of Boltzmann brains originated, and why they haunt modern cosmology.

Continue reading “Boltzmann Brains: A Cosmological Horror Story” »

Jan 3, 2023

The Light Clock: How Moving Clocks Run Slow

Posted by in categories: mathematics, physics

If you know anything about special relativity then you probably know that how fast you’re moving has an impact on how quickly time passes for you. What physics gives rise to this effect? Do you need to know some complicated mathematics in order to understand it?

It turns out that this effect, known as “time dilation”, can be very easily derived for a special kind of clock: a light clock. In this video, I consider a light clock moving through space and show how the postulates of special relativity entail that this moving clock runs slow.

Continue reading “The Light Clock: How Moving Clocks Run Slow” »

Jan 3, 2023

What Is Matter (and Why Does It Matter)?

Posted by in categories: particle physics, quantum physics

Quantum Hylomorphism

What is most original in Koons’s book is his argument that quantum mechanics is best interpreted as vindicating the Aristotelian hylomorphist’s view of nature. To be sure, there have been others who have made such claims, not the least of them being Werner Heisenberg, one of the fathers of modern quantum physics. But Koons is the first prominent philosopher to make the case at book-length, in a way that combines expertise in the relevant philosophical ideas and literature with serious and detailed engagement with the scientific concepts. Future work on hylomorphism and the philosophy of quantum mechanics will have to take account of his arguments.

As Koons notes, there are several aspects of quantum mechanics that lend themselves to an Aristotelian interpretation. For example, there is Heisenberg’s famous principle that the position and momentum of a particle are indeterminate apart from interaction with a system at the middle-range level of everyday objects (such as an observer). There is physicist Richard Feynman’s “sum over histories” method, in which predictions must take account of every possible path a particle might take, not just its actual path. There are “entanglement” phenomena, in which the properties of a system of particles are irreducible to the particles considered individually or their spatial relations and relative velocity. There is quantum statistics, in which particles of the same kind are treated as fused and losing their individuality within a larger system.

Jan 3, 2023

Can a Powerful Enough Computer Work Out a Theory of Everything?

Posted by in categories: computing, physics

Year 2020 face_with_colon_three


The rigorously proven No Free Lunch theorem shows that physicists will always be needed to determine the correct questions.

Jan 3, 2023

AI-ming for a Theory of Everything

Posted by in categories: particle physics, quantum physics, robotics/AI

Year 2020 o.o!


Explorations into the nature of reality have been undertaken across the ages, and in the contemporary world, disparate tools, from gedanken experiments [1–4], experimental consistency checks [5,6] to machine learning and artificial intelligence are being used to illuminate the fundamental layers of reality [7]. A theory of everything, a grand unified theory of physics and nature, has been elusive for the world of Physics. While unifying various forces and interactions in nature, starting from the unification of electricity and magnetism in James Clerk Maxwell’s seminal work A Treatise on Electricity and Magnetism [8] to the electroweak unification by Weinberg-Salam-Glashow [9–11] and research in the direction of establishing the Standard Model including the QCD sector by Murray Gell-Mann and Richard Feynman [12,13], has seen developments in a slow but surefooted manner, we now have a few candidate theories of everything, primary among which is String Theory [14]. Unfortunately, we are still some way off from establishing various areas of the theory in an empirical manner. Chief among this is the concept of supersymmetry [15], which is an important part of String Theory. There were no evidences found for supersymmetry in the first run of the Large Hadron Collider [16]. When the Large Hadron Collider discovered the Higgs Boson in 2011-12 [17–19], there were results that were problematic for the Minimum Supersymmetric Model (MSSM), since the value of the mass of the Higgs Boson at 125 GeV is relatively large for the model and could only be attained with large radiative loop corrections from top squarks that many theoreticians considered to be ‘unnatural’ [20]. In the absence of experiments that can test certain frontiers of Physics, particularly due to energy constraints particularly at the smallest of scales, the importance of simulations and computational research cannot be underplayed. Gone are the days when Isaac Newton purportedly could sit below an apple tree and infer the concept of classical gravity from an apple that had fallen on his head. In today’s age, we have increasing levels of computational inputs and power that factor in when considering avenues of new research in Physics. For instance, M-Theory, introduced by Edward Witten in 1995 [21], is a promising approach to a unified model of Physics that includes quantum gravity. It extends the formalism of String Theory. There have been computational tools relating to machine learning that have lately been used for solving M-Theory geometries [22]. TensorFlow, a computing platform normally used for machine learning, helped in finding 194 equilibrium solutions for one particular type of M-Theory spacetime geometries [23–25].

Artificial intelligence has been one of the primary areas of interest in computational pursuits around Physics research. In 2020, Matsubara Takashi (Osaka University) and Yaguchi Takaharu (Kobe University), along with their research group, were successful in developing technology that could simulate phenomena for which we do not have the detailed formula or mechanism, using artificial intelligence [26]. The underlying step here is the creation of a model from observational data, constrained by the model being consistent and faithful to the laws of Physics. In this pursuit, the researchers utilized digital calculus as well as geometrical approach, such as those of Riemannian geometry and symplectic geometry.

Jan 3, 2023

AI Is Discovering Its Own ‘Fundamental’ Physics And Scientists Are Baffled

Posted by in categories: physics, robotics/AI

Year 2022 😗


AI observed videos of lava lamps and inflatable air dancers and identified dozens of physics variables that scientists don’t yet understand.

Jan 3, 2023

Automated discovery of fundamental variables hidden in experimental data

Posted by in categories: physics, robotics/AI

Year 2022 What they find is a new type of physics generated by their artificial intelligence.


The determination of state variables to describe physical systems is a challenging task. A data-driven approach is proposed to automatically identify state variables for unknown systems from high-dimensional observational data.

Jan 3, 2023

Towards artificial general intelligence via a multimodal foundation model Communications

Posted by in category: robotics/AI

Year 2022 o.o!!!


Artificial intelligence approaches inspired by human cognitive function have usually single learned ability. The authors propose a multimodal foundation model that demonstrates the cross-domain learning and adaptation for broad range of downstream cognitive tasks.

Jan 3, 2023

Pre-exposure cognitive performance variability is associated with severity of respiratory infection

Posted by in category: neuroscience

While shedding and symptom may not be closely linked in general, we found total shedding and symptom severity to be highly correlated (Pearson 0.81, Supplementary Fig. S1). Furthermore, with one exception, low shedding implied low symptom severity and vice versa. Thus associations found between shedding and pre-inoculation biomarkers like the CPV are also present in symptom severity, although to a lesser degree. Therefore in the rest of this section we report associations for the less noisy shedding measurements. The total variance explained (\(R^2\) ) by a linear model relating CPV score to shedding titers is \(R^2=0.77\) (ratio of residual variance of linear regression to variance of titers). Furthermore, a logistic regression of total shedding onto the CPV score yielded a perfect discriminant between high and low shedders, respectively defined as those whose total shedding is below versus above the population median.

The correlation between shedding titers and CPV scores is robust to reductions in the number of NCPT variables composing the score. In fact the correlation between shedding and CPV increases to greater than 0.9 when only 6 NCPT measures are incorporated: digSym-time, digSym-correct, reaction-time, posner-tutorialTime, trail-time and trail-tutorialTime. Furthermore, the CVP score incorporating only the three basic NCPT measures digSym-time, digSym-correct, trail-time achieves a correlation level of approximately 0.7 (Fig. S2). We find that adding a fourth basic NCPT variable reaction time to the CPV score computation does not appreciably affect this level of correlation. On the other hand, replacing replacing either digSym-time or digSym-correct with posner-tutorialTime produces an increase in correlation to a level greater than 0.85.

To illustrate the role of the 18 individual NCPT variables in the CPV, we plot in Fig. 1e the univariate CPV scores for the two lowest shedding and the two highest shedding participants. This figure is extracted from Fig. S3 in the Supplementary that shows the sequence of univariate CPV scores for all 18 study participants. Superimposed on the plot of these variables is a boxplot indicating score sensitivity to session perturbation, determined by leave-one-out analysis where the univariate CPV was recomputed after successively leaving a single NCPT session out of each participant’s sequence (sans screening session). Figure 1e clearly shows that certain NCPT variables have significantly higher variability for the high shedders (lower two panels) than for the low shedders (top two panels). Note that the NCPT variable with highest variability (variable achieving peak score in each panel of Fig. 1e) differs across study participants.