Toggle light / dark theme

Exotic Pentaquark Particle Discovery & CERN’s Massive Data Center


July, 2015; as you know.. was the all systems go for the CERNs Large Hadron Collider (LHC). On a Saturday evening, proton collisions resumed at the LHC and the experiments began collecting data once again. With the observation of the Higgs already in our back pocket — It was time to turn up the dial and push the LHC into double digit (TeV) energy levels. From a personal standpoint, I didn’t blink an eye hearing that large amounts of Data was being collected at every turn. BUT, I was quite surprised to learn at the ‘Amount’ being collected and processed each day — About One Petabyte.

Approximately 600 million times per second, particles collide within the (LHC). The digitized summary is recorded as a “collision event”. Physicists must then sift through the 30 petabytes or so of data produced annually to determine if the collisions have thrown up any interesting physics. Needless to say — The Hunt is On!

The Data Center processes about one Petabyte of data every day — the equivalent of around 210,000 DVDs. The center hosts 11,000 servers with 100,000 processor cores. Some 6000 changes in the database are performed every second.

With experiments at CERN generating such colossal amounts of data. The Data Center stores it, and then sends it around the world for analysis. CERN simply does not have the computing or financial resources to crunch all of the data on site, so in 2002 it turned to grid computing to share the burden with computer centres around the world. The Worldwide LHC Computing Grid (WLCG) – a distributed computing infrastructure arranged in tiers – gives a community of over 8000 physicists near real-time access to LHC data. The Grid runs more than two million jobs per day. At peak rates, 10 gigabytes of data may be transferred from its servers every second.

By early 2013 CERN had increased the power capacity of the centre from 2.9 MW to 3.5 MW, allowing the installation of more computers. In parallel, improvements in energy-efficiency implemented in 2011 have led to an estimated energy saving of 4.5 GWh per year.

Image: CERN

PROCESSING THE DATA (processing info via CERN)> Subsequently hundreds of thousands of computers from around the world come into action: harnessed in a distributed computing service, they form the Worldwide LHC Computing Grid (WLCG), which provides the resources to store, distribute, and process the LHC data. WLCG combines the power of more than 170 collaborating centres in 36 countries around the world, which are linked to CERN. Every day WLCG processes more than 1.5 million ‘jobs’, corresponding to a single computer running for more than 600 years.

Racks of servers at the CERN Data Centre (Image: CERN)

CERN DATA CENTER: The server farm in the 1450 m2 main room of the DC (pictured) forms Tier 0, the first point of contact between experimental data from the LHC and the Grid. As well as servers and data storage systems for Tier 0 and further physics analysis, the DC houses systems critical to the daily functioning of the laboratory. (Image: CERN)

The data flow from all four experiments for Run 2 is anticipated to be about 25 GB/s (gigabyte per second)

  • ALICE: 4 GB/s (Pb-Pb running)
  • ATLAS: 800 MB/s – 1 GB/s
  • CMS: 600 MB/s
  • LHCb: 750 MB/s

In July, the LHCb experiment reported observation of an entire new class of particles:
Exotic Pentaquark Particles (Image: CERN)

Possible layout of the quarks in a pentaquark particle. The five quarks might be tightly bound (left). The five quarks might be tightly bound. They might also be assembled into a meson (one quark and one anti quark) and a baryon (three quarks), weakly bound together.

The LHCb experiment at CERN’s LHC has reported the discovery of a class of particles known as pentaquarks. In short, “The pentaquark is not just any new particle,” said LHCb spokesperson Guy Wilkinson. “It represents a way to aggregate quarks, namely the fundamental constituents of ordinary protons and neutrons, in a pattern that has never been observed before in over 50 years of experimental searches. Studying its properties may allow us to understand better how ordinary matter, the protons and neutrons from which we’re all made, is constituted.”

Our understanding of the structure of matter was revolutionized in 1964 when American physicist Murray Gell-Mann proposed that a category of particles known as baryons, which includes protons and neutrons, are comprised of three fractionally charged objects called quarks, and that another category, mesons, are formed of quark-antiquark pairs. This quark model also allows the existence of other quark composite states, such as pentaquarks composed of four quarks and an antiquark.

Until now, however, no conclusive evidence for pentaquarks had been seen.
Earlier experiments that have searched for pentaquarks have proved inconclusive. The next step in the analysis will be to study how the quarks are bound together within the pentaquarks.

The quarks could be tightly bound,” said LHCb physicist Liming Zhang of Tsinghua University, “or they could be loosely bound in a sort of meson-baryon molecule, in which the meson and baryon feel a residual strong force similar to the one binding protons and neutrons to form nuclei.” More studies will be needed to distinguish between these possibilities, and to see what else pentaquarks can teach us!

August 18th, 2015
CERN Experiment Confirms Matter-Antimatter CPT Symmetry
For
Light Nuclei, Antinuclei (Image: CERN)

Days after scientists at CERN’s Baryon-Antibaryon Symmetry Experiment (BASE) measured the mass-to-charge ratio of a proton and its antimatter particle, the antiproton, the ALICE experiment at the European organization reported similar measurements for light nuclei and antinuclei.

The measurements, made with unprecedented precision, add to growing scientific data confirming that matter and antimatter are true mirror images.

Antimatter shares the same mass as its matter counterpart, but has opposite electric charge. The electron, for instance, has a positively charged antimatter equivalent called positron. Scientists believe that the Big Bang created equal quantities of matter and antimatter 13.8 billion years ago. However, for reasons yet unknown, matter prevailed, creating everything we see around us today — from the smallest microbe on Earth to the largest galaxy in the universe.

Last week, in a paper published in the journal Nature, researchers reported a significant step toward solving this long-standing mystery of the universe. According to the study, 13,000 measurements over a 35-day period show — with unparalleled precision – that protons and antiprotons have identical mass-to-charge ratios.

The experiment tested a central tenet of the Standard Model of particle physics, known as the Charge, Parity, and Time Reversal (CPT) symmetry. If CPT symmetry is true, a system remains unchanged if three fundamental properties — charge, parity, which refers to a 180-degree flip in spatial configuration, and time — are reversed.

The latest study takes the research over this symmetry further. The ALICE measurements show that CPT symmetry holds true for light nuclei such as deuterons — a hydrogen nucleus with an additional neutron — and antideuterons, as well as for helium-3 nuclei — two protons plus a neutron — and antihelium-3 nuclei. The experiment, which also analyzed the curvature of these particles’ tracks in ALICE detector’s magnetic field and their time of flight, improve on the existing measurements by a factor of up to 100.

IN CLOSING..

A violation of CPT would not only hint at the existence of physics beyond the Standard Model — which isn’t complete yet — it would also help us understand why the universe, as we know it, is completely devoid of antimatter.

UNTIL THEN…

ORIGINAL ARTICLE POSTING via Michael Phillips LinkedIN Pulse @

Slow death of Universe confirmed with precision

  • The universe radiates only half as much energy as 2 billion years ago
  • New findings establish cosmos’ decline with unprecedented precision


From CNN
—The universe came in with the biggest bang ever. But now, with a drooping fizzle, it is in its swan song. The conclusion of a new astronomical study pulls no punches on this: “The Universe is slowly dying,” it reads.

Astronomers have believed as much for years, but the new findings establish the cosmos’ decline with unprecedented precision. An international team of 100 scientists used data from the world’s most powerful telescopes — based on land and in space — to study energy coming from more than 200,000 galaxies in a large sliver of the observable universe. [Full story below or at CNN.com]…

Based on those observations, they have confirmed the cosmos is radiating only half as much energy as it was 2 billion years ago. The astronomers published their study on Monday on the website of the European Southern Observatory.

The team checked the energy across a broad spectrum of lightwaves and other electromagnetic radiation and says it is fading through all wavelengths, from ultraviolet to far infrared.

Analysis across many wavelengths shows the universe’s electromagnetic energy output is dropping.

‘A cold, dark and desolate place’

At the ripe old age of nearly 13.8 billion years, the universe has arrived in its sunset years.

“The universe has basically sat down on the sofa, pulled up a blanket and is about to nod off for an eternal doze,” said astronomer Simon Driver, who led the team.

Death does not mean the universe will go away. It will still be there, but its stars and all else that produces light and stellar fire will fizzle out.

“It will just grow old forever, slowly converting less and less mass into energy as billions of years pass by until eventually, it will become a cold, dark and desolate place, where all of the lights go out,” said astronomer Luke Davies.

But don’t cry for the universe anytime soon. Astrophysicists say this will take trillions of years.

Bursting with energy

Go all the way back to its birth, and you find a vast contrast. In an infinitesimal fraction of a second, our entire cosmos blasted into existence in the Big Bang.

And the totality of the energy and mass in the universe originates from that moment, astronomers say.

Since that natal explosion, the cosmos has generated other sources of brilliant radiation — most notably stars — by converting some of the mass into energy when extreme gravity causes matter to burst into nuclear fusion.

But the universe is speckled by radiance from seething gas clouds, supernovas and, most spectacularly, the discs of hot matter that rotate around black holes to form quasars, which can be as bright as whole galaxies.

“While most of the energy sloshing around in the universe arose in the aftermath of the Big Bang, additional energy is constantly being generated by stars as they fuse elements like hydrogen and helium together,” Driver said.

Fizzling into space

The size and number of those sources of radiation so boggle the mind that it might be hard to imagine that the entirety of that vividness appears to be fading, as its energy flies off through space.

“This new energy is either absorbed by dust as it travels through the host galaxy, or escapes into intergalactic space and travels until it hits something, such as another star, a planet, or, very occasionally, a telescope mirror,” Driver said.

His team observed it from seven of the world’s mammoth telescopes spread out between Australia, the United States, Chile and Earth’s orbit. Many of the instruments specialize in receiving certain wavelengths of light and other electromagnetic waves.

Compiling the data from the collective wavelengths gives the scientists a more complete picture from across a broad spectrum of energy.

Their findings on the universe’s energy slump were part of the larger Galaxy And Mass Assembly, or GAMA, project to study how galaxies are formed. It has mapped out the position of 4 million galaxies so far.

From cameras to computers, new material could change how we work and play

Serendipity has as much a place in science as in love. That’s what Northeastern physicists Swastik Kar and Srinivas Sridhar found during their four-year project to modify graphene, a stronger-than-steel infinitesimally thin lattice of tightly packed carbon atoms. Primarily funded by the Army Research Laboratory and Defense Advanced Research Projects Agency, or DARPA, the researchers were charged with imbuing the decade-old material with thermal sensitivity for use in infrared imaging devices such as night-vision goggles for the military.

What they unearthed, published Friday in the journal Science Advances, was so much more: an entirely new material spun out of boron, nitrogen, carbon, and oxygen that shows evidence of magnetic, optical, and electrical properties as well as DARPA’s sought-after thermal ones. Its potential applications run the gamut: from 20-megapixel arrays for cellphone cameras to photo detectors to atomically thin transistors that when multiplied by the billions could fuel computers.

Read more

Universe might contain millions of black holes

[from Engadget]

Black holes are, by definition, impossible to see by conventional methods and are often further obscured by thick blankets of dust or gas. But that’s not an issue for NASA’s Nuclear Spectroscopic Telescope Array (NuSTAR). It can peek through the obscuring layers and monitor the black holes via the high-energy X-rays that they emit. And, after a recent survey that spotted five previously unknown supermassive black holes in the centers of various galaxies, NASA researchers now think there could be millions of of them dotting the Universe like the holes of an intergalactic colander.

“Thanks to NuSTAR, for the first time, we have been able to clearly identify these hidden monsters that are predicted to be there, but have previously been elusive because of their surrounding cocoons of material,” said George Lansbury of Durham University in a statement. “Although we have only detected five of these hidden supermassive black holes, when we extrapolate our results across the whole universe, then the predicted numbers are huge and in agreement with what we would expect to see.” The team’s research has been accepted for publication in The Astrophysical Journal.

Strings Are Dead

In 2014, I submitted my paper “A Universal Approach to Forces” to the journal Foundations of Physics. The 1999 Noble Laureate, Prof. Gerardus ‘t Hooft, editor of this journal, had suggested that I submit this paper to the journal Physics Essays.

My previous 2009 submission “Gravitational acceleration without mass and noninertia fields” to Physics Essays, had taken 1.5 years to review and be accepted. Therefore, I decided against Prof. Gerardus ‘t Hooft’s recommendation as I estimated that the entire 6 papers (now published as Super Physics for Super Technologies) would take up to 10 years and/or $20,000 to publish in peer reviewed journals.

Prof. Gerardus ‘t Hooft had brought up something interesting in his 2008 paper “A locally finite model for gravity” that “… absence of matter now no longer guarantees local flatness…” meaning that accelerations can be present in spacetime without the presence of mass. Wow! Isn’t this a precursor to propulsion physics, or the ability to modify spacetime without the use of mass?

As far as I could determine, he didn’t pursue this from the perspective of propulsion physics. A year earlier in 2007, I had just discovered the massless formula for gravitational acceleration g=τc^2, published in the Physics Essays paper referred above. In effect, g=τc^2 was the mathematical solution to Prof. Gerardus ‘t Hooft’s “… absence of matter now no longer guarantees local flatness…”

Prof. Gerardus ‘t Hooft used string theory to arrive at his inference. Could he empirically prove it? No, not with strings. It took a different approach, numerical modeling within the context of Einstein’s Special Theory of Relativity (STR) to derive a mathematic solution to Prof. Gerardus ‘t Hooft’s inference.

In 2013, I attended Dr. Brian Greens’s Gamow Memorial Lecture, held at the University of Colorado Boulder. If I had heard him correctly, the number of strings or string states being discovered has been increasing, and were now in the 10500 range.

I find these two encounters telling. While not rigorously proved, I infer that (i) string theories are unable to take us down a path the can be empirically proven, and (ii) they are opened ended i.e. they can be used to propose any specific set of outcomes based on any specific set of inputs. The problem with this is that you now have to find a theory for why a specific set of inputs. I would have thought that this would be heartbreaking for theoretical physicists.

In 2013, I presented the paper “Empirical Evidence Suggest A Need For A Different Gravitational Theory,” at the American Physical Society’s April conference held in Denver, CO. There I met some young physicists and asked them about working on gravity modification. One of them summarized it very well, “Do you want me to commit career suicide?” This explains why many of our young physicists continue to seek employment in the field of string theories where unfortunately, the hope of empirically testable findings, i.e. winning the Noble Prize, are next to nothing.

I think string theories are wrong.

Two transformations or contractions are present with motion, Lorentz-FitzGerald Transformation (LFT) in linear motion and Newtonian Gravitational Transformations (NGT) in gravitational fields.

The fundamental assumption or axiom of strings is that they expand when their energy (velocity) increases. This axiom (let’s name it the Tidal Axiom) appears to have its origins in tidal gravity attributed to Prof. Roger Penrose. That is, macro bodies elongate as the body falls into a gravitational field. To be consistent with NGT the atoms and elementary particles would contract in the direction of this fall. However, to be consistent with tidal gravity’s elongation, the distances between atoms in this macro body would increase at a rate consistent with the acceleration and velocities experienced by the various parts of this macro body. That is, as the atoms get flatter, the distances apart get longer. Therefore, for a string to be consistent with LFT and NGT it would have to contract, not expand. One suspects that this Tidal Axiom’s inconsistency with LFT and NGT has led to an explosion of string theories, each trying to explain Nature with no joy. See my peer-reviewed 2013 paper New Evidence, Conditions, Instruments & Experiments for Gravitational Theories published in the Journal of Modern Physics, for more.

The vindication of this contraction is the discovery of the massless formula for gravitational acceleration g=τc^2 using Newtonian Gravitational Transformations (NGT) to contract an elementary particle in a gravitational field. Neither quantum nor string theories have been able to achieve this, as quantum theories require point-like inelastic particles, while strings expand.

What worries me is that it takes about 70 to 100 years for a theory to evolve into commercially viable consumer products. Laser are good examples. So, if we are tying up our brightest scientific minds with theories that cannot lead to empirical validations, can we be the primary technological superpower a 100 years from now?

The massless formula for gravitational acceleration g=τc^2, shows us that new theories on gravity and force fields will be similar to General Relativity, which is only a gravity theory. The mass source in these new theories will be replaced by field and particle motions, not mass or momentum exchange. See my Journal of Modern Physics paper referred above on how to approach this and Super Physics for Super Technologies on how to accomplish this.

Therefore, given that the primary axiom, the Tidal Axiom, of string theories is incorrect it is vital that we recognize that any mathematical work derived from string theories is invalidated. And given that string theories are particle based theories, this mathematical work is not transferable to the new relativity type force field theories.

I forecast that both string and quantum gravity theories will be dead by 2017.

When I was seeking funding for my work, I looked at the Broad Agency Announcements (BAAs) for a category that includes gravity modification or interstellar propulsion. To my surprise, I could not find this category in any of our research organizations, including DARPA, NASA, National Science Foundation (NSF), Air Force Research Lab, Naval Research Lab, Sandia National Lab or the Missile Defense Agency.

So what are we going to do when our young graduates do not want to or cannot be employed in string theory disciplines?

(Originally published in the Huffington Post)

/* */