Menu

Blog

Archive for the ‘Nature’s method of operation’ tag

Nov 18, 2012

The Kline Directive: Technological Feasibility (2d)

Posted by in categories: cosmology, defense, education, engineering, general relativity, particle physics, philosophy, physics, policy, space

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationships, and Technological Feasibility.

In this post on technological feasibility, I point to some more mistakes in physics, so that we are aware of the type of mistakes we are making. This I hope will facilitate the changes required of our understanding of the physics of the Universe and thereby speed up the discovery of new physics required for interstellar travel.

The scientific community recognizes two alternative models for force. Note I use the term recognizes because that is how science progresses. This is necessarily different from the concept how Nature operates or Nature’s method of operation. Nature has a method of operating that is consistent with all Nature’s phenomena, known and unknown.

If we are willing to admit, that we don’t know all of Nature’s phenomena — our knowledge is incomplete — then it is only logical that our recognition of Nature’s method of operation is always incomplete. Therefore, scientists propose theories on Nature’s methods, and as science progresses we revise our theories. This leads to the inference that our theories can never be the exact presentation of Nature’s methods, because our knowledge is incomplete. However, we can come close but we can never be sure ‘we got it’.

With this understanding that our knowledge is incomplete, we can now proceed. The scientific community recognizes two alternative models for force, Einstein’s spacetime continuum, and quantum mechanics exchange of virtual particles. String theory borrows from quantum mechanics and therefore requires that force be carried by some form of particle.

Einstein’s spacetime continuum requires only 4 dimensions, though other physicists have add more to attempt a unification of forces. String theories have required up to 23 dimensions to solve equations.

However, the discovery of the empirically validated g=τc2 proves once and for all, that gravity and gravitational acceleration is a 4-dimensional problem. Therefore, any hypothesis or theory that requires more than 4 dimensions to explain gravitational force is wrong.

Further, I have been able to do a priori what no other theories have been able to do; to unify gravity and electromagnetism. Again only working with 4 dimensions, using a spacetime continuum-like empirically verified Non Inertia (Ni) Fields proves that non-nuclear forces are not carried by the exchange of virtual particles. And therefore, if non-nuclear forces are not carried by the exchange of virtual particles, why should Nature suddenly change her method of operation and be different for nuclear forces? Virtual particles are mathematical conjectures that were a convenient mathematical approach in the context of a Standard Model.

Sure there is always that ‘smart’ theoretical physicist who will convert a continuum-like field into a particle-based field, but a particle-continuum duality does not answer the question, what is Nature’s method? So we come back to a previous question, is the particle-continuum duality a mathematical conjecture or a mathematical construction? Also note, now that we know of g=τc2, it is not a discovery by other hypotheses or theories, if these hypotheses/theories claim to be able to show or reconstruct a posteriori, g=τc2, as this is also known as back fitting.

Our theoretical physicists have to ask themselves many questions. Are they trying to show how smart they are? Or are they trying to figure out Nature’s methods? How much back fitting can they keep doing before they acknowledge that enough is enough? Could there be a different theoretical effort that could be more fruitful?

The other problem with string theories is that these theories don’t converge to a single set of descriptions about the Universe, they diverge. The more they are studied the more variation and versions that are discovered. The reason for this is very clear. String theories are based on incorrect axioms. The primary incorrect axiom is that particles expand when their energy is increased.

The empirical Lorentz-Fitzgerald transformations require that length contracts as velocity increases. However, the eminent Roger Penrose, in the 1950s showed that macro objects elongate as they fall into a gravitational field. The portion of the macro body closer to the gravitational source is falling at just a little bit faster velocity than the portion of the macro body further away from the gravitational source, and therefore the macro body elongates. This effect is termed tidal gravity.

In reality as particles contract in their length, per Lorentz-Fitzgerald, the distance between these particles elongates due to tidal gravity. This macro expansion has been carried into theoretical physics at the elementary level of string particles, that particles elongate, which is incorrect. That is, even theoretical physicists make mistakes.

Expect string theories to be dead by 2017.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.