Jump to content

Zero-point field

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by PowerUserPCDude (talk | contribs) at 01:03, 22 July 2009 (Added {{verylong}} and {{wikify}} tags to article. using Friendly). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In quantum field theory, the zero-point field (or zpf) is the lowest energy state of a field, i.e. its ground state, which is non zero.[1] This phenomenon gives the quantum vacuum a complex structure, which can be probed experimentally; see, for example, the Casimir effect. The term "zero-point field" is sometimes used as a synonym for the vacuum state of an individual quantized field. The electromagnetic zero-point field is loosely considered as a sea of background electromagnetic energy that fills the vacuum of space. It is often regarded as only a curious outcome of the Heisenberg uncertainty principle measurement problem, which was derived from the fact that the lowest allowable "average" energy level in a harmonic oscillator mode is not zero but ħω/2, where ω is the characteristic angular frequency of the oscillator. However, there is a global scientific consensus developing that the quantized electromagnetic field exists independently of the statistical uncertainty involved in the non-commutative act of measurement, and that it is also fully consistent with changes in the field that are coincident with the act of measurement.

Overview

It is believed that an electromagnetic field exists in a vacuum even when the temperature of the surrounding material is reduced towards absolute zero.[2] The existence of such a zero-point field has been confirmed experimentally by the Casimir experiment, i.e. the measurement of the attractive force between two parallel plates in an evacuated, near-zero temperature enclosure.[2] That force is found to be proportional to the inverse fourth power of the distance apart of the plates; it has been shown that such a result can only be produced by a zero-point field whose spectral energy density has a frequency dependence of ρ(ν) = kν3.[2] It has been assumed until recently, though without any experimental evidence, that there are zero-point energies for the strong and weak forces as well as the electromagnetic force. More recently it has been understood that the electromagnetic zero-point field and the electromagnetic force carrier (the photon) are probably fundamental to all three force, because the electromagnetic force (expressed by the Lorentz force equation) does not require mass.

History

Quantum mechanics predicts the existence of what are usually called zero-point energies for the strong, the weak and the electromagnetic interactions, where zero-point refers to the energy of the system at temperature T=0, or the lowest quantized energy level of a quantum mechanical system. Specifically, in 1900, Max Planck derived the formula for the energy of a single "energy radiator", i.e. a vibrating atomic unit, as:

Here, is Planck's constant, is the frequency, k is Boltzmann's constant, and T is the temperature.

In 1913, using this formula as a basis, Albert Einstein and Otto Stern published a paper of great significance in which they suggested for the first time the existence of a residual energy that all oscillators have at absolute zero. They called this "residual energy" and then Nullpunktsenergie (in German), which later became translated as zero-point energy. They carried out an analysis of the specific heat of hydrogen gas at low temperature, and concluded that the data are best represented if the vibrational energy is taken to have the form:[3]

Thus, according to this expression, even at absolute zero the energy of an atomic system has the value ½.[4] Although the term zero-point energy applies to all three of these interactions in nature, customarily it is used in reference only to the electromagnetic case.[5] Because the zero-point field has the property of being Lorentz invariant, the zero-point field becomes detectable only when a body is accelerated through space.[5]

Some experiments in 1992 have demonstrated experimentally that the familiar spontaneous emission process in atoms may be regarded as stimulated emission by zero-point field radiation.[6] In recent years, it has been suggested that the electromagnetic zero-point field is not merely an artifact of quantum mechanics, but a real entity with major implications for gravity, astrophysics and technology. This view is shared by a number of researchers, including Boyer (1980), McCrea (1986), Puthoff (1987) and Rueda and Haisch .[7][8][9]

Zero-point energy and conservation of energy

Traditionally it has been assumed that the electromagnetic zpf energy density for each quantum space equals the sum of the quantum oscillatory energy at all possible non-interference producing wavelengths in each of the three spatial dimensions all the way up to the Planck length. Using this historical measure of energy density it has been estimated that there is enough zero point energy contained in one cubic meter of space to boil all of the oceans of the world.

However, the historical analysis of the zpf energy density just described appears to contradict the first law of thermodynamics and our understanding of the cosmology of the universe. The physical evidence is extensive that the universe has expanded from an origin containing essentially no space and infinite energy density in the event called the Big Bang. If our universe is defined as all that has the potential of being known to us and interacted with, then it is defined as a "closed system". A closed system retains causality because its total energy is finite and always conserved. This is not a contradiction with the universe at time (t) = 0 because the concept of infinite energy "density" in "zero" occupied space does not equal "infinite energy" for the universe. The historical analysis of the zpf energy density used in the example of energy in one cubic meter of space does not account for the expansion of the universe. It simply increments this expanding space over time and assigns each new additional quantum space with the maximum energy density.

There is a major drive in physics to create a more realistic zpf energy density model that still allows for causality and conservation of energy in the universe. There is substantial evidence in quantum physics, via the de Broglie relations, the Casimir effect, and the Zitterbewegung action of electrons that this field of energy acts as an energy intermediary in the dynamic actions of all particles. Electrons orbiting a nucleus, as one specific example, may use this energy source to move up in an orbit, and then contribute back to this energy source when they relax back into a lower orbit around the nucleus. The de Broglie relations show that the wavelength is inversely proportional to the momentum of an electron and that the frequency is directly proportional to the electron's kinetic energy. As long as the electron does not increase its average kinetic energy over time through acceleration or heating of the atom as a whole, then this wave-like movement of electrons can be seen as a direct interaction of electrons with the zpf.

A potentially promising area for research is the fact that if particles become more energetic as they are heated or accelerated their gravitational field increases. Changes in gravity can perhaps be attributed to a change in a spherical zpf energy density gradient surrounding an accelerated or decelerated massive particle. This dynamic action is just an extension of the static orbiting electron wave model to a dynamic model in which the "average" kinetic energy of a particle no longer remains constant over time and energy is drawn in from the quantum vacuum but not returned. If a massive particle's ground state is defined as its reference frame at the instant of its creation, then when a particle or body returns to this reference frame, or ground state, from an accelerated state that energy is returned to the quantum vacuum as a decrease in gravity surrounding the particle. This would be in accord with the rules of gravity on accelerated bodies as we know them, and most importantly, maintains conservation of the combined energy of both particles and the zpf, while still allowing for dynamic interaction between the two.

Quantum fluctuations versus quantum pathways

The essential character of the zpf was originally described by John Archibald Wheeler[10] as a foamy sea of constantly emerging virtual particles and anti-particles of immense intrinsic energy, which would come into existence spontaneously and then annihilate themselves. This description originated because it was the only way to consolidate the enormous projected energy density of the quantum vacuum. Because the mathematics of oscillators was the origin of our understanding of the zpf it has been described as "fluctuating" in the absence of any outside force. But there seems to be a fundamental confusion about this from which subsequent errors in logic arise. An individual oscillator can fluctuate, just as an electron's orbital wave motion can fluctuate between two different orbits, however the "rate" of fluctuation, i.e. a quantum oscillator's characteristic frequency, should never change unless a photon, the electromagnetic force carrier, is either absorbed or emitted by the quantum oscillator. If a point in space could actually be brought to a temperature of zero kelvins the flow of photonic energy between quantum oscillators would stop but each quantum oscillator would still "fluctuate" at a rate that never changes. The energy from one quantum oscillator would not propagate to other quantum spaces. Similarly, at that temperature an electron would maintain its fluctuation between the same two orbits about the nucleus without any change. However from an outside viewpoint the situation would be seen as a static and non-fluctuating one because photons do not escape or enter the quantum space between the two electron orbits.

A static, but plastic quantum foam is a better analogy to the character of the quantum vacuum, and not the kind of "spontaneously" changing foam Wheeler described. It should be remembered that each quantum space represents the sum of quantum harmonic oscillator energy in each of the three spatial degrees of freedom, and that each of these three degrees of freedom can act separately, but in coordination with the other two to provide a total energy for that quantum space which is always conserved. If a photon with a specific energy and direction enters and then exits a quantum space these three degrees of freedom allow for a change in the magnitude of the energy in each of the three spatial directions while conserving the total magnitude of harmonic energy in that space - an increase in magnitude of harmonic energy in one dimension can be compensated by a decrease in the other two dimensions. The orientation and size of these changes will correlate to a change in direction of any particle passing through that quantum space, and to the magnitude, or state of the existing harmonic energy in the three dimensional degrees of freedom for that quantum space at the time the photon enters. Similarly, any particle encountering a quantum space which has recently had a particle pass through it will be affected by that previous particle, even though they are not coincident in time.

For example, in the cold reaches of space between galaxies photons from distant galaxies arrive rarely. If by chance a rare photon passes through one of these cold dark spaces it will leave a quantum signature on the oscillatory energy of each quantum space it passes through. If a quantum space has experienced a photon passing through it and no further photons pass through, then that area of space will retain a memory of the last photon it experienced. That space will then exhibit a residual force which has a magnitude and a direction that resides in the memory of that space. This force can be observed in the Zitterbewegung, or jittery action of electrons, which in principle can be extended to all particles. In effect, a pathway has been produced by this single photon. The fact that this pathway cannot be maintained in its unaltered form after measuring it, as the Heisenberg Uncertainty Principle predicts, does not alter the fact that this pathway is retained in space until the next photon passing through creates an interference with this pathway. Acts of measurement represent an exchange of photons between the "observer" and the "observed" and are synonymous with local changes in energy. The "observed" can be an actual particle or just the pathway in space created by the particle. The appearance of fluctuation is actually the transformation of energy and information as it travels through space, but the total energy and information of the universe are always maintained. So it is seen that this quantum foam is not the kind of foam that springs back, but is more like milk foam on top of a cup of cappuccino - a straw can be pushed through the foam and a hole in the foam will remain for a time after the straw is withdrawn. But in the case of quantum foam the impression left behind is not of a hole, but rather the impression of the photon or particle with mass that passed through it.[11].

Curvature of space and the physical vacuum

Zero-point field theory originated from the application of thermodynamics to the problem of Black Body radiation. This knowledge was later used by Albert Einstein to calculate the electromagnetic residual energy of the vacuum surrounding the electron in the hydrogen atom that was required to keep it from collapsing into the nucleus. Much later the energy density of empty space was calculated to have a spectral density of . This energy density is an enormous figure and is approximately 10120 times higher than the cosmological constant predicts if, as is traditionally done, the Planck length is used for the upper bound for the frequency. The total energy of the universe does not seem to be conserved unless laws of physics are invoked that cannot be understood at the classical level. In other words, the observed expansion of the universe leads to a discrepancy from quantum physics derived vacuum energy of the order of 10120 times. This was originally not seen as a problem because the cosmological constant itself was seen to be on mathematically unfirm ground. It is a positive number very close to zero, but not zero, and it was assumed for many years that this was a mistake and that in actuality it was zero. This assumption came from quantum mechanics which said that virtual particle and anti-particle production and annihilation fluctuations accounted for the large density of the vacuum. In this view an unknown field, or supersymmetric particle system to all known particles, acted as a negative energy source that completely cancelled this energy, just as the cosmological constant should have predicted.

However, the recent red shift observations of type 1a supernovae have illustrated decisively that the universe is accelerating its expansion.[12] This proves that the universe really does have a non-zero "observed" vacuum energy. It calls into question the usefulness of the present application of the creation and annihilation operators used in quantum mechanics for the current state of the universe. If there really is production and annihilation of virtual particles then in today's universe it must occur on a vastly lower energy scale than is proposed in the standard model of quantum physics. And any decrease in virtual particle rate of production and annihilation, and/or lengthening of virtual particle cross section for the same energy scale, might allow for a lower vacuum energy density that would not need to be cancelled. If the vacuum has a finite positive energy then one must find reasons for seeing that reality in the sum of what we observe. One must assume that there are answers to these questions in the observable universe and that we don't understand everything yet. A quantum mechanical fluctuation should occur for a reason, and at a rate that corresponds to the vacuum energy density we currently observe in the universe. One cannot invoke another world or universe to justify unjustifiable fluctuations just when it is useful to solve an inconvenient problem, as in the many worlds philosophy of quantum mechanics.

Geometrodynamics represents the macroscopic curvature of spacetime, but it appears there are more complex forces at work that represent the granularity of space-time around massive particles and bodies. Thus, the famous Russian physicist and political dissident Andrei Sakharov said, "Geometrodynamics is neither as important or as simple as it looks. Do not make it the point of departure in searching for underlying simplicity. Look deeper, at elementary particle physics." Einstein's geometrodynamics, which looks simple, is interpreted by Sakharov as a correction term in particle physics. In this view the cutoff at the Planck length arises purely out of the physics of field and particles, and this governs the value of the Newtonian constant of gravity, G. And the value of the vacuum energy density function with respect to the volume of the universe, , is a constant and is governed by the value of the gravitational constant, G.[13] This energy density does not vary with changes in the volume of the universe because the gravitational constant does not change with the volume of the universe.

Energy in its classical description is a scalar quantity that is always positive. Energy in its particulate form is the photon, and there is no anti-photon. The energy released in the annihilation between matter and anti-matter is in the form of photons. So it may be said that all energy is positive and only in the statistical counterpoint of opposite direction of spin in matter and anti-matter in the Dirac sea can any energy be considered as negative, as opposed to another energy being positive. In addition, because the cosmological constant now has a confirmed small positive energy density the negative energy concept no longer serves a useful purpose in creating a balance of zero observed energy for the universe. Therefore, an effort should be made in physics to erase the idea of negative energy because it violates the first law of thermodynamics and adds a layer of needless confusion. However, removing the concept of negative energy will force a modification to be made to Sakharov's model of energy density, and will even change the idea of the quantum scale energy density of the universe being constant with changes in the volume of the universe. This agrees with the observed temperature of space being lowered as the universe expands, as is reflected in the radiation temperature of the cosmic microwave background radiation. One must now try to find a way to include the mathematical "appearance" of the large constant Planck length vacuum energy density used to support the existence of massive particles, while finding a suitable candidate quantity to subtract from that answer to create a final answer that agrees with the cosmological constant.

The Planck length

It can be noted in that the volume of the universe, , is expanding. If the total energy of the universe is taken as a positive constant based on first principles, then is not a constant throughout the life of the universe - the density gets lower as the volume of the universe expands. However it will be a "changing" constant in the sense that it is an "average" density that applies to the entire universe at any given moment in the life of the universe. It can be safely assumed that if Sakharov was correct in his analysis regarding the defining nature of the gravitational constant, something that is only tentatively accepted as of now, that there is something incorrect in our current mathematical assumptions in physics. If Sakharov's analysis is tentatively taken as true then certain results must accrue from that and other results, long assumed, must be incorrect. The final decision on whether it is worthwhile to accept his analysis is based solely on the utility, or non-utility in finding answers to problems long plagueing physics.

The Planck length defines the shortest wavelength quantum oscillator that is possible. The summation of energy of all possible non-interfering wavelengths for each of the three dimensional degrees of freedom up to this limit has historically defined the energy for each quantum space. And each quantum space is in turn defined as the Planck volume, or .

The Planck length is:

where:

Perhaps the Planck length is not a constant but stretches out as the universe expands? If the zpf energy density decreases as the volume of the universe expands then, by definition, the changing Planck length would define both the Planck volume and the total energy within each Planck volume. However, this quantum energy/volume relationship should be considered only an average value for the universe at any moment in the life of the universe. There is a precedent for this in the stretching out of light in the cosmic microwave background radiation. When electrons were initially captured in hydrogen atoms in the early universe high energy, short wavelength, photons were released. But in today's universe those wavelengths now appear as much, much longer microwave wavelength photons, but with a distribution and inferred spatial temperature that varies slightly throughout the universe about one unique average value. There are three constants used to create the Planck length constant, as shown above. Is it possible that the gravitational constant, always assumed to be constant throughout the expansion of the universe, is not a constant? This seems plausible, in view of structural changes that would occur in the universe as the fabric of space becomes less dense as it expands. Of the three constants included in the Planck length the gravitational constant seems to be most directly correlated with the expansion of this primordial field.

If one considers fundamentally altering the status of one the three constants then altering the gravitational constant would be preferable to altering the constancy of the speed of light or changing Planck's constant. Planck's constant and the speed of light fundamentally underlie all current calculations of physical properties. Albert Einstein's quantum derivation for the packet of energy in a single photon is:

or for the wavelength, :

Planck's constant is the constant of proportionality in the ratio of the frequency of the photon to the energy of the photon. When the frequency of a photon changes the energy of the photon changes via the constant h. Today the photons in the cosmic microwave background radiation have both lower frequency and subsequently lower energy than when they were originally emitted from hydrogen atoms. Just as h is a constant of proportionality between two variables that change together, frequency and energy, c can also be considered a constant of proportionality between two variables that change together, time and length. For a specific object being measured, the direct correlation of change in measurement of distance with the change in measurement of time creates the constancy of the speed of light. This in turn determines the measured wavelength, and thus the temperature of a given object, through Planck's constant. The only constant remaining in the Planck length that is not a constant of proportionality between two variables that change together is the gravitational constant, G. This means the only constant that remains available for modification if the average total energy density of the universe "does" change as the universe expands is the gravitational constant. Temperature, frequency-wavelength, and especially length, are all time dependent measurements. Time slows down within gravitational fields and it would also slow down within the universe as a whole if the energy density is reduced during its expansion as G increases. As the flow of energy and information slows down as the zpf density is reduced within gravitational fields and as the universe expands then these time dependent measurements are all locked together and change together. Thus, the gravitational constant, G, is probably not a constant quantity throughout the expansion of the universe. The gravitational constant underlies many of the fundamental properties of physics and the properties affected by the gravitational constant, i.e. the measurement of the dimensions of space and time and temperature, seem to be directly tied to contradictions that already exist between gravitational physics and quantum physics.

Only future experience will tell if more problems are "solved" or if more problems are "created" by allowing for changes in the gravitational constant during the evolution of the universe. One thing is certain though: If the Planck length stretches out as the universe expands then the zpf energy density is not even close to what it is currently assumed to be.

Limits on causality and quantum tunneling

Residual forces can be observed whenever a charge, be it a photon or a massive composite particle, encounters a recent, or not so recent, pathway of a previous particle trajectory. The force it exerts on any charged particle will be:

The force exerted by these pathways on any test charge will be indistinguishable from a magnetic force because the overwhelming probability is that these paths, relative to the test charge, will be moving in a different inertial reference frame from the test charge. In other words, there will be a relative velocity difference between the test charge and the "ghost particles" that represent the sum of past histories of particles in that area of space. But in open space the statistical probability is that these B field zitterbewegung forces will be equal in all directions. Only when the test charge is accelerated from its inertial reference frame will these forces shift to a maximum in the plane normal to the acceleration as the test charge cuts across many more lines of force in that one direction. This counterforce is experienced as inertia to the test charge. The resultant direction imposed on any particle in open space will average out to the initial direction of the force that created the acceleration. However it will be a path combined from connecting many lateral movements that combine to form a helical motion.[14][15]

Just as these forces will impinge on any test charge, this test charge will also affect the oscillatory energy within each quantum space and will modify them according to the magnitude and velocity of the test charge. There will be a balancing of the energy of angular velocity imparted to the particle trajectory with an equal and opposite change in oscillatory energy for the three spatial dimensions. If the trajectory of a charge entering a space does not match the direction of a force currently existing within a quantum space then both the exit trajectory of the test charge and the direction of the force within the quantum space will be altered, as its path integral predicts.[16][17] The appearance of inertia seems to come from the interference between incompatible oscillatory wavelengths within a quantum space as the entering charge changes the oscillatory energy in the three dimensions. That inertial reaction from the interference creates an induced force with the qualities of both magnitude and direction. The physics of the mechanism creating these induced forces still needs further mathematical modeling under varying conditions. It may be a result of just a change in the ratios of oscillatory energy between the three dimensions in a quantum space, while maintaining the same total energy within that quantum space. It may also be a result of a change in total energy within that quantum space in addition to a change in ratios of oscillatory energy in the three dimensions. The problem in determining this would lie in whether or not there is more energy in the charge entering the quantum space than there is total oscillatory energy in all three dimensions in that quantum space before the event occurs. That resultant three dimensional force vector can be viewed approximately as the net difference in energy levels between the three dimensions in a quantum space and also as a barrier between quantum spaces. Any space can be considered a single quantum space if it contains a single induced force that is continuous in one dimension.

The interesting question to ask is if accelerations can occur that overwhelm the ability of the residual force within a quantum space to modify the trajectory of a particle. It appears possible for a very high energy charge to tunnel through that force and leave the force, or barrier, parallel or nearly parallel, with the original trajectory of the charge. In this way the oscillatory energy in one direction is increased to a level that the energy in the other two dimensions are unable to compensate for it. This can be understood by realizing the oscillatory energy in the other two dimensions are brought to a value of zero through the interference of oscillatory energy in the third dimension. The totality of space after the tunneling occurs can be considered a larger volume than originally existed if one equates the length of that induced force in the same way one equated the Planck length3 to the Planck volume. And the seeming non-conservation of energy within this stretched out quantum space is equalized by the same energy dividing into a now larger volume of "combined" spaces. This can be understood because while the oscillatory energy within one quantum space may be increased through the charge tunneling process, the energy from the charge that originally creates that tunneling is simultaneously subtracted from a separate corresponding quantum space. This reduction is reflected in a larger gravitational constant, G, as the expansion proceeds. It is speculated that this is how the universe may have been created in the big bang.

There is a thermodynamic price to pay for this local violation of conservation of energy within a quantum space though. Time is measured by the flow of energy and information as it moves through space. Inertia is caused by the resistance of a test charge to the acceleration as it cuts across more of these lines of force in one direction. Time can also be seen as resistance to the flow of information between distant points in space created by that inertia. But the space between any two distant points between which a charge of overwhelming energy has passed will have a resultant induced force that is parallel with that path. There will be no inertia to information flow between those two points because there will be no lines of force that have a force component normal to the path between those two points. As the energy in one dimension in a space is increased by the charge tunneling effect it is normally damped out to some degree as that increased oscillatory energy interferes with the existing oscillatory energy in the other two dimensions. That damping out effect is directly related to the magnitude of oscillatory energy in the other two dimensions that exists prior to the event. But if a charge enters a quantum space with energy in only one dimension there will be no damping effect on the angular velocity of that force. If a charge now enters a quantum space from any direction that is not parallel with that path it will immediately result in equal and opposite forces imposed on the other end of the path. This is what results in angular momentum in particles with mass. The onset of angular momentum is what creates the division between matter and non-matter, and it also creates the separation between the quantum space from which the photon was taken and the quantum space to which the photon was added. Mass is the energy encapsulated within these rotating pathways that creates the non-local link in composite massive particles.

In reality this tunneling after-effect, or quantum entanglement, is seldom perfect and neither is it evenly distributed in the universe, as can be seen in the WMAP surveys. The lines of force are seldom lined up perfectly parallel to distant points in space. In other words, there can be vast differences in the amount of zpf density, both in spatial distribution and in absolute levels of energy density within any pre-specified average sized volume within the universe, depending on the scale one is looking at. At the quantum scale within fundamental massive particles they "will" be nearly lined up, depending on how long lived they are before they decay. At the cosmic scale it is appearing increasingly likely that there is a similar, but much smaller energy density difference between galaxies and the vast open spaces between the galaxies, because if one mathematically subtracts the gravitational effect of visible matter within each galaxy there is more angular momentum than there should be in the outer reaches of galaxies.

Andrei Sakharov and the elasticity of space

It is a fact that Albert Einstein's equation in General Relativity for the geometrodynamics of space is one of the most beautiful equations in physics. Unfortunately, its beauty has resulted in a fixation in the West on the mathematics of geometry, including the geometry of hyper-dimensional space. This focus has resulted in Western mainstream physics ignoring the need for a mathematical definition that connects the classical General Relativity 4-dimensional and Kaluza-Klein 5-dimensional theories of the geometry of space with new quantum mechanical derived ideas that will better correlate with them. Until an explicit mathematical equation is robustly proven to link these two disparate interpretations of physics the first law of thermodynamics for our universe appears to be conceptually violated.

Andrei Sakharov's conception of the elasticity of space, though incomplete, seems to point towards an ultimate resolution of this problem. In his cosmological model space is elastic like the surface of a balloon and thins out as the volume of the universe expands over time. If the 3-dimensional volume of the universe is represented as a 3-dimensional surface of a balloon, then in its collapsed state at the beginning of time the zpf energy density of the universe is greatest. In approximately the first 10-24 of a second in the life of the universe the zpf energy density would be extremely high, high enough to create the proton in the hydrogen atom. The proton represents approximately 99.9 percent of the mass of the entire hydrogen atom. The estimated energy density of the physical vacuum at the instant of the proton creation would then represent approximately the amount of energy density recently calculated for the hydrogen atom using stochastic electrodynamics.[18] However, in this conceptual framework this zpf energy density would only exist at precisely this one moment in the life of the universe. Sakharov conceived that at the instant of the creation of fundamental massive particles that the elastic zpf energy required for their mass was transformed into inelastic energy.

(Though Sakharov didn't know this when he conceptualized this idea, today our experimental knowledge indicates the proton, which itself is made up of three quarks, is the only fundamental massive particle that does not undergo transformation through radioactive decay if given a long enough time of observation or a large enough quantity of protons being observed. In the case of neutrinos, they simply transform into other kinds of neutrinos. So his idea seems to fit in with the idea of a type of "absolute" inelasticity that the Standard Model cannot account for.)

So a question similar to the question Einstein proposed concerning the non-collapse of the electron into the hydrogen nucleus at zero kelvins can be posed for the proton: Why doesn't the proton radioactively decay if the energy density of the physical vacuum is now a tiny fraction of what it was when the proton was created? The principle of asking why is exactly the same even though they apply to different processes occurring under different conditions, i.e., the relationship of the process to the environmental conditions are similar in both cases. If the zpf energy density decreases over time the statistical probability of the proton decaying should increase. If the proton composes 99.9 percent of the energy of the hydrogen atom, is itself a composite elementary particle, an even more pertinent question would be to ask why the proton doesn't radioactively decay?

Sakharov imagined that spin and its associated angular momentum provided this inelasticity. Spin, and specifically inelastic spin, is a part of a 5-dimensional definition that directly correlates with gravity in Kaluza-Klein theory. If one maintains the balloon metaphor then we sew in a round piece of inelastic material into our balloon at 10-24 seconds into the expansion from its collapsed state. We are careful to select a piece of inelastic material to substitute for the elastic material that has exactly the same density as the same equivalent elastic material surrounding it at that point in its expansion. As we continue to blow up the balloon its elastic material thins out symmetrically except where the inelastic patch is located. Because the area of the inelastic patch does not stretch and thin out the elastic material surrounding it must compensate by stretching and thinning even more than it otherwise would. The greatest thinning will be right on the periphery of the patch in the elastic material because this is the part of the elastic material that has the greatest concentration of stress. If we translate this idea to 3-dimensional vacuum energy physics the greatest zpf stress, or thinning out, will be right on the periphery of any massive body and this is where gravity is greatest.

One now returns to the question of where the missing energy went to that is required to support the proton in today's greatly expanded universe. Using pure logic it appears that the proton still has it in the form of energy pulled in from the surrounding physical vacuum. Though the inelasticity between a proton's quarks can by no means account for all the gravity in the universe it certainly can account for nearly all the gravity of unbound nucleons, i.e., individual protons and neutrons. A neutron can be included in this approximation because it has very slightly more energy than a proton and will decay into a proton after fifteen minutes if separated from protons. It should be remembered that the quarks in protons were bound together though the quantum tunneling mechanism in the overwhelming acceleration of the big bang. And quantum tunneling in the earliest inflationary phase of the expansion of the universe originated from high energy photons overpowering the quantum oscillation energy in a quantum space. The expenditure of energy in the tunneling process resulted in the small bare masses of quarks we see today and the huge binding energy between the three quarks. It is the most significant result of all the results from the inflationary phase of the expansion of the universe.

However, any photons creating a tunneling effect must be accounted for in the quantum space it inhabited before the tunneling took place. Besides the tunneling action creating more space in the universe it also created a deficit, or subtraction of energy from the quantum space each photon inhabited before the tunneling occurred. Gravity seems to be created by that subtraction. This is not gravity as we are used to understanding it, but gravity as an inverse lower energy density gradient in the vacuum whose energy density increases smoothly with the square of the distance from around any massive object. So it can be seen that the reason the proton does not decay in today's universe is because energy is pulled in from the surrounding physical vacuum. We experience that reduction in energy as gravity.

Non-Locality

The question of non-local affects in the universe has become a predominant topic in the first decade of the twenty first century. It is apparent that the classical version and the quantum version of physics both contain non-local phenomena such as wormholes and the EPR paradox, respectively. Typically, the barrier to integrating these ideas into mainstream physics has been the imperfection in reproducing these effects. There has been an "all or nothing" view of non-locality in which any effect that was not permanent and decayed rapidly engendered a feeling that, because it was a non-intuitive idea, not a trustworthy physics structure. Gradually this idea is being eroded as it is realized that there is a spectrum of non-local effects that can exist. It is possible that science will never be able to artificially create a permanent self-sustaining non-local effect, but that does not mean non-local effects don't exist. For example, photon entanglement can be observed and the duration of entanglement lengthened artificially by use of fiber optic cables that isolate the decaying effect of the outside environment. In the natural world it is understood that the proton's asymptotic safety, which prevents proton collapse, is a form of permanent natural quantum entanglement.

This new view of a spectrum of entanglement effects at various energies, durations, and spatial distances allows for a better integration of the idea of non-local entanglement with Sakharov's idea of varying levels of elasticity, or inelasticity, between remote points in space. The energy of the universe is finite. Energy, in all its multifaceted forms, represents information. Thus, the information of the universe is also finite. This means there will be changing limits on causality as the universe expands. Knots of highly dense information exist within protons and other massive particles, while other areas of the universe must compensate with much less information contained within large volumes of the universe. This meshes with the idea that as the universe expands space becomes more inelastic and non-local. The changing volume of the universe as it expands represents an ever expanding average difference between the oscillatory energy of the three spatial dimensions within each quantum space. This increasing ratio represents an elongating physical space that is mapped out onto the three dimensional structure of the universe, just as the Planck length3 = Planck volume. As the universe expands more non-local effects will be observed within distances and time durations that do not seem otherwise unusual. In other words, identical information can be smeared over larger and larger volumes as the expansion progresses. Coincidences seem to be part of the structure of the uneven distribution of information throughout the universe. It should be emphasized that this unevenness is a localization of the bulk effect and coincidences will usually appear locally and then vanish after a short time. There are exceptions to this though when large volumes of the universe can be observed. In those circumstances even a small variation in energy density represents a very large amount of energy and that energy and information will persist. This situation can be recognized in the greater than normal angular momentum beyond a critical radius from the center of all galaxies.

The temperature of the universe in the bulk decreases as it expands and this indicates that the gravitational constant, G, must change as the universe expands, as was previously described. But if the gravitational constant changes significantly after the creation of nucleons it would create enormous conflicts with other phenomena that depend on the gravitational constant being what it is in the present universe. The idea of a bulk index of elasticity, or index of non-elasticity as is preferred, seems to hold merit. This idea would incorporate the new view of non-local effects occurring within a spectrum of time durations, energies, and spatial distances that change with the expansion of the universe. Perhaps this index of inelasticity could serve to cancel out any observable changes in the universe that an increasing gravitational constant after big bang nucleosynthesis would normally create.

It is clear that increases in the gravitational constant with the expansion of the universe is consistent with a decrease in the bulk temperature of the universe. Therefore, it might be better to borrow from solid state physics and name the gravitational constant's opposing counterpart as the index of the bulk superconductivity of the universe, or just Universal Superconductivity Index, for short. Just as an electron conductor, such as copper, becomes superconductive near absolute zero our universe can presumably be described in a similar fashion with each quantum space of oscillatory energy standing in for the atomic description in solid state physics. This index of superconductivity would not affect our present perceived temperature of the universe in the bulk, nor would it affect the other perceived time dependent measurements. Its major impact must be a mathematical impact that offsets an increasing gravitational constant. Only future experience will tell if an index of universal superconductivity, or non-localness, can serve as a counterbalance to an increasing gravitational constant, or if it can be made consistent with phenomena observed in today's universe.

In recent years, a number of new age books have begun to appear propounding the view that the zero-point field of physics is the secret force of the universe being used to explain such phenomena as intention, remote viewing, paranormal ability, etc.[19][20] One of the main purveyors of this view is Stanford physicist Harold Puthoff who spent more than thirty years examining the zero-point field.[21] Books that promote this view include:

  • Lynne McTaggart's 2001 The Field - the Quest for the Secret Force of the Universe.
  • Ervin Laszlo's 2004 Science and the Akashic Field - an Integral Theory of Everything.
  • Brenda Anderson's 2006 Playing the Quantum Field - How Changing Your Choices Can Change Your Life.
  • Masaru Emoto's 2005 "The Hidden Messages in Water."

Such views are not without controversy. Some see such discussion as pseudoscience. [22] However, physicist David Bohm and other respected scientists, have found some utility in looking at the relationship of the zero point field to matter. Bohm posited, for example, that the field might be the force from which all life unfolds. He stated that the "nonlocality" of quantum physics, which could also be described as varying levels of inelasticity between remote points in space, might be explained through interconnections allowable via the zero-point field.

Though seldom used in fiction, the most notable reference to the Zero-point field is the use of ZPMs in the Stargate universe, devices which extract huge amounts of energy from a Zero-point field. In the video game Half-Life 2, there is also a weapon called the Zero-Point Energy Field Manipulator, more commonly known as the "Gravity Gun". In their 1996 fictional book, Encounter With Tiber, Buzz Aldrin and John Barnes have Alpha Centaurians visit Earth in 7200BC using laserable Zero-Point Field-based propulsion to achieve near-light speed travel. In the 2004 animated film The Incredibles, Syndrome's basic weapon is a zero-point energy field. Saint, a 2006 mystery novel by Ted Dekker, portrays characters who are able manipulate the zero-point field. In Australian author Matthew Reilly's two novels The Six Sacred Stones (2007) and The Five Greatest Warriors (2009), a zero-point field (referred to as the 'Dark Sun') is featured as the threat from which the world requires saving through use of ancient technology and long-lost knowldege.

References

  1. ^ Gribbin, John (1998). Q is for Quantum - An Encyclopedia of Particle Physics. Touchstone Books. ISBN 0-684-86315-4.
  2. ^ a b c Dodd, John, H. (1991). Atoms and Light: Interactions. Springer. p. (217). ISBN 0306437414.{{cite book}}: CS1 maint: multiple names: authors list (link)
  3. ^ Laidler, Keith, J. (2001). The World of Physical Chemistry. Oxford University Press. ISBN 0198559194.{{cite book}}: CS1 maint: multiple names: authors list (link)
  4. ^ Introduction to Zero-Point Energy - Calphysics Institute
  5. ^ a b Zero-point Energy and Zero-point Field – Calphysics Institute
  6. ^ S. Haroche and J.-M. Raimond, “Cavity Quantum Electrodynamics,” Sci. Am., pp. 54-62 (April 1993). Also H. Yokoyama, “Physics and Device Applications,” Science 256, pp. 66-70 (1992).
  7. ^ Boyer, T.H. (1980). In Foundations of Radiation Theory and Quantum Electrodynamics (ed. Barut, A.O.), Plenum, New York, 49.
  8. ^ McCrea, W.H. (1986). Quart. J. Roy. Astr. Soc. 27, 137.
  9. ^ Puthoff, H.E. (1987). Phys. Rev. D. 35, 3266.
  10. ^ Wheeler, John (1998). Geons, Black Holes, and Quantum Foam: A Life In Physics. Norton & Company. ISBN 0393319911.
  11. ^ Habegger, E.J., Quantum Vacuum Pathway Theory, Space Technology and Applications International Forum, (STAIF 2005), AIP Conference Proceedings 746 , 1379.
  12. ^ Adam G. Riess et al. (Supernova Search Team) (1998). "Observational evidence from supernovae for an accelerating universe and a cosmological constant" (subscription required). Astronomical J. 116: 1009–38. doi:10.1086/300499.
  13. ^ Misner, Charles W.; Kip S. Thorne, John Archibald Wheeler (September 1973). Gravitation. San Francisco: W. H. Freeman. ISBN 0-7167-0344-0.
  14. ^ B. Haisch, A. Rueda & H.E. Puthoff, (1994). "Inertia as a zero-point-field Lorentz force". Physical Review A, Vol. 49, No. 2, pp. 678-694.
  15. ^ Haisch, Bernard; and Rueda, Alfonso (1998). "Contribution to inertial mass by reaction of the vacuum to accelerated motion". Found. Phys. 28: 1057–1108.
  16. ^ Feynman, R. P. (1948). "The Space-Time Formulation of Nonrelativistic Quantum Mechanics". "Reviews of Modern Physics" 20: 367–387. doi:10.1103/RevModPhys.20.367
  17. ^ Feynman, R. P., and Hibbs, A. R., Quantum Mechanics and Path Integrals, New York: McGraw-Hill, 1965 [ISBN 0-07-020650-3]. The historical reference, written by the inventor of the path integral formulation himself and one of his students.
  18. ^ Daniel C. Cole & Yi Zou, "Quantum Mechanical Ground State of Hydrogen Obtained from Classical Electrodynamics", Physics Letters A, Vol. 317, No. 1-2, pp. 14-20 (13 October 2003), quant-ph/0307154 (2003).
  19. ^ Brilliant Disguise: Light, Matter and the Zero-Point FieldBernard Haisch, 2001 Science & Spirit Magazine
  20. ^ The Field: The Quest For The Secret Force Of The UniverseLynne McTaggart, Book Synopsis.
  21. ^ McTaggart, Lynne (2007). The Intention Experiment. Free Press. p. (13). ISBN 0743276957.
  22. ^ Exploiting Zero-point EnergyPhilip Yam, Scientific American Magazine, December 1997, pp. 82-85.