For centuries scientists have been trying to create lower and lower temperatures and pressures, initially by evacuating gas from containers with mechanical pumps.
But today more refined technologies such as diffusion, ionisation, chemisorption etc. are used to produce ‘high’ partial vacuums for commercial and experimental use and it is possible to (momentarily) achieve extremely low pressures, termed as ultra-high vacuums (UHV), to within a fraction of absolute zero pressure and temperature.
“Ultra-high vacuum is vacuum regime characterised by pressures lower than about 10–⁷ pascal or 100 nanopascals (10–⁹ mbar, ~10–⁹ torr). (Wikipedia)
But there is no single vacuum pump that can operate all the way from atmospheric pressure to ultra-high vacuum. Instead, a series of different pumps is used, according to the appropriate pressure range for each pump. High pumping speeds are necessary and multiple vacuum pumps are used in series and/or parallel.
Pumps commonly used in combination to achieve UHV include:-
1) Turbomolecular pumps (especially compound and/or magnetic bearing types)
2) Ion pumps
3) Titanium sublimation pumps
4) Non-evaporable getter (NEG) pumps
5) Cryopumps
But the UHV’s produced cannot be sustained for any length of time, this is due to contamination of the sample resulting from such effects as ‘out-gassing’.
Out-gassing can include sublimation and evaporation, which are phase transitions of a solid or liquid substance into a gas, in other words at these extremely low conditions of pressure, atoms, either contained within, or vapourised from the surfaces of, the solid matter of the apparatus are drawn into the volume under examination.
“Out-gassing is a significant problem for UHV systems. Out-gassing can occur from two sources: surfaces and bulk materials. Out-gassing from bulk materials is minimized by careful selection of materials with low vapor pressures (such as glass, stainless steel, and ceramics) for everything inside the system. Hydrogen and carbon monoxide are the most common background gases in a well-designed UHV system. Both Hydrogen and CO diffuse out from the grain boundaries in stainless steel and Helium can diffuse through steel and glass from the outside air.” (Wikipedia)
And extraordinary preparatory steps are required to reduce these effects, which include the following:-
1) Baking the system (for one-two days at up to 400°C while the pumps are running) to remove water or hydrocarbons adsorbed to the walls.
2) Minimizing the surface area in the chamber.
3) High conductance tubing to pumps — short and fat, without obstruction.
4) Low out-gassing materials such as certain stainless steels.
5) Avoiding creating pits of trapped gas behind bolts, welding voids, etc.
6) Electro-polishing all metal parts after machining or welding.
7) Low vapor pressure materials (ceramics, glass, metals, and teflon if unbaked).
8) Chilling chamber walls to cryogenic temperatures during use.
9) Avoid all traces of hydrocarbons, including skin oils in a fingerprint.
These preparatory requirements, together with the actual pumping processes, use an enormous amount of energy and, as it is not technically possible to completely eliminate out-gassing or other contaminating efflux, these very low pressures, or conditions, cannot be sustained for any length of time, it is therefore clear that there is a progressively increasing force of resistance to the decompression of a gas, and a very strong resistance to the maintenance of such levels of pressure.
Why should this be the case when the only external resistance is that generated by atmospheric pressure, which in theory should be easily overcome by modern machinery?
In the opposite direction, for example, there are sophisticated machines in regular use today that compress materials to upwards of 200,000 times atmospheric pressure, for example to produce industrial diamond from carbon.
This high level of resistance requires an explanation.
It is a core premise of currently accepted physics theory that a perfect vacuum cannot influence matter in any way, and accordingly nor can any of its hypothetical, aetherial constituents.
This being the case, the question is:-
What forces are operating in these circumstances to prevent the extraction of all matter from within the compartment, and what is the source of this resistance?
The simple diagram below illustrates this situation with a perfectly sealed piston cylinder apparatus, and a single atom within the cylinder.
The hypothetical, non-material, empty space which is believed to occupy virtually all the chamber, by definition, can have no influence upon matter itself and, as matter is undeniably present within the compartmental space under investigation (and in the surrounding structure), it can only be either the single atom, and/or the atomic structure of the apparatus which generates this exponentially increasing resistance.
In other words it is matter, and matter alone, that is the cause of this resistance.
As mentioned earlier, today it is being said that the vacuum is not empty, but is permeated with waves of energy, etc. etc., but again such a medium has to have the qualities of non-resistance to the free motion of atoms and molecules within it (i.e. is a zero-inertia medium) and so could not generate any resistance.
But, in terms of the kinetic atomic theory of gases, where the only force allowed is a positive one generated by the collisions of atoms, the generation of such a resistive or negative force is inexplicable, and if the matter of our experience is almost entirely composed of a non-material ’empty space’ (of any speculative description) then, technically speaking, it should be very easy to remove all atoms from within it.
It is an undeniable fact that current physics theory has no answer to this question and, as these numerous empirical results are a direct falsification of current, kinetic-atomic atomic theory of gases, it would be accordingly necessary to conclude that this, the base theory of the science of physics, is invalid.
So to summarise, there exists no proof of the existence of the state of a vacuum in any circumstance, on the contrary, to objective observers at least, electron microscopy images show that in the solid state atoms are in contact and are therefore continuous.
The assumption by theoretical physicists that this state ‘exists’, and is by far the largest component of macroscopic matter, is a fundamental problem for science in general, and this was clearly articulated by Isaac Newton in a letter to Richard Bentley over 300 years ago : –
“That one body may act upon another at a distance through a vacuum, without the mediation of anything else, by which their action and force may be conveyed from one to another, is to me so great an absurdity, that I believe no man, who has in philosophical matters a competent faculty of thinking, can ever fall into it.”.
What this brilliant mathematician and inventor/artisan/technician was saying is that it is both conceptually and mathematically impossible to describe the transmission of a force in these circumstances as such a void cannot sustain the necessary process of ‘action and reaction’.
As Archimedes said 2500 years ago, ‘give me a point on which to place a lever and I will move the world’, in other words there has to be a ‘something’ for a force to act upon and in the case of two atoms or of two massive bodies separated by vacuum, as this space by definition has no qualities, a force emanating from one mass has no base from which to act upon the other.
This when applied to atomic matter means that, if no continuous contact is assumed between two atoms in any state of matter, there is no possible way to describe how any force acting on one is transmitted to, and acts upon the other.
In the middle of the 20th century however eminent physicists, such as Bohr, ultimately came to the realisation that a vacuum that had no characteristics that could affect atomic matter, was an insurmountable obstacle to progress, and accordingly the vacuum subsequently began to be attributed with hypothetical characteristics, and such concepts as ‘vacuum fluctuations’ and ‘vacuum polarisation’ were introduced, more recently it is suggested that it has such qualities as ‘an infinite energy density’ or ‘quantum potential’ etc. etc.
Also hypothetical vehicles for the transmission of forces and light through outer space were proposed, such as represented by ‘super-string’ and ‘loop’ theories.
And, with the realisation that this universal structure would mean that only a infinitely small proportion of the mass of the universe could be identified as matter, the concepts of ‘dark matter’ and ‘dark energy’ occupying the vacua in outer space have been floated.
All these completely unproven, and unprovable, concepts are simply attempts to endow the vacuum with qualities that, amongst other things, can transmit or transfer a force, providing it with various hypothetical qualities that can influence atomic matter. In other words physicists have tacitly accepted that the definition of the supposedly all-pervading ‘empty space’ as a vacuum is ‘superfluous’.
Thus the concept of an all-pervading non-material medium, effectively the aether that was ridiculed in the first half of the century, has been subtly, and surreptitiously, reintroduced by theoretical physicists in attempts to deal with the present complete impasse in atomic level physics.
100 years ago the, then strongly disputed, vacuum was set by Einstein into scientific consciousness, and 50 years ago scientists belatedly began to patch up the already unsatisfactory base of accepted atomic theory, kinetic atomic theory, by investing the essential, ‘empty space’ component with numerous hypothetical qualities.
And today it is known that atoms are the ultimate natural division of matter, in other words it is effectively proven that this is the case.
“Atoms are the basic units of matter and the defining structure of elements.”
“Atoms are the basis of chemistry and they are the basis for everything in the Universe.” (Textbook quotes)
But up until around 35 years ago the atom was still a hypothetical entity. And, while for most of the last century its existence was almost a certainty, a definitive proof had to wait until the technology of electron microscopy was perfected in the early 1980’s.
Since then many thousands of images of atoms in solid matter have been produced and published for all to see, and individual atoms have even been manipulated into positions on surfaces to create company logos, rings and other shapes, as in the image below.
At the beginning of the 1800’s Dalton introduced his solid, spherical, indestructible atoms and, if we ignore the belated acceptance of Avogadro’s multi-atomic molecular structures and J J Thompson’s ‘plumb pudding’ model atom later in that century, the next significant change to the internal structure of atoms was Rutherford’s model of 1919.
Since this time theoretical physicists have focused their attention on examining this structure and today have arrived at a hypothetical structure described, broadly speaking, as the Standard Model.
So the hypothetical atomic structure has changed dramatically from an indestructible solid sphere to what could be termed, essentially, as a ‘vacuum’ atom, and if this model is put into a comprehensible perspective with a nucleus of a hydrogen atom presented as having the diameter of 1mm (the dot below on the left represents such a nucleus) the atoms single electron would be orbiting at an altitude from it of over 2 metres.
Nucleus ▪ <————– 2.3 metres ————> · Electron
Note that on this scale the electron, the dot on the right, would not be visible on this page as it would be less than one pixel in diameter.
This 2mm diameter nucleus of such an atom would exert influence over a nominally spherical, sub-atomic ‘empty space’, as defined by its electron, having a diameter of 4.6 metres, and two such atoms are presented below at the point of a ‘kinetic’ collision. The nuclei are not included as obviously on this scale they would be invisible, while the dashed circles represent the extent of the nominal orbits of their single electrons.
This projected collision, at a combined velocity of up to 3600 metres per second is, in terms of the kinetic atomic theory of gases, required to be one of perfect elasticity, i.e. no distortion of their spherical forms and no loss of energy, and no reduction in the average motions of both atoms.
But it is rather difficult to imagine how a collision of such ‘vacuum’ atoms could result in such a ‘perfect’ collision.
However this picture is a simple one and the, tiny, material structure of the nuclei of atoms today, as postulated by particle physicists, are one of an extreme complexity and which are said to be composed of around 300 separate particles.
This hypothetical structure is the result of a huge investment by governments (i.e. taxpayers) around the world over the last 70-80 years, exemplified by the cost of the Large Hadron Collider at CERN, which has cost over $13 billion to date and has an annual budget of $1 billion.
But for all this effort a commentator has said “There have been tremendous advances in most areas of physics, such as materials science and hydrodynamics, which remain tied to experiment, but since the development of QED in 1928-1930 there have been no major gains in our understanding of the underlying structure of matter” 1
1 ‘The Big Bang Never Happened’, Eric J Lerner, P 358
It could be said that these advances are due to the fact that in these disciplines solid and liquid matter are today analysed by applied physicsists using Huygen’s concept of a continuum of atoms, in other words atoms in these states are treated as forming a continuous structure.
But today, for theoretical physicists, in essence, the atom is composed of a nucleus, and the extent of the atom’s influence is defined by a ‘cloud’ of particles – electrons. The nucleus and the surrounding electrons are said to be separated by a “perfect vacuum” * (*The Void, Frank Close, Oxford University Press, 2007) which vacuum occupies almost all of the volume of an atom, while the proportion of matter represented by these sub-atomic particles is one trillionth of its total volume, and that atomic interactions are based upon their ‘kinetic’ motion within an extra-atomic vacuum.
The reason for physicist’s focus on the atom’s internal structure is that, as there is no possibility of the transmission of forces through and between the vacuum separating such discontinuous atoms, and they live in hope that somehow the answers could lie in the sub-atomic structure of the atom itself.
But if the atom is itself almost entirely a perfect vacuum and its mass is overwhelmingly concentrated in the nucleus, then again there is no possibility of an sensible explanation for the transfer of a force from the mass of the nucleus outward to and beyond its outer periphery.
Clearly this Standard Model ‘vacuum’ atom is an absurdity, however the suggestion that its mass is concentrated at its central core is not, but this does not mean that the remaining volume is empty of matter, as no one, and certainly no physicist, knows what matter is ultimately and so they cannot say with any certainty that matter is confined to the nucleus and electrons and so is ‘here’ and not ‘there’.
But if gravity is a function of mass and perhaps it is time, after decades of failure, to consider that there is something fundamentally wrong with the theory on which all of theoretical physics is based, and which needs a relatively vast inter- atomic, non-material, ‘empty space’ to function, i.e. the kinetic atomic theory of gases.
If the atom is the ultimate natural repository of matter, then surely it is also, both collectively and individually, the ultimate natural source of all forces and the ultimate natural vehicle for transmission.