Survey of Anthropic Principle
The Anthropic Principle is one of the most intriguing arguments that correlates God’s creation (science) and that of probability. Those who are familiar with AppliedTruth know that we enjoy testing different data and probabilities of events in order to see the possible outcomes and the likelihood of them occurring. Not only that but we love Modal Logic which encompasses Epistemic Probability.
Epistemic Probability (or sometimes called Objective Probability) is the model used to dictate the possibility of a statement within our current state of knowledge (or given what we know). The model allows us to see if something is actually possible or not.
The Anthropic Principle was proposed in Poland in 1973, during a special two-week series of synopsia commemorating Copernicus’s 500th birthday. It was proposed by Brandon Carter, who, on Copernicus’s birthday, had the audacity to proclaim that humanity did indeed hold a special place in the Universe, an assertion that is the exact opposite of Copernicus’s now universally accepted theory.
The following is the official definition of the WAP:
“Weak Anthropic Principle (WAP): the observed values of all physical and cosmological quantities are not equally probable but they take on the values restricted by the requirement that there exist sites where carbon-based life can evolve and by the requirement that the Universe be old enough for it to have already done so.” (The Anthropic Cosmological Principle by John Barrow and Frank Tipler, p. 16)
The definition of the SAP) is as follows:
“Strong Anthropic Principle (SAP): the Universe must have those properties which allow life to develop within it at some stage in it’s history.” (The Anthropic Cosmological Principle, p. 21)
In addition to the WAP and SAP, there are the Participatory and Final Anthropic Principles. The Participatory Anthropic Principle states not only that the Universe had to develop humanity (or some other intelligent, information-gathering life form) but that we are necessary to it’s existence, as it takes an intelligent observer to collapse the Universe’s waves and probabilities from superposition into relatively concrete reality. The Final Anthropic Principle states that once the Universe has brought intelligence into being, it will never die out. These two are also very speculative.
Some examples of the Anthropic Principle:
The explosive-force of the big-bang had to be fine-tuned to match the strength of gravity to one part in 10000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000.
This is one part in 10^60. The number 10^60 = 1 followed by 60 zeros.
This precision is the same as the odds of a random shot (bullet from a gun) hitting a one-inch target from a distance of 20 billion light-years.
Epistemic probability: 0.00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00001
Density-of-matter in the Big-bang
In the big-bang, the density-of-matter in the universe after Planck time (fraction of a second after the big-bang) had to be matched to the critical-density to better than one part in 10000 00000 00000 00000 00000 00000 00000 00000 00000 00000.
This is one part in 10^50, which is 1 followed by 50 zeros.
Epistemic probability: 0.00000 00000 00000 00000 00000 00000 00000 00000 00000 00001
The inflationary Big-bang
In the inflationary big-bang, the cosmological constant and a particular force need to be fine-tuned for galaxies and planets to form.
The net result is a situation with an epistemic-probability of one part in 10^81, which is 1 followed by 81 zeros.
Epistemic probability: 0.00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 1
Lambda in the inflationary Big-bang
In the inflationary big-bang, bare-lambda and quantum-lambda (two components of the cosmological constant) had to be fine-tuned to cancel each other to better than one part in 10000 00000 00000 00000 00000 00000 00000 00000 00000 00000, for galaxies and planets to form.
This is one part in 10^50, which is 1 followed by 50 zeros.
Epistemic probability: 0.00000 00000 00000 00000 00000 00000 00000 00000 00000 00001
The Strong Force
The strong-force (which binds particles in atomic nuclei) had to be balanced with the weak-nuclear-force to about one part in 10000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000.
This is one part in 10^60, which is 1 followed by 60 zeros.
Epistemic probability: 0.00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00001
Gravity
The force of gravity had to be tuned to one part in 10000 00000 00000 00000 00000 00000 00000 00000, for stars capable of supporting-life to exist (based on balancing electromagnetic forces with gravitational forces).
This is one part in 10^40, which is 1 followed by 40 zeros.
Epistemic probability: 0.00000 00000 00000 00000 00000 00000 00000 00001
Electrons & Protons
The number of electrons had to be matched to the number of protons to one part in 10000 00000 00000 00000 00000 00000 00000 00, for formation of stars and planets.
This is one part in 10^37, which is 1 followed by 37 zeros.
Epistemic probability: 0.00000 00000 00000 00000 00000 00000 00000 01
Carbon Resonance
A nuclear resonance had to be created for formation of carbon (via alpha particle collision with Beryllium-8) and then tuned to close to a specific energy, to enable a brief window of opportunity for formation of carbon.
Without this, there would be negligible carbon in the universe.
Carbon is the only element designed to be capable of forming the long molecular-chains necessary for the complexity required by life (silicon for instance forms much shorter and less versatile chains that are not specified-complex enough).
Oxygen Resonance
A nuclear resonance for formation of oxygen had to be tuned to prevent complete cannibalization of carbon (via alpha-particle collision with carbon, resulting in oxygen).
If the oxygen-resonance were half a percent higher, there would be negligible carbon in the universe and on earth. Carbon is the only element designed to be capable of forming the long molecular-chains necessary for the complexity required by life.
Particle masses
Proton, neutron and electron masses had to be fine-tuned to enable life.
For instance, free neutrons decay to form protons. If the proton mass were slightly higher, the opposite would happen, resulting in a universe full of neutronium.
There would be no elements (no hydrogen, oxygen, carbon) and no way to create the molecular-complexity required for life.
Weak Nuclear Force
The weak-nuclear force had to be fine-tuned to enable life.
Slightly stronger, and no helium or heavier elements would form. And there would be no means to create the molecular-complexity required for life.
Slightly weaker, and no hydrogen would remain (to provide fuel for steady-burning stars needed as sources of energy for life).
Also, supernova explosions would not be able to disperse the medium-to-heavy elements created in stars.
Elements such as carbon (for molecular chains basic to life), iron (for hemoglobin), copper and other elements used in life-forms were originally created in stars, then dispersed by supernova explosions, to finally reach/coalesce into earth…
Dimensions
The number of dimensions in our universe had to be fine-tuned to enable life.
The topological, and physical laws of the universe need more than two spatial-dimensions, and less than five extended-dimensions for stability and the complexity required for life…
This requirement is met in our universe, with 3 extended spatial-dimensions and 1 temporal dimension.
Carbon chemistry
Lee Smolin (a world-class physicist and a leader in quantum gravity) estimates that if the physical constants of the universe were chosen randomly, the epistemic-probability of ending up with a world with carbon chemistry is less than one part in 10^220.
This epistemic-probability is one part in: 10000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 0.
Epistemic Probability: 0.0000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 1
Cosmological Flatness
Lee Smolin (physicist) estimates the epistemic-probability for the "equivalent-temperature" of the universe being such as to enable cosmological flatness, to be one part in 10^32.
Epistemic Probability: 0.00000 00000 00000 00000 00000 00000 01
Quantum Gravity & Cosmological Flatness
Looking at Quantum Gravity and what it would take to obtain Flat Euclidean 3D space upto cosmological scales (as observed in our universe) …
Calculating the epistemic probability of this occurring by random chance, using spin-networks from Roger Penrose, applied to quantum gravity by Lee Smolin and co-scientists. The number of predicted spin-network nodes in our universe would be at least 10^180. And allowing a 10% deviation from cosmological flatness, we end up with an epistemic-probability of less than one part in 10^(10^180).
This is one part in 10^(10^180), which is 10 followed by 10^180 zeros.
Epistemic Probability: 0.0000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 … … … … … 00001
If I were to write this number out, as 0.0000 0000 …, with all of its zeros, we would need a computer hard-drive much larger than the size of our entire universe, just to hold all of the zeros that I would have to write out.
The big-bang (reprise)
The big-bang had to result in a universe with relatively low-entropy (a high degree of thermodynamic-order), which could then proceed to increase in entropy with time, thus enabling formation of galaxies, stars, planets and ultimately enabling life to function once it was created.
In 1989 Roger Penrose (a world-class mathematician) calculated the precision required to create our universe with the necessary thermodynamic-order and to send it on its way (to develop in a manner compatible with life). His calculated precision was one part in 10^(10^123).
This is one part in 10^(10^123), which is 10 followed by 10^123 zeros.
Epistemic Probability: 0.0000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 00000 … … … … … 00001
If I were to write this number out, as 0.0000 0000 …, with all of its zeros, we would need a computer hard-drive much larger than the size of our entire universe, just to hold all of the zeros that I would have to write out.
So, what does all this mean? It means that it is reasonable to conclude that our universe did not get here by accident. The epistemic-probabilities are far too low for the universe to have arisen by random chance. The numerous observations of extremely-low epistemic-probabilities, point to an Intelligent Designer (God) having designed, created and fine-tuned the universe.
It should be noted that there are strong objections to epistemic probability and they should not be used as a frontal argument but allow the Anthropic Principle to do its work.