Late last year, Wall Street Journal columnist Eric Metaxas published an interesting piece titled "Science Increasingly Makes the Case for God". The article argues that recent scientific discoveries about the universe increase the likelihood that the cosmos is the result of an intelligent designer. Though the article is brief, we recommend a studious reading of it, as it brings to the fore several essential problems with purely materialist theories of the origin of the universe. In this essay, we will examine the issue of probability and the concept of "statistical impossibility" and how it undermines the materialist assumption that given enough time, anything is possible.
The Metaxas article cannot be read without a subscription to the Wall Street Journal, but we have included a PDF copy of it here for your review. A perusal of this brief article will be helpful in understanding our essay.
I. Parameters Necessary for Supporting Life
The Metaxas article centers on the question of the conditions necessary for life to emerge. One of the more traditional arguments against God's existence states that God was only invoked by human beings as a means of explaining phenomenon whose causes remained obscure. As the age of science progresses and we learn more about the natural world, it will increasingly become less necessary to invoke a creator to explain physical phenomenon. Hence one day theist cosmologies will wither away altogether.
This is the so-called "God of the gaps" argument. This argument is severely flawed in that it misunderstands the fundamental religious impulse of man (see here). But the argument can also be flipped around on the atheist: if it is alleged that God is invoked to explain natural phenomenon whose cause remains unknown, it can equally be argued that naturalistic explanations are invoked to explain developments that are simply not possible in a materialistic framework. The two greatest of these developmental leaps is the jump from non-life to life, and more fundamentally, the shift from non-being to being.
We have already written on the absurd attempts of science to find naturalistic ways to explain the coming into existence of matter from nothing (see, "Lawrence Krauss's 'Nothing' is not Nothing"). Metaxas' article deals with the second issue - the transition from non-living matter to living beings.
It was assumed back in the 1960's and 1970's that life would naturally emerge when the right material "ingredients" were mixed together. In 1966, agnostic cosmologist Carl Sagan opined that really only two essential ingredients were needed to create conditions conducive to the emergence of life: the right kind of star, and a planet located the proper distance from that star. Given what was then known about the universe, this criterion should have made the eventual discovery of other life in the universe quite probable. Quoting Metaxas:
"Given the roughly octillion—1 followed by 27 zeros—planets in the universe, there should have been about septillion—1 followed by 24 zeros—planets capable of supporting life. With such spectacular odds, the Search for Extraterrestrial Intelligence [SETI], a large, expensive collection of private and publicly funded projects launched in the 1960s, was sure to turn up something soon." 
But, as we came to learn more about the universe and about the miracle of life, science realized that the necessary components of life were not that simple. Sagan's original estimate of two necessary conditions was eventually raised to 10; a later revision in light of further discoveries stated there were 20 necessary conditions. By the turn of the 21st century, scientific advances had now set the number of necessary conditions for life at around 50, which of course drastically reduces the amount of potentially life-supporting planets in the universe. The one septillion potentially life-supporting planets of 1966 had by 2000 been reduced to only a few thousand. And with each discovery about the universe or biological life, those numbers are constantly being revised, further raising the number of conditions and reducing the amount of planets capable of supporting life.
By 2006, the estimates of the necessary conditions for life had risen so high that a representative of the SETI project wrote in the Skeptical Inquirer an article abjectly admitting that all earlier estimates of the statistical likelihood of finding extraterrestrial life "may no longer be tenable" and that expectations about finding a planet that could support life should be put to rest. 
In fact, as further conditions for supporting life continued to be discovered, the mathematical probability of there being any planets able to support life dropped to zero. The odds were against any planet supporting life - even this one. Today the list of parameters necessary to support life is around 200 and will probably keep growing in the near future. As Metaxas says, "The odds against life in the universe are simply astonishing" .
II. Probability and Statistical Impossibility
Statistics is the study of data, usually for the purpose of determining the probability of an event or condition. A probability is a likelihood that something will occur, expressed in mathematical terms. Calculating probabilities is governed by certain mathematical principles. If there are three cups and only one has a ball beneath it, the probability of someone choosing the cup with the ball on their first pick is one out of three, or 1:3. This is because there is only one "right" or desired outcome in this event, and three possibilities for attaining that outcome. This is relatively simple.
Probability gets more complicated when the one outcome you are studying has to have multiple factors all arranged in order to bring it about. For example, suppose six blocks, each with a different number. What is the likelihood of randomly choosing any one number? That's easy. One in six (1:6). But suppose we want to know the likelihood of throwing the six blocks and getting a particular sequence, say, 6,1,2,5,4,3. How do we determine the probability of throwing the six blocks and getting that particular sequence?
To calculate this, we use a mathematical function called the factorial. The factorial function is used to compute the number of combinations or permutations that can be constructed from a set of objects. In the problem posed above, we need to know the probability of getting one particular sequence. But to know this, we first need to know how many possible sequences there could be. We need to use the factorial calculation. To do this, we would multiply 6 x 5 x 4 x 3 x 2 x 1 (if we want to write it shorthand, the symbol for factorial is !, so 6 factorial is 6!). The answer is 720, which means there is a 1:720 likelihood of randomly throwing the sequence 6,1,2,5,4,3 on the first try. Or, to say it another way, you would have to throw the blocks 720 times before getting the desired sequence.
The odds against something obviously rise dramatically when the number of variables required to come together are greater. Even tiny increases in the amount of factors affect the probability drastically. The factorial of 6 is 720. But the factorial of 10 is 3,628,800. It is about 3.6 million times more unlikely to get a particular combination of 10 blocks than it is of 6.
Now, because something has a low probability does not mean you will definitely have to exhaust the entire realm of possibility to get the desired outcome. The odds of winning the Mega Millions Jackpot are 1:258,890,850, but in fact somebody wins it almost every time . The odds of being struck by lightning twice in a row in your life and surviving are extremely low, but it has happened to people.
The fact that a probability is remote does not mean something could not happen, only that it is less likely to happen. In most cases, as long as any probability exists, the chance of the event occurring cannot be ruled out.
I say here in most cases, because there are certain situations in statistics where this is not the case, so-called cases of "statistical impossibility." Statistical impossibility refers to a situation where, although there remains a mathematical "chance" that something could occur, its probability is so low - and the odds against it so enormously high - that it really cannot be admitted as a possibility in any rational argument. Practically speaking, it is impossible.
And it does not take much to get there. Suppose we have a 50 piece children's puzzle. For a puzzle to come together correctly, each piece needs to be in the proper place. What is the probability that we can dump the pieces of this puzzle on the floor and they will all just fall together in exactly the right locations? We'd need to figure out the factorial of 50. The factorial of 50 (expressed as 50!) comes out to 3.0414093 x 1064. That's some seriously long odds. How long? That's roughly equivalent or maybe a little bigger than the number of atoms in the universe.  Now, some might object that because there is still a chance (1: 3.0414093 x 1064) then we cannot call this "impossible"; that per our examples of the lightning strike and the lottery, even remote odds can be realized.
But we really need to understand the massive distinction between, say 1:1:258,890,850 (odds of winning the Mega Millions jackpot) and 1:3.041409 x 1064. As we move into probabilities where the odds against something start to range into exponents, we run up against a "curve of impossibility" that it becomes logically meaningless to argue against. Even though there are remote mathematical odds that I could pass through a solid brick wall by virtue of quantum tunneling, we know it will never happen. I can walk into that brick wall for all eternity and never go through it. It is impossible. These conditions are known as "statistical impossibility."
III. The Complexity and Order of the Universe
Atheists often quip that the Christian defense of God's design in the universe boils down to "the universe is way too complicated" and therefore must have a creator. This is over-simplified; complexity alone does not necessitate intelligence, but something called specified complexity does. Whereas simple complexity denotes mere intricacy, specified complexity occurs in a configuration when it can be described by a pattern that displays a large amount of independently specified information and is also complex, which is defined as having a low probability of occurrence. In other words, there is of necessity a very specified pattern or complexity that must occur in the configuration.
A classic example of this is in genetics, where individual genes must be arranged in very specified patterns; every gene needs to be located in a specific place for them to function properly. The configuration is not just complex but also specified.
Remember our 50 piece puzzle and the odds against dumping it out and finding all the pieces in the correct locations? One of the very simplest organisms known to man is the Mycoplasma genitalium, a tiny parasitic bacterium that lives on cells within the human urinary tract. Forget single-celled organisms; this organism is so tiny it lives on cells. It is the smallest known free-living organism. The Mycoplasma genitalium has 525 genes.  So in Mycoplasma genitalium, all 525 genes must be arranged in a particular genetic sequence. What are the odds of this happening by chance?
As a matter of fact, according to the online Factorial Calculator, it is around 6.890802624 x 101201. Yes, that means multiplied by ten with 1,201 zeros after it. This odds against 525 genes coming together "on their own" in this specified pattern are so absurdly gigantic that it is a clear case of statistical impossibility. Or, if you prefer, Google's factorial calculator explains it very well:
The odds of this are so remote as to constitute an infinite impossibility. And this is just looking at the components necessary for the simplest free-living genomic organism known to man. One can imagine the statistical difficulties we would be immersed in trying to explain the materialist, unguided development of something like even the most basic light-receptive eye let alone something like even the simplest brain.
The odds become even more remote when we realize that there is no cumulative increase in probability. We are all taught in basic math that when flipping a coin, the fact that you get heads two or three times in a row does not mean there is any greater chance of getting it the next time. Even if you should flip heads a million times in a row, the probability of getting heads the millionth-and-first time is still 50/50. This means that if you shake that puzzle box once every second for one hundred thousand years in the vain hope of getting the pieces to come together on their own, your probability of getting it to happen after a hundred thousand years is just as remote as when you first started.
This is important because there is a materialist fallacy that anything is possible given enough time; that an occurrence which logic and mathematics tell us is impossible at any one time suddenly becomes possible if we stretch the time frame out millions or billions of years. But since there is no cumulative increase in probability over time, there is no reason why this should be so. A probability of 6.890802624 x 101201 remains 6.890802624 x 101201 whether we are on Attempt 1 or Attempt Ten Million.
Biologically speaking, this strays into other issues, such as irreducible complexity. We will discuss this, as well as the concept of specified complexity, in further articles.
The point is, as Metaxas points out in his article, when we get to the point where a planet must contain all 200 known parameters for supporting life, the probability that such a planet could ever exist anywhere approaches statistical impossibility - and this includes our own planet. The existence of life on this planet is the biggest miracle, something that science increasingly confirms could not have "just happened."
 Eric Metaxas, "Science Increasingly Makes the Case for God", Wall Street Journal, Dec. 24. 2014, available online at http://www.unamsanctamcatholicam.com/images/metaxas.pdf