The second round presented by Dennis McCarthy was very interesting for what it revealed concerning how much, and how little, about evolution and its various mechanisms is understood by even its more intelligent defenders. I will not bother responding to the first part of the post, as it is a very good example of what I consider to be a Wistarian response, which is attempting to address a mathematical challenge with an orthogonal appeal to tangential logic. The distinction between the various levels of speciation, the similarity of birds and dinosaurs, and the various familial clads are no more relevant to any argument I have presented than who wins the Super Bowl this coming weekend.
Now, here’s where the storytelling gets substantive.
“Ahh,” says the evolution-skeptic, “I don’t care about fossils or biogeography or stories about salamanders or moths. Vox Day has proved mathematically that it can’t happen, so I don’t even have to think about any of this.”
First, Vox Day’s central argument in Probability Zero concerns neutral mutation fixation rates, which says nothing about natural selection and is largely orthogonal to most of what we have been discussing. Even if Motoo Kimura’s neutral theory—and the equations Vox Day disputes—were entirely mistaken, that would not overturn Darwinian evolution, nor would it undermine any of the empirical facts or conclusions considered so far. Vox Day himself effectively concedes as much in his response:
And in the interest of perfect clarity, note this: Dennis McCarthy’s critique of Probability Zero is not, in any way, a defense of evolution by natural selection. Nor can it be cited as a defense of speciation or Darwinism at all, because neutral theory has as about as much to do with Darwin as the Book of Genesis
Actually, my full post response (like this one) did indeed defend evolution by natural selection. And the only reason I veered from the subject of Darwinism at all was to address Vox Day’s main mathematical arguments—and it is Day’s main arguments that are not relevant to Darwinism or evolution by natural selection. And this is true despite what Day frequently implies, what his readers persistently infer, and what the subtitle of Probability Zero plainly states.
Unfortunately, it’s at this point that it becomes clear that McCarthy either hasn’t actually read Probability Zero or somehow managed to miss the central point of the book despite it being right in the subtitle. It’s clear that he confused my secondary mathematical argument, which addressed Kimura’s neutral theory, which is a) not part of the Modern Synthesis, b) does not involve natural selection, c) is algebraically doomed to be mathematically incorrect, and d) is the non-Darwinian ground to which professional biologists have retreated due to their recognition that natural selection is incapable of accounting for the observed genetic divergence between any two distinct, but related species, for the primary one.
It’s a little hard to understand how McCarthy manages to focus on the 1,600 generations per fixation rate measured in the lab with E.coli bacteria while somehow missing the entire mathematical argument that started this whole discussion back 2019 with MITTENS, which is entirely and only concerned with natural selection. This is the core equation that is integral to MITTENS, which McCarthy still has not addressed:
F_max = (t_div × d) / (g_len × G_f)
- F_max = maximum achievable fixations
- t_div = divergence time (in years)
- g_len = generation length (in years)
- d = Selective Turnover Coefficient
- G_f = generations per fixation
As you can easily verify for yourself, the MITTENS formula not only disproves any possibility of natural selection accounting for the observed post-CHLCA human-chimpanzee divergence, but has also been confirmed to disprove any possibility of natural selection accounting for every single divergence between two species for which we have the necessary data to check, including Human–Chimp, Human–Gorilla, Human–Orangutan, Bonobo–Chimp, W–E Gorilla, S–B Orangutan, Chimp subspecies, D. mel–D. sim, Mus–M. spretus, Chicken–Turkey, Horse–Donkey, Afr–Asian Elephant, Sav–Forest Elephant, L. Victoria cichlids
L. Malawi cichlids, and Stickleback M–F.
All of the detail concerning this can be found in “The Universal Failure of Fixation: MITTENS Applied Across the Tree of Life” paper which I have posted publicly in an open repository. Run the numbers for yourself if you are skeptical. And, by the way, you should probably note that the argument does not rely upon G_f, the 1,600 generations per fixation number, because running the calculation the opposite way shows that the selection coefficients required to account for the fixations are not possible either, as demonstrated in “Minimum Selection Coefficients Required for Speciation: A Cross-Taxa Quantitative Analysis” which appears in The Frozen Gene.
However, even though McCarthy hasn’t yet addressed my actual case against natural selection, he did touch usefully on it when he appealed to a study on adaptive variation.
Day starts discussing beneficial mutations, and in regional mammalian populations, beneficial mutations can, under severe selective pressures, sweep to fixation across the entire population in only tens or hundreds of generations. Numerous lab and field studies confirm this (e.g., Steiner, C. C., Weber, J. N., & Hoekstra, H. E. (2007). Adaptive variation in beach mice produced by two interacting pigmentation genes. PLoS Biology 5(9): e219.)
As before, McCarthy’s critique proved to be very beneficial to the larger case against natural selection I’ve been making, as a close look at the study revealed that what I had originally intended as nothing more than a disproof has application as a predictive model. Because what the study gives us the ability to demonstrate is how MITTENS is capable of distinguishing between an adaptive mutational fixation that is possible within the observed time limits and the large quantity of speciating mutational fixations that are not. Remember, MITTENS never claimed that no fixations are possible over time, only that the possible number is much, much smaller than the observed speciating divergences require.
In fact, a review of the beach mice study led to another paper, this one entitled “The Scope of Natural Selection: MITTENS-Validated Case Studies in Local Adaptation” which demonstrates that MITTENS can be usefully applied to calculating the quantity of fixations that are possible in a given period of time, thereby converting it into an effective predictive model.
The MITTENS framework (Mathematical Impossibility of The Theory of Evolution by Natural Selection) establishes quantitative constraints on achievable fixations based on generation time, the selective turnover coefficient (d), and empirically observed fixation rates. While MITTENS demonstrates a 158,000-fold shortfall for macro-evolutionary divergence (e.g., human-chimpanzee), critics might argue that local adaptation represents an intermediate test case. Here we examine four well-documented examples of local adaptation: beach mouse pigmentation, stickleback armor reduction, peppered moth melanism, and warfarin resistance in rats. In every case, the required genetic changes involve 1–3 fixations—precisely the scale MITTENS predicts natural selection can accomplish. Using taxon-appropriate parameters and, where available, empirically measured selection coefficients, we show that all four cases pass MITTENS constraints. The peppered moth case is particularly instructive: MITTENS predicts 0.66 achievable fixations, implying the allele should reach high frequency but not fix—exactly what was observed before selection reversed. These results confirm that natural selection operates effectively within its proper scope while remaining incapable of the million-fold extrapolation required for macro-divergence.
Now let’s look at the next element of his critique:
So in the first generation after the chimpanzee/human split, there were 1,000,000 new mutations—1/20,000 of which may be expected to reach fixation—or 50 fixed mutations per generation. But we should not expect these 50 mutations to fix immediately, but after 40,000 generations. 50 more mutations from the 2nd generation should fix around the 40,001st generation. And so on.
Since hominids have had 450,000 generations, all mutations would have had time to fix except for those mutations occurring in the last 40,000 generations. What is more, the human race has been widely dispersed for tens of thousands of years, with some populations living in constant isolation, necessarily preventing them from sharing mutations with the rest of the world.
So let’s subtract out the last 50,000 generations, which leaves us with 400,000 generations. 400,000 generations x 50 fixed mutations per generation = 20 million fixed mutations. We can also calcuclate this another way: 400,000 generations x 1,000,000 new mutations per generation = 400 billion new mutations altogether. Each new mutation has a probability of 1/20,000 in becoming fixed, so: 400 billion mutations x 1/20,000 = 20 million fixed mutations.
And that equals the 20 million fixed mutations that have been observed. Change the assumptions, and the estimate moves. But under Vox’s own assumptions, the result is the opposite of “probability zero”: it’s what you’d predict.
What McCarthy has focused on here is neutral theory, and specifically, the Kimura Fixation model, which inevitably produces mathematically incorrect results due to the fact that its derivation is algebraically incorrect. The correct equation is the one that is presented in The Frozen Gene, which, to be fair, McCarthy has not read. Nor would he have any reason whatsoever to suspect that the Kimura model he’s using is incorrectly derived and that its results are hopelessly wrong. Just to be clear, the correct equation is this one:
k = 2Nμ × 1/(2Nₑ) = μ(N/Nₑ)
But we’ll set that aside; given that the entire field of population genetics missed that one for 58 years, we certainly cannot expect McCarthy to have noticed that. But, it might behoove my critics to be aware that this is the sort of thing I am capable of doing before attempting to claim that I didn’t understand something about the standard theories. So what we’ll focus on instead is the fact that even if we grant the validity of the neutral theory model, McCarthy’s application of it is completely incorrect on multiple levels.
McCarthy’s version of the model requires that at any given moment, millions of mutations are simultaneously “in progress” toward fixation—each at a different frequency stage, all drifting upward in parallel. That’s as per Kimura’s theory. However, this runs right into the Bernoulli Barrier. For 1,000,000 mutations “in the pipeline” simultaneously, the probability that they all successfully navigate drift to fixation—rather than being lost—is astronomically small. Each neutral mutation has only a 1/(2Nₑ) chance of fixing rather than being lost. The vast majority of mutations that enter the population are lost, not fixed. McCarthy’s model treats the expected value as though it were guaranteed. “50 mutations per generation should fix, therefore 50 mutations per generation do fix.”
But that’s not how probability works. The expected value is the average over infinite trials, it is not a guaranteed throughput. And in the real world, you don’t get infinite trials, you get precisely one run.
Unlike many evolutionary biologists, McCarthy does correctly grasp that geographic dispersion renders fixation impossible.
Evolutionary theory predicts that no mutation, whether neutral or beneficial, that has arisen in the last 50,000 years or so can reach and spread throughout all populations on the planet. The reason is that over that time—and especially over the last 10,000 years—human populations have become fragmented and geographically isolated in places such as New Guinea, Australia, Tasmania, the Andaman Islands, the Pacific islands, and the Americas. Most people of these regions have had effectively zero genetic contact with the rest of the world until very recently, if at all.
Under such conditions, it has been impossible for any single mutation—whether neutral or beneficial—to reach fixation across the entire human species. The genes that helped some Europeans survive the Black Death in the 1300s, for example, could never have also raced across the Americas (neither group even knew each other existed at this time), let alone reach the Hewa people of New Guinea, who would not see a white person until 1975. Instead, the roughly 20 million fixed genetic differences between humans and chimpanzees accumulated during the millions of years when ancestral hominid populations were relatively small, geographically concentrated, and tightly interconnected by gene flow.
That’s a logical conclusion. But here’s the problem with that. Genetic drift doesn’t stop. Once fixed does not mean always fixed. So let’s run the numbers and assume that all of the 20 million fixed differences between humans and the proto-chimp – not chimpanzees, because with chimpanzees we need to account for a total of 40 million – mostly happened very early in the process. In a population with effective size Nₑ = 10,000 (the value McCarthy uses), the expected time until a new mutation arises at any specific fixed site is:
Time to new mutation = 1 / (2Nₑ × μ)
Plugging in the values: 1 / (2 × 10,000 × 1.5 × 10⁻⁸) = 1 / (3 × 10⁻⁴) ≈ 3,333 generations
So on average, every fixed site will experience a new mutation within approximately 3,300 generations. Most of these new mutations will not fix themselves; the gene will become polymorphic, and the original fixation will be lost. So over long timescales, the cumulative probability that a site remains unchanged becomes vanishingly small.
The probability that a fixed site has experienced no new mutations over T generations is: P(unchanged) = e^(−2NₑμT)
McCarthy’s model, necessarily corrected for continued mutation, predicts roughly 150,000 fixed differences between humans and the CHLCA. We observe approximately 20 million. The McCarthy model is therefore short by a factor of 133× even when we grant him a) an ancient Nₑ of 10,000, b) Kimura’s invariance, and c) 20 million free ancestral mutations.
The real number based on actual values is even worse for his attempted rebuttal. In utilizing the math he presented, we adjust for a) the correct ancient effective population of 3,300, b) we correct Kimura’s algebra and incorporate the correct current population of 8 billion.
See what a difference it makes:
McCarthy’s Original Calculation:
- Population: Nₑ = 10,000 (theoretical effective population size AND theoretical current population)
- Mutations per generation: 1,000,000
- Fixation probability: 1/2N = 1/20,000
- Generations: 400,000
- Total mutations: 400 billion
- Expected fixations: 400 billion × 1/20,000 = 20 million fixations
Corrected Calculation Using Actual Values:
- Nₑ = 3,300 (actual ancient effective population size)
- N = 8,000,000,000 (actual population)
- Mutations per generation: 100 per individual × 3,300 individuals = 330,000 mutations per generation
- Fixation probability: 1/2N = 1/16,000,000,000
- Expected fixations per generation: 330,000 × 1/16,000,000,000 = 0.0000206
- Over 400,000 generations: 0.0000206 × 400,000 = 8.25 fixations
Thus the shortfall increases from 133x to 2,424,242x when we go from the theoretical to the actual.
There is absolutely nothing Dennis McCarthy or anyone else can do at this point to salvage either natural selection or neutral theory as an adequate engine for evolution and the origin of species. The one is far too weak to account for the empirically observed genetic changes and the second is flat-out wrong. Neither of them, alone or in combination with every other suggested mechanism for evolution, can come within several orders of magnitude of the quantity required to account for the genetic divergence in humans and chimpanzees, in mice and rats, in chickens and turkeys, or in savannah and forest elephants.