A Substantive Critique of PZ

Dennis McCarthy, the historical literary sleuth whose remarkable case for the true authorship of Shakespeare’s works is one of the great detective works of history, has aimed his formidable analytical abilities at Probability Zero. And it is, as he quite correctly ascertains, an important subject that merits his attention.

I believe this is one of my more important posts—not only because it explains evolution in simple, intuitive terms, making clear why it must be true, but because it directly refutes the core claims of Vox Day’s best-selling book Probability Zero: The Mathematical Possibility of Evolution by Natural Selection. Day’s adherents are now aggressively pushing its claims across the internet, declaring evolution falsified. As far as I am aware, this post is the only thorough and effective rebuttal to its mathematical analyses currently available.

It’s certainly the only attempt to provide an effective rebuttal that I’ve seen to date. Please note that I will not respond to this critique until tomorrow, because I want to give everyone a chance to consider it and think about it for themselves. I’d also recommend engaging in the discussion at his site, and to do so respectfully. I admire Mr. McCarthy and his work, and I do not find his perspective either surprising or offensive. This is exactly the kind of criticism that I like to see, as opposed to the incoherent “parallel drift” Reddit-tier posturing.

The book is first and foremost what I like to call an end-around. It does not present a systematic attack on the facts just presented—or, for that matter, any of the vast body of empirical evidence that confirms evolution. It sidesteps entirely the biogeographical patterns that trace a continuous, unbroken organic thread that runs through all regions of the world, with the most closely related species living near each other and organic differences accruing with distance; the nested hierarchies revealed by comparative anatomy and genetics; the fossil record’s ordered succession of transitional forms (see pic); directly observed evolution in laboratories and natural populations; the frequency of certain beneficial traits (and their associated genes) in human populations, etc.

Probability Zero, instead, attempts to fire a mathematical magic bullet that finds some tiny gap within this armored fort of facts and takes down Darwin’s theory once and for all. No need to grapple with biology, geology, biogeography, fossils, etc., the math has pronounced it “impossible,” so that ends that.

Probability Zero advances two principal mathematical arguments intended to show that the probability of evolution is—as its title suggests—effectively zero. One centers on the roughly 20 million mutations that have become fixed (that is, now occur in 100% of the population) in the human lineage since our last common ancestor with chimpanzees roughly 9 million years ago. Chimpanzees have experienced a comparable number of fixed mutations.

Day argues that this is impossible given the expected number of mutations arising each generation and the probability that any particular neutral mutation reaches fixation—approximately 1 in 20,000, based on estimates of ancestral human population size. Beneficial mutations do have much higher fixation probabilities, but the vast majority of these ~20 million substitutions are neutral.

Read the whole thing there. Mr. McCarthy is familiar with the relevant literature and he is not an innumerate biologist, which is what makes this discussion both interesting and relevant.

As I said before, I will refrain from saying anymore here or on SG, and I will refrain from commenting there, until I provide my own response tomorrow. But I will say that I owe a genuine debt to Mr. McCarthy for drawing my attention to something I’d overlooked…

DISCUSS ON SG


A 60-Year-Old Book Review

A review of the 1966 Wistar Symposium about which I have written in Probability Zero:

Evolution: What Is Required of a Theory?
Mathematical Challenges to the Neo-Darwinian Interpretation of Evolution.

A symposium, Philadelphia, April 1966.

The idea of this symposium is supposed to have originated from a discussion at two picnics in Switzerland, when four mathematicians, Schutzenberger, Ulam, Weisskopf, and Eden, had a discussion with the biologists Kaplan and Koprowski on mathematical doubts concerning the Darwinian theory of evolution. After heated debates it was proposed “that a symposium be arranged to consider the points of dispute more systematically, and with a more powerful array of biologists who could function adequately in the universe of discourse inhabited by the mathematicians.“ During the course of the symposium further heat was generated.

It is not easy to summarize the case made by the mathematicians,(1) which involves both the challenge that computer simulation of evolution shows evolutionary theory to be inadequate and a complaint that the biologist has not provided sufficient information for efficient computer simulation. Eden was particularly concerned with the clement of randomness which is claimed to provide the mutational variation upon which evolution depends. “No currently existing formal language,” he contends, “can tolerate random changes in the symbol sequences which express its sentences. Meaning is almost invariably destroyed. Any changes must be syntactically lawful ones.” He therefore conjectures that “what one might call ‘genetic grammaticality’ has a deterministic explanation and docs not owe its stability to selection pressure acting on random variation.” He points out that attempts to provide for computer learning by random variation have been unsuccessful, and that an adequate theory of adaptive evolution would supply a computer programmer with a correct set of ground rules.

Schutzenberger takes a more extreme position. Arguing that all genetic information should consist of a rather limited set of words in an alphabet of 20-odd letters—in which evolution is typographical change—he finds a need for algorithms “in which the very concept of syntactic correctness has been incorporated.” He compares this “syntactic topology” with the “phenotypic topology” of organisms as physical objects in space-time, and a major part of his challenge to neo-Darwinian theory is “the present lack of a conceivable mechanism which would insure within an interesting range the faintest amount of matching between the two topologies. . . an entirely new set of rules is needed to obtain the sort of correspondence which is assumed to hold between neighbouring phenotypes. . .“

A major part of the biologists’ answer to this challenge was in the claim that the neo-Darwinian theory used in computer models, based on the Haldane-Fisher-Wright interpretation of 1920-1930, misses out those forces which lead to continuing evolution, such as continued environmental change, the heterogeneous environment, epigenetic organization of phenotypes, and the progressive elaboration of the types of mutation possible. (2) Waddington presented the main elements of a theory of phenotypes involving canalized processes of development (with switching mechanisms), the heritability of developmental responses to environmental stimuli, and a principle of “Archetypes,” inbuilt characteristics of an evolving group which determine the directions in which evolutionary change is especially easy. Realistic models would need to build in these elements.(3) Many of the papers by biologists in this volume are peripheral(4) to the theme stated by the mathematicians, providing an accompaniment of sophisticated evolutionary theory rather than a counterpoint to the mathematical challenge.

Most biologists are satisfied with a theory that can be tested and that proves predictive. It is a different challenge to a theory that it should have an effective working model, for failure may imply either imperfection in the theory or imperfection in the model. It is doubtful whether this symposium has done much to influence the theory of evolution; it may have done much to improve future models.

It must have been tremendous fun to attend this symposium, but the full record of argument and interruption is very irritating to at least one reader. An interchange between speakers which runs X “No,” Y “No, no,” X “O.K. let’s waste time,” Y “We understand the question,” Z “The answer is no” surely needs no record in the literature of science. The short pre- and post-conference papers included in the volume arc excellent succinct expressions of points of view, but much of the main text reads like a word-for-word record of a heckled political meeting. This may be a useful way to discuss problems in science; it is not the way to publish them.

John L. Harper

School of Plant Biology, University College of North Wales, Bangor


Uncle John’s Band added a few footnotes as commentary. I added a fourth one.

  1. As predicted by Probability Zero, the biologist reviewer struggles with mathematical arguments. They are well-summarized by Day.
  2. It isn’t an exaggeration to say the biological counterargument consisted of what was for all intents and purposes, magic. When they weren’t replying at all.
  3. Still no compulsion to, you know, do an experiment It’s all thoughts and fancies.
  4. Peripheral indeed. Peripheral was the polite way to say: they didn’t respond in any way, shape, or form to the mathematical criticism.

DISCUSS ON SG


No Chance At All

The Band reviews Probability Zero:

Probability Zero demolishes TENS so utterly, the preface should be “PULL!”

This is the first version of a new book by Vox Day that demonstrates the mathematical impossibility of the Theory of Evolution by Natural Selection [TENS]. Given how big the House of Lies and reality-facing counterculture are around here, it demands attention. There may not be a more important pillar for its entire fake ontology.

Probability Zero strikes the heart of what the setup post called conflict between The Science! and the Scientific Method. This matters for more than intellectual reasons. Readers know personal responsibility is a priority around here. But we also live in a complex socio-culture that has unavoidable influence on us. From basic things, like adding tax and regulatory burdens to organic community demands. Up to the fundamental beliefs that set the public ethos…

Probability Zero starts by setting aside the religious and philosophical arguments, just like The Science! does. It accepts the discourse on its terms, by adhering to the “scientific” arguments it claims to adhere to. To be defined by. Full concession of TENS huffing’s own epistemological standards. Then lays out the mathematical parameters claimed to be involved in the TENS process. No additional yeah, buts. Just what is accepted in the literature. And then lets the logical realities of math blow the whole mess into a smoking crater so apocalyptically vast, I’ll never be able to see biologists the same way again.

There’s no need to recap the statistical arguments, they’re clear and complete. The kernel is that if mutations take an amount of time to appear and fix, that much time has to be available for the theory to be possible.

This was clear when MITTENS was pointed out. Even before it had a name. General conditions of possibility make it obvious once seen. But the full demonstration lights up that gulf between The Science! and science as modes of knowledge production. The whole point of science is empirical conformation and abstract reasoning in concert. Day’s observation that evolutionary biologists have replaced experimentation with pure modeling was legitimately surprising. Apparently there still was a bar, however low. Not anymore.

Consider what problems innumeracy might present for pure modelers. Because the level is staggering. To the point where a simple arithmetic mean is incomprehensible. No hyperbole. Probability Zero describes blank stares when asked for the average rate of mutation. The ongoing idiocy over parallel vs. sequential mutation is illustrative. The total number of mutations separating species includes all of them. Parallel, sequential, or however else. Hence the word “total”. And dividing “total” by “amount of time” gives a simple, unweighted average number. The rate.

I’m not exaggerating. There was always the joke that biologists were fake scientists that couldn’t do math. Easier for premed GPAs too. But the assumption was that it was relative. Lighter than physics or chemistry, but still substantial compared to social sciences or the arts. And that would be wrong. There are some computational sub-fields of biology. Assuming they’re legit, they clearly aren’t working in evolution.

Read the whole thing there. He has several very illuminating examples of historical evo-fluffery, including one page of a manuscript that I’m going to put up here as a separate post, simply because it demands seeing in order to believe it.

DISCUSS ON SG



A Question of Content

I have a lot of additional data that isn’t going to go in the book, and tends to be much more on the technical side. For example, this is the latest thing I’ve been running down, not because it’s necessary to any of the points I’m making, but because I’m interested in the tangential element that appeared in one of the necessary investigations.

  • Show the sex composition of Neolithic vs Modern samples (if very different, Y-chromosome artifacts are confirmed)
  • Check each outlier SNP individually with sex breakdowns
  • Flag if females have Y-chromosome data (impossible, means data quality issue)

I don’t really want to start yet another site devoted to this stuff, but I don’t want to bore everyone here to tears either. So, would a daily post on the science marginalia be of interest here, or should I try to find a different solution until I inevitably get bored of this sort of thing?

I very much appreciate the strong support that has been shown by everyone here in making Probability Zero a multi-category bestseller. But I also know that it’s not necessarily the content for which most people come here.

DISCUSS ON SG


Rethinking Human Evolution Again

Imagine that! The timelines of human evolution just magically changed again! And it’s really not good news for the Neo-Darwinians or the Modern Synthesis, while it simultaneously highlights the importance of Probability Zero and its mathematical approach to evolution.

A stunning discovery in a Moroccan cave is forcing scientists to reconsider the narrative of human origins. Unearthed from a site in Casablanca, 773,000-year-old fossils display a perplexing blend of ancient and modern features, suggesting that key traits of our species emerged far earlier and across a wider geographic area than previously believed…

The find directly challenges the traditional “out-of-Africa” model, which holds that anatomically modern humans evolved in Africa around 200,000 years ago before migrating and replacing other hominin species. Instead, it supports a more complex picture where early human populations left Africa well before fully modern traits had evolved, with differentiation happening across continents.

“The fossils show a mosaic of primitive and derived traits, consistent with evolutionary differentiation already underway during this period, while reinforcing a deep African ancestry for the H. sapiens lineage,” Hublin added.

Detailed analysis reveals the nuanced transition. One jaw shows a long, low shape similar to H. erectus, but its teeth and internal features resemble both modern humans and Neanderthals. The right canine is slender and small, akin to modern humans, while some incisor roots are longer, closer to Neanderthals. The molars present a unique blend, sharing traits with North African teeth, the Spanish species H. antecessor and archaic African H. erectus.

The fossils are roughly contemporaneous with H. antecessor from Spain, hinting at ancient interconnections. “The similarities between Gran Dolina and Grotte à Hominides are intriguing and may reflect intermittent connections across the Strait of Gibraltar, a hypothesis that deserves further investigation,” noted Hublin.

Dated by the magnetic signature of the surrounding cave sediments, the Moroccan fossils align with genetic estimates that the last common ancestor of modern humans, Neanderthals and Denisovans lived between 765,000 and 550,000 years ago. This discovery gives a potential face to that mysterious population.

The research, suggests that modern human traits did not emerge in a single, rapid event in one region. Instead, they evolved gradually and piecemeal across different populations in Africa, with connections to Eurasia, deep in the Middle Pleistocene.

This sort of article really underlines the nature of the innumeracy of the archeologists as well as the biologists. It’s not that they can’t do the basic arithmetic involved, it’s that they have absolutely no idea what the numbers they are throwing around signify, or understand the necessary second- and third-order implications of changing both their numbers and their assumptions.

For example, the reason the Out of Africa hypothesis was so necessary to the evolutionary timeline is because it kept the whole species in a nice, tight little package, evolving together and fixating together over time. But geographic dispersion necessarily prevents universal fixation. So, let’s take a look at how this new finding changes the math, because it is a significant complication for the orthodox model.

If human traits were evolving “gradually and piecemeal across different populations” spanning Africa and Eurasia as early as 773,000 years ago, then fixation had to occur separately in each isolated population before those populations could contribute to modern humans. This isn’t parallel processing that helps the model, it’s the precise opposite. Each isolated population is a separate fixation bottleneck that must be traversed independently.

Consider the simplest case: two isolated populations (Africa and Eurasia) that occasionally reconnect. For a trait to become universal in modern humans, one of two things must happen:

  1. Independent fixation: The same beneficial mutation arises and fixes independently in both populations. This requires the fixation event to happen twice, which squares the improbability.
  2. Migration and re-fixation: The mutation fixes in one population, then migrants carry it to the other population, where it must fix again from low frequency. This doubles the time requirement since the allele must go from rare-to-fixed twice in sequence.

If there were n substantially isolated populations contributing to modern human ancestry, and k of the 20 million fixations had to spread across all of them through migration and re-fixation, the time requirement multiplies accordingly.

The “mosaic” of traits—some modern, some archaic, some Neanderthal-like, some unique—found in the Moroccan fossils suggest that different features were fixing in different populations at different times, which is what one would expect. The eventual modern human phenotype was assembled from contributions across multiple semi-isolated groups. However, this means the 20 million fixations weren’t a single sequential process in a single lineage. They were distributed across multiple populations that had to:

  1. Fix different subsets of mutations locally
  2. Reconnect through migration
  3. Allow the locally-fixed alleles to spread and fix in the combined population
  4. Repeat for 773,000+ years

Let’s say there were effectively 3 semi-isolated populations contributing to modern human ancestry: North Africa, Sub-Saharan Africa, and Eurasia. This is the absolute minimum number. If half of the 20 million fixations had to spread across population boundaries after initially fixing locally, that’s 10 million alleles requiring a second fixation event after migration reintroduced them at low frequency.

The time requirement approximately doubles for those 10 million alleles (first fixation + migration + second fixation), while the original problem remains for the other 10 million.

Original shortfall: ~150,000-fold (from MITTENS)

Revised shortfall with geographic structure: ~300,000 to 450,000-fold

But this understates the issue. The real problem is that geographic structure reduces effective population size locally while increasing it globally.

  • Small local populations mean more drift, which sounds helpful for fixation
  • But small local populations also mean more mutations are lost to drift before they can spread
  • And the global population that must eventually carry the fixed allele is larger than any local population, meaning the final fixation is harder

The multiregional model doesn’t help Neo-Darwinism. It creates a nested fixation problem: alleles must fix locally (possible but slow), then spread through migration (slow), then fix in the receiving population (slow again), then spread further (slow), until global fixation is achieved (slowest of all).

The mathematical impossibility of TENS was just multiplied by at least a factor of 3. Notice how every time they find new evidence and adjust the narrative to accommodate it, they make the mathematical problem worse. The Moroccan fossils can’t save Neo-Darwinism. They’re just another shovel of dirt on the coffin.

DISCUSS ON SG


A Necessary Revision

It was pointed out to me that I made a mistake and used the full quote as well as a subset of the quote from Haldane’s 1932 work, The Causes of Evolution, as two different chapter headers. So, for the print edition, I selected a replacement quote that I think many of you will find ironic indeed, as it comes from Richard Dawkins’s landmark 1976 work, The Selfish Gene.

Incidentally, a thing that is sometimes not realized even by people who worry about population problems is that population growth depends upon when people have children, as well as how many they have. Since populations tend to increase by a certain proportion per generation, it follows that if you space the generations out more, the population will grow at a slower rate per year.
—Richard Dawkins, The Selfish Gene, 1976

Athos firmly concurred with my decision:

That’s a good choice. It accomplishes several things at once:

  1. It comes from an unimpeachable mainstream source—Dawkins in 1976, the foundational text of gene-centric evolution
  2. It explicitly states the principle that generations, not years, are the fundamental unit
  3. It’s accessible to your target audience without requiring population genetics background
  4. It sets up the Bio-Cycle argument perfectly: if generation time matters for population growth, it matters even more for fixation rates.
  5. The slight irony of using Dawkins to introduce an argument that undermines neo-Darwinism is a nice touch your readers will appreciate.

The irony compounds nicely when you consider that Dawkins spent his career not thinking through the implications of what he just said. If generation time matters for population growth, it matters for everything that happens generation by generation—including fixation.

The last comment was precisely why I chose that quote. There will be people who quibble over the necessity of the Bio-Cycle Fixation Model even though it is observably more accurate than either Wright-Fisher or Kimura, and even though Richard Dawkins himself pointed out the importance of the very factors upon which it relies 50 years ago.

And thereby underlining that Probability Zero is belatedly doing the work that the professional evolutionary biologists could have, and should have, done long before the turn of the century.

Some people are starting to post their reviews of the book, and I thought that this was one particularly perspicacious observation. The reviewer may be underestimating himself:

Vox Day is a lot smarter than I am, and he’s done a lot of research and complicated math that I am not even going to attempt to do myself. The math is over my head. I don’t understand Vox’s arguments. But here’s what I do understand: if Vox publicly demonstrates the impossibility of evolution by natural selection, given the facts and timeline asserted by the Darwinists themselves — or even if enough people form the impression that Vox has managed to refute Darwinism, regardless of whether he actually has — it absolutely presents a mortal threat to the civic religion that has been essential to the overarching project of the social engineers. That’s the point I was making in yesterday’s post. Moreover, if the powers that be do not suppress Vox’s “heresy,” that acquiescence on their part would show that they are prepared to abandon Darwinism, and that is a new and incredibly significant development.

That’s what I find intriguing too. There was far more, and far more vehement, opposition to The Irrational Atheist compared to what we’re seeing to Probability Zero. What little opposition we’ve seen has been, quite literally, Reddit-tier, and amounts to little more than irrelevant posturing centered around a complete refusal to read the book, let alone offer any substantive criticism.

Meanwhile, I’ve been hearing from mathematicians, physicists, scientists, and even literal Jesuits who are taking the book, its conclusions, and its implications very seriously after going through it carefully enough to identify the occasional decimal point error.

My original thought was that perhaps the smarter rational materialists realized that the case is too strong and there isn’t any point in trying to defend the indefensible. But there were enough little errors in the initial release that someone should have pointed out something, however minor. So, perhaps it’s something else, perhaps it’s useful in some way to those who have always known that the falsity of Neo-Darwinism was going to eventually be exposed in a comprehensive manner and are now ready to abandon their failing plans to engineer society on a materialist basis.

But I’m somewhat less sanguine about that possibility since Nature shot down all three papers I submitted to it. Then again, it could be that the editors just haven’t gotten the message yet that it’s all over now for the Enlightenment and its irrational materialism.

DISCUSS ON SG


PROBABILITY ZERO Q&A

This is where questions related to the #1 Biology, Genetics, and Evolution bestseller PROBABILITY ZERO will be posted along with their answers. The newest questions are on the top.

QUESTION: Interesting equation d = T × [∫μ(x)l(x)v(x)dx / ∫l(x)v(x)dx] I’m pretty sure all the denominator does is cancel l(x)v(x) and make l(x)v(x) ≠ 0. Which is to say d = T × ∫μ(x) unless the functions are special somehow for l(x) and v(x).

The functions l(x) and v(x) are special. They’re not constants that can be factored out and cancelled.

  • l(x) (survivorship) is a decreasing function. It starts at 1 (everyone alive at birth) and declines toward 0 as age increases. In humans, it stays high through reproductive years then drops off.
  • v(x) (reproductive value) is a hump-shaped function. It starts low (children can’t reproduce), peaks in early reproductive years, then declines as remaining reproductive potential diminishes.
  • The product l(x)v(x) weights each age by “probability of being alive at that age × reproductive contribution from that age forward.” This weighting is highly non-uniform. A 25-year-old contributes far more to the integral than a 5-year-old or a 60-year-old.

If l(x) and v(x) were constants, they’d cancel and you’d get d = T × ∫μ(x)dx. But they’re not constants, they’re age-dependent functions that capture the demographic structure of the population.

QUESTION: The math predicts that random drift with natural selection turned off will result in negative mutations would take over and kill a population in roughly 225 years. I would argue modern medicine has significantly curtailed negative natural selection, and the increases of genetic disorders, autoimmune diseases, etc. are partially the result of lessened negative selection and then resulting drift. Am I reading too much into the math, or is this a reasonable possibility?

Yes, that’s not only correct and a definite possibility, it is the basis for the next book, which is called THE FROZEN GENE as well as the hard science fiction series BIOSTELLAR. However, based on my calculations, natural selection effectively stopped protecting the human genome around the year 1900. And this may well account for the various problems that appear to be on the rise in the younger generations which are presently attributed to everything from microplastics to vaccines.

QUESTION: In the Bernoulli Barrier, how is competition against others with their own set of beneficial mutations handled?”

Category error. Drift is not natural selection. The question assumes selection is still operating, just against a different baseline. But that’s not what’s happening. When everyone has approximately the same number of beneficial alleles, there’s no meaningful selection at all. What remains is drift—random fluctuation in allele frequencies that has nothing to do with competitive advantage. The mutations that eventually fix do so by chance, not because their carriers outcompeted anyone.

This is why the dilemma in the Biased Mutation paper bites so hard. Since the observed pattern of divergence matches the mutational bias, then drift dominated, not selection. The neo-Darwinian cannot claim adaptive credit for fixations that occurred randomly, even though he’s going to attempt to claim drift for the Modern Synthesis in a vain bait-and-switch that is actually an abandonment of Neo-Darwinian theory that poses as a defense.

The question posits a scenario where everyone is competing with their different sets of beneficial alleles, and somehow selection sorts it out. But that’s not competition in any meaningful sense—it’s noise. When the fitness differential between the best and worst is less than one percent, you’re not watching selection in action. You’re watching a random walk that, as per the Moran model, will take vastly longer than the selective models assume.

QUESTION: In the book’s example, an individual with no beneficial mutations almost certainly does not exist, so how can the reproductive success of an individual be constrained by a non-existent individual?

That’s exactly right. The individual with zero beneficial mutations doesn’t exist when many mutations are segregating simultaneously. That’s the problem, not the solution. Selection requires a fitness differential between individuals. If everyone in the population carries roughly the same number of beneficial alleles, which the Law of Large Numbers guarantees when thousands are segregating, then selection has nothing with which to work. The best individual is only marginally better than the worst individual, and the required reproductive differential to drive all those mutations to fixation cannot be achieved.

The parallel fixation defense implicitly assumes that some individuals carry all the beneficial alleles while others carry none because that’s the only way to get the massive fitness differentials required. The Bernoulli Barrier shows how this assumption is mathematically impossible. You simply can’t have 1,570-to-1 reproductive differentials when a) the actual genetic difference between the population’s best and worst is less than one percent or b) you’re dealing with human beings.

QUESTION: What about non-random mutation? Base pair mutation is not totally random, as purine to purine and pyrimidine to pyrimidine happens a lot more often then purine to pyrimidine and reverse. And CGP sites are only about one parcent of the genome but mutate 10s of times more often than other sites. This would have some effect on the numbers, but obviously might get you a bit further across the line than totally random mutation, how much, no idea, I have not done the math.

Excellent catch and a serious omission from the book. After doing the math and adding the concomitant chapter to the next book, it turns out that if we add non-random mutations to the MITTENS equation, it’s the mathematical equivalent of reducing the available number of post-CHLCA d-corrected reproductive generations from 209,500 to 157,125 generations. The equivalent, mind you, it doesn’t actually reduce the number of nominal generations the way d does. The reason is that Neo-Darwinian models implicitly assume that mutation samples the space of possible genetic changes in a more or less uniform fashion. When population geneticists calculate waiting times for specific mutations or estimate how many generations are required for a given adaptation, they treat the gross mutation rate as though any nucleotide change is equally likely to occur. This assumption is false, and the false assumption reduces the required time by about 25 percent.

Mutation is heavily biased in at least two ways. First, transitions (purine-to-purine or pyrimidine-to-pyrimidine changes) occur at roughly twice the rate of transversions (purine-to-pyrimidine or vice versa), despite transversions being twice as numerous in combinatorial terms. The observed transition/transversion ratio of 2.1 represents a four-fold deviation from the expected ratio of 0.5 under uniform mutation. Second, CpG dinucleotides—comprising only about 2% of the genome—generate approximately 25% of all mutations due to the spontaneous deamination of methylated cytosine. These sites mutate at 10-18 times the background rate, creating a “mutational sink” where a disproportionate fraction of the mutation supply is spent hitting the same positions repeatedly.

The compound effect dramatically reduces the effective exploratory mutation rate. Of the 60-100 mutations per generation typically cited, roughly one-quarter occur at CpG sites that have already been heavily sampled. Another 40% or more are transitions at non-CpG sites. The fraction representing genuine exploration of sequence space—transversions at non-hypermutable sites—is a minority of the gross rate. The mutations that would be required for many specific adaptive changes occur at below-average rates, meaning waiting times are longer than standard calculations suggest.

This creates a dilemma when applied to observed divergence patterns. Human-chimpanzee genomic differences show exactly the signature predicted by mutational bias: enrichment for CpG transitions, predominance of transitions over transversions, clustering at hypermutable sites. If this pattern reflects selection driving adaptation, then selection somehow preferentially fixed mutations at the positions and of the types that were already favored by mutation. If, as is much more reasonable to assume, the pattern reflects mutation bias propagating through drift, then drift dominated the divergence, and neo-Darwinism cannot claim adaptive credit for the observed changes. Either the waiting times for required adaptive mutations are worse than calculated or the fixations weren’t adaptive in the first place. The synthesis loses either way.

DISCUSS ON SG


Where Biologists Fear to Tread

The Redditors don’t even hesitate. This is a typical criticism of Probability Zero, in this case, courtesy of one “Theresa Richter”.

E coli reproduce by binary fission, therefore your numbers are all erroneous, as humans are a sexual species and so multiple fixations can occur in parallel. Even if we plugged in 100,000 generations as the average time to fixation, 450,000 generations would still be enough time, because they could all be progressing towards fixation simultaneously. The fact that you don’t understand that means you failed out of middle school biology.

This is a perfect example of Dunning-Kruger Syndrome in action. She’s both stupid and ignorant, neither of which state prevent her from being absolutely certain that anyone who doesn’t agree with her must have failed out of junior high school biology. Which makes a certain degree of sense, because she’s relying upon her dimly recalled middle school biology as the basis of her argument.

The book, of course, dealt comprehensively with all of these issues in no little detail.

First, E. coli reproduce much faster in generational terms than humans or any other complex organisms do, so the numbers are admittedly erroneous, they are generous. Which is to say that they err on the side of the Modern Synthesis; all the best human estimates are slower.

Second, multiple fixations do occur in parallel. And a) those parallel fixations are already included in the number, b) the reproductive ceiling: the total selection differential across all segregating beneficial mutations cannot exceed the maximum reproductive output of the organism, and c) Bernoulli’s Barrier: the Law of Large Numbers imposes an even more severe limitation on parallel fixation than the reproductive ceiling alone.

Third, an average time of 100,000 generations per fixation would permit a maximum of 4.5 fixations because those parallel fixations are already included in the number.

Fourth, there aren’t 450,000 generations. Because human reproductive generations overlap and therefore the 260,000 generations in the allotted time must be further reduced by d, the Selection Turnover Coefficient, the weighted average of which is 0.804 across the entirety of post-CHLCA history, to 209,040 generations.

Note to PZ readers: yes, the work continues. Any differences you note between numbers in the book and numbers I happen to mention now will be documented, in detail, in the next book, which will appear much sooner than anyone will reasonably expect.

Now, here’s the irony. There was an actual error in the book apparently caused by an AI hallucination that substituted a 17 for 7.65 for no discernible reason that anyone can ascertain. The change was even a fortuitous one, as it indicates 225 years until total genetic catastrophe instead of 80. And the punchline: the error was discovered by a Jesuit priest who was clearly reading the book very, very carefully and checking the numbers.

DISCUSS ON SG


The Confirmation of IGM

If Col Macgregor is correct, then I think we have a pretty good idea why PROBABILITY ZERO was not suppressed in any manner, but has been allowed to present its case without much in the way of interference, or even criticism:

BREAKING: Bank of England told to prepare for a market crash if the United States announces Alien Life. Helen McCaw who served as a senior analyst in financial security at the UK’s central bank sounded the alarm. She has now written to Andrew Bailey, the Bank’s governor, urging him to organize contingencies for the possibility that the White House may confirm we are not alone in the universe.

This would explain a lot of anomalies about all the high weirdness that has surrounded geopolitics over the last 2-3 years, from the fake Bidens and Trumps to the bizarre imperial expansionism of the fake Trump administration.

The thing is, the discovery of alien-human interaction has been pretty close to inevitable ever since the onset of full genome mapping. Intelligent Genetic Manipulation of the kind deduced in PROBABILITY ZERO has not yet been proven, but the statistical probabilities of it are rapidly approaching certainty as all of the naturalistic mechanisms either proposed by Darwin or developed in his wake as part of the Modern Synthesis have been conclusively ruled out due to the mutually reinforcing logic, math, and empirical evidence.

Once genetic scientists are able to look closely enough at anomalies such as the split chromosome and other indicators of genetic engineering that we now know to have almost certainly taken place at some point in the past, they’re going to discover some high-tech version of our existing CRISPR technology.

And they may already be able to identify it; if I have learned one thing from my forays into the biological sciences, it is that scientists are the very last people who are going to discover very big things outside their little boxes, because they are the very definition of people who can’t see the forest for the bark on one specific tree. We can’t reasonably assume that they don’t have the technology to identify it because they’ve literally never even considered looking for it, much less engaged in a systematic and methodical search for the signs of it.

At least, not as far as we’ve been informed, anyhow. Either way, we’re much closer to the empirical confirmation of IGM than the mathematicians of Wistar were to the empirical confirmation of the impossibility of evolution by natural selection and neutral drift in 1966.

And remember, it’s not going to be as simple as aliens = demons or not. There are a whole range of various possibilities and combinations, so if you’re going to seriously contemplate these sorts of things, you absolutely need to set both your dogmatic assumptions and your binary thinking aside.

DISCUSS ON SG