US-Iran Talks Cancelled

It’s being reported that the talks were cancelled because it was very, very important to the USA to expand the talks to include ballistic missiles. Which tends to lend an amount of credence to the belief that the last exchange of ballistic missiles between Israel and Iran hurt the former a lot more than the media indicated.

I understand that Israel is a lot more eager for war than the media tends to be indicating, which tends to indicate that there will be another round soon whether the USA elects to participate or not.

DISCUSS ON SG


Response to Dennis McCarthy, Round 2

The second round presented by Dennis McCarthy was very interesting for what it revealed concerning how much, and how little, about evolution and its various mechanisms is understood by even its more intelligent defenders. I will not bother responding to the first part of the post, as it is a very good example of what I consider to be a Wistarian response, which is attempting to address a mathematical challenge with an orthogonal appeal to tangential logic. The distinction between the various levels of speciation, the similarity of birds and dinosaurs, and the various familial clads are no more relevant to any argument I have presented than who wins the Super Bowl this coming weekend.

Now, here’s where the storytelling gets substantive.

“Ahh,” says the evolution-skeptic, “I don’t care about fossils or biogeography or stories about salamanders or moths. Vox Day has proved mathematically that it can’t happen, so I don’t even have to think about any of this.”

First, Vox Day’s central argument in Probability Zero concerns neutral mutation fixation rates, which says nothing about natural selection and is largely orthogonal to most of what we have been discussing. Even if Motoo Kimura’s neutral theory—and the equations Vox Day disputes—were entirely mistaken, that would not overturn Darwinian evolution, nor would it undermine any of the empirical facts or conclusions considered so far. Vox Day himself effectively concedes as much in his response:

And in the interest of perfect clarity, note this: Dennis McCarthy’s critique of Probability Zero is not, in any way, a defense of evolution by natural selection. Nor can it be cited as a defense of speciation or Darwinism at all, because neutral theory has as about as much to do with Darwin as the Book of Genesis

Actually, my full post response (like this one) did indeed defend evolution by natural selection. And the only reason I veered from the subject of Darwinism at all was to address Vox Day’s main mathematical arguments—and it is Day’s main arguments that are not relevant to Darwinism or evolution by natural selection. And this is true despite what Day frequently implies, what his readers persistently infer, and what the subtitle of Probability Zero plainly states.

Unfortunately, it’s at this point that it becomes clear that McCarthy either hasn’t actually read Probability Zero or somehow managed to miss the central point of the book despite it being right in the subtitle. It’s clear that he confused my secondary mathematical argument, which addressed Kimura’s neutral theory, which is a) not part of the Modern Synthesis, b) does not involve natural selection, c) is algebraically doomed to be mathematically incorrect, and d) is the non-Darwinian ground to which professional biologists have retreated due to their recognition that natural selection is incapable of accounting for the observed genetic divergence between any two distinct, but related species, for the primary one.

It’s a little hard to understand how McCarthy manages to focus on the 1,600 generations per fixation rate measured in the lab with E.coli bacteria while somehow missing the entire mathematical argument that started this whole discussion back 2019 with MITTENS, which is entirely and only concerned with natural selection. This is the core equation that is integral to MITTENS, which McCarthy still has not addressed:

F_max = (t_div × d) / (g_len × G_f)

  • F_max = maximum achievable fixations
  • t_div = divergence time (in years)
  • g_len = generation length (in years)
  • d = Selective Turnover Coefficient
  • G_f = generations per fixation

As you can easily verify for yourself, the MITTENS formula not only disproves any possibility of natural selection accounting for the observed post-CHLCA human-chimpanzee divergence, but has also been confirmed to disprove any possibility of natural selection accounting for every single divergence between two species for which we have the necessary data to check, including Human–Chimp, Human–Gorilla, Human–Orangutan, Bonobo–Chimp, W–E Gorilla, S–B Orangutan, Chimp subspecies, D. mel–D. sim, Mus–M. spretus, Chicken–Turkey, Horse–Donkey, Afr–Asian Elephant, Sav–Forest Elephant, L. Victoria cichlids, L. Malawi cichlids, and Stickleback M–F.

All of the detail concerning this can be found in “The Universal Failure of Fixation: MITTENS Applied Across the Tree of Life” paper which I have posted publicly in an open repository. Run the numbers for yourself if you are skeptical. And, by the way, you should probably note that the argument does not rely upon G_f, the 1,600 generations per fixation number, because running the calculation the opposite way shows that the selection coefficients required to account for the fixations are not possible either, as demonstrated in “Minimum Selection Coefficients Required for Speciation: A Cross-Taxa Quantitative Analysis” which is not in the repository but appears in Chapter Two of The Frozen Gene. Even fruit flies would require a selection coefficient of 245% to account for the observed genetic gap; but the observed selection coefficient is between 0.1% and 1%. No matter which way you look at the problem, it’s clear that evolution by natural selection is totally impossible.

However, even though McCarthy hasn’t yet addressed my actual case against natural selection, he did touch usefully on it when he appealed to a study on adaptive variation.

Day starts discussing beneficial mutations, and in regional mammalian populations, beneficial mutations can, under severe selective pressures, sweep to fixation across the entire population in only tens or hundreds of generations. Numerous lab and field studies confirm this (e.g., Steiner, C. C., Weber, J. N., & Hoekstra, H. E. (2007). Adaptive variation in beach mice produced by two interacting pigmentation genes. PLoS Biology 5(9): e219.)

As before, McCarthy’s critique proved to be very beneficial to the larger case against natural selection I’ve been making, as a close look at the study revealed that what I had originally intended as nothing more than a disproof has application as a predictive model. Because what the study gives us the ability to demonstrate is how MITTENS is capable of distinguishing between an adaptive mutational fixation that is possible within the observed time limits and the large quantity of speciating mutational fixations that are not. Remember, MITTENS never claimed that no fixations are possible over time, only that the possible number is much, much smaller than the observed speciating divergences require.

In fact, a review of the beach mice study led to another paper, this one entitled “The Scope of Natural Selection: MITTENS-Validated Case Studies in Local Adaptation” which demonstrates that MITTENS can be usefully applied to calculating the quantity of fixations that are possible in a given period of time, thereby converting it into an effective predictive model.

The MITTENS framework (Mathematical Impossibility of The Theory of Evolution by Natural Selection) establishes quantitative constraints on achievable fixations based on generation time, the selective turnover coefficient (d), and empirically observed fixation rates. While MITTENS demonstrates a 158,000-fold shortfall for macro-evolutionary divergence (e.g., human-chimpanzee), critics might argue that local adaptation represents an intermediate test case. Here we examine four well-documented examples of local adaptation: beach mouse pigmentation, stickleback armor reduction, peppered moth melanism, and warfarin resistance in rats. In every case, the required genetic changes involve 1–3 fixations—precisely the scale MITTENS predicts natural selection can accomplish. Using taxon-appropriate parameters and, where available, empirically measured selection coefficients, we show that all four cases pass MITTENS constraints. The peppered moth case is particularly instructive: MITTENS predicts 0.66 achievable fixations, implying the allele should reach high frequency but not fix—exactly what was observed before selection reversed. These results confirm that natural selection operates effectively within its proper scope while remaining incapable of the million-fold extrapolation required for macro-divergence.

Now let’s look at the next element of his critique:

So in the first generation after the chimpanzee/human split, there were 1,000,000 new mutations—1/20,000 of which may be expected to reach fixation—or 50 fixed mutations per generation. But we should not expect these 50 mutations to fix immediately, but after 40,000 generations. 50 more mutations from the 2nd generation should fix around the 40,001st generation. And so on.

Since hominids have had 450,000 generations, all mutations would have had time to fix except for those mutations occurring in the last 40,000 generations. What is more, the human race has been widely dispersed for tens of thousands of years, with some populations living in constant isolation, necessarily preventing them from sharing mutations with the rest of the world.

So let’s subtract out the last 50,000 generations, which leaves us with 400,000 generations. 400,000 generations x 50 fixed mutations per generation = 20 million fixed mutations. We can also calcuclate this another way: 400,000 generations x 1,000,000 new mutations per generation = 400 billion new mutations altogether. Each new mutation has a probability of 1/20,000 in becoming fixed, so: 400 billion mutations x 1/20,000 = 20 million fixed mutations.

And that equals the 20 million fixed mutations that have been observed. Change the assumptions, and the estimate moves. But under Vox’s own assumptions, the result is the opposite of “probability zero”: it’s what you’d predict.

What McCarthy has focused on here is neutral theory, and specifically, the Kimura Fixation model, which inevitably produces mathematically incorrect results due to the fact that its derivation is algebraically incorrect. The correct equation is the one that is presented in The Frozen Gene, which, to be fair, McCarthy has not read. Nor would he have any reason whatsoever to suspect that the Kimura model he’s using is incorrectly derived and that its results are hopelessly wrong. Just to be clear, the correct equation is this one:

k = 2Nμ × 1/(2Nₑ) = μ(N/Nₑ)

But we’ll set that aside; given that the entire field of population genetics missed that one for 58 years, we certainly cannot expect McCarthy to have noticed that. But, it might behoove my critics to be aware that this is the sort of thing I am capable of doing before attempting to claim that I didn’t understand something about the standard theories. So what we’ll focus on instead is the fact that even if we grant the validity of the neutral theory model, McCarthy’s application of it is completely incorrect on multiple levels.

McCarthy’s version of the model requires that at any given moment, millions of mutations are simultaneously “in progress” toward fixation—each at a different frequency stage, all drifting upward in parallel. That’s as per Kimura’s theory. However, this runs right into the Bernoulli Barrier. For 1,000,000 mutations “in the pipeline” simultaneously, the probability that they all successfully navigate drift to fixation—rather than being lost—is astronomically small. Each neutral mutation has only a 1/(2Nₑ) chance of fixing rather than being lost. The vast majority of mutations that enter the population are lost, not fixed. McCarthy’s model treats the expected value as though it were guaranteed. “50 mutations per generation should fix, therefore 50 mutations per generation do fix.”

But that’s not how probability works. The expected value is the average over infinite trials, it is not a guaranteed throughput. And in the real world, you don’t get infinite trials, you get precisely one run.

Unlike many evolutionary biologists, McCarthy does correctly grasp that geographic dispersion renders fixation impossible.

Evolutionary theory predicts that no mutation, whether neutral or beneficial, that has arisen in the last 50,000 years or so can reach and spread throughout all populations on the planet. The reason is that over that time—and especially over the last 10,000 years—human populations have become fragmented and geographically isolated in places such as New Guinea, Australia, Tasmania, the Andaman Islands, the Pacific islands, and the Americas. Most people of these regions have had effectively zero genetic contact with the rest of the world until very recently, if at all.

Under such conditions, it has been impossible for any single mutation—whether neutral or beneficial—to reach fixation across the entire human species. The genes that helped some Europeans survive the Black Death in the 1300s, for example, could never have also raced across the Americas (neither group even knew each other existed at this time), let alone reach the Hewa people of New Guinea, who would not see a white person until 1975. Instead, the roughly 20 million fixed genetic differences between humans and chimpanzees accumulated during the millions of years when ancestral hominid populations were relatively small, geographically concentrated, and tightly interconnected by gene flow.

That’s a logical conclusion. But here’s the problem with that. Genetic drift doesn’t stop. Once fixed does not mean always fixed. So let’s run the numbers and assume that all of the 20 million fixed differences between humans and the proto-chimp – not chimpanzees, because with chimpanzees we need to account for a total of 40 million – mostly happened very early in the process. In a population with effective size Nₑ = 10,000 (the value McCarthy uses), the expected time until a new mutation arises at any specific fixed site is:

Time to new mutation = 1 / (2Nₑ × μ)

Plugging in the values: 1 / (2 × 10,000 × 1.5 × 10⁻⁸) = 1 / (3 × 10⁻⁴) ≈ 3,333 generations

So on average, every fixed site will experience a new mutation within approximately 3,300 generations. Most of these new mutations will not fix themselves; the gene will become polymorphic, and the original fixation will be lost. So over long timescales, the cumulative probability that a site remains unchanged becomes vanishingly small.

The probability that a fixed site has experienced no new mutations over T generations is: P(unchanged) = e^(−2NₑμT)

McCarthy’s model, necessarily corrected for continued mutation, predicts roughly 150,000 fixed differences between humans and the CHLCA. We observe approximately 20 million. The McCarthy model is therefore short by a factor of 133× even when we grant him a) an ancient Nₑ of 10,000, b) Kimura’s invariance, and c) 20 million free ancestral mutations.

The real number based on actual values is even worse for his attempted rebuttal. In utilizing the math he presented, we adjust for a) the correct ancient effective population of 3,300, b) we correct Kimura’s algebra and incorporate the correct current population of 8 billion.

See what a difference it makes:

McCarthy’s Original Calculation:

  • Population: Nₑ = 10,000 (theoretical effective population size AND theoretical current population)
  • Mutations per generation: 100 per individual × 10,000 individuals = 1,000,000
  • Fixation probability: 1/2N = 1/20,000
  • Generations: 400,000
  • Total mutations: 400 billion
  • Expected fixations: 400 billion × 1/20,000 = 20 million fixations

Corrected Calculation Using Actual Values:

  • Nₑ = 3,300 (actual ancient effective population size)
  • N = 8,000,000,000 (actual population)
  • Mutations per generation: 100 per individual × 3,300 individuals = 330,000
  • Fixation probability: 1/2N = 1/16,000,000,000
  • Generations: 400,000
  • Total mutations: 330,000 × 400,000 = 132 billion
  • Expected fixations: 132 billion × 1/16,000,000,000 = 8.25 fixations

Thus the shortfall increases from 133x to 2,424,242x when we go from the theoretical to the actual.

There is absolutely nothing Dennis McCarthy or anyone else can do at this point to salvage either natural selection or neutral theory as an adequate engine for evolution and the origin of species. The one is far too weak to account for the empirically observed genetic changes and the second is flat-out wrong. Neither of them, alone or in combination with every other suggested mechanism for evolution, can come within several orders of magnitude of the quantity required to account for the genetic divergence in humans and chimpanzees, in mice and rats, in chickens and turkeys, or in savannah and forest elephants.

DISCUSS ON SG


Veriphysics: The Treatise 003

IV. The Inversion of Rights

No concept is more central to the Enlightenment’s self-understanding than the idea of natural right, the inherent entitlements that belong to every human being by virtue of reason and nature, prior to and independent of any government. Life, liberty, property, the pursuit of happiness: these were to be the inviolable foundations upon which a just, rational, and enlightened society would be built.

The subsequent history of human rights demonstrates something the Enlightenment philosophers clearly did not anticipate and never discussed: a right without a sound basis is a right that can be redefined, expanded, contracted, and ultimately inverted by a government deemed capable of granting and defining them.

Consider the fate of intellectual freedom, that most cherished of Enlightenment values. J.B. Bury, in his 1913 History of the Freedom of Thought, offered a confident chronicle of humanity’s liberation from the shackles of religious and political censorship. The trajectory seemed clear: from the persecution of Socrates, through the medieval suppression of heresy, to the hard-won victories of the modern age, mankind was progressing toward ever-greater liberty of mind. A great scholar and the editor of The Cambridge Medieval History series, Bury wrote as a true believer in Enlightenment ideals, and his beliefs were representative of educated opinion in his time.

The subsequent century has not vindicated his optimism.

The progression—or rather, regression—is traceable through the very language of the freedom Bury celebrated. The original concept was freedom of thought. But this, upon examination, is a tautology. No external power has ever been able to reach into a man’s mind and compel his thoughts. The Inquisition could burn a heretic; it could not make him believe. Thought is already free by its very nature—it is private, inaccessible, beyond the reach of any tyrant. To proclaim “freedom of thought” as a right is to proclaim a right to what no one can take away.

The tautology was resolved by externalizing the freedom. Freedom of thought became freedom of speech: the liberty not merely to think but to express, to articulate, to attempt persuasion. The fact that freedom of speech was always fundamentally flawed and utilized primarily to defuse the blasphemy laws in Christian societies never seemed to trouble its champions, even as people were punished for perjury, slander, and other speech-related crime.

But the expansion of the right did not stop there. Freedom of speech was soon expanded into freedom of expression: not merely words but conduct, symbols, art, gesture—the full range of human communicative action. This expansion seemed natural, even inevitable. If speech is protected, why not the t-shirt with a slogan, the armband, the flag, the dance, the photograph, or the pornographic video. Expression is simply speech by other means, after all.

Even as the scope of the freedom was expanded, the Enlightenment tradition also expanded the domain of regulating speech. Once expression is the category, expression can be parsed, distinguished, and classified. Some expressions are protected; others are not. And who determines the boundaries? Those with the power to enforce them.

The terminus of this progression is now visible. In the nations most committed to Enlightenment values, the ones that pride themselves on their liberal traditions and constitutional protections, speech is criminalized today to a degree that would have astonished Bury. In Britain, in Germany, in France, in Canada, and increasingly in the United States, one may not express, and in some cases may not be permitted to hold, certain prohibited thoughts. “Hate speech” codes, “anti-discrimination” requirements, “anti-extremism” measures: the vocabulary varies, but the effect is consistent. The freedom of thought that Bury celebrated has become the regulation of expression that his heirs enforce.

The right, unmoored from any transcendent ground and no longer endowed by Man’s creator, transmogrified into anything those in power declared it to be or not to be. The freedom to think became the freedom to speak, became the freedom to express, and became the freedom to express only what is permitted, which is to say, no freedom at all. The Enlightenment’s signature achievement consumed itself through its own warped logic, and those who enforced the final inversion did so in the name of the very values they were negating.

Nor was this the only right that was modified over time. The right of free association transformed into the crime of racism. The right to worship the Christian God was reduced to the right to pray in silence so long as no one noticed. The right of self-defense was inverted into an obligation to retreat. Even the marital rights of a man to his wife and children, honored throughout the centuries, were reduced to nothing more than financial obligations.

The is not a corruption of Enlightenment principles by its enemies. To the contrary, it is the application and extension of those principles by their truest believers.

DISCUSS ON SG


Preface to The Frozen Gene

I’m very pleased to announce that the world’s greatest living economist, Steven Keen, graciously agreed to write the preface to The Frozen Gene which will appear in the print edition. The ebook and the audiobook will be updated once the print edition is ready in a few weeks.

Evolution is a fact, as attested by the fossil record, and modern DNA research. The assertion that evolution is the product of a random process is a hypothesis, which has proven inadequate, but which continues to be the dominant paradigm promulgated by prominent evolutionary theorists.

The reason it fails, as Vox Day and Claude Athos show in this book, is time. The time that it would take for a truly random mutation process, subject only to environmental selection of those random mutations, to generate and lock in mutations that are manifest in the evolutionary complexity we see about us today, is orders of magnitude greater than the age of the Universe, let alone the age of the Earth. The gap between the hypothesis and reality is unthinkably vast…

The savage demolition that Day and Athos undertake in this book of the statistical implications of the “Blind Watchmaker” hypothesis will, I hope, finally push evolutionary biologists to abandon the random mutation hypothesis and accept that Nature does in fact make leaps.

Read the whole thing there. There is no question that Nature makes leaps. The question, of course, is who or what is the ringmaster?

It definitely isn’t natural selection.

DISCUSS ON SG


The Reproducibility Crisis in Action

Now, I could not care less about the catastrophic state of professional science. Most scientists are midwits who are wholly incapable of ever doing anything more than chasing credentials, and the scientific literature ranges from about 50 percent to 100 percent garbage, depending upon the field. But I do feel sufficient moral duty to the great archive of human knowledge to bring it to the attention of the professionals when the very foundation upon what they’re basing a fairly significant proportion of their work is obviously, observably, and provably false.

So I submitted a paper calling attention to the fact that Kimura’s fixation model, upon which all neutral theory is based, is algebraically incorrect due to an erroneous cancellation in its derivation. In short, Kimura fucked up massively by assigning two different values to the same variable. In order to make it easy to understand, let me make an analogy about Democrats and Republicans in the US political system.

T = D + R, where D = 1-R.

This looks reasonable at first glance. But in fact, D stands for two different things here. It stands for both Democrats and it stands for Not Republicans. These two numbers are always going to be different because Democrats (47%) are not the same as Democrats + Independents (62%). So any derivation that cancels out D as part of an equation is always going to result in the equation producing incorrect results. Even for the most simple equation of the percentage of the US electorate that is divided into Democrats and Republicans, instead of getting the correct answer of 85, the equation will produce an incorrectly inflated answer of 100.

So you can’t just use D and D to represent both values. You would do well to use D and Di, which would make it obvious that they can’t cancel each other out. Kimura would have been much less likely to make his mistake, and it wouldn’t have taken 57 years for someone to notice it, if instead of Ne and Ne he had used Ne and Nc.

So, I write up a paper with Athos and submitted it to a journal that regularly devotes itself to such matters. The title was: “Falsifying the Kimura Fixation Model: The Ne Equivocation and the Empirical Failure of Neutral Theory” and you can read the whole thing and replicate the math if you don’t want to simply take my word for it.

Kimura’s 1968 derivation that the neutral substitution rate equals the mutation rate (k = μ) has been foundational to molecular evolution for over fifty years. We demonstrate that this derivation contains a previously unrecognized equivocation: the population size N in the mutation supply term (2Nμ) represents census individuals replicating DNA, while the N in the fixation probability (1/2N) was derived under Wright-Fisher assumptions where N means effective population size. For the cancellation yielding k = μ to hold, census N must equal Ne. In mammals, census populations exceed diversity-derived Ne by 19- to 46-fold. If census N governs mutation supply while Ne governs fixation probability, then k = (N/Ne)μ, not k = μ. This fundamental error, present in both the original 1968 Nature paper and Kimura’s 1983 monograph, undermines the theoretical foundation of molecular clock calculations and coalescent-based demographic inference. Empirical validation using ancient DNA time series confirms that the Kimura model systematically mispredicts allele frequency dynamics, with an alternative model reducing prediction error by 69%.

This is a pretty big problem. You’d think that scientists would like to know that any results using that equation are guaranteed to be wrong and want to avoid that happening in the future, right? I mean, science is all about correcting its errors, right? That’s why we can trust it, right?

Ms. No.: [redacted]
Title: Falsifying the Kimura Fixation Model: The Ne Equivocation and the Empirical Failure of Neutral Theory
Corresponding Author: Mr Vox Day
All Authors: Vox Day; Claude Athos

Dear Mr Day,

Thank you for your submission to [redacted]. Unfortunately, the Editors feel that your paper is inappropriate to the current interests of the journal and we regret that we are unable to accept your paper. We suggest you consider submitting the paper to another more appropriate journal.

If there are any editor comments, they are shown below.

As our journal’s acceptance rate averages less than half of the manuscripts submitted, regretfully, many otherwise good papers cannot be published by [redacted].

Thank you for your interest in [redacted].

Sincerely,

Professor [redacted]
Co-Chief Editor
[redacted]

Apparently showing them that their math is guaranteed to be wrong is somehow inappropriate to their current interests. Which is certainly an informative perspective. Consider that after being wrong for fifty straight years, they’re just going to maintain that erroneous course for who knows who many more?

Now, I don’t care at all about what they choose to publish or not publish. I wouldn’t be protecting the identities of the journal or the editor if I did. It’s their journal, it’s their field, and they want to be reliably wrong, that’s not my problem. I simply fulfilled what I believe to be my moral duty by bringing the matter to the attention of the appropriate authorities. Having done that, I can focus on doing what I do, which is writing books and blog posts.

That being said, this is an illustrative example of why you really cannot trust one single thing coming out of the professional peer-reviewed and published scientific literature.

DISCUSS ON SG


Dennis McCarthy’s Round 2

Dennis McCarthy has responded to my response to his initial critique:

“Ahh,” says the evolution-skeptic, “I don’t care about fossils or biogeography or stories about salamanders or moths. Vox Day has proved mathematically that it can’t happen, so I don’t even have to think about any of this.”

First, Vox Day’s central argument in Probability Zero concerns neutral mutation fixation rates, which says nothing about natural selection and is largely orthogonal to most of what we have been discussing. Even if Motoo Kimura’s neutral theory—and the equations Vox Day disputes—were entirely mistaken, that would not overturn Darwinian evolution, nor would it undermine any of the empirical facts or conclusions considered so far. Vox Day himself effectively concedes as much in his response:

And in the interest of perfect clarity, note this: Dennis McCarthy’s critique of Probability Zero is not, in any way, a defense of evolution by natural selection. Nor can it be cited as a defense of speciation or Darwinism at all, because neutral theory has as about as much to do with Darwin as the Book of Genesis

Actually, my full post response (like this one) did indeed defend evolution by natural selection. And the only reason I veered from the subject of Darwinism at all was to address Vox Day’s main mathematical arguments—and it is Day’s main arguments that are not relevant to Darwinism or evolution by natural selection. And this is true despite what Day frequently implies, what his readers persistently infer, and what the subtitle of Probability Zero plainly states.

Secondly, as I showed, his two main analyses were both flawed. He contended that it was essentially impossible, given the circumstances of mutation rates, population size, fixation-probability, etc., for the human lineage to have acquired 20 million fixed mutations in the nine million years since humans and chimpanzees last shared an ancestor.

As before, I invite those who are interested in participating in the discussion to read the whole thing there and comment on his site. I will refrain from responding to it until tomorrow. I will note that this is why it is important to read The Frozen Gene as well as Probability Zero, as the former completes the comprehensive case begun in the latter. I have, however, taken the liberty of correcting his cartoon.

By the way, I would be remiss if I did not mention that the print edition of TFG will feature a preface from a somewhat surprising source. The ebook will be updated accordingly, of course.

DISCUSS ON SG


Veriphysics: The Treatise 002

III. The Political Failures

The Enlightenment promised to place politics on a rational foundation. In place of the divine right of kings, the accidents of inheritance, and the weight of tradition, the people would be ruled more justly by a government grounded in reason and consent. The results of this centuries-long experiment are now in, and they do not vindicate those who advocated for it.

Jean-Jacques Rousseau’s Social Contract, published in 1762, proposed that legitimate political authority rests upon an agreement among free individuals to submit to the general will. The concept was elegant and has proven remarkably durable as a legitimating fiction. But it was never anything more than a fiction. No actual contract was ever signed. No one has ever been consulted about its terms nor has anyone ever been permitted to negotiate them. The consent of the governed is presumed from the mere fact of residence and geographic location, which is to say, it is not consent at all but submission enforced by the impracticality of any alternative. The man who may freely leave a country provided he abandons his home, his family, his language, his livelihood, and everything he knows, is not free in any meaningful sense. He is merely presented with a choice between submission and exile, and given the universal jurisdiction claimed by some countries, he may not even have that.

This abstraction at the heart of social contract theory, the idea that rational individuals in some imaginary past are assumed to have agreed to certain specific terms, does precisely the work that rational argument can never do: it manufactures consent that was never given by anyone. And this manufactured consent has proven useful for its ability to justify anything. Just thirty years after the publication of the Social Contract, Robespierre was sending men to their deaths on the guillotine in the name of the general will. The Jacobins were not betraying Rousseau’s principles, to the contrary; they were applying them. If the general will is supreme, and if some enlightened vanguard is able to discern that will more clearly than the confused masses, then terror in the service of the general will is not tyranny, but liberation. The Revolution did not reject the social contract. It followed exactly where its logical premises led.

Representative democracy was meant to solve the problem of scale: direct democracy being impractical for large nations in the Eighteenth Century. Therefore, the people would elect representatives to deliberate on their behalf. The representatives would be constrained by accountability to their constituents, and the result would approximate the will of the people as closely as circumstances allowed.

Three centuries of practice have demonstrated the gap between theory and reality. The representatives are accountable not to the people but to the interests that fund their campaigns and the parties that control their advancement. The people are consulted every few years, presented with choices they did not make, between candidates selected by processes they do not control, on platforms that will be abandoned the moment they become inconvenient. Between elections, the permanent bureaucracy—elected by no one, accountable to no one—governs according to its own institutional logic. The people’s will, to the extent it can be determined, is an obstacle to be managed through media, education, and when necessary, simple disregard.

And direct democracy, which is now tenable due to technological advancement, is opposed everywhere by the representatives who claim to speak for the people. Referendums that consult the people directly are opposed by politicians and overturned by judges. The genuine will of the people is systematically thwarted by the Enlightenment’s parody of itself.

This anti-popular representative democracy is not a deviation from the democratic ideal; it is its mature expression. The Enlightenment theorists imagined that rational voters would deliberate on the common good and select wise representatives to enact sound policy. They refused to contemplate the way in which the structures of representative democracy would inevitably be captured by those with the strongest motivations to do so and the sufficient resources to control them. The will of the people is not expressed by modern democracy; it is manufactured by the elite, distributed by the media, channeled through one or another of the ruling party’s factions, and then imposed by the government.

The separation of powers was designed to prevent tyranny by dividing authority among competing branches. The executive, legislative, and judicial were supposed to check each other’s oversteps, ensuring that no single faction could dominate. This mechanism has proven altogether inadequate to its stated purpose. The branches have not remained in productive tension; they have merged into a single ruling apparatus with superficial divisions. The legislature delegates its authority to executive agencies and abdicates its responsibility to make difficult decisions. The judiciary legislates from the bench, discovering in ancient documents various rights, requirements, and limitations that none of its authors could ever have imagined. The executive acts unilaterally whenever the legislature proves inconvenient. The separation of powers has not contained government overreach, it has instead provided a complex machinery for diffusing responsibility and eliminating accountability while concentrating effective control.

What the Enlightenment theorists failed to take into consideration is that political structures do not operate upon rational principles, but upon incentives, interests, and the will to power. Parchment barriers, however cleverly designed, constrain only those who choose to be constrained. The Constitution of the United States has not prevented the emergence of a surveillance state, a long series of undeclared wars, the demographic adulteration of the nation, or the periodic disenfranchisement of half the citizenry through the two-party system. It has merely required that all of the developments that materially harm the very Posterity whose rights the Constitution was written to safeguard be dressed in constitutional language.

DISCUSS ON SG


Jeffrey Epstein is Still Alive

I’m not even remotely surprised by this news. The picture of the “corpse” they showed obviously wasn’t the same man. The nose wasn’t even close.

An anonymous 4chan poster said that Epstein was wheeled out of prison hours before his reported death

Subpoenas show that man was Roberto Grijalva, who was a lieutenant at the prison at this time

It appears Epstein really did get broken out of prison and flown to Israel

Remember, if the mainstream media reports it, then it isn’t true. It never is.

DISCUSS ON SG


The Collapse of the Liberal Order

The inevitable failure of the post-WWI liberal world order is increasingly obvious to everyone now, but they’re still not connecting it to the even more inevitable failure of the Enlightenment and its false ideals.

What was the clearest early sign for you that the unipolar order was beginning to fracture?

The theorists such as Huntington, Faye, and Pat Buchanan were all writing about the inevitable fracture in the early 1990s. But for me, there were three events that conclusively indicated that the unipolar world was cracking.

The 2014 annexation of Crimea marked the first real irreversible breach. This wasn’t merely territorial – it was civilizational. President Putin invoked the baptism of Kievan Rus in 988, positioning Russia as the Third Rome inheriting Byzantium’s mantle. While Western elites dismissed this as nothing more than manipulative propaganda, they missed the core signal: a major power was reorganizing its legitimacy around its own territorial hegemony based on religious-historical continuity rather than liberal democratic norms.

The second sign was China’s 2015 declaration of cyber sovereignty. When Beijing asserted that nations have an absolute right to regulate internet activities within their borders, it wasn’t fundamentally about censorship – it was about civilizational control over cyberspace. The split internet wasn’t a bug; it was the architecture of civilizational spheres reawakening through technology.

The third indicator was the 2016 Brexit vote paired with Trump’s election. Brexit represented the first time a globalist institution like the EU actually contracted and shrank. And Trump ran on a political platform that promised to dismantle the liberal international order. These weren’t isolated populist spasms but the first mass democratic repudiations of Francis Fukuyama’s “end of history” thesis, as he himself has admitted. The liberal order’s legitimacy collapsed not from external attack but from internal hollowing – its own populations voting against its continuance.

This is the deeper point that a lot of observers are missing. They’re still trying to figure out how the Enlightenment ideals in which they still believe can be implemented in whatever replaces the failing world order, but this is a fundamentally flawed perspective because it is the failure of the ideals that is causing the failure of the world order.

However, simply attempting to return to traditional ideals won’t work, not because the ideals are false, but because the knowledge upon which they are based and their practical applications are at least 300 years out of date. Hence the need for a new post-Enlightenment philosophy that is capable of serving as the intellectual foundation for humanity’s eventual post-crash recovery.

DISCUSS ON SG


Diversity Will Hunt You Down

This is what it making the coming demographic repairs both inevitable and unavoidable. Because Clown World is aggressively attempting to not only eliminate the right to free association, but every last vestige of the ability of white people to live amongst themselves in the style they prefer:

The British countryside is in the midst of a diversity drive after a government-commissioned report found it was too ‘white’ and ‘middle-class’.

Officials charged with managing some of the country’s best known beauty spots have laid out a series of proposals aimed at attracting minorities.

The plans follow a review, ordered by the Department for Environment, Food & Rural Affairs (Defra), which warned the countryside was seen as ‘very much a white environment’ and risked becoming ‘irrelevant’ in a multicultural society.

In the wake of the report, officials representing National Landscapes – including the Cotswolds and Chilterns – have now published a series of management plans that detail their proposals to attract more minority communities. 

The Chiltern National Landscape will launch an outreach programme in Luton and High Wycombe targeted at Muslims.

One factor stopping ethnic minorities visiting was said to be ‘anxiety over unleashed dogs’.

Translation: they’re eventually going to try to ban dogs. I had no idea that leash laws were actually about imposing diversity on people.

The extremists of the younger generations are going to be a bit much for those of us who grew up comfortably indoctrinated in It’s a Small World Disney propaganda. For Generation X, our role is largely going to be to shut up, stay out of their way, and let them get on with fixing the problem as they see fit. Because we all know who their patron saint is going to be, and it isn’t St. George.

DISCUSS ON SG