Plague for profit

All right, so this Blind Item at Crazy Days and Nights sounds way too far-fetched to be true; it’s more akin to a SF-horror novel plot than actual news, right?

Apparently The Church is finding it more difficult to bring in children the way they have in the past. That elusive head of the Church has donated tens of millions of dollars to research against diseases, many of which adversely affect third world countries. It was during this process of trying to eradicate a disease that one of the scientists created a pathogen which can kill swiftly and effectively. When the head of The Church heard about it, he agreed to test it on a village in a country that was friendly to bribes. It worked really well. It killed an astonishing number of people which were mainly adults. The children of the adults were 30-40 miles distant at a boarding school. Now, with no parents, they needed to be adopted. The Church, along with more bribes to the government had a great way to get large numbers of children quickly.

With that success, they decided to try it again, but this time, the villagers didn’t stay in place as they had before and some traveled to a neighboring village. The next thing you know, it has now started spreading to different countries and killing people faster than they can create cover stories. Look for them to spread the rumor it is an Ebola outbreak to give themselves a chance to destroy the evidence of what they did.

It wouldn’t shock me if they come forward with a cure and make hundreds of millions of dollars. That might bring too much publicity for them though. Even they would have tough time watching thousands of people die though wouldn’t they?

I mean, lethal artificial diseases that are being tested in third world countries is just crazy conspiracy theory, right? Right?

Fears are growing of a major health crisis in East Africa as a girl died of a suspected fever which could be more deadly than the Black Death. A nine-year-old girl died in central Uganda with the symptoms of an eye-bleeding disease which it is thought could kill up to 40 per cent of those infected by it.

The feared outbreak comes only months after hundreds of people were killed by the plague in Madagascar in what was described as the worst bout for 50 years. The symptoms of the new disease include headaches, bleeding, vomiting, diarrhoea and muscle pains.

This timeline just keeps getting weirder. If the Vikings win the Super Bowl, we’ll know all bets are officially off and literally anything can happen.


Narcissist vs psychopath

Anonymous Conservative explains how similar behaviors stem from very different sources:

My view of the Narcissist is their amygdala is too painful when triggered, and their brain is not able to handle the stimulation of it. The narcissists I have observed would actually see their brains melt down when triggered, and it would manifest in what looked like incredibly unpleasant physical symptoms, almost combining a seizure, and the gastrointestinal upset and sickness of a major illness.

The psychopath is the opposite. Their amygdala is not there, so they don’t really feel fear. I am reminded of the character Hannibal Lector in the book Hannibal. At a critical moment, man-eating hogs are released, and rush toward Hannibal, who is holding FBI agent Clarice Starling in his arms. But the pigs move around Hannibal, because he feels no fear, and the pigs detect it. Although the scene is fictional, that is how psychopath brains operate.

Now narcissists, out of necessity, eventually hack their brains by using a false reality to shut off that amygdala-pain. They develop the ability to force their brain to believe something untrue, just so their amygdalae will feel relief and their amygdala will not turn on. I am quite certain it begins in childhood. As children however, I am not sure if they force themselves to believe an untruth, and that eventually becomes more more common as their brain finds it relieves angst, or if the untruth, when contemplated, is so relieving their brain cannot tell it from the truth. From their amygdala’s perspective, that would feel the same as when we find believing a falsehood irritating, and as a result we seek relief when we default to truth.

But once a narcissist develops this hack, now their amygdala’s influence on the brain and behavior is very similar to how a psychopath’s amygdala influences the brain and behavior. It is as if the amygdala is not there. The psychopath just feels nothing, while the narcissist alters their beliefs until they feel nothing.

Interestingly enough, he concludes that narcissists are more dangerous than psychopaths, because psychopaths are too clueless to be able to conceal themselves or their deeds very effectively.


Genetically inferior

More scientific evidence in support of my original hypothesis that atheism is a form of mental abnormality that results in spiritual insensitivity is accumulating:

Left-handed people are more likely to be atheists, a study has found, as it says belief is passed on genetically.  The study suggests that religious people have fewer genetic mutations and are therefore less likely to be left handed or have conditions such as autism or schizophrenia.

British academic Edward Dutton, a professor at Oulu University, Finland, said that in pre-industrial times religiosity was passed on like other genetic attributes because it was associated with greater stability, mental health and better social behaviour. But modern science means many people who would not previously have survived are making it to adulthood and reproducing – leading to a greater incidence of atheism.

Lack of belief in God is connected to genetic mutations which cause attributes such as left-handedness or autism, the paper argues.

This would also put Bruce Charlton’s Mouse Utopia observations into context, as atheism appears to be one aspect of the nihilistic despair that is a consequence of the increased prevalence of genetic inferiority that results from easier circumstances.



The Roman invasion of Britain

Caesar’s invasion site is believed to have been found:

The first Roman invasion of Britain by Julius Caesar in 55BC is a historical fact, with vivid accounts passed down by Tacitus, Cicero and Caesar himself. Yet, despite a huge landing force of legionaries from 800 ships, no archaeological evidence for the attack or any physical remains of encampments have ever been found.

But now a chance excavation carried out ahead of a road building project in Kent has uncovered what is thought to be the first solid proof for the invasion. Archaeologists from the University of Leicester and Kent County Council have found a defensive ditch and javelin spear at Ebbsfleet, a hamlet on the Isle of Thanet. The shape of the ditch at Ebbsfleet, is similar to Roman defences at Alésia in France, where a decisive battle in the Gallic War took place in 52 BC.

Experts also discovered that nearby Pegwell Bay is one of the only bays in the vicinity which could have provided harbour for such a huge fleet of ships. And its topography echoes Caesar’s own observations of the landing site.

Dr Andrew Fitzpatrick, Research Associate from the University of Leicester’s School of Archaeology and Ancient History said: “Caesar describes how the ships were left at anchor at an even and open shore and how they were damaged by a great storm. This description is consistent with Pegwell Bay, which today is the largest bay on the east Kent coast and is open and flat.

“The bay is big enough for the whole Roman army to have landed in the single day that Caesar describes. The 800 ships, even if they landed in waves, would still have needed a landing front 1-2 km wide. Caesar also describes how the Britons had assembled to oppose the landing but, taken aback by the size of the fleet, they concealed themselves on the higher ground. This is consistent with the higher ground of the Isle of Thanet around Ramsgate.”

Thanet has never been considered as a possible landing site before because it was separated from the mainland until the Middle Ages by the Wanstum Channel. Most historians had speculated that the landing happened at Deal, which lies to the south of Pegwell Bay.

This is, of course, absolutely fascinating in its own right. It would be intriguing to compare the layout of the land with the historical descriptions. But it is also an important lesson in the difference between history and archeology, and how there is very little physical evidence of many events that are widely accepted as having taken place.

Science is simply not a viable metric for the past, due to the intrinsic limits of scientody.


Not a good start

After reading Tom Wolfe’s unstinting praise of EO Wilson, I decided I need to read the man’s work. Who could fail to be interested after this sort of billing?

He could be stuck anywhere on God’s green earth and he would always be the smartest person in his class. That remained true after he graduated with a bachelor’s degree and a master’s in biology from the University of Alabama and became a doctoral candidate and then a teacher of biology at Harvard for the next half century. He remained the best in his class every inch of the way. Seething Harvard savant after seething Harvard savant, including one Nobel laureate, has seen his reputation eclipsed by this terribly reserved, terribly polite Alabamian, Edward O. Wilson.

Fantastic. But as I am insufficiently learned to read his scientific work critically, I elected to begin with his philosophical work, specifically, The Meaning of Human Existence. And I was unexpectedly disappointed on only the second page. To say that it does not begin well for a man of supposedly superlative intelligence would be an understatement.

In ordinary usage the word “meaning” implies intention, intention implies design, and design implies a designer. Any entity, any process, or definition of any word itself is put into play as a result of an intended consequence in the mind of the designer. This is the heart of the philosophical worldview of organized religions, and in particular their creation stories. Humanity, it assumes, exists for a purpose. Individuals have a purpose in being on Earth. Both humanity and individuals have meaning.

There is a second, broader way the word “meaning” is used and a very different worldview implied. It is that the accidents of history, not the intentions of a designer, are the source of meaning. There is no advance design, but instead overlapping networks of physical cause and effect. The unfolding of history is obedient only to the general laws of the Universe. Each event is random yet alters the probability of later events. During organic evolution, for example, the origin of one adaptation by natural selection makes the origin of certain other adaptations more likely. This concept of meaning, insofar as it illuminates humanity and the rest of life, is the worldview of science.

What? All right, hold on just one sociobiologically-constructed minute. No one, literally no one, ever uses the word “meaning” that way. Even less so can this usage be excused in the case of an author who is writing in the intrinsically philosophical context of attempting to explain the significance of Man’s existence. Let’s reference the dictionary.

MEANING, noun

  1. what is intended to be, or actually is, expressed or indicated; signification; import
  2. the end, purpose, or significance of something

Hmmm. He has at least a superficial excuse. It appears that Wilson is playing a little fast-and-loose with the definition of “meaning” here. He is clearly using it in the sense of “what actually is”. That is (unexpectedly) fair enough, except for the fact that by selecting that specific meaning of the word,(1) he reduces both his statement and the thesis of his book to basic tautologies.

Consider the title: The Meaning of Human Existence. Now let’s incorporate this second, broader way the word meaning is used, according to Wilson: The Actual Is of Human Existence. What, one wonders, can we derive from Wilson’s bold statement that humans actually exist? Are we to assume it is a catalog of facts about humanity rather than a statement about the significance of humanity’s existence? It’s more akin to a bad comedy routine than a genuine philosophical statement.

“What do you mean by that?”
“What it is. What it actually is.”
“I know what you said. But what do you mean?
“What I said. What else could I mean?”
“Don’t you mean what else could I actually is?”
“Don’t you?”

In fact, I even suspect Wilson of cherry-picking this definition in order to beg the question he appears to be feigning to propose given the fact that it does not appear in other dictionaries, such as the Oxford online dictionary.

MEANING, noun

  1. What is meant by a word, text, concept, or action.
  2. Implied or explicit significance.
  3. Important or worthwhile quality; purpose.

But the definition provided is even worse than the self-parody it appears to be. Remember, Wilson didn’t directly state that meaning is that which actually is, he declared the second way the word is used to be is that the accidents of history “are the source of meaning”. So, he’s actually using the word meaning in his own definition of the word meaning. This is either intellectual incompetence or intellectual shadiness, and while I cannot say which is the case yet, I am now on high alert to the probability of either… or both.

Given this shaky – or shady – foundation, I do not have very high hopes for the philosophy that Mr. Wilson has constructed upon it. I completely understand why some find my intellectual arrogance to be unseemly and offputting, but honestly, can you not in turn understand how I come by it, given how often this sort of thing happens?


(1) One can legitimately groan at that one. It does nicely underline my point, though.


The handicap of high IQ

This recent finding on intelligence and leadership will not surprise anyone at this blog:

Although intelligence is positively correlated with inspiring and capable leadership, there’s a point where a leader’s IQ offers diminishing returns or can actually lead to detrimental leadership.

The findings were made by psychologists at the University of Lausanne, Switzerland, who assessed 379 mid-level leaders employed by private companies in 30 mainly European countries. The average age of the participants was 38 and 27 percent of them were women.

Each participant was asked to complete the Wonderlic Personnel Test, a cognitive ability test widely used by employers and educational institutions around the world. The average IQ of the participants was 111, which is well above the average IQ score of 100 for the general population….

As previous studies showed, the Swiss researchers found that there was a linear relationship between intelligence and effective leadership — but only up to a point. This association plateaued and then reversed at IQ 120. Leaders who scored above this threshold scored lowered on transformational and instrumental leadership than less intelligent leaders, as rated by standardized tests. Over an IQ score of 128, the poorer leadership style was plainer and statistically significant, as reported in the Journal of Applied Psychology.

It’s important to note at this point at these ‘very smart’ leaders didn’t employ detrimental leadership styles but rather just scored lower than their ‘less smart’ peers on useful leadership style.

You’ll notice that these findings are perfectly consistent with both the observed exclusion of the cognitive elite from the professional elite as well as my distinction between VHIQ and UHIQ. It may also help you understand why I consistently refuse the various leadership positions I am regularly offered as well as why I am so careful about the volunteers I accept.

I intensely dislike explaining things in unnecessary detail, much less justifying things to anyone, especially subordinates. I simply cannot work with people who insist on both a) having the obvious spelled out to them and b) taking umbrage at having things explained step-by-step for them from the beginning as if they were stupid. (Their words, not mine.) Here is the problem with that conceptual dichotomy: if you have to have the obvious spelled out to you, if you can’t immediately grasp the whole chain of reasoning from start to finish, then it is necessary to spell everything out from the beginning because the other person cannot possibly know at what point your ability to go from A to Z broke down.

Another problem is the way in which many, if not most, people are unable to recognize that for every effect, there must be a cause. If I ask a question, then I want the answer to it. I don’t care if you’ve told me the answer 40 times before. I don’t care if you think I should already know the answer. I don’t care if you think there is a different question that I should have asked. Just answer the damned question; I guarantee doing so will take considerably less time than engaging in a debate over any of the various possible permutations of a discussion exploring the reasons why you should not be under any obligation to answer the aforementioned question. What is more likely, the probability that I have forgotten what you have said or the probability that I derive some sort of strange pleasure from forcing you to answer the same question again? Just answer the question that was asked. If that causes any questions to arise on your part, that’s fine, but ask them after you answer mine first.

I have also noticed that many people seem to rather enjoy playing dumb, ignoring the most likely context, and insisting on having everything explained to them instead of using their common sense to assume the probable. For example, if I say “wash the car” to my friend, is it reasonable for him to say, “whatever car do you mean? There are millions, tens of millions of cars in the world? How can I possibly take action when I have no idea what car you could possibly be referring to?”

To which my response is: “There is one car in the driveway. It is mine. It is dirty. You borrowed it yesterday. Do you really think I am referring to the presidential limo – no, wait, let’s not confuse you and be too general, do you really think I am referring to the U.S. presidential limo?”

Now, the most likely context may or may not be the correct one. But it is surely the correct assumption, which one can either listen and wait to see confirmed by subsequent details, or in the absence of those, a simple question. But to pretend that no actionable information has been presented and that one is operating in a complete absence of data is false, disingenuous, and may even be reasonably considered dishonest. Whether this behavior is the result of looking to excuse inaction, to avoid thinking, or to avoid any responsibility for decision-making, I do not know. Regardless, a highly intelligent person is likely to find this sort of pedantic pseudo-ignorance to be aggravating, and thereby, right from the start, find himself behind the leadership eight-ball in the eyes of his subordinates.

In my opinion, an important aspect of good leadership is a collection of good followers who actively want to be led. I don’t think it is a coincidence that the “poorer leadership” line of demarcation observed happens to almost perfectly line up with the so-called 2SD “communications gap”. Unfortunately, I don’t have any useful advice for the 2SD+ crowd, other than “find smarter subordinates” and “never be surprised by any failure to understand what you think to be obvious.”


Plagiarism is plagiarism

Toddy-Cat isn’t quite sure that the Zman is a plagiarist.

“I’m not sure that not citing a source in a response to a blog comment actually rises to the dignity of ‘plagiarism’”.

That degree of uncertainty is fair, especially if you haven’t actually read the source yourself, as I have not. But, as Tublecane demonstrates, once you look at Stove’s actual words and compare them to the Zman’s words, you are forced to conclude there is nothing to be uncertain about:

I thought of the paraphrasing defense, but that doesn’t hold up. It’s not that Z-man comes off sounding like Stove because uses the same general form of argument, borrowing a phrase or two…. I believe it was deliberate. Compare:

thezman: “Much more is known now about the natural world, than was known fifty years ago…”

Scientific Irrationalism by David Stove, (p.1) “Much more is known now than was known fifty years ago…”

thezman: “…and much more was known then than in 1580.”

Stove: “…and much more was known then than in 1580.”

thezman: “So there has been a great accumulation or growth of knowledge in the last four hundred years.”

Stove: “So there has been a great accumulation or growth of knowledge in the last four hundred years.”

thezman: “This is an extremely well-known fact.”

Stove: “This is an extremely well-known fact…”

thezman: “Let’s call this (A).”

Stove: “…which I will refer to as (A).”

thezman: “A person, who did not know (A), would be uncommonly ignorant.”

Stove: “A philosopher, in particular, who did not know it, would be uncommonly ignorant.”

The remainder of the post veers away from Stove’s text, though I wouldn’t be surprised if it were stolen from somewhere else. Now, whether such a thing as plagiarism exists in internet comment sections, that’s a different matter. I say yes, because it’s publicly passing off someone else’s writing as your own.

Tublecane is correct. The Zman clearly attempted to pass off David Stove’s writing and ideas as his own in order to try to place himself in an intellectually superior position from which he could then pass judgment. It’s not merely a question of what he did, but why he did it in that particular manner. He is observably a plagiarist. This observation is further supported by the fact that the Zman didn’t understand the argument that Stove was making about Karl Popper, nor does he understand Popper’s positions, nor does he even understand the fundamental differences between a) logic, b) math, and c) science, let alone the current need for the etymological division of “science” into its three aspects of scientody, scientage, and scientistry.

Ogre agrees. “It’s absolutely plagiarism in the sense of “presenting the words of another as your own.” And that’s really the only kind of plagiarism we care about here. Whether it could be considered academic plagiarism (I don’t know) or copyright infringement (its not), its still a dishonest and unethical thing to do. Especially given the context in which it was presented. It’s just more evidence of his posturing–passing off another’s arguments and expressions as his own in order to bolster his perceived intelligence.”

As has been the case every single time I have exposed the pretenses and posturings of someone who has fans, some of those fans are attempting to change the subject away from the failings of that particular individual to my theoretical motivations in destroying that individual’s intellectual reputation. To those fans, I will simply point out that my motivations are irrelevant, the facts are readily observable to everyone, and that this is what I do every time anyone comes at me, be they friend or foe.

The Zman and his would-be defenders can dance and defend and distract and theorize all they like. It won’t make any difference. The point is that he’s not particularly smart, he’s not very well-read – it wouldn’t surprise me to learn he hasn’t actually read much of the Stove book past the first page since he clearly didn’t understand it – and most importantly, he’s not very honest. And his moral and intellectual failings have nothing to do with me, as I am merely one of the many people who has happened to observe them.

The main difference between me and most of those who wish to somehow minimize my influence or discredit me is not that I am at least a standard deviation more intelligent than they are, although that is often true. The main difference is that for 16 years I have had tens of thousands of opponents poring over my every word written in column, blog post, comment, tweet, and book, looking for every possible mistake they can exploit, and most of my critics have not.

So, even if I lacked both confidence in my own words and personal integrity, I know better than to ever make the sort of stupid, obvious, dishonest, and self-discrediting mistake that the Zman did in plagiarizing David Stove’s words and attempting to pass off Stove’s ideas as his own. At the end of the day, a man must decide whether he values his integrity or he values the opinions of others. My decision should be obvious from my mantra: MPAI.


Mailvox: posturing and plagiarism

Tublecane accuses the Zman of plagiarizing David Stove

If those paragraphs you quoted in your update are supposed to be Z-man’s words, uttered without reference to their source, oh boy. I thought they sounded familiar, so I checked my copy of David Stove’s Scientific Irrationalism and Z-man copies verbiage found on page one. Right down to the year 1580, the letter “A,” and the phrase “uncommonly ignorant.”

Stove, being much brighter than the Z-men of the world, wasn’t making an “everything scientists say is factual, so shut up” argument. He doesn’t even share Z-man’s opinion on Popperian falsifiabilty, though he lays into Popper and finds him guilty of launching a line of irrationalism (or a “postmodern cult,” as the subtitle has it) in the philosophy and historiography of science. A line which isn’t so bad with Popper but gets worse and worse as you go through Kuhn, Lakatos, and Feyerabend.

The point about accumulation of knowledge, which is robust in Stove’s book, is neither here nor there regarding the subject at hand. Z-man thinks he’s dealing with nihilists, and nihilists would have trouble with facts accumulating. But of course that has nothing to do with how you characterize varieties of “science” in the 16 Points. Science since 1580 could have simultaneously been more wrong than right and still served to advance human knowledge.

Upon closer inspection, Z-man explicitly mentions David Stove’s Popper and After, but in a separate post from the one in which he steals from it.

I call plagiarism!

Moreover, plagiarism that would be insulting to Stove, RIP, since he wouldn’t be caught making an argument as silly as Z-man’s.

I haven’t read any of David Stove’s books, so I can’t testify to the accuracy of the accusation of plagiarism. But it’s not particularly surprising to be informed that the argument the Zman’s was making is not his own, as 8 hours before Tublecane posted his comment, I had made this observation: “One definitely has the impression that the Zman has not read Popper, or even Kuhn, himself, but rather, has read what people have written about Popper.”

In any event, this demonstrates why it is important not to feign knowledge you do not possess, not to pass off the arguments of others as your own, and not to express opinions on subjects you do not know very much about. Especially on the Internet, someone is bound to eventually notice that you are an intellectual fraud.


In over his head

Now, I generally enjoy reading both John Derbyshire and Zman, and I think they’re both intelligent iconoclasts, but every now and then I am surprised to discover how conceptually limited various would-be critics can be when they attempt to criticize me. They preen, they posture, and they pontificate even as they demonstrate that they neither understand me nor know whereof they speak. That may sound a little arrogant, but bear with me a moment and you’ll see what I mean.

The Zman mentioned this in his interesting summary of attending the Mencken Club:

John was first up and he used Vox Day’s 16-points blog post as the framework for his talk. He made the point that Vox is by no means the leader of the alt-right or the voice of it, but a representative sample that is useful for analyzing the movement. His comments about item number eight were laugh out loud funny, to the empirically minded. What John was doing was introducing the general ideas of the alt-right to a crowd that is not spending their evenings in the meme war. He did a good job presenting the broad strokes.

This is all very well, but it led to the following string of comments which revealed some unexpected conceptual limitations on the part of the Zman. His failure to grasp either the obvious linguistics involved or to understand the basic nature of science is, to put it mildly, surprising. I find myself wondering if these failures are a logical consequence of his atheistic philosophical incoherence running headlong into its own conclusions, a kneejerk reaction to displeasure with something I have said, or simply an indication of his cognitive limitations.

Toddy+Cat
Personally, I’d be very interested to hear what John Derbyshire had to say about Vox’s point number eight. Derbyshire is great intellect, a fantastic writer, and has enough moral courage for several men, but he (like all of us) is a product of his Time, and sometimes has way too much respect for “science” and the “scientific community”. He sometimes does not seem to realize just how politicized “science” has become in our day. As much as both he and I might regret it, this ain’t 1955.

thezman
He comically analyzed the possible entomology of the words, “scientodific” and “scientody”. He also pointed out the absurdity of the claim that scientific conclusions are liable to future revision. For instance, Mars is closer to the sun than Jupiter, a conclusion of science that will never be liable to revision. John correctly pointed out that number eight is gibberish.

I’ve written a little about this topic. I’ll be revisiting it frequently. I think there may even be a book in it, if I can manage to squeeze more than 24 hours from each day. Suffice it to say that I don’t think science and technology can be jammed into the moral philosophy of the 17th and 18th century. Therefore, we wither kill all the scientists or create a new moral philosophy.

Heywood
Mars is closer to the sun than Jupiter… Anyway, that’s a stupid argument. While facts may be immutable – for a time, which may be long, like in the case above – our interpretations in the form of theories are always placeholders. Good till something better comes along, which, btw, must have the same predicative power as the old theory had where it was applicable, while extending the range of predictability. See classic Newtonian mechanics vs. theory of relativity. And as it happens, neither addresses Vox’s point of all so many modern “scientists” who employ the trappings and outer forms of science while gleefully ignoring everything that makes it actually useful. Nothing of this strikes me as overly difficult to either comprehend or establish, given the state of sciences today.

thezman
I’ve heard every iteration of factual nihilism and I have no interest in taking it seriously. It’s just another way of putting the goal posts on roller skates.

Man of the West
Zman said: “He also pointed out the absurdity of the claim that scientific conclusions are liable to future revision” I am assuming and hoping that what John meant is that SOME scientific conclusions are not open to revision, as his example — though questionable as science — shows. Because if not, and if the above statement is what he actually meant, then it is foolishness of a high order given the provisional nature of science and the fact that we know that there are unknown errors in scientific conclusions (as the replication crisis is presently showing us) that may be discovered at a later time.

Of course, John may have a unique definition of ‘science’ or of the word ‘conclusions’ which gives him some wiggle room, but then he is just playing semantic games.

thezman
Think of it this way. There is a set of things that have to be true or nothing is true. There is a set of things that are most likely true, but have yet to be conclusively proven. There are a set of things that may be true, but there’s either no way to test them or the efforts to prove them have fallen short. Finally, there is the set of things that are unknown.

Scientific conclusions are the first set. The second and third sets are open to revision and challenge. It’s not a matter of semantics. It is about definitions. People tend not to grasp the definitions of science, because the have had little exposure to math or science.

Byzantine_General
Your first category encompasses logic and mathematics. Your second includes theories of gravitation, where Einstein’s superceded Newton’s. But your words seem to place today’s scientific “conclusions” in the first category, rather than the second. Can you explain without casting nasturtiums?

“Factual nihilism”. Hmph.

thezman
I wrote, “Scientific conclusions are the first set.” That seems to cover it. Science, like mathematics, is about the accumulation of axioms, things that are assumed to be true by their nature. Put another way, if everything is open to revision, there is no truth.

Byzantine_General
I am gobsmacked, and I say this lovingly, by your wrong-headedness.

Science deals in theories, not conclusions. Theories are always contingent; we strenuously fail to falsify them, accumulate partial belief in them, build on and with them, but can never reach axiomatic mathematical certainty.

Not long ago, the best available theory was that neutrinos had zero mass. It could have been loosely said that science had concluded that this was the case. But the progress of science demands that it does not make conclusions.

A pesky observation (that the wrong kind of neutrinos were arriving on Earth from the Sun) forced that theory to be abandoned. A better theory accounted for the observation (and all previous observations), and entailed a non-zero mass. This is a perfect example of scientific progress.

Axioms are not subject to abandonment. Had the zero mass of the neutrino been treated as an axiom, we would have been forced to reject the observation and the better theory, because the contradiction of an axiom is automatically false.

Karl Popper’s criterion for deciding whether a theory is scientific is whether it could by some conceivable observation be falsified. Theories that can withstand any contrary evidence are called “religions”.

thezman
Karl Popper was wrong. If everything can be falsified, then there is no truth. That’s nihilism.

This is one of those moments when you suddenly realize that someone simply isn’t as intelligent as you had previously thought they were. One does not need to be a Popperian to recognize the obvious fact that many, if not most, scientific conclusions are intrinsically provisional, if not based entirely on false foundations. The Zman is making a common error in confusing the scientific method with a means of determining absolute truth; scientody is actually nothing more than a tool for determining what is not true from a material perspective and therefore can only ever be a means of narrowing the possible scope of the truth of things that remain firmly within the temporally accessible aspects of the material realm.

History, for example, is a matter of firmly established fact, and yet remains largely outside the realm of science and its conclusions.

In claiming that a correct understanding of science is nihilism, he confuses the subset of observable facts with the much larger set of scientific conclusions. And in asserting that science is about the accumulation of axioms with the alternative being nihilism. he demonstrates that he understands neither science nor mathematics nor the philosophy of science. Or, for that matter, nihilism.

Speaking of etymology, it seems we’re going to need to coin a new term for this sort of high midwittery.

UPDATE: I don’t think the Zman properly grasped what John Derbyshire was saying at all. But I will post my response to Derb’s comments in a separate post.

UPDATE: Yeah, he’s just not very bright. He didn’t even hesitate to double down in response to this post.

Much more is known now about the natural world, than was known fifty years ago, and much more was known then than in 1580. So there has been a great accumulation or growth of knowledge in the last four hundred years.

This is an extremely well-known fact. Let’s call this (A). A person, who did not know (A), would be uncommonly ignorant. To assert that all scientific conclusions are open to revision, as Vox Day has done, is to deny the existence of (A). I see he is now pushing around the goal post on wheels to try and obscure the fact he made a ridiculous statement, but that changes nothing.