The handicap of high IQ

This recent finding on intelligence and leadership will not surprise anyone at this blog:

Although intelligence is positively correlated with inspiring and capable leadership, there’s a point where a leader’s IQ offers diminishing returns or can actually lead to detrimental leadership.

The findings were made by psychologists at the University of Lausanne, Switzerland, who assessed 379 mid-level leaders employed by private companies in 30 mainly European countries. The average age of the participants was 38 and 27 percent of them were women.

Each participant was asked to complete the Wonderlic Personnel Test, a cognitive ability test widely used by employers and educational institutions around the world. The average IQ of the participants was 111, which is well above the average IQ score of 100 for the general population….

As previous studies showed, the Swiss researchers found that there was a linear relationship between intelligence and effective leadership — but only up to a point. This association plateaued and then reversed at IQ 120. Leaders who scored above this threshold scored lowered on transformational and instrumental leadership than less intelligent leaders, as rated by standardized tests. Over an IQ score of 128, the poorer leadership style was plainer and statistically significant, as reported in the Journal of Applied Psychology.

It’s important to note at this point at these ‘very smart’ leaders didn’t employ detrimental leadership styles but rather just scored lower than their ‘less smart’ peers on useful leadership style.

You’ll notice that these findings are perfectly consistent with both the observed exclusion of the cognitive elite from the professional elite as well as my distinction between VHIQ and UHIQ. It may also help you understand why I consistently refuse the various leadership positions I am regularly offered as well as why I am so careful about the volunteers I accept.

I intensely dislike explaining things in unnecessary detail, much less justifying things to anyone, especially subordinates. I simply cannot work with people who insist on both a) having the obvious spelled out to them and b) taking umbrage at having things explained step-by-step for them from the beginning as if they were stupid. (Their words, not mine.) Here is the problem with that conceptual dichotomy: if you have to have the obvious spelled out to you, if you can’t immediately grasp the whole chain of reasoning from start to finish, then it is necessary to spell everything out from the beginning because the other person cannot possibly know at what point your ability to go from A to Z broke down.

Another problem is the way in which many, if not most, people are unable to recognize that for every effect, there must be a cause. If I ask a question, then I want the answer to it. I don’t care if you’ve told me the answer 40 times before. I don’t care if you think I should already know the answer. I don’t care if you think there is a different question that I should have asked. Just answer the damned question; I guarantee doing so will take considerably less time than engaging in a debate over any of the various possible permutations of a discussion exploring the reasons why you should not be under any obligation to answer the aforementioned question. What is more likely, the probability that I have forgotten what you have said or the probability that I derive some sort of strange pleasure from forcing you to answer the same question again? Just answer the question that was asked. If that causes any questions to arise on your part, that’s fine, but ask them after you answer mine first.

I have also noticed that many people seem to rather enjoy playing dumb, ignoring the most likely context, and insisting on having everything explained to them instead of using their common sense to assume the probable. For example, if I say “wash the car” to my friend, is it reasonable for him to say, “whatever car do you mean? There are millions, tens of millions of cars in the world? How can I possibly take action when I have no idea what car you could possibly be referring to?”

To which my response is: “There is one car in the driveway. It is mine. It is dirty. You borrowed it yesterday. Do you really think I am referring to the presidential limo – no, wait, let’s not confuse you and be too general, do you really think I am referring to the U.S. presidential limo?”

Now, the most likely context may or may not be the correct one. But it is surely the correct assumption, which one can either listen and wait to see confirmed by subsequent details, or in the absence of those, a simple question. But to pretend that no actionable information has been presented and that one is operating in a complete absence of data is false, disingenuous, and may even be reasonably considered dishonest. Whether this behavior is the result of looking to excuse inaction, to avoid thinking, or to avoid any responsibility for decision-making, I do not know. Regardless, a highly intelligent person is likely to find this sort of pedantic pseudo-ignorance to be aggravating, and thereby, right from the start, find himself behind the leadership eight-ball in the eyes of his subordinates.

In my opinion, an important aspect of good leadership is a collection of good followers who actively want to be led. I don’t think it is a coincidence that the “poorer leadership” line of demarcation observed happens to almost perfectly line up with the so-called 2SD “communications gap”. Unfortunately, I don’t have any useful advice for the 2SD+ crowd, other than “find smarter subordinates” and “never be surprised by any failure to understand what you think to be obvious.”


Plagiarism is plagiarism

Toddy-Cat isn’t quite sure that the Zman is a plagiarist.

“I’m not sure that not citing a source in a response to a blog comment actually rises to the dignity of ‘plagiarism’”.

That degree of uncertainty is fair, especially if you haven’t actually read the source yourself, as I have not. But, as Tublecane demonstrates, once you look at Stove’s actual words and compare them to the Zman’s words, you are forced to conclude there is nothing to be uncertain about:

I thought of the paraphrasing defense, but that doesn’t hold up. It’s not that Z-man comes off sounding like Stove because uses the same general form of argument, borrowing a phrase or two…. I believe it was deliberate. Compare:

thezman: “Much more is known now about the natural world, than was known fifty years ago…”

Scientific Irrationalism by David Stove, (p.1) “Much more is known now than was known fifty years ago…”

thezman: “…and much more was known then than in 1580.”

Stove: “…and much more was known then than in 1580.”

thezman: “So there has been a great accumulation or growth of knowledge in the last four hundred years.”

Stove: “So there has been a great accumulation or growth of knowledge in the last four hundred years.”

thezman: “This is an extremely well-known fact.”

Stove: “This is an extremely well-known fact…”

thezman: “Let’s call this (A).”

Stove: “…which I will refer to as (A).”

thezman: “A person, who did not know (A), would be uncommonly ignorant.”

Stove: “A philosopher, in particular, who did not know it, would be uncommonly ignorant.”

The remainder of the post veers away from Stove’s text, though I wouldn’t be surprised if it were stolen from somewhere else. Now, whether such a thing as plagiarism exists in internet comment sections, that’s a different matter. I say yes, because it’s publicly passing off someone else’s writing as your own.

Tublecane is correct. The Zman clearly attempted to pass off David Stove’s writing and ideas as his own in order to try to place himself in an intellectually superior position from which he could then pass judgment. It’s not merely a question of what he did, but why he did it in that particular manner. He is observably a plagiarist. This observation is further supported by the fact that the Zman didn’t understand the argument that Stove was making about Karl Popper, nor does he understand Popper’s positions, nor does he even understand the fundamental differences between a) logic, b) math, and c) science, let alone the current need for the etymological division of “science” into its three aspects of scientody, scientage, and scientistry.

Ogre agrees. “It’s absolutely plagiarism in the sense of “presenting the words of another as your own.” And that’s really the only kind of plagiarism we care about here. Whether it could be considered academic plagiarism (I don’t know) or copyright infringement (its not), its still a dishonest and unethical thing to do. Especially given the context in which it was presented. It’s just more evidence of his posturing–passing off another’s arguments and expressions as his own in order to bolster his perceived intelligence.”

As has been the case every single time I have exposed the pretenses and posturings of someone who has fans, some of those fans are attempting to change the subject away from the failings of that particular individual to my theoretical motivations in destroying that individual’s intellectual reputation. To those fans, I will simply point out that my motivations are irrelevant, the facts are readily observable to everyone, and that this is what I do every time anyone comes at me, be they friend or foe.

The Zman and his would-be defenders can dance and defend and distract and theorize all they like. It won’t make any difference. The point is that he’s not particularly smart, he’s not very well-read – it wouldn’t surprise me to learn he hasn’t actually read much of the Stove book past the first page since he clearly didn’t understand it – and most importantly, he’s not very honest. And his moral and intellectual failings have nothing to do with me, as I am merely one of the many people who has happened to observe them.

The main difference between me and most of those who wish to somehow minimize my influence or discredit me is not that I am at least a standard deviation more intelligent than they are, although that is often true. The main difference is that for 16 years I have had tens of thousands of opponents poring over my every word written in column, blog post, comment, tweet, and book, looking for every possible mistake they can exploit, and most of my critics have not.

So, even if I lacked both confidence in my own words and personal integrity, I know better than to ever make the sort of stupid, obvious, dishonest, and self-discrediting mistake that the Zman did in plagiarizing David Stove’s words and attempting to pass off Stove’s ideas as his own. At the end of the day, a man must decide whether he values his integrity or he values the opinions of others. My decision should be obvious from my mantra: MPAI.


Mailvox: posturing and plagiarism

Tublecane accuses the Zman of plagiarizing David Stove

If those paragraphs you quoted in your update are supposed to be Z-man’s words, uttered without reference to their source, oh boy. I thought they sounded familiar, so I checked my copy of David Stove’s Scientific Irrationalism and Z-man copies verbiage found on page one. Right down to the year 1580, the letter “A,” and the phrase “uncommonly ignorant.”

Stove, being much brighter than the Z-men of the world, wasn’t making an “everything scientists say is factual, so shut up” argument. He doesn’t even share Z-man’s opinion on Popperian falsifiabilty, though he lays into Popper and finds him guilty of launching a line of irrationalism (or a “postmodern cult,” as the subtitle has it) in the philosophy and historiography of science. A line which isn’t so bad with Popper but gets worse and worse as you go through Kuhn, Lakatos, and Feyerabend.

The point about accumulation of knowledge, which is robust in Stove’s book, is neither here nor there regarding the subject at hand. Z-man thinks he’s dealing with nihilists, and nihilists would have trouble with facts accumulating. But of course that has nothing to do with how you characterize varieties of “science” in the 16 Points. Science since 1580 could have simultaneously been more wrong than right and still served to advance human knowledge.

Upon closer inspection, Z-man explicitly mentions David Stove’s Popper and After, but in a separate post from the one in which he steals from it.

I call plagiarism!

Moreover, plagiarism that would be insulting to Stove, RIP, since he wouldn’t be caught making an argument as silly as Z-man’s.

I haven’t read any of David Stove’s books, so I can’t testify to the accuracy of the accusation of plagiarism. But it’s not particularly surprising to be informed that the argument the Zman’s was making is not his own, as 8 hours before Tublecane posted his comment, I had made this observation: “One definitely has the impression that the Zman has not read Popper, or even Kuhn, himself, but rather, has read what people have written about Popper.”

In any event, this demonstrates why it is important not to feign knowledge you do not possess, not to pass off the arguments of others as your own, and not to express opinions on subjects you do not know very much about. Especially on the Internet, someone is bound to eventually notice that you are an intellectual fraud.


In over his head

Now, I generally enjoy reading both John Derbyshire and Zman, and I think they’re both intelligent iconoclasts, but every now and then I am surprised to discover how conceptually limited various would-be critics can be when they attempt to criticize me. They preen, they posture, and they pontificate even as they demonstrate that they neither understand me nor know whereof they speak. That may sound a little arrogant, but bear with me a moment and you’ll see what I mean.

The Zman mentioned this in his interesting summary of attending the Mencken Club:

John was first up and he used Vox Day’s 16-points blog post as the framework for his talk. He made the point that Vox is by no means the leader of the alt-right or the voice of it, but a representative sample that is useful for analyzing the movement. His comments about item number eight were laugh out loud funny, to the empirically minded. What John was doing was introducing the general ideas of the alt-right to a crowd that is not spending their evenings in the meme war. He did a good job presenting the broad strokes.

This is all very well, but it led to the following string of comments which revealed some unexpected conceptual limitations on the part of the Zman. His failure to grasp either the obvious linguistics involved or to understand the basic nature of science is, to put it mildly, surprising. I find myself wondering if these failures are a logical consequence of his atheistic philosophical incoherence running headlong into its own conclusions, a kneejerk reaction to displeasure with something I have said, or simply an indication of his cognitive limitations.

Toddy+Cat
Personally, I’d be very interested to hear what John Derbyshire had to say about Vox’s point number eight. Derbyshire is great intellect, a fantastic writer, and has enough moral courage for several men, but he (like all of us) is a product of his Time, and sometimes has way too much respect for “science” and the “scientific community”. He sometimes does not seem to realize just how politicized “science” has become in our day. As much as both he and I might regret it, this ain’t 1955.

thezman
He comically analyzed the possible entomology of the words, “scientodific” and “scientody”. He also pointed out the absurdity of the claim that scientific conclusions are liable to future revision. For instance, Mars is closer to the sun than Jupiter, a conclusion of science that will never be liable to revision. John correctly pointed out that number eight is gibberish.

I’ve written a little about this topic. I’ll be revisiting it frequently. I think there may even be a book in it, if I can manage to squeeze more than 24 hours from each day. Suffice it to say that I don’t think science and technology can be jammed into the moral philosophy of the 17th and 18th century. Therefore, we wither kill all the scientists or create a new moral philosophy.

Heywood
Mars is closer to the sun than Jupiter… Anyway, that’s a stupid argument. While facts may be immutable – for a time, which may be long, like in the case above – our interpretations in the form of theories are always placeholders. Good till something better comes along, which, btw, must have the same predicative power as the old theory had where it was applicable, while extending the range of predictability. See classic Newtonian mechanics vs. theory of relativity. And as it happens, neither addresses Vox’s point of all so many modern “scientists” who employ the trappings and outer forms of science while gleefully ignoring everything that makes it actually useful. Nothing of this strikes me as overly difficult to either comprehend or establish, given the state of sciences today.

thezman
I’ve heard every iteration of factual nihilism and I have no interest in taking it seriously. It’s just another way of putting the goal posts on roller skates.

Man of the West
Zman said: “He also pointed out the absurdity of the claim that scientific conclusions are liable to future revision” I am assuming and hoping that what John meant is that SOME scientific conclusions are not open to revision, as his example — though questionable as science — shows. Because if not, and if the above statement is what he actually meant, then it is foolishness of a high order given the provisional nature of science and the fact that we know that there are unknown errors in scientific conclusions (as the replication crisis is presently showing us) that may be discovered at a later time.

Of course, John may have a unique definition of ‘science’ or of the word ‘conclusions’ which gives him some wiggle room, but then he is just playing semantic games.

thezman
Think of it this way. There is a set of things that have to be true or nothing is true. There is a set of things that are most likely true, but have yet to be conclusively proven. There are a set of things that may be true, but there’s either no way to test them or the efforts to prove them have fallen short. Finally, there is the set of things that are unknown.

Scientific conclusions are the first set. The second and third sets are open to revision and challenge. It’s not a matter of semantics. It is about definitions. People tend not to grasp the definitions of science, because the have had little exposure to math or science.

Byzantine_General
Your first category encompasses logic and mathematics. Your second includes theories of gravitation, where Einstein’s superceded Newton’s. But your words seem to place today’s scientific “conclusions” in the first category, rather than the second. Can you explain without casting nasturtiums?

“Factual nihilism”. Hmph.

thezman
I wrote, “Scientific conclusions are the first set.” That seems to cover it. Science, like mathematics, is about the accumulation of axioms, things that are assumed to be true by their nature. Put another way, if everything is open to revision, there is no truth.

Byzantine_General
I am gobsmacked, and I say this lovingly, by your wrong-headedness.

Science deals in theories, not conclusions. Theories are always contingent; we strenuously fail to falsify them, accumulate partial belief in them, build on and with them, but can never reach axiomatic mathematical certainty.

Not long ago, the best available theory was that neutrinos had zero mass. It could have been loosely said that science had concluded that this was the case. But the progress of science demands that it does not make conclusions.

A pesky observation (that the wrong kind of neutrinos were arriving on Earth from the Sun) forced that theory to be abandoned. A better theory accounted for the observation (and all previous observations), and entailed a non-zero mass. This is a perfect example of scientific progress.

Axioms are not subject to abandonment. Had the zero mass of the neutrino been treated as an axiom, we would have been forced to reject the observation and the better theory, because the contradiction of an axiom is automatically false.

Karl Popper’s criterion for deciding whether a theory is scientific is whether it could by some conceivable observation be falsified. Theories that can withstand any contrary evidence are called “religions”.

thezman
Karl Popper was wrong. If everything can be falsified, then there is no truth. That’s nihilism.

This is one of those moments when you suddenly realize that someone simply isn’t as intelligent as you had previously thought they were. One does not need to be a Popperian to recognize the obvious fact that many, if not most, scientific conclusions are intrinsically provisional, if not based entirely on false foundations. The Zman is making a common error in confusing the scientific method with a means of determining absolute truth; scientody is actually nothing more than a tool for determining what is not true from a material perspective and therefore can only ever be a means of narrowing the possible scope of the truth of things that remain firmly within the temporally accessible aspects of the material realm.

History, for example, is a matter of firmly established fact, and yet remains largely outside the realm of science and its conclusions.

In claiming that a correct understanding of science is nihilism, he confuses the subset of observable facts with the much larger set of scientific conclusions. And in asserting that science is about the accumulation of axioms with the alternative being nihilism. he demonstrates that he understands neither science nor mathematics nor the philosophy of science. Or, for that matter, nihilism.

Speaking of etymology, it seems we’re going to need to coin a new term for this sort of high midwittery.

UPDATE: I don’t think the Zman properly grasped what John Derbyshire was saying at all. But I will post my response to Derb’s comments in a separate post.

UPDATE: Yeah, he’s just not very bright. He didn’t even hesitate to double down in response to this post.

Much more is known now about the natural world, than was known fifty years ago, and much more was known then than in 1580. So there has been a great accumulation or growth of knowledge in the last four hundred years.

This is an extremely well-known fact. Let’s call this (A). A person, who did not know (A), would be uncommonly ignorant. To assert that all scientific conclusions are open to revision, as Vox Day has done, is to deny the existence of (A). I see he is now pushing around the goal post on wheels to try and obscure the fact he made a ridiculous statement, but that changes nothing.


Book Review: SAPIENS by Yuval Harari IV

Review of Yuval Harari’s Sapiens: A Brief History of Humankind
by C.R.Hallpike
The complete PDF

Part IV of IV

Harari’s next major turning point in world history he refers to, reasonably enough, as  ‘The Scientific Revolution’.  Around AD 1500 ‘It began in western Europe, a large peninsula on the western tip of Afro-Asia, which up till then played no important role in history.’ (p. 272) This is a unconvincing assessment of a region that had been the seat of the Roman Empire, the Christian Church, and Greek science which was one of the essential foundations of the Scientific Revolution. Harari’s opinions about how this got started are even less persuasive:

The Scientific Revolution has not been a revolution of knowledge. It has above all been a revolution of ignorance. The great discovery that launched the Scientific Revolution was the discovery that humans do not know the answers to their most important question. (p. 279).

This is a statement whose truth is not immediately obvious, and he justifies it as follows:

Premodern traditions of knowledge such as Islam, Christianity, Buddhism and Confucianism asserted that everything that is important to know about the world was already known. The great gods, or the one almighty God, or the wise people of the past possessed all-encompassing wisdom, which they revealed to us in scriptures and oral traditions (pp. 279-80).

These traditions may have claimed to know all that was essential to salvation and peace of mind, but that kind of knowledge had nothing whatsoever to do with pre-modern traditions of science. In Europe this meant Aristotle and Greek natural philosophy but about which, astonishingly, Harari has nothing at all to say anywhere in his book. Apart from a willingness to admit ignorance and embrace new knowledge, science

…has a common core of research methods, which are all based on collecting empirical observations – those we can observe with at least one of our senses – and putting them together with the help of mathematical tools (p. 283).

This is a nineteenth-century view of what science does, whereas the really distinctive feature of modern science is that it tests theory by experiment, and does not simply collect empirical observations. On why modern science developed specifically in Europe Harari has the following explanation:

The key factor was that the plant-seeking botanist and the colony-seeking naval officer shared a similar mindset. Both scientist and conqueror began by admitting ignorance – they both said ‘I don’t know what’s out there.’ They both felt compelled to go out and make new discoveries. And they both hoped that the new knowledge would make them masters of the world (pp. 316-17).

Botany was actually of quite minor importance in the early stages of modern science, which was dominated by studies of terrestrial and celestial motion (Copernicus, Galileo, Kepler, and Newton), and by chemistry which involved the revival of Greek atomism. And Columbus, to take a useful example of ‘a colony-seeking naval officer’ knew quite well what was out there. He knew that the earth is round, and concluded that if he sailed west for long enough he would find a new route to the East Indies. So when he reached the islands of the Caribbean he was convinced that their inhabitants were ‘Indians’ and never changed his mind. I think we can perhaps do a little better than Harari in explaining the European origin of modern science.

Greek science was dominated by the belief that reason, and particularly mathematics, was the true path to knowledge and its role was to be the tutor of the senses, not to be taught by them. The idea of performing an experiment did not really exist, and the great Alexandrian engineer Hero, for example believed that water pressure does not increase with depth. He defended this belief with an ingenious theory from Archimedes, but ignored the practical experiment of taking a glass down to the bottom of a pool where it could easily have been seen that the water rises higher inside the glass the deeper it is taken. Aristotle’s theories of terrestrial and celestial motion, and Ptolemy’s elaborate geometrical model of the heavens, for example, were seen as triumphs of reason, and were inherited by the medieval European universities who began a critical study of them. The importance of Greek science, however,  was not that it was right – it contained fundamental errors – but that it presented a coherent theoretical model of how the world worked that stimulated thought and could be tested.

The Islamic world had transmitted much of Greek science to medieval Europe, and Aristotle in particular was greatly admired by Muslim scholars as ‘The Philosopher’. But under the influence of the clerics Islam eventually turned against reason and science as dangerous to religion, and this renaissance died out. In rather similar fashion, the Byzantine Emperor Justinian closed the philosophy schools of Athens in 529 AD because he considered them dangerous to Christianity. But while in the thirteenth century several Popes, for the same reason, tried to forbid the study of Aristotle in the universities, they were ignored and in fact by the end of the century Aquinas had been able to publish his synthesis of Aristotelian philosophy and Christian theology in the Summa Theologica.

This illustrates a vital difference between Europe and the other imperial civilisations. Whereas the Caliph and the Byzantine Emperor had the authority to impose intellectual orthodoxy, in Europe the Popes could not enforce their will on society, and neither could the secular authorities, because there were too many competing jurisdictions – of the Holy Roman Emperor, of kings, of free cities, of universities, and between church and state themselves. Another vital difference was that in the other imperial civilisations there was that basic gulf between scholars and artisans and between merchants and the rest of the upper classes to which I referred earlier. Medieval European towns and cities, however, were run by merchants, together with the artisans and their guilds, so that the social status of artisans in particular was very much higher than in other cultures, and it was possible for them to interact socially with learned scholars. This interaction with scholars occurred in the context of a wide range of interests that combined book-learning with practical skills: alchemy, astrology, medicine, painting, printing, clock-making, the magnetic compass, gunpowder and gunnery, lens-grinding for spectacles, and so on. These skills were also intimately involved in the making of money in a commercially dynamic society.

It is highly significant that this interaction between scholars and artisans also occurred in the intellectual atmosphere of ‘natural magic’, the belief that the entire universe is a vast system of interrelated correspondences, a hierarchy in which everything acts upon everything else. Alchemy and astrology were the most important components of this tradition, but by the thirteenth century Roger Bacon, for example, was arguing that by applying philosophy and mathematics to the study of nature it would be possible to produce all sorts of technological marvels such as horseless vehicles, flying machines, and glasses for seeing great distances. It was not therefore the admission of ignorance that was truly revolutionary, but  the idea that science could be useful in mastering nature for the benefit of Man.

By the time of Galileo, whom Harari does not even mention, the idea that science should be useful had become a dominant idea of Western science. Galileo was very much in the natural magic tradition and was a prime example of a man of learning who was equally at home in the workshop as in the library – as is well-known, when he heard of the Dutch invention of the telescope he constructed one himself and ground his own lenses to do so. But Galileo was also enormously important in showing the crucial part that experiment had in the advancement of science. He was keenly interested in Aristotle’s theory of terrestrial motion and is said to have tested the theory that heavier bodies fall faster than light ones by dropping them from the leaning tower of Pisa. This is somewhat mythical, but he certainly carried out detailed experiments with metal balls by rolling them down sloping planks to discover the basic laws of acceleration. He did not simply observe, but designed specific experiments to test theories. This is the hall-mark of modern science, and it emerged in the circumstances that I have just described so that reason and the evidence of the senses were thus harmonized in the modern form of natural science. (On the origins of science see Hallpike 2008:288-353; 396-428).

Science, then, is not exactly Harari’s strong point, so we need spend little time on the concluding part of his book, which is taken up with speculation about where science and technology are likely to take the human race in the next hundred years. He concludes, however, with some plaintive remarks about our inability to plan our future: ‘we remain unsure of our goals’, ‘nobody knows where we are going’, ‘we are more powerful than ever before, but have very little idea what to do with all that power’ (pp. 465-66). He has just written a book showing that mankind’s social and cultural evolution has been a process over which no-one could have had any control. So why does he suddenly seize upon the extraordinary fiction that there ought to be some ‘we’ who could now decide where we all go next? Even if such a ‘we’ existed, let us say in the form of the United Nations, how could it know what to do anyway? 

Throughout the book there is also a strange vacillation between hard-nosed Darwinism and egalitarian sentiment. On one hand Harari quite justifiably mocks the humanists’ naive belief in human rights, for not realising that these rights are based on Christianity, and that a huge gulf has actually opened up between the findings of science and modern liberal ideals. But on the other hand it is rather bewildering to find him also indulging in long poetic laments about the thousands of years of injustice, inequality and suffering imposed on the masses by the great states and empires of history, and our cruelty to our animal ‘slaves’ whom we have slaughtered and exterminated in such vast numbers, so that he concludes ‘The Sapiens reign on earth has so far produced little that we can be proud of’. But a consistent Darwinist should surely rejoice to see such a fine demonstration of the survival of the fittest, with other species either decimated or subjected to human rule, and the poor regularly ground under foot in the struggle for survival. Indeed, the future looks even better for Darwinism, with nation states themselves about to be submerged by a mono-cultural world order, in which we ourselves are destined to be replaced by a superhuman race of robots.

It has been rightly said that:

Harari’s view of culture and of ethical norms as fundamentally fictional makes impossible any coherent moral framework for thinking about and shaping our future. And it asks us to pretend that we are not what we know ourselves to be – thinking and feeling subjects, moral agents with free will, and social beings whose culture builds upon the facts of the physical world but is not limited to them (Sexton 2015:120).

Summing up the book as a whole, one has often had to point out how surprisingly little he seems to have read on quite a number of essential topics. It would be fair to say that whenever his facts are broadly correct they are not new, and whenever he tries to strike out on his own he often gets things wrong, sometimes seriously. So we should not judge Sapiens as a serious contribution to knowledge but as ‘infotainment’, a publishing event to titillate its readers by a wild intellectual ride across the  landscape of history, dotted with sensational displays of speculation, and ending with blood-curdling predictions about human destiny. By these criteria it is a most successful book. 


Book Review: SAPIENS by Yuval Harari III

Review of Yuval Harari’s Sapiens: A Brief History of Humankind
by C.R.Hallpike

Part III of IV

Anyway, what was needed here to control these much larger populations were networks of mass co-operation, under the control of kings, and Harari takes us almost immediately into the world of the ancient empires of Egypt, and Mesopotamia, and Persia and China. But how were these networks of mass communication created?

He recognises, quite rightly, the importance of writing and mathematics in human history, and claims they were crucial in the emergence of the state:

…in order to maintain a large kingdom, mathematical data was vital. It was never enough to legislate laws and tell stories about guardian gods. One also had to collect taxes. In order to tax hundreds of thousands of people, it was imperative to collect data about people’s incomes and possessions; data about payments made; data about arrears, debts and fines; data about discounts and exemptions. This added up to millions of data bits, which had to be stored and processed (p. 137).

This was beyond the power of the human brain, however.

This mental limitation severely constrained the size and complexity of human collectives. When the amount of people in  a particular society crossed a critical threshold, it became necessary to store and process large amounts of mathematical data. Since the human brain could not do it, the system collapsed. For thousands of years after the Agricultural Revolution, human social networks remained relatively small and simple (p. 137).

But it is simply not true that kingdoms need to collect vast quantities of financial data in order to tax their subjects, or that social systems beyond a certain size collapsed until they had invented writing and a numerical system for recording this data. If Harari were right it would not have been possible for any kingdoms at all to have developed in Sub-Saharan Africa, for example, because there were no forms of writing systems in this region until quite late when a few developed under European or Islamic influence (Ethiopia was a special case.)  Nevertheless, pre-colonial Africa was actually littered with states and even empires that functioned perfectly well without writing.

They were able to do this because of the undemanding administrative conditions of early kingdoms. These are based on subsistence agriculture without money and have primitive modes of transport, unless they have easy access to river transport like Egypt, Mesopotamia or China. They also have a simple administrative structure based on a hierarchy of local chiefs or officials who play a prominent part in the organization of tribute. The actual expenses of government, apart from the royal court, are therefore relatively small, and the king may have large herds of cattle or other stock, and large estates and labourers to work them to provide food and beer for guests. The primary duty of a ruler is generosity to his nobles and guests, and to his subjects in distress, not to construct vast public works like pyramids. The basic needs of a ruler, besides food supplies, would be prestige articles as gifts of honour, craft products, livestock, and above all men as soldiers and labourers. In Baganda, one of the largest African states, with a population of around two million, tax messengers were sent out when palace resources were running low:

The goods collected were of various kinds –  livestock, cowry shells, iron hoe-blades, and the cloths made from the bark of a fig-tree beaten out thin [for clothing and bedding]…Cattle were required of superior chiefs, goats and hoes of lesser ones, and the peasants contributed the cowry shells and barkcloths….the tax-gatherers did not take a proportion of every herd but required a fixed number of cattle from each chief. Of course the hoes and barkcloths had to be new, and they were not made and stored up in anticipation of the tax-collection. It took some little time to produce the required number, and the tax-gatherers had to wait for this and then supervise the transport of the goods and cattle, first to the saza [district] headquarters and then to the capital. The amount due was calculated in consultation with the subordinates of the saza chiefs who were supposed to know the exact number of men under their authority, and they were responsible for seeing that it was delivered (Mair 1962:163). (Manpower was recruited in basically the same way, and in Africa generally was made up of slaves and corvée labour.)

Nor do early states require written law codes in the style of Hamurabi, and most cases can be settled orally by traditional local courts. No doubt, the demands of administering early states made writing and mathematical notation very useful, and eventually indispensable, but the kinds of financial data that Harari deems essential for a tax system could only have been available in very advanced societies. As we have just seen, very much simpler systems were quite viable. (Since the Sumerian system of mathematical notation is the example that Harari chooses to illustrate the link between taxation, writing, and mathematics, it is a pity that he gets it wrong. The Sumerians did not, as he supposes, use a ‘a combination of base 6 and base 10 numeral systems’. As is well-known, they actually used base 60, with sub-base 10 to count from 1 – 59, 61 – 119, and so on. [Chrisomalis 2010:241-45])

When the Agricultural Revolution opened opportunities for the creation of crowded cities and mighty empires, people invented stories about great gods, motherlands and joint-stock companies to provide the needed social links. (p. 115)  

The idea of people ‘inventing’ religious beliefs to ‘provide the needed social links’ comes out of the same rationalist stable as the claim that kings invented religious beliefs to justify their oppression of their subjects and that capitalists did the same to justify their exploitation of their workers. Religious belief simply doesn’t work like that. It is true, however, that what he calls universal and missionary religions started appearing in the first millennium BC.

Their emergence was one of the most important revolutions in history, and made a vital contribution to the unification of humankind, much like the emergence of universal empires and universal money. (p. 235)

But his chapter on the rise of the universal religions is extremely weak, and his explanation  of monotheism, for example, goes as follows:

With time some followers of polytheist gods became so fond of their particular patron that they drifted away from the basic polytheist insight. They began to believe that their god was the only god, and that He was in fact the supreme power of the universe. Yet at the same time they continued to view Him as possessing interests and biases, and believed that they could strike deals with Him. Thus were born monotheist religions, whose followers beseech the supreme power of the universe to help them recover from illness, win the lottery and gain victory in war. (p. 242)

This is amateurish speculation, and Harari does not even seem to have heard of the Axial Age. This is the term applied by historians to the period of social turmoil that occurred during the first millennium BC across Eurasia, of political instability, warfare, increased commerce and the appearance of coinage, and urbanization, that in various ways eroded traditional social values and social bonds. The search for meaning led to a new breed of thinkers, prophets and philosophers who searched for a more transcendent and universal authority on how we should live and gain tranquillity of mind, that went beyond the limits of their own society and traditions, and beyond purely material prosperity. People developed a much more articulate awareness of the mind and the self than hitherto, and also rejected the old pagan values of worldly success and materialism. As one authority has put it:

‘Everywhere one notices attempts to introduce greater purity, greater justice, greater perfection, and a more universal explanation of things’ (Momigliano 1975:8-9; see also Hallpike 2008:236-65).

One of the consequences of this new cultural order was a fundamental rethinking of religion, so that the old pagan gods began to seem morally and intellectually contemptible. Instead of this naively human image of the gods, said the Greek Xenophanes, ‘One God there is…in no way like mortal creatures either in bodily form or in the thought of his mind… effectively, he wields all things by the thought of his mind.’ So we find all across the Old World the idea developing of a rational cosmic order, a divine universal law, known to the Greeks as Logos, to the Indians as Brahman, to the Jews as Hokhma, and to the Chinese as Tao. This also involved the very important idea that the essential and distinctive mental element in man is akin to the creative and ordering element in the cosmos, of Man as microcosm in relation to the macrocosm.

Intellectually, the idea that the universe makes sense at some deep level, that it is governed by a unified body of rational laws given by a divine Creator, became an essential belief for the development of science, not only among the Greeks, but in the Middle Ages and the Renaissance. As Joseph Needham has said, ‘…historically the question remains whether natural science could ever have reached its present stage of development without passing through a “theological stage” ‘ (Needham 1956:582).

Against this new intellectual background it also became much easier to think of Man not as a citizen of a particular state, but in universal terms as a moral being. There is the growth of the idea of a common humanity which transcends the boundaries of nation and culture and social distinctions of rank, such as slavery, so that all good men are brothers, and the ideal condition of Man would be universal peace (Hallpike 2016:167-218).

Harari tries to create a distinction between ‘monotheistic’ religions such as Judaism, Christianity, and Islam, and ‘natural law religions’, without gods in which he includes Buddhism, Taoism, Confucianism, Stoicism, and the Epicureans. From what I have said about the concepts of Logos, Hokhma, Brahman, and Tao it should be clear that his two types of religion actually had  much in common. In Christianity, for example, Jesus was almost immediately identified with the Logos. The Epicureans, however, do not belong in this group at all as they were ancient materialist atheists who did not believe in natural law of any kind. One of the most obvious facts about states in history is that they all were hierarchical, dividing people into different classes with kings and nobles at the top enjoying wealth and luxury, and peasants or slaves at the bottom in poverty, men privileged over women, some ethnic groups privileged over others, and so on. Harari attributes all this to the invention of writing, and to the ‘imagined orders’ that sustained the large networks involved in state organization.

The imagined orders sustaining these networks were neither neutral nor fair. They divided people into make-believe groups, arranged in a hierarchy. The upper levels enjoyed privileges and power, while the lower ones suffered from discrimination. Hammurabi’s Code, for example established a pecking order of superiors, commoners and slaves. Superiors got all the good things in life. Commoners got what was left. Slaves got a beating if they complained. (p. 149)

 But since these sorts of hierarchies in state societies are universal in what sense can they have simply been ‘make-believe’? Doesn’t this universality suggest that there were actually laws of social and economic development at work here which require sociological analysis? Simply saying that ‘there is no justice in history’ is hardly good enough. In particular, he fails to notice two very significant types of inequality, that of merchants in relation to the upper classes, and of craftsmen in relation to scholars, which had major implications for the development of civilisation, but to which I shall return later.

Harari says that religion and empires have been two of the three great unifiers of the human race, along with money: 

Empires were one of the main reasons for the drastic reduction in human diversity. The imperial steamroller gradually obliterated the unique characteristics of numerous peoples…forging out of them new and much larger groups (p. 213)

These claims have a good deal of truth but they are also quite familiar, so I shall not go into Harari’s discussion of this theme, except for his strange notion of ‘Afro-Asia’, which he describes not only as an ecological system but also as having some sort of cultural unity, e.g. ‘During the first millennium BC, religions of an altogether new kind began to spread through Afro-Asia’ (p. 249). 

Culturally, however, sub-Saharan Africa was entirely cut off from developments in Europe and Asia until Islamic influence began spreading into West Africa in the eighth century AD, and has been largely irrelevant to world history except as a source of slaves and raw materials. And as Diamond pointed out in Guns, Germs and Steel, Africa is an entirely distinct ecological system because it is oriented north/south, so that it is divided by its climatic zones, whereas Eurasia is oriented east/west, so that the same climatic zones extend all across it, and wheat and horses for example are found all the way from Ireland to Japan.

Harari says that at the beginning of the sixteenth century, 90{b05c51a15f0a42d8e7dd687f4cc4bfffd66a97ee173a2742c6182468204332c9} of humans still lived in ‘the single mega-world of Afro-Asia’, while the rest lived in the Meso-American, Andean, and Oceanic worlds. ‘Over the next 300 years the Afro-Asian giant swallowed up all the other worlds’, by which he actually means the expanding colonial empires of the Spanish, Portuguese, Dutch, French and British.

But to refer to these nations as ‘Afro-Asian’  is conspicuously absurd, and the whole concept of Afro-Asia is actually meaningless from every point of view. The general idea of Eurasia, however, does make a good deal of cultural as well as ecological sense, not only because it recognises the obvious importance of Europe, but because of the cultural links that went to and fro across it, so that the early navigators of the fifteenth century were using the Chinese inventions of magnetic compasses, stern-post rudders, paper for their charts, and gunpowder, and were making their voyages to find sea-routes from Europe to China and the East Indies rather than relying on overland trade.

Part IV will be posted tomorrow.


Book Review: SAPIENS by Yuval Harari II

Review of Yuval Harari’s Sapiens: A Brief History of Humankind
by C.R.Hallpike

Part II of IV

Harari’s belief that the Cognitive Revolution provided the modes of thought and reasoning that are the basis of our scientific civilisation could not therefore be further from the truth. We may accept that people became able to speak in sentences at this time, and language is certainly essential to human culture, but anthropologists and developmental psychologists, in their studies of primitive societies, have found that their language development and their modes of thought about space, time, classification, causality and the self have much more resemblance to those of the Piraha than to those of members of modern industrial societies. The Piraha are an extreme case, but the Tauade of Papua New Guinea, for example, with whom I lived only had the idea of single and pair, and no form of calendar or time-reckoning. Harari clearly has no knowledge at all of cross-cultural developmental psychology, and of how modes of thought develop in relation to the natural and socio-cultural environments. The people who carved the Stadel lion-man around 30,000 years ago and the Piraha had the same ability to learn as we do, which is why Piraha children can learn to count, but these cognitive skills have to be learnt: we are not born with them all ready to go. Cross-cultural developmental psychology has shown that the development of the cognitive skills of modern humans actually requires literacy and schooling, large-scale bureaucratic societies and complex urban life, the experience of cultural differences, and familiarity with modern technology, to name some of the more important requirements (see Hallpike 1979).

While Harari recognises that we know almost nothing about the beliefs and social organization of ancient foragers, he agrees that the constraints of their mode of life would have limited them to small-scale groups based on the family without permanent settlements (unless they could fish), and with no domestic animals. But then he launches into some remarkable speculations about what they might nevertheless have achieved in the tens of thousands of years between the Cognitive Revolution and the beginning of agriculture.

These long millennia may have witnessed wars and revolutions, ecstatic religious movements, profound philosophical theories, incomparable artistic masterpieces…The foragers may have had their all-conquering Napoleons who ruled empires half the size of Luxembourg; gifted Beethovens who lacked symphony orchestras but brought people to tears with the sound of their bamboo flutes…’ and so on (pp. 68-9).

Er, no. They couldn’t. All these imagined triumphs of the hunter-gatherers would actually have required a basis of large populations, centralized political control and probably literate civilisation, which in turn would have required the development of agriculture.

This is normally regarded as, after language, the innovation that made possible the extraordinary flowering of human abilities. As Harari correctly points out, agriculture developed independently in a number of parts of the world, and tribal societies based on farming became extremely common, many of them surviving into modern times. But he describes the Agricultural Revolution as ‘history’s biggest fraud’ because individuals in fully developed farming societies generally had an inferior diet and harder work than foragers, and their food supply depended on a limited range of crops that were vulnerable to drought, pests, and invaders, unlike the more varied food resources of hunter-gatherers.

These criticisms of agriculture are, of course, quite familiar, and up to a point legitimate. But if agriculture was really such a bad deal why would humans ever have gone along with it? Harari begins by suggesting that wheat and other crops actually domesticated us, and made us work for them, rather than the other way round, but this doesn’t get him very far in explaining the persistence of agriculture, and instead he argues that wheat offered nothing to individuals, but only to the species by enabling the growth of larger populations. But since it is actually individuals who have to do all the hard work of sowing and reaping this won’t do either, so finally he says that people persisted in the agricultural way of life because they were in search of an easier life, and couldn’t anticipate the full consequences of agriculture.

Whenever they decided to do a bit of extra work – say, to hoe the fields instead of scattering the seeds on the surface – people thought, “Yes, we will have to work harder, but the harvest will be so bountiful! We won’t have to worry any more about lean years. Our children will never go to sleep hungry.” It made sense. If you worked harder, you would have a better life. That was the plan. (p. 97)

It didn’t work out that way, however, because people didn’t foresee population growth, poor diet and disease. Since it would have taken many generations to realise all the disadvantages of agriculture, by that time the population would have grown so large that it would have been impossible to go back to foraging, so the agricultural trap closed on Man for evermore.

The change from foraging to agriculture as principal mode of subsistence would have actually taken hundreds of years in many cases, and there are many important advantages of agriculture which he ignores. It is likely that one of the primary attractions of planting crops was that it allowed people to live in fixed settlements for some or all of the year, for a variety of reasons. Some favoured locations would have provided access to a plentiful supply of food or water; a whole series of craft activities are all more conveniently carried out in permanent or semi-permanent settlements; and these are also very convenient for holding ceremonies such as initiations and feasts. We also know that the food surplus from agriculture can be used in systems of exchange and competitive feasting, for trading with different groups, and for feeding domestic animals. A larger population also has many attractions in itself: it permits a much richer social life than is possible for small foraging bands,  with more impressive ceremonies, a larger labour force for social projects such as irrigation and communal buildings, and more effective defence against local enemies. Agriculture would therefore have had many attractions which would have been obvious to the people concerned, (see Hallpike 2008:52-65).

Agriculture with the domestication of animals, then, was the essential foundation for the growth of really large populations which are in turn essential for the development of complex cultures and social systems in a new ‘tribal’ form of social organization. Land ownership became closely related to kin groups of clans and lineages, which were in turn the basis of formal systems of political authority based on elders or chiefs who could mediate in disputes and sometimes assume priestly functions. A whole variety of groups sprang up based not only on kinship but on residence, work, voluntary association, age, and gender and these group structures and hierarchical organization made it much easier to co-ordinate the larger populations that developed (see Hallpike 2008:66-121). This tribal organization was the essential precursor of the state, particularly through the development of political authority which was always legitimated by descent and religious status. By the state I mean centralised political authority, usually a king, supported by tribute and taxes, and with a monopoly of armed force. Although it has been estimated that only about 20{745b48424592519896714d7eb9f12ef71c35f3ab17441a70b87f3207bf0913ca} of tribal societies in Africa, the Americas, Polynesia, New Guinea, and many parts of Asia actually developed the state, the state was almost as important a revolution in human history as agriculture itself, because of all the further developments it made possible, and a large literature on the process of state formation has developed (e.g. Claessen & Skalnik 1978, Hallpike 1986, 2008, Trigger 2003).

Unfortunately, Harari not only knows very little about tribal societies but seems to have read almost nothing on the literature on state formation either, which he tries to explain as follows:

The stress of farming [worrying about the weather, drought, floods, bandits, next year’s famine and so on] had far reaching consequences. It was the foundation of large-scale political and social systems. Sadly, the diligent peasants almost never achieved the future economic security they so craved through their hard work in the present. Everywhere, rulers and elites sprang up, living off the peasants’ surplus food. (p. 114) 

The reader might well wonder how peasants worrying about next year’s possible famine could possibly have been the foundation of any major political developments, and why in any case they would have meekly allowed their crops to be plundered, as well as where these rulers and elites suddenly sprang from. If Harari knew more about tribal societies he would have realised that the notion of a leader imposing his will on his followers misses the whole point of leadership in pre-state societies, which is that the leader has to attract people by having something to offer them, not by threatening them, because he has no means of doing this. To have power over people one must control something they want: food, land, personal security, status, wealth, the favour of the gods, knowledge, and so on.

In other words, there must be dependency, and leaders must be seen as benefactors. In tribal societies, where people are not self-sufficient in defence, or in access to resources or to the supernatural, they will therefore be willing to accept inequality of power because they obviously get something out of war-leaders, or clan heads, or priests. Political authority in tribal society develops in particular through the kinship system, with hereditary clan heads, who are also believed to have the mystical power to bless their dependents. When states develop we always find that the legitimacy of kings is based on two factors: descent and religion. It is only after the advent of the state can power be riveted on to people by force whether they like it or not, and when it is too late for them to do anything about it except by violent rebellion.

Part III of Dr. Hallpike’s review will be posted tomorrow.
Part I


Average global IQ = 86

The blank slatists and civic nationalists are in for a nightmarish disappointment if these global IQ figures are even remotely correct.

The World’s IQ = 86. Test results of 550,492 individuals in 123 countries.

Every test, either “school near” as those designed for PISA or “school far” as designed for intelligence testing, are subject to the same concerns about sampling, measurement invariance, individual item analysis, and the appropriateness of summary statistics. Why the difference in public response to these two different points on the assessment spectrum? Perhaps it is as simple as noting that in scholastic attainment there is always room to do better (or to blame the quality of schooling) whereas in intelligence testing there is an implication of an immutable restriction, unfairly revealed by tricky questions of doubtful validity.

Perhaps it is a matter of publicity. PISA has the money for brochures, briefing papers, press conferences, meetings with government officials. Richard Lynn put his list together in his study, and came up with results that many were happy to bury.

106.02 CHINA
100.46 GERMANY
 96.99 USA
 92.79 ISRAEL
 88.51 MEXICO
 83.04 JORDAN

I suspect the 8-point decline I calculated in the USA hasn’t fully shown up in these tests yet, although the initial decline is clear in the way it is declining faster than the European norms. Anyhow, as incomplete as the data presently is, it suffices to put the lie to more than one dubious IQ-related narrative.


Smart people are crazy

In fairness, it may be that it is only having to be around stupid people that makes us that way:

The stereotype of a tortured genius may have a basis in reality after a new study found that people with higher IQs are more at risk of developing mental illness.

A team of US researchers surveyed 3,715 members of American Mensa with an IQ higher than 130. An “average IQ score” or “normal IQ score” can be defined as a score between 85 and 115.

The team asked the Mensa members to report whether they had been diagnoses with mental illnesses, including autism spectrum disorder (ASD) and attention deficit hyperactivity disorder (ADHD).

They were also asked to report mood and anxiety disorders, or whether the suspected they suffered from any mental illnesses that had yet to be diagnosed, as well as physiological diseases, like food allergies and asthma. After comparing this with the statistical national average for each illness they found that those in the Mensa community had considerably higher rates of varying disorders.

In my experience, the more highly intelligent people are, the more unstable they tend to be, particularly if they are women. The problem, I suspect, is that smart people are more able to rationalize their aberrant behaviors and justify their departures from observable reality, thereby leading to a vicious circle where their disorders are reinforced.

I find narcissists of the non-malignant variety to be some of the most stable smart people I know, presumably because their lack of interest in other people tends to render their social interactions less confusing and angst-inducing. The more sensitive and insecure an intelligent person is, the more likely that he is going to find himself regularly buffeted by social pressures that he simply cannot understand.


Scientists can’t do science

One of the signs of a society in decline is the way in which its institutions are increasingly incapable of performing their primary functions. SJW convergence is one reason for decline, but declining intelligence and capability is another one. I suspect the latter may be the root cause of the latest scientific debacle.

Researchers warn that large parts of biomedical science could be invalid due to a cascading history of flawed data in a systemic failure going back decades. A new investigation reveals more than 30,000 published scientific studies could be compromised by their use of misidentified cell lines, owing to so-called immortal cells contaminating other research cultures in the lab.

The problem is as serious as it is simple: researchers studying lung cancer publish a new paper, only it turns out the tissue they were actually using in the lab were liver cells. Or what they thought were human cells were mice cells, or vice versa, or something else entirely.

If you think that sounds bad, you’re right, as it means the findings of each piece of affected research may be flawed, and could even be completely unreliable.

“Most scientists don’t intentionally publish findings on the wrong cells,” explains one of the researchers, Serge Horbach from Radboud University in the Netherlands.

“It’s an honest mistake. The more concerning problem is that the research data is potentially invalid and impossible to reproduce.”

Science is not, and should never be, considered any sort of truth-metric. It can only be judged by its actual real-world results, which is to say, science that has not advanced to the state of being transformed into engineering can NEVER be relied upon.

This also demonstrates why it is so vital to construct a solid and reliable foundation, because building upon intellectual sand means the entire edifice is eventually bound to collapse.