Agnosticon presents his argument for his Singularitarian faith, or as I prefer to think of it, the techno-apocalypse:
In response to whether exponential technology will continue, whether immortality is feasible, and the compatibility of transhumanism with Christianity:
Technological Singularity doesn’t only rely on continuous exponential growth of separate technologies. If you look at the history of technology, there hasn’t just been a single exponential curve that keeps advancing each technology. For instance, vacuum tube technology gave way to transistor technology that gave way to integrated circuits with shrinking scale and increasing speed.
The Kurzweilian Singularity is composed of a series of S shaped curves, each having a gradual initiation and leveling out phase and a middle exponential growth phase as technologies come to fruition and then lapse into obsolescence. The combined effect of technological paradigms appearing and then shifting to new ones are observed as Kurzweil’s Law of Accelerating Returns, Moore’s Law being just a special case. The LAR posits that complexity leverages itself to create more complexity.
The exponential nature of technological advance, particularly in anything that becomes an information science leads to what is now becoming a common pessimistic fallacy across a number of fields. The example Kurzweil gives is of the Human Genome Project which began in 1990 as a fifteen year project to sequence all of human DNA. Halfway into the project only a tiny portion of the genome had been completed, yet by the year 2000 nearly all of it had been finished. What researchers hadn’t realized, due to our inborn tendency to think linearly, is that gene sequencing had become an automated information science, amenable to exponential increase in efficiency.
If we consider the prospects for material immortality today, a similar distortion clouds our perception, namely you cannot extrapolate by linear means into the future and expect to come anywhere close to a realistic target. Not only is this because biology is now an information science, but also because the sophistication and intelligence of computational tools will also grow exponentially in the future.
The single greatest stumbling block for Singularity is the poor performance of software and artificial intelligence in the last half century. While Kurweil can confidently claim that the most powerful supercomputers today are roughly equivalent to the computational power of the human brain, and that by 2020 personal computers will share the same distinction, he cannot project a similar track for AI, which is crucially important. Most people interested in Singularity don’t believe it can happen without I.J. Good’s predicted Intelligence Explosion, whence intelligent machines are able to parse their own code and are smart enough to improve themselves recursively. It is possible that from that point onward, machine intelligence will explode in a positive feedback loop, giving rise to intellects many orders of magnitude beyond ours. The complex interdependencies of biological networks may be beyond our ape’s brains, but very likely they won’t be beyond the superintelligences that arise from the Intelligence Explosion.
The relatively poor performance of AI’s today, and the inability of narrow AI’s to generalize on their own to other domains is somewhat disheartening; however there is cause to be hopeful that things will change in the coming decade, mostly because research is now focusing more on general AI, and it is now known that narrow AI does not lead to insights in general AI. No matter how well DARPA gets a Hummer to cross the desert, that skill is not transferable to other domains.
Along with investigating general AI, the Singularity Institute is investigating means to ensure that superintelligent machines will not destroy us. Friendly AI is the new field that seeks to use decision theory and ideas about mind architecture to create minds that share our own values and retain those values perpetually throughout the intelligence explosion. The overall principle is summarized in the statement: “Gandhi does not want to commit murder, and does not want to modify himself to commit murder.” By grabbing any mind at random out of all of “mind space” the chance of picking one of benevolence is very low. However, by guiding the process onto favorable paths as the Singularity process initiates and unfolds, the theory is that we will be able to avoid those minds that are indifferent, or even hostile, to our existence.
Summarizing and putting all the pieces together, the hardware Singularity is already in progress, the software Singularity has been less spectacular, though there have been significant flashes of brilliance. Software systems in general have steadily increased in complexity. Showcase systems like IBM’s Deep Blue chess player and Watson Jeopardy player have impressively beaten human players, but like the DARPA challenge, are still hampered by being narrow intelligences. This may seem like cause for pessimism, but remember 1998 during the genome project. Remember that we humans suffer the myopia of linear thinking.
The prospect of material immortality? I, for one, am doubtful we will ever get there alone. If there is one thing that we know for sure, it’s that human intelligence is not part of the exponential explosion. Humans are pretty much as smart, and as dumb, as we were thousands of years ago (give or take a Flynn Effect). But imagine, if you will, an intelligence a thousand times greater than ours working on the problem, or a hundred thousand, or a million. Imagine something as far beyond us as we are beyond a gnat.
Is transhumanism incompatible with Christianity? This depends on how you interpret the Singularity. If you recast the quest for material immortality just as the attempt to extend lifespan, I don’t see why you can’t regard it as another medical procedure, albeit an unusual one. Many things about the Singularity can be regarded as only methodologically materialistic and not as pure materialism. However, it would be disingenuous not to recognize that most Singularitarians are probably strict materialists. Things like mind uploading, which contradict doctrines about the human soul, are probably Christian heresies; however, I don’t see much problem with cryonics, nanotechnological resuscitation, and a very, very long life.
There is some question about what and who will be allowed into the post-Singularity “heaven.” If our AI’s are made to be friendly, it might be presumed that evil human intention won’t be allowed into the Singularity either, at least not into merged or uploaded minds. On the other hand, since vintage, unaugmented minds will probably be quite innocuous considering the superpowers that inhabit the Singularity, they may be relegated to a quiet, pastoral existence on a preserve of some type, should they choose to remain human. But even that type of human existence will probably be different than our lives today — or perhaps they will cater to nostalgia. You may be able to return to childhood and relive your life as many times as you want. By this time, human qualia will be understood as neural/cognitive processes; the capacity to feel happiness and reward, or erotic pleasure will be beyond the crass boundary provided us by evolution. Conversely, the ability to inflict arbitrary horror, anxiety and pain on a cognitive agent could conceivably be without bound. The post-Singularity Hell could make Christianity’s look like Disneyland.
If our minds are to populate the post-Singularity on equal status to the potencies of those around us, whether merged with us, or as individual identities, are we ready and willing to relinquish those aspects of ourselves that are inimical to a collective existence? A similar question could be asked of the Christian afterlife. How much of “you” can you afford to lose before you become “not you”?
Said Aleksandr Solzhenitsyn : “If only there were evil people somewhere insidiously committing evil deeds, and it were necessary only to separate them from the rest of us and destroy them. But the line dividing good and evil cuts through the heart of every human being. And who is willing to destroy a piece of his own heart?”
If you desire to live in full post-Singular status, you might face a similar quandary, and this may be the final answer to the question of immortality. Stealing a thought from Buddhism, it is change that defines the central aspect of our lives. It is unclear whether anyone ever lives beyond ten years in any actual sense, because after that interval we have changed beyond equivalent identity.
If we met our ten-year-ago selves, would we share any intimate empathy with them at all? We are engaged in a continual process of birth and becoming and death and dissolution. What we feel as nostalgia is the dim remembrance and mourning of a deceased relative who was ourselves. To achieve true immortality, we may need to reselect from “mind space,” this time choosing one capable perceiving an integrated experience throughout time. For human beings, immortality may be pure illusion.