Singularity is a favorite topic among geeks… I find it unsettling.
The Singularity in the film’s title refers to a point in time when the planet’s non-biological intelligence will be one billion times more powerful than the sum of all human intelligence existing today. At that point, the non-biological intelligence will have begun to analyze and improve itself in increasingly rapid redesign cycles. Technical progress will be so fast that un-enhanced humans would be unable to follow it. Kurzweil posits that this will occur in 2045 and concedes the extreme difficulty in making predictions past that point.
[From Untitled]
Kurzweil makes some really compelling predictions in a thesis that is broadly called singularity and at the core of the notion is the suggestion that machines will be developed with capacity for creativity and invention which mimic that of humans but as they take on this new dimension they will achieve something dramatic, the ability to self-iterate and evolve at a rate far greater than if that development were solely at the hands of humans.
I have to admit that I find myself thinking “who am I to question RAY KURZWEIL, the guy is a legitimate legend?” but it’s not the predictions he makes that bother me but rather the glee that his cult-like followers bubble with. I get that the realization of singularity could result in some wonderful advances that diminish the frailties of being human, and I really would look forward to a world without serious physical disability and totally powered by clean renewable energy. If we could just accomplish those two things we would have done something that pays dividends into future generations.
However, there is something rather creepy about the singularity movement that goes beyond solving diseases and creating energy, it often comes across that singularity is a way to better organize society and substitute real person to person contact (think Second Life at a whole new level). This is find troubling because the thing that makes us human is our imperfection, both in our form and our organization, and the emotional stimulus that comes from connecting and interacting with our fellow humans cannot be replaced or replicated by machines.
There are also legitimate concerns about machines becoming autonomous, all Terminator jokes aside. The proliferation of nanotechnology could also result in some serious and irreversible consequences, in effect becoming an infection that mimics behaviors of a virus by mutating to avoid detection and eradication.
Kurzweil, for all his genius, could also just be wrong and while the march of technology continues unabated it may not be accelerating after all but slowing down, just like the early phase of the industrial revolution saw explosive innovations it later became held down by inertia that dramatically slowed the innovation and resulted in counter productive behaviors that still plague us (such as labor unions that are inflexible to technological change).
Complexity theory rears it’s ugly head here as well as technological innovation leads to self-limited returns as a result of the technical complexity it imposes on the system as a whole. We see this today manifest itself as a something as trivial as a mis-typed entry in a routing table bringing down an entire system or an errant line of code that spawns a billion processes which crash a complex server environment. In other words, just having more technology does not mean we are closer to Kurzweil’s vision, it just means we have more technology with more points of failure.
I can’t help but think of the cryogenics proponents from the 1980′s… you remember them, they wanted to cut off your head when you die and freeze it so that when science achieves the ability to create life (already has, btw) they can put your head on a new body and you are good to go. After I got over the irrepressible giggling at the notion of freezing my head for future generations I wondered why I would want to put my head on a new body in the pursuit of immortality when all the people I care about and the life I know is rooted in the here and now? Singularity is the same way for me, I’m not sure I want to have a world where technology is autonomous and I’m just another cog in a machine, literally.

A “singularity event” is essentially any advance or technology whose outcome introduces such drastic change that it defies any possibility of prediction beyond that point. That’s what the ‘singularity’ part of the name means.
Anyone who makes predictions about what might happen after a singularity event is, by definition, talking out of their hat.
Tateru,
You are certainly correct but at the same time you must recognize that there is a “movement” around Kurzweil and his well formed singularity predictions.
So what if there is a “movement” that Kurzweil has inadvertently created? And I say ‘inadvertenly’ because the guy is not a cult leader; he is a futurist, and a good one at that. Better a movement that focuses on improving the human condition than our ages-old, more familiar, yet infinitely more debilitating death cults which result in nut jobs crashing planes into skyscrapers – just to give one juicy example from a library-large litany of them. I’ll take Kurzweil’s “nerdy” movement any day over the various death cults on this planet, which offer hatred, strife, prejudice and violent, misogynistic rhetoric, and backward, Bronze Age ‘thinking’.
Your whole “creepy” angle on the so-called movement lacks analysis. How are the people who buy into his predictions “creepy”? Some may be ‘gleeful’ because he describes a future where anything is possible in terms of our creative potential. That inspires some people. Others in his movement can be both admiring of his ideas and yet reserved. While others are just having fun toying with these ideas. But especially to those people who view ignorance and death as unpalatable things in human life – ‘imperfections’ as you say – his ideas give hope for a better, more enlightened tomorrow. A future where human intelligence is vastly upgraded – in a sense, idiocy will be eliminated. Imagine that: it is easy to manipulate gullible, low-IQ masses but what if we are all much stronger intellectually? Will the Pat Buchanans, and imams, and scumbag politicians of this world so easily manipulate someone with 10x the mental strength of Stephen Hawking? The majority of our worst problems on this planet come from ignorance, or more bluntly put, human stupidity. Religion, superstition, unwarranted prejudice, psychological frailties (depression, anger, sociopathological tendencies), emotional instabilities, etc. Surely, some of us look forward to an end to those kind of ‘imperfections’. They make us human by definition that we have them – circular logic. But our ability to ‘upgrade’ ourselves is also human; our ability to change and adapt and aspire to become better is ALSO human isn’t it? So why should we let our weaknesses define our nature? Let us make our strengths as defining of who we are.
Now, that may ‘scare’ the ‘dinosaurs’/traditionalists who clutch onto the past like some lifebelt but aside from your inner psychological conflicts, I don’t find anything “creepy” about real human progress. I find it refreshing and bold and humane, yes, emphasis on human not machine. If you are so concerned about ‘machine-think’ check out the various death cults on this planet, aka, religions.
Wait – did you even read the man’s book? If you did, you would understand that ‘real person to person contact’ is still something that is perceived through our organic substrate. There is always a medium that filters and outputs our so-called perceptions; even our brain is a filter of sorts. The point is that you make melodrama out of something that is trivial – human body, ergo, a sort of “filter”. There is no true “contact”; you are never “One” with that flower you smell, that is an illusion. You filter all of it in and your brain outputs ‘flower image’ and particular scent. What Kurzweil’s says is that in the future some of the common substrates will change but the essence of the contact will not. Reality will still be reality even if your ‘filters’ have been upgraded (brain, CNS, etc.).
You find Kurzweil’s fresh ideas “unsettling”: let me guess, because you’re close to middle-aged and find security in your life ‘as-is’ and as it ‘has-been’. Well, things change. Deal with it. Or not; you risk becoming, intellectually speaking, extinct. But as for me, as Kurzweil said it, if we had listened to the traditionalists we’d still be swinging from tree branches. Fortunately, humanity didn’t. Instead it listened to the likes of Copernicus, Galileo, Newton, Darwin, Einstein, Hawking. And it listened to the various scientists and engineers, which is why, by the way, that you have a car, and a house, and this pc on which you can type up little pieces on how someone’s ooh-so scary!-nerd movement is “creepy”. Enjoy your “creepy” technology, may it serve you well.
You make a lot of valid and important points, thanks for commenting.
PS- headlines drive traffic, you will notice that I only used the word “creepy” once in the actual text 🙂
The more pressing concern is the ideology of the Singularitarians. Whether or not their wish for transhuman evolution comes to pass, they will continue to act as if its arrival is inevitable. Keep in mind that promises of youth restoration and immortality are persistently compelling across human cultures. The Singularitarians have placed themselves at the forefront of various life extension and human augmentation projects… Every successful advancement (except perhaps the last one) will be highly rewarded, further encouraging their work. Their power and influence is more likely to wax than wane, so it’s critically important to understand the kinds of values and assumptions that propel their worldview. For example, a typical Singularitarian’s conception of free will strikes me as highly problematic, and that carries serious implications for the future of democracy.
Have a look at my paper “Deriving Common Interests from Animal Origins: The Generative Constraints of Global Polity” at http://www.rkey.com/essays/Simon_DCI_02.pdf