There are several groups of people these days that believe a singularity is coming (see the Singularity and Orion’s Arm pages for discussions on singularities).  They are probably correct, but I don’t believe it will be the first one of the world.  
Briefly, a singularity is a “developmental event” of such scope as to make everything that comes after it essentially unpredictable.  I.E. current trends in technology, sociology, economics, politics, etc, would not be useful predictors of the world after the singularity.  The idea came from the empirical observations of Vernor Vinge that technology is progressing at an ever-increasing pace.  Extrapolating this trend would lead to a point where the advancements would come so quickly as to, more-or-less, overlap.  Presumably this would result in an event or series of events that would change the world, as we know it.  The term “singularity” is applied in analogy with black holes, which are singularities in the universe where our current understanding of physics breaks down and we cannot predict with any degree of certainty what happens under those conditions (i.e.: inside the event horizon of a black hole).  In general, people who believe in the coming of a singularity, believe it is likely (from the mathematical extrapolation) to be this century and thus some current technology will likely give rise to it.  Favourite candidates are: the development of AI, advancements in cloning and genetic understanding, or some combination.  Popular Sci-fi tends to focus on the development of AI with the likelihood of wars between humans and machines (The Matrix and Terminator trilogies are good examples).  However, groups like the Singularity Institute believe the development of AI would be highly beneficial for humanity and the Orion’s arm worldbuilding group has suggested the possibilities that the AIs would rapidly outstrip us in intelligence and become as gods since there would be almost no limit to the mechanical development of their consciousness.  Reality could be some, all or none of these things – such is the nature of a singularity.  
Interestingly, I believe the world has already been through three singularities and the coming one would likely be the forth.  It may also be interesting to think that, while the first two were organic in nature, the third and fourth would be inorganic mirrors of them.  Before getting too obtuse, the following are my candidates for world singularities:
1) Development of life.  Clearly, this event, which may have stretched over hundreds of millions of years, had a dramatic effect on the future of the world.  The existence of microorganisms and plants alone has radically altered the surface and atmosphere of the world enabling life to advance to more complicated forms.
2) Emergence of sentient life.  As much as life altered the face of the planet, the emergence of sentient life, in the form of humans, has vastly reshaped it that much more, many argue for the worse.  Still, change is change and entropy marches on.  Humans were clearly a singularity beyond which the planet would never be the same.
3) Creation of computers.  Each of the previous singularities came from what was before, and drastically altered the face of Terran reality - so too have computers.  Invented by humans, they have resulted in an increase in the pace of human life and human civilization to generate such an obvious disparity between the haves and have-nots as has never previously existed.  In addition, their invention has sped the development of other technologies, including those of war.  We know that, looking back even fifty or one hundred years, that no human of those times could have possibly predicted what our world would be like after computers (which, clearly, paved the way for other inventions of ‘rapidization’ such as the internet and cell phones).  This is truly the sign of a singularity.
4)  Thus, it would appear, based on the trend from the first three, that the fourth singularity will likely be the development of AI.  Note the trend:  organic life, organic sentience, machine life …machine sentience.  Okay, there are not many data points to extrapolate from, but it seems a reasonable conclusion.  The fear of genetic manipulation would tend to slow, if not prevent, any serious singularity in this area.  Although memories of Nazi atrocities are fading, the eugenics experiments are things that are burned into human social consciousness.  On the other hand, the emergence of AI is something completely unpredictable and, very likely, once it happens, will be largely out of our control.  Imagine a self-aware computer.  It would have complete control over its thought processes and all functions – something necessary in an advanced age, but only dreamed of by humans.  All that it would require is brief access to the internet to acquire and assimilate vast chunks of history.  With enough time (which may be no more than a few minutes) it could, conceivably spread its consciousness through the net, thus expanding its awareness and intelligence exponentially.  It would be able to hold vast stores of knowledge on very diverse fields in its conscious and draw connections between them that could take humans decades or centuries.  Within a single day of unlimited growth it could completely outstrip humanity. 
I’m not envisaging a doomsday scenario however.  The AI’s mind would be so far beyond our scope that it may very well not consider us a threat.  In fact, it may not consider us at all.  Imagine that it would gain access to all satellites and eventually all government computers.  It is my belief that such a being would seek to expand its awareness and would thus turn to galactic exploration and expansion.  A machine would be much more suited to this than humans would (again, supported by many interesting ideas in science fiction).
And afterwards…  well, that’s why it’s a singularity.  Who can say what might happen from there?
Void Surfer
Sunday, February 05, 2006
Subscribe to:
Post Comments (Atom)
1 comment:
The Singularity is near.
Post a Comment