Search This Blog

Wednesday, March 9, 2011

Why an Intelligence Explosion is Probable

Why an Intelligence Explosion is Probable: "One of the earliest incarnations of the contemporary Singularity concept was I.J. Good’s concept of the “intelligence explosion,” articulated in 1965:

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an ‘intelligence explosion,’ and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make.

We consider Good’s vision quite plausible but, unsurprisingly, not all futurist thinkers agree. Skeptics often cite limiting factors that could stop an intelligence explosion from happening, and in a recent post on the Extropy email discussion list, the futurist Anders Sandberg articulated some of those possible limiting factors, in a particularly clear way..."

No comments:

Post a Comment