Is Skynet Inevitable? - Reason.com: "The emergence of super-intelligent machines has been dubbed the technological Singularity. Once machines take over, the argument goes, scientific and technological progress will turn exponential, thus making predictions about the shape of the future impossible.
Barrat believes the Singularity will spell the end of humanity, since the ASI, like Skynet, is liable to conclude that it is vulnerable to being harmed by people. And even if the ASI feels safe, it might well decide that humans constitute a resource that could be put to better use. "The AI does not hate you, nor does it love you," remarks the AI researcher Eliezer Yudkowsky, "But you are made out of atoms which it can use for something else.""