Artificial intelligence and singularity could mean demise of human control: "Oxford University philosopher Nick Bostrom points out in his new book Superintelligence that it is imperative we "understand the challenge presented by the prospect of superintelligence, and how we might best respond."
"This is quite possibly the most important and most daunting challenge humanity has ever faced. And – whether we succeed or fail – it is probably the last challenge we will ever face."
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment