Search This Blog

Wednesday, April 20, 2011

Mitigating the Risks of Artificial Superintelligence

Mitigating the Risks of Artificial Superintelligence: "“Existential risk” refers to the risk that the human race as a whole might be annihilated. In other words: human extinction risk, or species-level genocide. This is an important concept because, as terrible at it would be if 90% of the human race were annihilated, wiping out 100% is a whole different matter."

No comments:

Post a Comment