AI - Bostrom

Post Reply
User avatar
Pigeon
Posts: 18059
Joined: Thu Mar 31, 2011 3:00 pm

AI - Bostrom

Post by Pigeon » Wed Dec 30, 2015 12:27 am


The influential 42-year-old philosopher Nick Bostrom favors the creation of “superintelligent” computers, but only if done with great vigilance, with safeguards to ensure that the machines do not escape human control and pose an existential threat to humanity.


Bostrom’s favorite apocalyptic hypothetical involves a machine that has been programmed to make paper clips (although any mundane product will do). This machine keeps getting smarter and more powerful, but never develops human values. It achieves “superintelligence.” It begins to convert all kinds of ordinary materials into paper clips. Eventually it decides to turn everything on Earth — including the human race (!!!) — into paper clips.


Then it goes interstellar.

“An ultraintelligent machine could design even better machines; there would then unquestionably be an ‘intelligence explosion,’ and the intelligence of man would be left far behind.”

I.J. Good, a British mathematician and code-breaker, in 1965 


User avatar
Royal
Posts: 10565
Joined: Mon Apr 11, 2011 5:55 pm

Re: AI - Bostrom

Post by Royal » Thu Dec 31, 2015 6:26 am

We can see factions of society treating human capital as paper clips - molding, dispensing at will.

I often think that the internet has made the world into its paperclips. An entity with limitless power.

Post Reply