From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Jul 14 2005 - 13:26:14 MDT
Ashley Thomas wrote:
> On Jul 14, 2005, at 2:06 PM, Eliezer S. Yudkowsky wrote:
>
>> Juvenile? That's one I hadn't heard before... I use "paperclips" to
>> convey the idea of an AI with goals that may be simple or complex but
>> are essentially uninteresting to a human. ... If you want to convey
>> an actual, different concept in order to sound less juvenile, then I
>> have to object: "not sounding juvenile" isn't a consideration when
>> you're trying to figure out the actual facts of the matter.
>
>
> "Paperclip Maximizer" is a technical term which the SL4 community
> understands to describe a class of existential threats. Some members of
> that community have as goals the education of non-SL4 individuals. Our
> ability to work on that goal is affected by the impact we have on those
> individuals, which includes the impressions of our terminology.
>
> "Mr. President! We need to nuke Silicon Valley in the next sixteen
> seconds to stop a paperclip maximizer!"
>
> "..."
Okay, do you have a better suggestion?
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT