From: Robin Lee Powell (rlpowell@digitalkingdom.org)
Date: Thu Jul 14 2005 - 13:17:00 MDT
On Thu, Jul 14, 2005 at 03:02:53PM -0400, Ben Goertzel wrote:
>
> > > I'd love to hear well-reasoned thoughts on what and whose
> > > motivation would end up being a bigger or more likely danger
> > > to us.
> >
> > I think that all utility functions containing no explicit
> > mention of humanity (example: paperclips) are equally dangerous.
>
> Eli, this clearly isn't true, and I think it's a
> poorly-thought-out statement on your part.
>
> For instance, consider
>
> Goal A: Maximize the entropy of the universe, as rapidly as
> possible.
>
> Goal B: Maximize the joy, freedom and growth potential of all
> sentient beings in the universe
Saying "sentient beings" instead of "humanity" is a cop-out, Ben.
For our purposes, they are identical.
-Robin
-- http://www.digitalkingdom.org/~rlpowell/ *** http://www.lojban.org/ Reason #237 To Learn Lojban: "Homonyms: Their Grate!" Proud Supporter of the Singularity Institute - http://intelligence.org/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT