From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Oct 06 1997 - 10:17:16 MDT
Anders Sandberg wrote:
>
> > -- I anticipate that Anders will suggest that superintelligences will
> > make entertainment out of shaping human organizations into
> > entities that perform intelligent tasks, just as we have fun by
> > watching circus animals behave. :-)
>
> It seems like you have a quite good simulation of my thought processes.
>
> About Eliezer's view that this would be immoral: I think it can be
> ethical to do, as long as the humans voluntarily agree to form the
> organisation. "Wow! Look at this: that PostAnders entity has a really
> great idea about a selforganized democracy. Let's try it!"
Well, of course it's moral with informed consent. *Anything* is moral
with informed consent. That's not the same as the "circus" theory.
I still maintain that it would be pointless to form human organizations into
Turing machines. Besides, this doesn't take a posthuman entity at all. I bet
Bill Gates could make Microsoft simulate a small Turing machine with a few
simple mandatory self-propagating memos. Maybe he's doing it already.
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:45:00 MST