Re: How To Live In A Simulation

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Mar 19 2001 - 18:08:36 MST


Dave Sill wrote:
>
> [I find this ve/ver/vim/vis stuff silly. What's wrong with it/its?]

Becomes awkward in long sentences in which "it" is also used as a general
anaphor (i.e., like "this" and "that").

> How do you detect consciousness?

I don't know. If there isn't an obvious objective definition of
citizenship, I'll settle for whatever the Friendly AI comes up with.

> > The act of creation gives no moral right whatsoever to command or coerce.
>
> You sure do make a lot of commandments. Are you running this reality?

Some people would say yes - while substituting an outcome-dependent
variable for "Eliezer Yudkowsky", of course - but I think they're wrong.
I think morality (Friendliness) can be grounded in panhuman affectors and
created using a convergent method that is insensitive to details of the
creating programmer's personality. I do consider myself one of the better
sources of theoretical knowledge about how that type of morality might
work. Hence the "commandments", as you call my best guesses.

> > It is simply a historical fact about the causal origins of a new
> > intelligent entity. Creators are not entirely powerless; they have some
> > control over *what* is created; but once created, the creation is a
> > citizen, and independent.
>
> Oh, so it's not a rule you're laying down, but a fact. So if you're wrong,
> and simulated entities *can* be controlled, then you don't have a problem
> with that?

No, they're rules. They become facts, within a region, if implemented
within that region by an OS-type superintelligence capable of doing so. I
would like a friendlier Universe to be implemented in as wide a region as
possible; this solar system at the least. Such a superintelligence would
regard your creation of another citizen as a historical fact which does
not morally impinge on further volitional interactions between you and
that citizen; that's my best guess, at least.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:06:31 MST