From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Mar 15 2001 - 14:38:13 MST
"Robert J. Bradbury" wrote:
>
> Reliberion's Prince of Darkness (aka Eliezer wrote):
...why?
> > If this is a simulation, I don't expect that I have much free will.
>
> Of course you have free will, what would be the point of a simulation
> if you didn't? [Now of course, if you are a zombie in the simulation
> in which *I* have the free will, then you don't...]
It's a historical simulation of Transition celebrity Eliezer Yudkowsky, of
course. A lot of post-Singularity still-human individuals ran it, more so
as Eliezer is one of the few (historical) personalities that can be run in
a totally immersive, non-knowing simulation without violating citizenship
rights... though it does have a certain effect on your personality...
Not that *I'm* volunteering, but I can easily see myself reaching that
conclusion (even pre-Singularity), and that might be enough under the
rules.
You can watch the 24-hour Buffy channel; you can live in an immersive
Buffyverse Matrix filled with totally realistic personalities for all
other sentients; you can adopt a Buffy outlook and personality; you can
assimilate Buffy's historical memories (as revealed by Joss Whedon); you
can shove your memories of the real world into the back of your mind; you
can damp down the cognitive processes whereby your knowledge of the
unreality of your world interferes with the emotional impact - but you
still need enough memory of the external world to give your continuous
informed consent, because a pure vanilla Buffy wouldn't consent to being
stuck in her horrible world... or choose to load your "real" memories back
in after leaving it.
You can only undergo a totally unknowing sim experience if the sim
personality is such as to (a) consent and (b) be willing to load your true
memories back in afterwards. So it's more than usually plausible that
*I'm* a simulation, not just because of my potential celebrity, but
because Eliezer Yudkowsky is one of the few historical personalities who
might be willing to do that - exist and hurt and be effectively
annihilated at the end - *if* there were a really strong reason, something
to be gained under my/his morality; because right now, at least, I assign
my own life the same value I would assign anyone else's, and I'd need a
good reason before I'd allow myself to be hurt. In the apparent/original
world, the Singularity can take precedence over my welfare; in a world
where there's no Singularity to be gained, I/he/we would need a pretty
good reason to consent to the recreation of negative experiences, even if
it's consent-in-potentia.
But again, in that case, all you guys out there are zombies, and I'm
writing this email because it was in the historical record - so why dwell
on it?
> How do you even know or enforce what someone else does with their
> computronium?
You own and control your own computronium, but you control it through the
Sysop API, and trying to violate citizenship rights will result in an API
error. If we're in a sim with no citizenship rights, then we are so
totally, utterly screwed that there is really very little that I can think
of to do about it.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:06:24 MST