From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Mar 14 2001 - 18:46:58 MST
Nick Bostrom wrote:
>
> Eliezer wrote:
> >
> >Ah, yes, but it doesn't have negligible moral weight. Suppose that
> >there's one original Eliezer and a billion imitators. The chance that I'm
> >the original is only one-billionth; however, the actions of that original
> >would carry a billion times as much weight.
>
> Why do you think that the basement-level Eliezer's actions carry a billion
> times more weight?
Because my - his - we need new pronouns - sensory information is
duplicated a billion times over.
> Indeed, in some respects Eliezer seems more like one of those slightly
> implausible characters that might have been added for dramatic effect ;-)
I'm fairly sure that years 14-17 were in *someone's* real life - I really
can't see someone putting that into an educational simulation except for
the sake of historical accuracy. If this is a simulation, I don't expect
that I have much free will.
I tend to deprecate the "amoral posthumans simulating whole civilizations"
hypothesis - (a) I don't think it's true, (b) our civilization currently
seems to be on track for either extermination or Friendliness (neither
future allows the amoral simulation of whole civilizations), and (c) if it
is true, there's not much I can do about it.
The Sysop Citizenship Rules, I expect, disallow simulation of single
unconsenting individuals even as personally experienced virtual realities
- never mind the simulation and extermination of entire civilizations! I
know (or rather, I recall) that during most of my life I would not have
consented to remaining unknowing in the simulation - I'm not sure I would
consent to it now - but it's possible that The Rules work differently than
I expect, and interim nonconsent turns out to be okay as long as there's a
near-certainty of approval in retrospect by a future self. Since
my/his/Eliezer's life is certainly educational, it's possible to imagine
that I approved the reliving, or will come to approve this reliving with
near certainty. Again, this is not how I expect that Friendliness works,
but I could be wrong.
But, again, I think this *is* the real world.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:06:22 MST