From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Nov 22 2002 - 14:20:39 MST
Rafal Smigrodzki wrote:
> Eliezer wrote:
>
>>Or in simpler terms, when I say that X is MORALLY WRONG, I mean that X
>>seems to me to be wrong regardless of what I think about it, and the
>>fact that an alternate Eliezer might be warped to think X was right
>>makes no difference in that. Similarly, it seems to me that 2 + 2 =
>>4 whether I believe that or not, and the idea of being hypnotized to
>>believe 2 + 2 = 5 doesn't change that, nor does the fact that "2 + 2
>>= 4" is a cognitive representation somewhere in my brain.
>
> ### In other words, you deny the murderous Eliezer any claim for personal
> identity with you. Although this is quite similar to the approach I would
> most likely take with an analogously changed version of myself, I do see a
> philosophical caveat of a very general nature - our perceptions and beliefs
> can be wrong, no matter how strongly we feel about them. It is not
> impossible that a type IV deity could actually make 2+2=5, and make murder
> to be the right thing to do, but our limited minds are incapable of seeing
> the rationale. Maybe with only a few dozen working memory slots more we'd be
> able to glimpse the inevitability of genocide. Would you repudiate the
> ultra-smart Eliezer (as measured by extensions of current cognitive tests)?
> After all, if you tried to explain the idea of object constancy to the very
> early, one-year old forms of Eliezer, you'd fail, yet you do accept as valid
> the transformative process which brought you from that level to today's
> brilliance. There is no telling where our paths will take us.
Sure, I can accept that 2 + 2 might equal 5. I just don't model it as
having anything to do with warped-Eliezer *thinking* that 2 + 2 = 5. Same
goes for murder. I can model the possibility that murder might be right,
or even that an unFriendly type IV deity might *make* murder be right
under my own current moral standards (now *there* is a Greg Egan
fuseblower), but in both cases I don't model that as having anything to do
with the case of a warped-Eliezer *thinking* murder is right.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jan 15 2003 - 17:58:19 MST