From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Jun 24 2002 - 04:24:06 MDT
Eugen Leitl wrote:
> On Sun, 23 Jun 2002, Eliezer S. Yudkowsky wrote:
>
>>How can it be moral for you to sympathize with unmodified humans now, yet
>>immoral after you transcend? Which one of you is being irrational?
>
> I do not understand this sentence. Do you somehow imply that morality is
> viewpoint-invariant?
I try my best. That's not what I'm asking, however. I am asking *you*,
specifically, one individual with one viewpoint, whether you consider your
present sympathy with humans to be irrational, and if so, why you keep on
doing it; or, if it is not irrational, why you expect that you would lose
this sympathy as an eventual Power. I am not asking you whether you believe
that all sentient beings must think the same way, or all humanborn Powers; I
am asking whether *you personally* think that you are presently being
immoral by the correct standards of your future self, or whether your future
self is being immoral by the correct standards of your present self.
Correct from whose viewpoint? Why, yours, of course. You can talk about
the observer-dependence of morality all you like, but I assume that you
personally have a morality of some kind. Is your future self being immoral?
Or are you being immoral? You can't both be moral under any single moral
standard. So, under your single moral standard as a single individual,
who's wrong?
> What is the mechanism asserting conservation of specific frame of morality
> over subjective geological time scale in face of speciation, radiation
> driven by Lamarckian/Darwinian evolution?
Darwinian evolution, speciation, and radiation apply only to populations; I
was asking about *you* as an individual.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:14:59 MST