From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Nov 09 2002 - 23:33:54 MST
Slawomir Paliwoda wrote:
>
> Okay, I see now what you were trying to say here. Unfortunately, it
> looks like you completely missed the point of the whole discussion
> which was to make the FAI research famous in order to get funding, not
> making you famous. Getting personal fame for the right reasons would
> not be that easy either.
You're placing too much trust in your moral intuitions. Generally, if a
moral compromise instinctively seems like a good idea, it's because in the
ancestral environment that moral compromise would have promoted your
*personal* reproductive fitness. It is not a coincidence that the moral
compromises that seemed to Stalin to promise the greatest good for the
greatest number ended up with Stalin as tribal chief and all of the
supposed beneficiaries miserable. I don't mean to imply that this
evolutionary motivation is explicitly represented in cognition either
consciously or subconsciously; explictly thinking "I am riding this issue
for the sake of personal fame" would tend to interfere with riding the
issue for the sake of personal fame.
It seems to your moral intuitions like compromising the message at the
heart of the Singularity seems like a good idea, something that would work
to promote the Singularity, and certainly not anything that you are doing
for the sake of personal fame. Why does it seem like a good idea? Is it
an empirical generalization from the history of postagricultural
societies? Are you modeling the detailed effect of your moral compromise
on millions of interacting people in order to predict the outcome of a
complex social and memetic system? Of course not. It seems like a good
idea because fifty thousand years ago, people who thought it was a good
idea tended to end up as tribal chiefs. In the domain of politics, a
means to an end intuitively seems like a good idea to the extent that
carrying out that means would have served the purposes of your genes in a
hunter-gatherer tribe, not to the extent the means would achieve its
supposed end in our far more complex culture.
It *is* a famous empirical generalization from the history of
postagricultural societies that people who start out by making moral
compromises in the service of their ideals usually end up not
accomplishing anything toward those ideals, although their adaptations may
(or may not) operate in accordance with ancestral function to place them
in positions of personal benefit.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT