From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Apr 23 2002 - 20:26:13 MDT
Dan Fabulich wrote:
>
> So the question is: does my personal financial contribution to
> immortality research probably matter quite a lot? Or does it probably
> matter fairly little?
This is an interesting question because I'm not entirely sure whether it's
being asked from an altruistic perspective, a purely selfish perspective, a
selfish-discount perspective, or a selfish-split perspective. For me the
answer here is straightforward, and would be regardless of whether I seem to
be playing an interesting role in Friendly AI or seed AI; I count myself as
one six-billionth of the stakes immediately on the table. I'm pretty sure I
can count for more than one six-billionth of the Singularity by trying to
become directly involved in it, so the total amount of happiness I can
create through other-directed efforts is greater than the total amount of
happiness I can create by making myself happy. I also have a nonzero
discount rate for future sentients, and any nonzero discount rate means that
the continued existence and growth of Earth-originating intelligent life,
"humanity's future", is the most important thing to protect, because the
"zillion" (10^N) sentients who exist in the future are the largest stakes on
the table. It's a good thing that the interests of humanity's future and
present-day humanity seem to coincide; otherwise it would present an
interesting ethical dilemna...
There is a somewhat equivalent argument from the "purely selfish"
perspective which states that the expected integral over time of
post-Singularity happiness is likely to outweigh any happiness achieveable
in the present day regardless of discount rate. However, if shifting from
an altruistic to a selfish perspective, one then has the question of whether
one should spend money on increasing the probability of *personal* survival
to the Singularity, or spend money on increasing the probability of the
Singularity, accelerating the arrival time of the Singularity, or making the
Singularity safer. This question is dependent on how much leverage you
think you can have on the Singularity. Given that surprisingly few people
are even paying attention to the oncoming tidal wave, much less do anything
about it, I would argue that individual efforts at this point really can
have a substantial effect - as long as they are intelligently directed to
exactly the point of maximum leverage. However, from a purely selfish
perspective, no matter how much leverage you have, it will still be rational
to buy a cryo bracelet. If you don't think you can have any significant
personal impact on the future and you are purely selfish, then you should
just spend all your money on increasing your chance of personal survival.
It might be noted though that, in human terms, this may constitute
"defecting" in the Prisoner's Dilemna, which is an action that can have
direct personal consequences - people may trust you less. Symmetrically, if
you cooperate, this may encourage others to cooperate as well, amplifying
the effect of your actions.
Most people mix selfishness and altruism.
If you use a "selfish discount" algorithm that weights your own welfare
greater than the welfare of others, but the discount rate on the happiness
of others is nonzero, then this collapses to pretty much the same answer as
the altruistic answer above; the total of what's at stake will outweigh your
own life, no matter the discount rate.
However, the most common real algorithm is the "selfish split" - spend some
of your effort on yourself, and some of your effort on others. This can't
easily be mapped onto normative goal reasoning with a desirability metric
for possible futures, but it does make easy intuitive sense to a human with
a day planner... obviously in this case the rational action is to spend
whatever proportion of your "altruistic" efforts on the Singularity, because
That's What It's All About at this point in Earth's history; and to spend
whatever proportion of your "selfish" efforts on things that make you feel
good in the short term, because if you're trying for a selfish split,
spending your so-called "personal" efforts on willpower-consuming calorie
restriction is not likely to be sustainable psychologically...
The "selfish split" often isn't so much of a selfish split as a split
between efforts directed toward long-term goals and efforts directed toward
short-term happiness. I've never been able to get past the split itself,
but I do try to make sure that all effort available for long-term goals is
directed toward the Singularity, and I don't make efforts for short-term
happiness in ways that threaten those long-term goals. If you're
simultaneously split between selfishness and altruism and between doing what
you want to do and what you ought to do, then you should try to channel your
"want to" efforts into hobbies that damage neither your contribution to the
Singularity nor your chances of long-term survival, and channel your "ought
to" efforts into whatever mix of Singularity activities and long-term
survival activities you think is appropriate, given your chosen split for
altruism and selfishness.
If you're currently operating on a selfish split but *want* to be more
altruistic, you can either "just do it" - make the switchover and get it
over with - or you can try and gradually increase the proportion of the
split allocated to altruism. I think there are profound reasons why trying
for 100% altruism is very different psychologically from compromising at 90%
altruism, but others may feel that the real difference is only an 11%
increase, so your mileage may vary.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:38 MST