From: Dan Fabulich (daniel.fabulich@yale.edu)
Date: Fri Apr 26 2002 - 22:00:33 MDT
Eliezer S. Yudkowsky wrote:
> [discussion about altruism largely skipped... there's plenty of this
> already in the archives.]
>
> But that's beside the point. I probably do contribute less
> immediate local happiness to the universe than I would if I lived my
> life a bit differently. Those are the stakes I ante to the table
> because I think it's a good bet.
That "goodness" of the bet is, itself, my question.
I'm skeptical that altruism alone is sufficient justification for
working on the Singularity, any more than flat self-interest is
sufficient justification for working on the Singularity, because I'm
unconvinced that the bet is good for either type of thinker.
Both the altruist and the egoist may be making a radically bad bet;
they may be wrongly trading off whatever it is that they value today
for more of it tomorrow. It's natural to think of personal
satisfaction as the value being postponed/invested, but
accomplished-Friendliness might stand in for personal satisfaction
just as easily.
Almost nobody agrees with the Extropians/Transhumanists that pursuing
LE/Singularity is the most effective way to be an egoist OR an
altruist. "Fifty million Frenchmen," as the saying goes.
I've lost a lot of confidence that the bet is good. I'm now quite
uncertain that if I devote my life to either LE or Singularity I'll
have any effect at all, even to the thousandth of a percentage point,
even to the millionth of a percentage point. Even if I did have an
effect, the ball is still up in the air as to whether my effect would
be a positive one.
Is it a good bet? What makes it so good? What makes it so much
better than betting on raising a (mortal) family and retiring
peacefully?
Why is such a tiny minority of apparently intelligent people
unconvinced that the bet is even non-bad?
-Dan
-unless you love someone-
-nothing else makes any sense-
e.e. cummings
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:39 MST