Re: Immortality and Personal Finance

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri May 03 2002 - 15:38:57 MDT


Dan Fabulich wrote:
>
> Both the altruist and the egoist may be making a radically bad bet;
> they may be wrongly trading off whatever it is that they value today
> for more of it tomorrow.

Yes, they may. They may also be making a radically good bet. Trying to
avoid all bets that have any possibility of resulting in emotional
frustration is a non-normative way of approaching the problem.

> It's natural to think of personal
> satisfaction as the value being postponed/invested, but
> accomplished-Friendliness might stand in for personal satisfaction
> just as easily.

Stipulated.

> Almost nobody agrees with the Extropians/Transhumanists that pursuing
> LE/Singularity is the most effective way to be an egoist OR an
> altruist. "Fifty million Frenchmen," as the saying goes.

Call me an iconoclast, but after seeing the ways that millions of people
have been wrong in the past, and learning about the forces that tend to make
millions of people go wrong, the herd instinct has been burned out of me
somewhere along the line. If fifty million people believe X, they may be
wrong or they may be right. As Bayesian evidence goes, this is usually weak
enough that the "Bayesian priors" for X, which in this context means looking
directly at the rational evidence for X, tend to outweigh fifty million
people's opinion any day of the week.

Do you believe that if the Singularity meme is correct, it should be
spreading faster than it is so far? I don't see any dissonance here. That
an idea spreads slowly does not prove the idea is correct by analogy with
past good ideas that spread slowly; bad ideas can spread slowly too. But it
is also true that a lot of good ideas have spread much slower than the
Singularity.

> I've lost a lot of confidence that the bet is good. I'm now quite
> uncertain that if I devote my life to either LE or Singularity I'll
> have any effect at all, even to the thousandth of a percentage point,
> even to the millionth of a percentage point.

Projecting your current self forward in time to after the Singularity, would
you enjoy saying: "Yes, I was born before the Singularity, and I was even
one of the relative handful of people that saw it coming, but I didn't even
try to do anything about it"?

> Even if I did have an
> effect, the ball is still up in the air as to whether my effect would
> be a positive one.

This is absolutely, totally, completely true. But deciding to do nothing is
also a choice that can have unintended consequences. And deciding not to
think about the issue doesn't mean that your acts have no consequences, just
that you aren't visualizing those consequences.

> Is it a good bet? What makes it so good? What makes it so much
> better than betting on raising a (mortal) family and retiring
> peacefully?

Asked on those terms, it's a good bet because of the insane amount of
leverage that comes from being one of infinitesimal fraction of sentients
living *before* the Singularity. It's a good bet that the incidental effect
exerted on the Singularity by a janitor in Peoria, whether that effect is
positive or negative, will outweigh by orders of magnitude the *direct*
value of raising a mortal family. Or at least that's how I see it. Now, as
facts go, this fact is very unsuited to direct comprehension by the thinly
modified primate brains we're trying to run our minds on, and as I see it,
the best translation that can be easily represented by contemporary
emotional systems is "Whatever percentage of your resources you allocate to
altruism and trying to have a long-term positive effect, allocate it to the
Singularity."

> Why is such a tiny minority of apparently intelligent people
> unconvinced that the bet is even non-bad?

Because being rationally intelligent is much harder than being relatively
intelligent.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:48 MST