From: Dave Sill (extropians@dave.sill.org)
Date: Wed Apr 24 2002 - 20:46:59 MDT
"Eliezer S. Yudkowsky" <sentience@pobox.com> wrote:
>
> Well, from an altruistic perspective, I don't think that the absolute
> confidence matters. What matters is not whether a Singularity is 95% likely
> or 5% likely, but whether your personal efforts can bump that up to 95.001%
> or 5.001%. From an altruistic perspective this is still playing for stakes
> overwhelmingly higher than your own life; 0.001% ~ 60,000 lives. (1 day ~
> 150,000 lives.)
If altruism wasn't one of J R Molloy's useless hypotheses, it should
have been. All motivation is ultimately selfish. Caring more about the
welfare of others is irrational, if not insane.
> In practice, this means that I try to minimize my attachment to personal
> gain. I try to, whenever I imagine a conflict of interest, imagine myself
> doing the altruistic thing. It doesn't matter whether this hurts
> productivity; it's necessary to maintain cognitive integrity.
So your motivation is maintaining your cognitive integrity, not being
altruistic.
> I don't think I'm immediately miserable, and I certainly don't try to make
> myself miserable, but I also pass up on certain kinds of happiness - forms
> of fun that I think involve unnecessary risk or consume more in time than
> they pay back in energy.
Avoiding unnecessary risk is selfish, as is avoiding time-wasting
activities.
> I think that other people can and should sculpt themselves into thinking
> like this, but I wouldn't want people to burn themselves out trying to do it
> too fast. (Under a year is probably too fast.)
Right, a burnt-out altruist is an ineffective altruist. So you have to
take of yourself in order to be able to help others.
> I think it's more important *to the Singularity* for me to try and get this
> modified primate brain to support altruism than for me to try and squeeze
> the maximum possible amount of work out of it in the short term. Maybe
> that's just an excuse that shows that I am still thinking selfishly despite
> everything,
That'd be my guess. :-)
> and that I value a certain kind of moral comfort more than I
> value Earth. There is always that uncertainty. But in the end I would
> distrust the compromise more.
Which compromise?
> *After* the Singularity, when opportunities for major altruism are
> comparatively rare because there won't be as many major threats, then I
> expect that your happiness and the happiness of your immediate companion
> network will be the flower that most needs tending. Today the flower that
> most needs tending is the Singularity.
Replace Singularity with Second Coming and you sound like a Christian
zealot. As is, you sound like singularity zealot. I'm sure you're
convinced--as all zealots are--that your cause is worthy of zealotry,
but I just don't think it's healthy for either the cause or the
zealot. Most people--even those dropping like flies today--are happy
with the way things are. If they get dramatically better some day,
then that's great, but they don't really want you sacrificing your
live to save them.
And I don't relish the thought of a singularity brought on by an AI
created by someone so caught up in the effort that he never got around
to experiencing the joys of life. Quality of life matters more to me
than quantity. I'd rather see more organic progress toward the
singularity that incorporates a broader perspective of the life
experience.
-Dave
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:38 MST