From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Apr 24 2002 - 21:36:48 MDT
Dave Sill wrote:
>
> "Eliezer S. Yudkowsky" <sentience@pobox.com> wrote:
> >
> > Well, from an altruistic perspective, I don't think that the absolute
> > confidence matters. What matters is not whether a Singularity is 95% likely
> > or 5% likely, but whether your personal efforts can bump that up to 95.001%
> > or 5.001%. From an altruistic perspective this is still playing for stakes
> > overwhelmingly higher than your own life; 0.001% ~ 60,000 lives. (1 day ~
> > 150,000 lives.)
>
> If altruism wasn't one of J R Molloy's useless hypotheses, it should
> have been. All motivation is ultimately selfish. Caring more about the
> welfare of others is irrational, if not insane.
If you were really a selfish rationalist, you would cleverly remain silent
and let altruistic memes propagate, thus maximizing the number of people
working to your benefit. You certainly would not feel the need to argue the
morality of selfishness in public forums. Say what you like about rational
selfishness; it can't consistently propagate itself as a meme.
> > In practice, this means that I try to minimize my attachment to personal
> > gain. I try to, whenever I imagine a conflict of interest, imagine myself
> > doing the altruistic thing. It doesn't matter whether this hurts
> > productivity; it's necessary to maintain cognitive integrity.
>
> So your motivation is maintaining your cognitive integrity, not being
> altruistic.
No, cognitive integrity is more important than short-term productivity
because huge efforts create only the illusion of accomplishment unless they
are the *right* efforts. Cognitive integrity can still be a means to an
end, but it is the part of you that determines what you *think* your ends
are, so compromising it risks everything.
> > I don't think I'm immediately miserable, and I certainly don't try to make
> > myself miserable, but I also pass up on certain kinds of happiness - forms
> > of fun that I think involve unnecessary risk or consume more in time than
> > they pay back in energy.
>
> Avoiding unnecessary risk is selfish, as is avoiding time-wasting
> activities.
Avoiding unnecessary risk to the self is a convergent subgoal of altruism
and selfishness. This is the only body and mind which I control, so risking
them risks my entire potential effect on the future.
> > I think that other people can and should sculpt themselves into thinking
> > like this, but I wouldn't want people to burn themselves out trying to do it
> > too fast. (Under a year is probably too fast.)
>
> Right, a burnt-out altruist is an ineffective altruist. So you have to
> take of yourself in order to be able to help others.
Right, but "take care of yourself" as a subgoal of helping others is
differently shaped from "take care of yourself" as an end in itself. For
example, I cannot afford to take care of myself in ways that threaten to
block my ability to help others.
> > I think it's more important *to the Singularity* for me to try and get this
> > modified primate brain to support altruism than for me to try and squeeze
> > the maximum possible amount of work out of it in the short term. Maybe
> > that's just an excuse that shows that I am still thinking selfishly despite
> > everything,
>
> That'd be my guess. :-)
>
> > and that I value a certain kind of moral comfort more than I
> > value Earth. There is always that uncertainty. But in the end I would
> > distrust the compromise more.
>
> Which compromise?
The compromise of cognitive integrity.
> > *After* the Singularity, when opportunities for major altruism are
> > comparatively rare because there won't be as many major threats, then I
> > expect that your happiness and the happiness of your immediate companion
> > network will be the flower that most needs tending. Today the flower that
> > most needs tending is the Singularity.
>
> Replace Singularity with Second Coming and you sound like a Christian
> zealot.
You live in a free country built by people who were willing to die for
freedom and often did. People can be dedicated to good causes or bad causes
- it depends on your cognitive integrity, among other things. That some
dedicated people are dedicated to evil causes does not make dedication, in
itself, wrong.
> As is, you sound like singularity zealot. I'm sure you're
> convinced--as all zealots are--that your cause is worthy of zealotry,
> but I just don't think it's healthy for either the cause or the
> zealot. Most people--even those dropping like flies today--are happy
> with the way things are. If they get dramatically better some day,
> then that's great, but they don't really want you sacrificing your
> live to save them.
If 90% of them want me to butt out, which I doubt, then the 10% who remain
are still clientele enough for me - not to mention the (*)illions of
sentients who will follow them.
> And I don't relish the thought of a singularity brought on by an AI
> created by someone so caught up in the effort that he never got around
> to experiencing the joys of life.
Like I said: different people, different joys, and nobody gets to be a
complete human being this side of the Singularity.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:39 MST