From: Dave Sill (extropians@dave.sill.org)
Date: Fri Apr 26 2002 - 18:23:09 MDT
"Eliezer S. Yudkowsky" <sentience@pobox.com> wrote:
>
> If you were really a selfish rationalist, you would cleverly remain
> silent and let altruistic memes propagate, thus maximizing the
> number of people working to your benefit.
The trouble with that is most people who claim to believe in altruism
are really selfish so they don't benefit me, and they often expect
things in return and meddle in my affairs "for my own good".
> You certainly would not feel the need to argue the morality of
> selfishness in public forums.
But I *am* a selfish rationalist and I *do* feel the need to discuss
it publicly. What makes you think I wouldn't?
> Say what you like about rational selfishness; it can't consistently
> propagate itself as a meme.
Selfishness needs promoting about as much as gravity does.
Understanding selfishness is important, though, so it's worth talking
about.
> > > In practice, this means that I try to minimize my attachment to
> > > personal gain. I try to, whenever I imagine a conflict of
> > > interest, imagine myself doing the altruistic thing. It doesn't
> > > matter whether this hurts productivity; it's necessary to
> > > maintain cognitive integrity.
> >
> > So your motivation is maintaining your cognitive integrity, not
> > being altruistic.
>
> No, cognitive integrity is more important than short-term
> productivity because huge efforts create only the illusion of
> accomplishment unless they are the *right* efforts.
Obviously, any decision made without cognitive integrity is in danger
of being wrong an counterproductive. So it's in your best interest to
maintain it. That's true whether your goal is ultimately to better
yourself or others with your efforts.
> > Avoiding unnecessary risk is selfish, as is avoiding time-wasting
> > activities.
>
> Avoiding unnecessary risk to the self is a convergent subgoal of
> altruism and selfishness.
Same as with cognitive integrity.
> > As is, you sound like singularity zealot. I'm sure you're
> > convinced--as all zealots are--that your cause is worthy of
> > zealotry, but I just don't think it's healthy for either the cause
> > or the zealot. Most people--even those dropping like flies
> > today--are happy with the way things are. If they get dramatically
> > better some day, then that's great, but they don't really want you
> > sacrificing your live to save them.
>
> If 90% of them want me to butt out, which I doubt, then the 10% who
> remain are still clientele enough for me - not to mention the
> > (*)illions of sentients who will follow them.
I certainly doubt that 90% figure--I think it's much higher. But who
are you trying to kid? Even if you were the only person in the world
who understood the need to promote the right kind of singularity,
you'd still be completely dedicated to it, wouldn't you? If not, then
how do you justify your current efforts? Where's your mandate? What's
the magic number of "clientele" required for you to pursue your
efforts? What proof do you have that you have that many clients? What
of the vast majority who don't approve of your efforts? How do you
discount them?
> > And I don't relish the thought of a singularity brought on by an
> > AI created by someone so caught up in the effort that he never got
> > around to experiencing the joys of life.
>
> Like I said: different people, different joys, and nobody gets to be
> a complete human being this side of the Singularity.
You--one who hasn't even enjoyed the basic experiences of life--think
you're in a position to declare that nobody who has ever lived has
been a "complete human"? That's absurd. It might be true that life
post-singularity is better--though that's not guaranteed--but to call
our current lives incomplete is outrageous.
-Dave
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:39 MST