From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Oct 19 1999 - 08:58:37 MDT
Sayke@aol.com wrote:
>
> > There are two problems with trying to shock me this way. First, unlike
> > you and den Otter, I suffer from no illusion that the world is fair.
>
> since when do i think the world is fair? when i put words into your
> mouth, i at least attempt to be ridiculous to the point of amusement. you
> actually sound like you think i think the world is fair. amusing, yes, but
> not quite ridiculous enough for me to infer an attempt at irony and
> thought-provocation. i am left with no choice but to take you seriously.
Wrong. It was intended to be thought-provoking.
I don't think you explicitly think the world is fair; I think you're
using fair-world heuristics.
> > I choose the
> > path with the best absolute probability, even if it isn't as emotionally
> > satisfying, even if it contains risks I admit to myself that I can't
> > affect, because the next best alternative is an order of magnitude less
> > attractive.
>
> best absolute probability of what, exactly? and why is that to be strived
> for? if you dont trust philosophy and you dont trust your wetware, what do
> you trust? ("and who do you serve?" sorry... damn that new babylon 5
> spinoff...)
The next best alternative would probably be Walter John Willams's
Aristoi or Iain M. Banks's Culture. Both are low-probability and would
probably require that six billion people die just to establish a seed
culture small enough not to destroy itself.
> and anyway, it seems to me that your basicly saying "the powers will eat
> us if the powers will eat us. their will be done on earth, as it is in
> heaven, forever and ever, amen." damn the man! root for the underdogs! etc...
> (yes, i know my saying that probably has something to do with my tribal-issue
> wetware. so? it makes sense to me. if it shouldnt, point out the whole in my
> premises)
Yes, your intuitive revulsion is exactly my point. I'm saying that
there's nothing we can do about it, and you refuse to accept it. There
may be actions that could very slightly reduce the risk of humanity
being wiped out, like trying to wait to create AI until after a survival
capsule has arrived at Alpha Centauri. There is no anti-AI action we
can take that will improve our absolute chances. The optimal course is
to create AI as fast as possible.
To you this seems like "defeatism" - which is another way of saying that
life is fair and there's no problem you can't take actions to solve.
You're choosing plans so that they contain actions to correspond to each
problem you've noticed, rather than the plan with the least total
probability of arriving at a fatal error.
> does not 'the state of having goals' depend upon personal survival?
Yes.
> if so, are not all other goals secondary to personal survival?
No. The map is not the territory. This is like saying, "Does not the
state of having beliefs depend upon personal survival? If so, are not
all other facts logically dependent on the fact of my existence?"
> the singularity is not, to me, intuitively obvious as "the thing to do
> next." and, i do not trust any kind of intuition, if i can help it. why do
> you? yes, im asking for to justify your reliance on intuition (if thats what
> it is), and thats philosophy. if you will not explain, please explain why you
> will not explain.... heh.
Maybe I'll post my intuitional analysis in a couple of days. But
basically... the world is going somewhere. It has momentum. It can
arrive either at a nanowar or at the creation of superintelligence.
Those are the only two realistic alternatives. Anything else, from
_Aristoi_ to _Player of Games_, is simply not plausible on the cultural
level. Our choice is between destruction and the unknowable. And
that's the only real choice we have.
> and are intuitions not a function of your tribalistic and vestigial
> wetware, as well as my instinct for survival?
Yes, but my intuitions about factual matters actually *work*. That's
why I rely on them, the same reason I rely on logic. My intuitions
about moral and social matters are as untrustworthy as anyone's, of course.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:33 MST