From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Mar 18 2000 - 21:08:01 MST
"Eliezer S. Yudkowsky" mistakenly wrote:
>
> No, that estimate is definitely incorrect. Using a value of less than
> 10% or more than 70% would be unjustifiable. 30% was "pulled out of the
> air"; I'll happily defend the range itself.
>
> More than 70% would be unjustifiable due to the Fermi Paradox and unknowability.
>
> Since, if we can create a Sysop with specifiable stable goals, we win,
> to assert that the probability is less than 10% would require
> demonstrating that the probability of (A) External goals (and hostile
> ones, at that), or (B) the probability that stable arbitrary goals can
^^^
should be 'cannot'
> be produced, are one or the other above 90%, or that their product is
> above 90%; which requires a degree of definite knowledge about these
> issues that nobody possesses. Even if it were possible to rationally
> estimate the resulting "specifiable stable goals" probability as being
> below 10%, which I do not think is the case, then it would be absurd to
> argue it as being 1%. To say that a 99% probability of "no specifiable
> goals" holds is to imply definite knowledge, which neither of us has.
Or in other words, I'm sure den Otter would agree that 70% is a
reasonable upper bound on our chance of success given our current
knowledge (although I'm sure he thinks it's too optimistic). It is
equally possible to set a reasonable upper bound on our chance of failure.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/beyond.html Member, Extropy Institute Senior Associate, Foresight Institute
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:27:30 MST