From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Mar 25 1999 - 17:06:13 MST
"Michael S. Lorrey" wrote:
>
> Yes, but I think with the above proposal, the altruist motivator can be
> maintained. What do you think?
I think that there are conditions under which a suboptimal humanity
might be preserved: If killing us all off would scare off other races
from their own Singularity. I don't know how the other races would
know; time cameras, reasoning from the Great Filter, or just following
the same logic. Even if only one-in-a-septillion races manages this
trick, it's entirely logical to devote an octillionth of available
resources to running humans.
In this case I would commit suicide to free up my share of the
resources, unless even that small action might scare off other races.
The practical imperative involved might be to figure out the logic, on
the grounds that only races that figure out the logic are protected by
it - although one rather suspects this isn't the case, and *in* that
case, morally speaking, the Universe would be better off if we failed.
Anyway, I don't think that the Great Filter Paradox allows for races to
be extinguished for utility; it doesn't explain why Powers would reach
out and deliberately hunt down any hiveminds that managed to avoid a Singularity.
I really do think that when all is said and done, predicting our
treatment at the hands of the Powers is a fifty-fifty coinflip. I just
don't know. What I do know is that the near-future parts of the
probability branches indicate that, after the preconditions are taken
into account, this coinflip chance is larger by an order of magnitude
than all the other happy outcomes put together.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/singul_arity.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:03:23 MST