"Michael S. Lorrey" wrote:
>
> Yes, but I think with the above proposal, the altruist motivator can be
> maintained. What do you think?
I think that there are conditions under which a suboptimal humanity might be preserved: If killing us all off would scare off other races from their own Singularity. I don't know how the other races would know; time cameras, reasoning from the Great Filter, or just following the same logic. Even if only one-in-a-septillion races manages this trick, it's entirely logical to devote an octillionth of available resources to running humans.
In this case I would commit suicide to free up my share of the resources, unless even that small action might scare off other races.
The practical imperative involved might be to figure out the logic, on the grounds that only races that figure out the logic are protected by it - although one rather suspects this isn't the case, and *in* that case, morally speaking, the Universe would be better off if we failed.
Anyway, I don't think that the Great Filter Paradox allows for races to be extinguished for utility; it doesn't explain why Powers would reach out and deliberately hunt down any hiveminds that managed to avoid a Singularity.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/singul_arity.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.