From: Samantha Atkins (samantha@objectent.com)
Date: Sat Mar 09 2002 - 05:09:16 MST
Eliezer S. Yudkowsky wrote:
> Anders Sandberg wrote:
>
>>I think Greg Egan has done the best "message within pi" trick so far in
>>_Diaspora_. The message of the Transmuters is both obvious (in a way),
>>informative and suggests a way out of the system. I think I will use
>>something similar when I start creating universes - give the inhabitants
>>a chance to get out if they poke deep enough.
>>
>
> If you wish to create a universe with truly sentient inhabitants that suffer
> and die until they perform some physics trick, then you are evil and must be
> stopped.
Says who? By what universal measure? You create beings that
develop sentience and evolve to ever greater capacity until they
transcend their substrate. Who says that you can increase the
variety of sentient beings and especially super-intelligent
beings, in any other manner? Perhaps they need to cook from the
seeds that you are capable of creating. Who says that it is
possible to evolve self-improving scenarios without some
possibilities of failure and suffering that act as drivers? I
would like to believe there are other ways and work toward
creating them but I am not about to claim that all beings who
could not find such a way are out and out evil. Especially
since it is possible that the suffering and death is not a
permanent situation in a "sim" universe. Suffering and death
per se in a created universe do not prove the creator of said
uinverse is evil.
- samantha
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:52 MST