Re: Carl Sagan's Contact (was: My Review A.I. the Movie (total sp oiler I hop...

From: Richard Steven Hack (richardhack@pcmagic.net)
Date: Sat Mar 09 2002 - 08:12:50 MST


At 04:09 AM 3/9/02 -0800, you wrote:

>Eliezer S. Yudkowsky wrote:
>
>>If you wish to create a universe with truly sentient inhabitants that suffer
>>and die until they perform some physics trick, then you are evil and must be
>>stopped.
>
>
>Says who? By what universal measure? You create beings that develop
>sentience and evolve to ever greater capacity until they transcend their
>substrate. Who says that you can increase the variety of sentient beings
>and especially super-intelligent beings, in any other manner? Perhaps
>they need to cook from the seeds that you are capable of creating. Who
>says that it is possible to evolve self-improving scenarios without some
>possibilities of failure and suffering that act as drivers? I would like
>to believe there are other ways and work toward creating them but I am not
>about to claim that all beings who could not find such a way are out and
>out evil. Especially since it is possible that the suffering and death is
>not a permanent situation in a "sim" universe. Suffering and death per se
>in a created universe do not prove the creator of said uinverse is evil.
>
>- samantha
>

Well, I understand what Eliezer is saying. I, too, have frequently felt
that if there is a God, he is a major pain for setting things up the way
they are. The Gnostic concept, of course, is that the universe was created
by a "higher" God (or Goddess) than the "blind idiot god) that created our
world.

Having said that, it may turn out that you may be right, too. Maybe this
is the only way to set up the situation.

Richard Steven Hack
richardhack@pcmagic.net


---
Outgoing e-mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.332 / Virus Database: 186 - Release Date: 3/6/02


This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:52 MST