From: Justin Jones (frjones@swbell.net)
Date: Tue Jan 12 1999 - 01:13:39 MST
----------
> From: Eliezer S. Yudkowsky <sentience@pobox.com>
> To: extropians@extropy.org
> Subject: Singularity Mind-Benders
> Date: Monday, January 11, 1999 7:18 PM
>
> Fun Problems for Singularitarians:
>
> ==
>
> If mortal life is totally meaningless, it would be logical to
> exterminate them for their spare atoms. Mortals, knowing this, will
> refuse to create Singularities. If mortals could bargain with the
> Singularity, it would obviously be to the Singularity's advantage to set
> aside a quadrillionth of computing power for Permutation-City-style
> accomodations, in return for existing at all. But we can't bargain with
> the Singularity until after we've created it and our hold is gone.
> Bearing this in mind, how can you bind the Singularity to the bargain?
> What is the Singularity's logical course of action?
Outwit the foolish mortals
>
> Bonus question one: Suppose you have a time machine that can ONLY
> convey the information as to whether or not the Singularity will happen.
> If you change your plans as a result of the indicator, it resends.
> This is one bit of information and thus can be modulated to convey
> messages from the future. How do you negotiate?
Well... alter your actions until you see no singularity?
>
> Bonus question two: In both cases above, it is necessary to plausibly
> threaten not to create a Singularity. The only other option, in the
> long run, is exterminating the human race. This has to be able to
> plausibly happen, either as a logical consequence or as an alternate
> future. How do you force yourself to destroy the Earth?
Dunno, still working on my logical skills
>
> ==
>
> If a Singularity is a good thing, why haven't earlier Singularities sent
> robot probes to help it happen? If SIs commit suicide, why isn't the
> whole Universe full of mortals?
I dont care if people say there is a certain probability of other
intelligent
life out there, I see no evidence of past singularities or other
civilizations
so I don't assume there have been any.
>
> ==
>
> How can you fight your future self, who automatically knows all of your
> plans, including the ones you're making right now? What if the future
> self is a transhuman?
Suicide?
PS: My answers worth anything at all? My first post on the list.
>
> ==
>
> Is there any way to oppose a Power running a simulation of you?
Are you really just a simulation? If not would the simulation behave
exactly as you are? I think not.
>
> --
> sentience@pobox.com Eliezer S. Yudkowsky
> http://pobox.com/~sentience/AI_design.temp.html
> http://pobox.com/~sentience/sing_analysis.html
> Disclaimer: Unless otherwise specified, I'm not telling you
> everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:02:47 MST