>
> Bonus question one: Suppose you have a time machine that can ONLY
> convey the information as to whether or not the Singularity will happen.
> If you change your plans as a result of the indicator, it resends.
> This is one bit of information and thus can be modulated to convey
> messages from the future. How do you negotiate?
Well... alter your actions until you see no singularity?
>
> Bonus question two: In both cases above, it is necessary to plausibly
> threaten not to create a Singularity. The only other option, in the
> long run, is exterminating the human race. This has to be able to
> plausibly happen, either as a logical consequence or as an alternate
> future. How do you force yourself to destroy the Earth?
Dunno, still working on my logical skills
>
> ==
>
> If a Singularity is a good thing, why haven't earlier Singularities sent
> robot probes to help it happen? If SIs commit suicide, why isn't the
> whole Universe full of mortals?
I dont care if people say there is a certain probability of other
intelligent
life out there, I see no evidence of past singularities or other
civilizations
so I don't assume there have been any.
>
> ==
>
> How can you fight your future self, who automatically knows all of your
> plans, including the ones you're making right now? What if the future
> self is a transhuman?
Suicide?
PS: My answers worth anything at all? My first post on the list.
>
> ==
>
> Is there any way to oppose a Power running a simulation of you?
Are you really just a simulation? If not would the simulation behave
exactly as you are? I think not.
>
> --
> sentience@pobox.com Eliezer S. Yudkowsky
> http://pobox.com/~sentience/AI_design.temp.html
> http://pobox.com/~sentience/sing_analysis.html
> Disclaimer: Unless otherwise specified, I'm not telling you
> everything I think I know.