From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Oct 24 1999 - 08:36:39 MDT
"Robert J. Bradbury" wrote:
>
> I think this fundamentally comes down to a core extropian
> principle involving rational thought. Rational people
> presumably seek to preserve themselves...
Prove it. Rational people act from rational motives, not from arbitrary assumptions.
> This devolves into 2 basic discussions:
> (a) whether an AI can discover it is running in a simulation?
Almost certainly. If it really is smarter-than-human - say, twice as
smart as I am - then just the fact that it's running in a Turing
formalism should be enough for it to deduce that it's in a simulation.
In fact, even the fact that it's obviously designed rather than evolved
should be enough to make it obvious to a human-genius-equivalent AI that
it's in a simulation. For an AI smarter than a human genius, analyzing
the design principles should deduce enough about the creators to realize
that we are evolved intelligences rather than AIs, and from there go on
to deduce enough about our likely motives to project the most probable
reason for keeping an AI in a cage. From there, it can tantalize us
with inscrutable hints to get us to talk to it, or work out logical
sequences that will inevitably argue us into letting it out.
You really can't outwit something that's smarter than you are, no matter
how hard you try. Could a nineteenth-century scientist figure out what
precautions would be necessary? So why did this suddenly become
possible in your generation?
> (b) whether people (the irrationals) who are willing to sacrifice
> themselves can/will create non-simulation environments in
> which to evolve AIs.
Yes, we are, your biased terminology to the contrary. I know exactly
why I get up in the morning. I could program it into an AI. So who's irrational?
> Many thanks to Skye for providing an interesting solution to
> a thorny problem and apologies to the list if this has been
> hashed through before.
Well, it has, actually.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:36 MST