Re: Preventing AI Breakout [was Genetics, nannotechnology, and , programming]
From: Anders Sandberg (asa@nada.kth.se)
Date: Mon Oct 25 1999 - 05:52:05 MDT
- Next message: Rob Harris: "Morality"
- Previous message: Spudboy100@aol.com: "Re: religion again"
- In reply to: Eliezer S. Yudkowsky: "Re: Preventing AI Breakout [was Genetics, nannotechnology, and , programming]"
- Next in thread: Eliezer S. Yudkowsky: "Re: Preventing AI Breakout [was Genetics, nannotechnology, and , programming]"
- Reply: Eliezer S. Yudkowsky: "Re: Preventing AI Breakout [was Genetics, nannotechnology, and , programming]"
- Reply: Matt Gingell: "Re[2]: Preventing AI Breakout [was Genetics, nannotechnology, and , programming]"
- Messages sorted by:
[ date ]
[ thread ]
[ subject ]
[ author ]
[ attachment ]
"Eliezer S. Yudkowsky" <sentience@pobox.com> writes:
> > (a) whether an AI can discover it is running in a simulation?
>
> Almost certainly. If it really is smarter-than-human - say, twice as
> smart as I am - then just the fact that it's running in a Turing
> formalism should be enough for it to deduce that it's in a simulation.
So if the Church-Turing thesis holds for the physical world, it is a
simulation?
If the AI runs on a Game of Life automaton, why should it believe the
world is embedded in another world? The simplest consistent
explanation involves just the automaton.
> You really can't outwit something that's smarter than you are, no matter
> how hard you try.
Ever tried to rear children? Outwitting goes both ways.
--
-----------------------------------------------------------------------
Anders Sandberg Towards Ascension!
asa@nada.kth.se http://www.nada.kth.se/~asa/
GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
- Next message: Rob Harris: "Morality"
- Previous message: Spudboy100@aol.com: "Re: religion again"
- In reply to: Eliezer S. Yudkowsky: "Re: Preventing AI Breakout [was Genetics, nannotechnology, and , programming]"
- Next in thread: Eliezer S. Yudkowsky: "Re: Preventing AI Breakout [was Genetics, nannotechnology, and , programming]"
- Reply: Eliezer S. Yudkowsky: "Re: Preventing AI Breakout [was Genetics, nannotechnology, and , programming]"
- Reply: Matt Gingell: "Re[2]: Preventing AI Breakout [was Genetics, nannotechnology, and , programming]"
- Messages sorted by:
[ date ]
[ thread ]
[ subject ]
[ author ]
[ attachment ]
This archive was generated by hypermail 2.1.5
: Fri Nov 01 2002 - 15:05:36 MST