AI-Box Experiments

From: Michael Wiik (mwiik@messagenet.com)
Date: Sun Jul 07 2002 - 09:11:19 MDT


Maybe this explains the Fermi paradox. If the AI convinces you to let it
out, then the only rational strategy for long-term existance in the
universe is to flee from all other systems at near-light speeds (or
perhaps hide in pocket universes). Cooperation becomes impossible, all
inferior entities are murdered.

Just a thought,
        -Mike

--


This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:15:12 MST