From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Sep 07 1998 - 22:22:10 MDT
http://pobox.com/~sentience/sing_analysis.html
Singularity Analysis
A Series of Educated Guesses
A few conclusions drawn from "Coding A Transhuman AI".
About 65K.
I tried to summarize from _Coding_, so you can hopefully follow along if
you're willing to take my word for a few summarized assertions. Deep
understanding or persuasion probably requires that you bite the bullet and
read _Coding_.
I will post highlights as my response to some comments.
Current Table of Contents:
1. AI: Human-equivalence and transhumanity.
Trajectory analysis: Does human AI imply superhuman AI?
In response to: Max More, Hanson.
2. Zone Barriers.
A canonical list of shields from the Singularity.
In response to: Vernor Vinge, Nielsen.
3. Superintelligent motivations.
Mostly summarizes _Coding_'s sections on goals. Some arguments.
Necessary to item 4.
4. Unknowability: Evaporation of the human ontology.
The complexity barrier and the simplicity barrier.
In response to: Damien Sullivan, Bostrum, Hanson, More, Nielsen.
I don't plan to post from _Motivations_ or _Unknowability_.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/sing_analysis.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:33 MST