From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Dec 24 2001 - 00:48:03 MST
I guess I'd better at least do mine. I'm also tossing in a few other
definitions, including some from the glossaries of "General Intelligence
and Seed AI" and "Creating and Friendly AI", as long as I'm at it. I
don't know the attributions for some of those, so feel free to contribute
them.
BTW, there a reason why ExI can't borrow Anders Sandberg's glossary?
-- COMPUTRONIUM: Matter that has been transformed from its natural state into a computer of the maximum physically achievable efficiency. (A true Extropian would argue that this <i>is</i> matter's "natural state".) What constitutes "computronium" varies with the level of postulated technology; a rod-logic <gl>nanocomputer</gl> is probably too primitive, since the basic elements consist of hundreds or thousands of atoms. More likely forms of computronium include three-dimensional quantum cellular automata, or exotic forms of matter such as neutronium, Higgsium, and monopolium. FRIENDLY AI: An AI which is, broadly speaking, one of the good guys; an AI which operates roughly within humanity's moral frame of reference; an AI which has the potential and the will to become at least as philosophically enlightened, from our perspective, as intelligence derived from a human or group of humans; an AI sufficiently advanced to engage in independent real-world planning, which makes human-benefiting, non-human harming decisions. [<a href="http://singinst.org/friendly/">Eliezer Yudkowsky</a>, 2000.] GREY GOO: Out-of-control replicating <gl>nanotechnology</gl>; some <a href="http://www.foresight.org/NanoRev/Ecophagy.html">calculations</a> indicate that the entire ecosphere could be consumed within weeks or days. One of the primary risks threatening the complete destruction of humanity. [Eric Drexler, 1986.] Perhaps an even more dangerous variant is "red goo", or military nanotechnology. HARD TAKEOFF: A <gl>Singularity</gl> occurring with extreme speed and rapidity, over the course of hours or weeks rather than months or decades. Most hard takeoff scenarios involve Artificial Intelligence because of the probable ability of an AI to rapidly absorb enormous amounts of computing power, run on basic computing elements with limiting serial speeds of 2GHz (as opposed to 200Hz neurons), and recursively self-improve by rewriting one's own source code; however, it is also conceivable that a hard takeoff scenario could develop out of brain-computer interfaces. NANOCOMPUTER: A computer built using <gl>nanotechnology</gl> (manufacturing to molecular specifications). A lower bound on nanocomputing speeds has been set by calculating the speed of an acoustic computer using "rod logics" and messages that travel at the speed of sound; a one-kilogram rod logic, occupying one cubic centimeter, can contain 10^12 CPUs each operating at 1000 MIPS for a total of ten thousand billion billion operations per second. Note that rod logics are the nanotech equivalent of vacuum tubes; electronic nanocomputers would be substantially faster. [Eric Drexler, 1986, 1992.] NANOTECHNOLOGY: As used by venture capitalists, technology which operates in the nanometer (billionth of a meter) scale. As used by transhumanists, technology which uses precise positional control of reactants to mechanically synthesize large-scale structures to exact molecular specifications - "positional chemistry" or "mechanosynthesis". Molecular nanotechnology is distinguished by the observation that in theory, it can produce virtually any material object, including a duplicate of itself, and can moreover operate on a scale that is small relative to human biology - allowing medical technology verging on total control of biology, including the halting or reversal of aging. [Eric Drexler, 1986, 1992.] NEUROHACK: A broad term covering most forms of biologically based intelligence enhancement, including brain-computer interfaces, genetic engineering for higher intelligence, addition of extra brain tissue, various proposed neurosurgical methods, et cetera; may also be used to refer to a sufficiently unusual and extreme natural perturbation to cognitive processing. [Eliezer Yudkowsky, 1998.] SEED AI: An AI designed for recursive self-improvement; that is, improvement followed by another round of improvement at that higher level of intelligence. Rather than building a mind which is superintelligent from the start, the theory holds that only some bounded level of intelligence need be achieved in order for the AI to become capable of open-ended improvement of its own source code. [Eliezer Yudkowsky <a href="http://singinst.org/GISAI/">1998</a>, <a href="http://singinst.org/seedAI/">2000</a>, <a href="http://singinst.org/seedAI/seedAI.html">2001</a>.] SINGULARITARIAN: Originally defined by Mark Plus to mean "one who believes the concept of a Singularity", this term has since been redefined to mean "Singularity activist" or "friend of the Singularity"; that is, one who acts so as to bring about a Singularity. [Mark Plus, 1991; <a href="http://sysopmind.com/sing/principles.html">Eliezer Yudkowsky</a>, 2000.] -- -- -- -- -- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:12:46 MST