From: D.den Otter (neosapient@transtopia.org)
Date: Fri Nov 12 1999 - 17:08:09 MST
----------
> From: Eliezer S. Yudkowsky <sentience@pobox.com>
> I would tend to lump together group minds, human-level AIs,
> human/computer hybrids, neurohacks, extra-neuron augments, and
> genetically engineered geniuses as being "us". Not really that much of
> a difference. The basic style of cognition is pretty much the same.
> And we'll all share the same fate when the Powers show up, whatever the
> heck that hidden variable contains.
Yep. So...work hard to become that Power.
> So far, den Otter would seem to be the only person with the
> self-consistency to admit that not wanting AIs to transcend implies not
> wanting *anyone* else to transcend.
Actually it implies not wanting anyone or anything else
to have a head start. An "unfair" advantage. To transcend
before you do. As it is unlikely that anyone can become a
Power all by himself, synchronized ascension of the project
team is the obvious solution.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:45 MST