FW: Singularity vs. Ronald Mcdonald

From: Ben Goertzel (ben@webmind.com)
Date: Mon Jul 30 2001 - 08:44:14 MDT


> Date: Sun, 29 Jul 2001 11:27:57 EST
> From: "Stirling Westrup" <sti@cam.org>
> Subject: Re: Big Bang is Bunk
>
> Eliezer S. Yudkowsky wrote:
>
> > The Singularity is defined to occur as soon as any
greater-than-human
> > intelligence comes into existence - big or small. It has to be
genuine,
> > hardware transhumanity, not just humans put together in
> interesting shapes,
> > but that's it.
>
> By who's definition?

Not mine.

Counterargument 1)

A transhuman intelligence could conceivably come about *without* having
far-reaching effects for humanity as a whole.

An example of how this could come about is given in Stanislaw Lem's
story
about the AI "Honest Annie," which as soon as it became
superintelligent,
cut off its communications with humans

Counterargument 2)

It's not inconceivable either that we could create superb
nanotechnology,
molecular assemblers, etc., but discover that for complex reasons of
physical law and complexity science, there is NO WAY TO MAKE AN
INTELLIGENCE
SIGNIFICANTLY GREATER THAN HUMAN INTELLIGENCE. I think this is
immensely
unlikely but it's not impossible. In this case we could reshape the
solar
system into a perfect image of Ronald McDonald's butt, create free food
for
all and infinite molecularly-induced orgasmic bliss on command, travel
between universes on quark-powered Razor scooters, and do all sorts of
other
funky-cool Singularity-ish stuff, but NOT create superhuman
intelligence...

I happen to agree with Eli that the creation of transhuman intelligence
is
both the most likely path to a Singularity, and very likely to lead to a
Singularity if it occurs. But these statements can't reasonably be
posited
with 100% certainty, and hence they're not the *definition* of the
technological Singularity.

-- Ben G



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:09:17 MST