Eliezer S. Yudkowsky wrote:
> Otter and I appear to agree on the following:
<snip> Eliezer, saying Otter instead of denOtter is as incorrect
as referring to me as ike ones, or you as kowsky. But that isnt
why I posted.
Seems like in the race between strong nanotech and AI, regardless
of which one happens first, the other cant be far behind it, for each
would facilitate the other. Which leads me to this question:
Are there many possible scenarios regarding the outcome of
the Singularity, and is there matrix or something that maps out
all the possible outcomes?
Reason: I have been trying to imagine a slow Singularity, wherein
everything changes, but much of humanity notices nothing for
some time afterwards.
Consider an historical singularity, lower case s: the stock market
crash of 1929. On the day of that crash (October 29?) many
fortunes were wiped out but my great grandmother's journal
recorded nothing unusual on that day, or the week following
or the month following. They didnt notice, they didnt even
know anything untoward had happened.
Likewise, if a Singularity happened in the next 5 years, most
of humanity wouldnt know or care for some time afterward,
altho you and I would face an immediate crisis.
Next, we tend to think of a Singularity that pretty much takes
everyone by surprise. [Well, not you Eliezer, you will likely have
written the thing. {8^D ] But what if we had a nearly universally
anticipated Singularity? Heres how that would work: we figure
out a way to simulate human intelligence, sort of, so that we have
a machine that can think, but isnt very fast. You set it to witty
sarcasm mode, it thinks of a snappy comeback after about a
week. Suppose we write a human mind-simulator but find it
takes a jillion CPU cycles to figure out the punchline of a joke,
for instance.
Then, everyone can extrapolate to see that if we Moore's Law
for say 15 more years, using *this* algorithm, we have a pretty good
silicon companion. Then we might well realize that if we just
manufacture a few billion such processors, we could effectively
double the number of brains working on humanity's problems,
etc, and it is easy to see we could start an intellectual avalanche.
Would that not be a highly anticipated Singularity? Would we
not have 15 years to try to make friends with computers,
kiss our butts goodbye, build spaceships etc?
What other scenarios have been suggested? Are these
documented? spike
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:05:05 MDT