From: Lee Corbin (lcorbin@tsoft.com)
Date: Wed Apr 24 2002 - 01:19:35 MDT
Spike wrote
> Question please: if the emergent AI simulates humanity at
> the moment of the singularity, then there are a bunch of
> simulated beings who are working to invent AI and who
> would have a strong suspicion that all the elements are in
> place for a singularity to occur. Right?
Yes.
> The simulated beings would become puzzled if all
> the elements for a runaway AI were in place but
> the singularity was not happening.
This kind of assumption---that it would want to keep
the big S a secret---reminds me of the "explanation"
for why we don't see much of the Saucer Folks. They're
just shy, you see, and they don't want anyone to know
they're around.
Now, if we are so lucky that the AI uploads us, then I
can't imagine any reason that it wouldn't say so: I claim
that emulated lives inside are better than no lives at all,
and that this AI is really therefore really "nice", since
it hardly needs us anymore. Under these very ideal
(and optimistic assumptions, sigh), apparently our best
interests are near its silicon heart.
So if there's no nearby top to a sigmoid curve of development
that it has to be concerned about, then there's no reason
that it can't offer a customized path of personal evolution
for each of us. Even as you and I debate to ourselves the
desirability of drastic IQ enhancement, merging memories,
allocating some of our granted resources to running backups
in parallel, and so on, the AI races forward at breakneck speed.
Lee
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:38 MST