From: Lee Corbin (lcorbin@tsoft.com)
Date: Mon May 13 2002 - 23:13:45 MDT
Robert wrote
> If an AI maxes out the currently available computing capacity,
> presumably fighting a losing battle as mere humans unplug their
> computers, it has to expand its capacity faster than the unpluging
> occurs.
For sure, this scenario you propose represents a project
gone awry. One very nice description occurred in the
SF book "The Adolescence of P1" by Thomas J. Ryan (1979),
in which an AI escaped from its creator.
That could happen. But if the Singularity happens in an AI,
then it'll almost surely have the cooperation with the humans
who fed it from the beginning, and they'll be happy with it
so long as they believe that it's not gone off the rails.
And before long, they wouldn't be able to affect it much
at all, anyway. (The Singularity Institute has written at
length about this, of course.)
So, returning to your proposal, either by accident or design,
some matter is projected into space:
> It only requires getting a small amount of matter into space.
> Beaming photons into space isn't more expensive than beaming
> them in horizontal directions.
But if this happens, maybe there would be an off-planet
pinnacle of development. I have no idea how to estimate
how much mass is needed for prolonged advancement. Does
anyone?
Lee
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:14:03 MST