Re: IA vs. AI was: Longevity vs. Singlularity (fwd)

From: Robert J. Bradbury (bradbury@www.aeiveos.com)
Date: Fri Jul 30 1999 - 08:42:35 MDT


~> Eugene Leitl <eugene.leitl@lrz.uni-muenchen.de> wrote:

> paul@i2.to writes:
>
> > I agree. Why waste the benefits of diversity and *unique* informational
> > structures (us) in the sole pursuit of more raw materials for computronium?
>
> Because the atoms of your body could crank a lot more of diversity
> much more quickly than they do right now if rearranged
> properly. That's the rational agenda, in case the entity which devours
> you happens to be rational.
>
> Resistance if futile.

I agree with Eugene's atomic argument, but I would probably disagree
regarding the motivations of "rational entities". This goes back to
the whole question of "Why do we still see stars?". I think it revolves
around the fact that there are diminishing returns on any strategy
if carried out to its ultimate limits. In the example of the
conversion of a solar system into computronium -- you get a *huge*
jump by simply disassembling a single planet to harvest the full
solar output. [~10^13 increase [earth-power output -> sun-power
output]). You get ~10^8 increase in computation density by going to
nanotechnology (~10^5 more IPS in ~10^3 less volume). Disassembling
most of the other planets may only get you a factor of ~3-4 in terms
of increased computational capacity [due to the fact that a multi-layer
Matrioshka Brain has diminishing returns with additional layers].
The extra material does help in terms of long-term memory storage
capacity, but it isn't clear at this point whether this is really
needed [you have to make an argument that there is some "rational"
purpose to remembering many details non-essential to survival].

Once you have gained something like 10^21 in processing capacity,
the gains of "re-engineering humans" are going to be so far down
on the scale that they will likely not merit much consideration.
The *real* issues are *what* do you think about and *what* is the
optimal computing architecture to use to do that. Earth is
far more likely to get removed because it stands in the line-of-sight
for a highly parallel commuication lasers between to computation nodes
than it is because the SI needs us measly earthlings for computronium.
Given the tradeoff between bouncing the beams around the Earth
(relative to all the other communication delays in a MBrain) and
nostalgic attachment to Earth, I would think that "saving" Earth wins.

But I can't promise that this would be the case... Much depends on
how much of the MBrain is uploaded human consciousness vs.
self-evolved AI.

Robert



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:04:36 MST