Nick Bostrom wrote:
> Billy Brown wrote:
> >I won't pretend to know what it will actually do at that point,
> > but I can't see it being concerned about something as prosaic as its
> > supply of atoms.
>
> Why not? If it is better off (however slightly) with these atoms than
> without them, then in this scenario could all be dead, unless we have
> been wise enough to make sure that the power is ethical..
Note: I assume here that we're talking about strong SI and some variant of the rapid self-enhancement scenario.
Well, first off, I think an SI is going to invent its own ethics long before it really deserves that label. IMO, everything any human has ever thought of on the topic is going to seem intuitively obvious to such an entity. Now, whether its morals will agree with ours is still an open question (although I side with Eliezer in expecting that the SI is more likely to be correct than we are).
But that aside, atoms are mostly useful for bulky, clumsy, primitive applications like nanotechnology and electronic computers. An SI with moderately advanced nanotech can convert a few cubic meters of mass into a computer in hours or less, and would probably do so. However, the delays involved in manipulating bulk matter on any larger scale are going to seem agonizingly long once it ports itself to that first machine. How willing are you to wait several days to complete the project if your consciousness runs 10^12 times faster than that of a human?
IMHO, what happens next depends on the ultimate nature of physics. I see two important cases:
Billy Brown, MCSE+I
bbrown@conemsco.com