Eliezer S. Yudkowsky wrote:
> I really do think that when all is said and done, predicting our
> treatment at the hands of the Powers is a fifty-fifty coinflip. I just
> don't know. What I do know is that the near-future parts of the
> probability branches indicate that, after the preconditions are taken
> into account, this coinflip chance is larger by an order of magnitude
> than all the other happy outcomes put together..
I think its interesting how one can arrive at the same conclusion from a completely different direction.
Personally, I would give us a very good chance of surviving the emergence of ultratechnology even without and early seed AI success. The kind of technology progression we would get in that situation looks like something humans could cope with, and the emergence of Powers would be gradual enough that there would never be a single invincible individual.
If, OTOH, IE is not that easy, then there is never going to be a single Power. Instead, we'll get a society of different kinds of Transhuman minds working to improve themselves as a group. That effectively puts us back in my first scenario, but with a faster rate of change and even less chance of disaster.
So, whichever way it works out, anything we can do to speed up progress (especially progress on IE) is a good thing. The longer we take to reach practical immortality, the more people will die before we get there.
Billy Brown, MCSE+I
bbrown@conemsco.com