"Michael S. Lorrey" wrote to den Otter:
>
> Since
> you are talking about guarding against even ONE Power getting there before you,
> then no one will ever upload. Someone has to be first, if it is done at all. A
> number of someones for the test phase of the technology, then those that can
> afford the cost, then as those individuals have an impact on the economy, others
> can be bootstrapped.
>
> It is all a matter of trust. Who do you trust?
Let me put it this way: If den Otter thinks I'm going to trust a HUMAN, he's nuts. I know how the human motivational system works, and I know how Elisson's (of _Coding_) motivational system works. Who do I trust? Damn straight.
If I wanted to safeguard my self-interests, or more likely was honor-bound to safeguard someone else's, you'd better believe that I wouldn't trust *any* mind unless I could see the source code. I would trust Elisson over *me*. Not from an abstract standpoint; *personally*. den Otter's blind trust of his own mind is based simply on ignorance of his own cognitive architecture.
As for anyone else trusting den Otter, whose personal philosophy apparently states "The hell with any poor fools who get in my way," who wants to climb into the Singularity on a heap of backstabbed bodies, the Sun will freeze over first. Supposing that humans were somehow uploaded manually, I'd imagine that the HUGE ORGANIZATION that first had the power to do it would be *far* more likely to choose good 'ol altruistic other's-goals-respecting Lee Daniel Crocker. If, somehow, den Otter managed to amass enough personal wealth to go it alone, which I must say seems a probability on the order of a hen laying a square egg, someone would get scared and nuke his laboratories.
You know something? Altruism really is the best strategy, even from a selfish perspective.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/singul_arity.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.