Lee Daniel Crocker wrote:
>> Eli said:
> > As for anyone else trusting den Otter, whose personal philosophy
> > apparently states "The hell with any poor fools who get in my way," who
> > wants to climb into the Singularity on a heap of backstabbed bodies, the
> > Sun will freeze over first. Supposing that humans were somehow uploaded
> > manually, I'd imagine that the HUGE ORGANIZATION that first had the
> > power to do it would be *far* more likely to choose good 'ol altruistic
> > other's-goals-respecting Lee Daniel Crocker...
>
> I wouldn't trust me if I were them. Sure, I can be quite tolerant
> of other humans and their goals--even ones I think mildly evil--if
> for no other reason than the value of intellectual competition and
> the hope of reform.
That is the hope.
> There's also my strong /current/ conviction
> that norms are not yet rationally provable, so my certainty of the
> superiority of my own goals is in question. Keeping around those
> with other goals keeps me honest. But if I were uploaded into a
> superintelligence that rationally determined that the world would
> be a better place with some subset of humanity's atoms used to a
> better purpose, I am more committed to rationality than to humanity.
I think that this is the big wall that will limit us reaching the singularity. It won't be an issue of technology, but integrity and trust.
I can imagine that to overcome this block, the first x number of uploads will not only have to clear a battery of tests, but will also have to agree to the imposition of hard wired restrictions like Asimov's Laws until a predetermined number of individuals have been uploaded, so that the impact of any one individual can be mitigated by the rest. Say, 10,000 individuals.
>
> I am an altruist specifically because it serves my needs--I can't
> get what I want except through cooperation with others. If that
> condition changes, then it is likely my attitude would change with it.