From: Michael S. Lorrey (mike@lorrey.com)
Date: Thu Mar 25 1999 - 14:56:26 MST
Lee Daniel Crocker wrote:
>> Eli said:
> > As for anyone else trusting den Otter, whose personal philosophy
> > apparently states "The hell with any poor fools who get in my way," who
> > wants to climb into the Singularity on a heap of backstabbed bodies, the
> > Sun will freeze over first. Supposing that humans were somehow uploaded
> > manually, I'd imagine that the HUGE ORGANIZATION that first had the
> > power to do it would be *far* more likely to choose good 'ol altruistic
> > other's-goals-respecting Lee Daniel Crocker...
>
> I wouldn't trust me if I were them. Sure, I can be quite tolerant
> of other humans and their goals--even ones I think mildly evil--if
> for no other reason than the value of intellectual competition and
> the hope of reform.
That is the hope.
> There's also my strong /current/ conviction
> that norms are not yet rationally provable, so my certainty of the
> superiority of my own goals is in question. Keeping around those
> with other goals keeps me honest. But if I were uploaded into a
> superintelligence that rationally determined that the world would
> be a better place with some subset of humanity's atoms used to a
> better purpose, I am more committed to rationality than to humanity.
That is the rub, but that is also the point of the testing process. If you
gave your word to first do no harm, would you keep your word on mere
principle, even if humanity's ability to get you if you don't keep your word
is non-existent? Picking people whose sense of identity is deeply integrated
with their sense of personal integrity.
I think that this is the big wall that will limit us reaching the
singularity. It won't be an issue of technology, but integrity and trust.
I can imagine that to overcome this block, the first x number of uploads will
not only have to clear a battery of tests, but will also have to agree to the
imposition of hard wired restrictions like Asimov's Laws until a predetermined
number of individuals have been uploaded, so that the impact of any one
individual can be mitigated by the rest. Say, 10,000 individuals.
>
> I am an altruist specifically because it serves my needs--I can't
> get what I want except through cooperation with others. If that
> condition changes, then it is likely my attitude would change with it.
Yes, but I think with the above proposal, the altruist motivator can be
maintained. What do you think?
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:03:23 MST