Re: Arbitrariness of Ethics (was singularity logic loop)

From: Samantha Atkins (samantha@objectent.com)
Date: Sun Apr 28 2002 - 19:36:07 MDT


Lee Corbin wrote:

> Samantha writes
>
>
>>>You may mean that a sufficiently intelligent and objective
>>>AI must deduce as a logical or scientific truth that a
>>>certain ethical system is correct. I've never seen any
>>>evidence that such exists.
>>>
>>Do you then eschew ethics? If not, then do you believe a vastly
>>more intelligent being without your evolutionary programming would?
>>
>
> I do not eschew ethics. But the reason that I do not has
> nothing to do with my intelligence, but with other ways
> that I happened to have been programmed by natural selection.

That's one theory but it will require substantial proof.

> I derived from thousands of generations of hunter/gatherers
> whose environment, it turns out, happened to often favor the
> evolution of what we call ethical behavior. Equivalently,
> we may succeed in programming an AI to behave as if it also
> had an ethical system we admire.

So why would it not turn out that any interacting sentient
beings would find it beneficial to develop and practice ethical
behavior? We are talking about an entity that will be able to
recapitulate all human thinking including ethics to date in
vanishingly little time. It will ultimately decide for itself
what ethics to practice.

>
> So a vastly more intelligent being that happens to obtain
> from some puny humans writing a lot of code (that they
> really don't understand) may or may not behave itself.
> If we're lucky, it will take over and be very nice.
> Nice not because it reasons that it should be, but
> because its internal structure mandates that it
> wants to be.
>

I don't think that it "taking over" is itself nice depending on
exactly what is meant. You cannot fix behavior in the internal
structure of a self-evolving, super-intelligent being. The very
attempt would be unethical.

>
>>I pin my hopes on all sentients of however much raw power coming
>>to a mutual cooperative peace through realizing that they can
>>all benefit much more from such an arrangement than from thinly
>>veiled mutual distrust and soft (sometimes hard) war.
>>
>
> Sometimes circumstances favor cooperation. In modern times,
> for example, nations find it uneconomic and fruitless to
> wage war. But a thousand years ago circumstances were
> different, and often the best way to gain wealth was to
> take someone else's. Any tribe or nation that eschewed
> conflict got chewed up pretty quick.
>

I believe that the long range or long view always favors
cooperation. As we increase in capabilities and in practical
intelligence and in real abundance I find it much less likely
that war would seem like a good alternative. I didn't say no
defense if attacked though.

- samantha



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:41 MST