Arbitrariness of Ethics (was singularity logic loop)

From: Lee Corbin (lcorbin@tsoft.com)
Date: Sun Apr 28 2002 - 12:18:53 MDT


Samantha wrote

> Lee Corbin wrote:
>
> > There are just two reasons that it might be nice to
> > humans, so far as I know: one, I. J. Good's meta-
> > golden rule... and two, because someone built
> > it to be so nice.
>
> Or it decides that non-arbitrary ethics preclude destroying
> other sentients as much as possible.

That "it decides" that there exists a non-arbitrary
ethical system is no different, so far as I can tell,
from "someone built it to be so nice".

You may mean that a sufficiently intelligent and objective
AI must deduce as a logical or scientific truth that a
certain ethical system is correct. I've never seen any
evidence that such exists.

> > As the speed of light becomes so slow (from the SI
> > perspective) evolution even over the Earth could break
> > apart and become localized quickly, and in that case
> > systems that allowed anything to hold back their own
> > development would be at a serious disadvantage.
>
> This has the built-in assumption that endless-growth is such a
> strong and mandatory drive that all else is subservient. I
> don't think this is a given.

If an SI that at one time did control the entire Earth were
to break apart into sufficiently many pieces, as I was trying
to suggest, then any variation in acquisitiveness would result
from evolutionary principles in the more acquisitive growing
at the expense of the less acquisitive.

> There are no other SIs in the neighborhood we can detect
> and it is not a given that they could think of nothing
> better than endlessly competing with one another throughout
> space-time.

Indeed, they could come to such an agreement, but as I said,
if they are sufficiently many, then the agreement might be
as unstable as an oil cartel's.

> Precisely why should it be that hungry or that hungry that
> quickly? Cancerous levels of growth until all resources are
> consumed are simply not the only viable models of Singularity
> level beings.

Well, nothing is certain, but where there is vast potential,
very much diversity among systems results in that potential
being exploited.

Some futurists, I take it, pin their hopes on a single all-
encompassing AI that would take over the Earth and solar system,
and institute a permanent Pax by force. That could happen,
though it's by no means certain that that's how everything is
going to shake out. But even if it does, the argument I alluded
to above about the speed of light and local information processing
would strain the internal loyalty of such a system.

Lee



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:40 MST