Re: electronic intelligence and ethics

From: Michael S. Lorrey (retroman@turbont.net)
Date: Tue Feb 22 2000 - 15:23:24 MST


Zero Powers wrote:

> Following up on QueeneMuse's interesting question about what the uploading
> experience will be like, one of the things I wonder about is what sort of
> life we (plain, trans-, or post-)humans will be allowed by the electronic
> intelligences (I prefer the term EI to AI) once they take over. That will
> depend, of course, on the nature of their ethical or moral systems. I think
> it's a safe bet to say that they will be driven by a desire to acquire and
> analyze data. I believe they will pick up the baton and run with the human
> scientific attempt to formulate a grand unified theory of everything.

SInce I'm a proponent of the soft slow upload (i.e humans slowly augmenting
their brains over time until when the brain dies, the mind doesn't notice), my
opinion is that we will not notice any significant differences other than
greater abilities. New and increased abilities will still feel like they are
'our' abilities, not those of some machine we drive like a car or run like a
computer. Recent R&D on implants (cochlear and optical) shows that our minds
consider silicon implants to be just one more part of our minds.

> But what else (if anything) will motivate them? Will there be any "good"
> for them other than information? Any "evil" other than ignorance? Will
> they care at all about such trivialities as emotion, fairness, compassion
> and pain? Whether or not I want to survive the ascendancy of strong EI will
> depend largely upon this question. Unfortunately I'll never know the
> answers unless and until I survive to that time. Or unless I am persuaded
> by the musings of this list. I can't wait to hear your thoughts.

Based on my arguments above, I think that since uploaded humans will continue to
think of themselves as human, that their motivations will be very similar as
they are now, there will merely be increased growth and maturity in an uploaded
humans thinking. Because the uploaded human will have greater access to
information and capacity to make rational decisions, human society will become
closer to the Baysean ideal, so less strife and stupidity will of course occur
(except that by those who refuse to augment themselves, of course), although I
don't know if it will dissapear entirely. For example, those who attempt to form
borganismsin a coercive manner will be fiercely opposed by extropians (extropy
being the individualist form of transhumanism in general). Feral humans will of
course paint all transhumans with the scary BORG propaganda brush (much as
libertarians get painted with the same 'right wing nut' label as religious
conservatives), and Borganists will label feralists as 'reactionary luddite
barbarians' and libertarians as merely 'reactionary anarchists'.

Mike Lorrey



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:26:58 MST