From: den Otter (neosapient@geocities.com)
Date: Wed Feb 24 1999 - 04:45:09 MST
----------
> From: Billy Brown <bbrown@conemsco.com>
> I see two fatal criticisms of the idea suggested in the FAQ:
>
> First, it is mind control. Remember, posthumans are by definition fully
> sentient beings. Programming them to abide by a preordained moral code is
> no different than doing the same thing to our own children, or to each
> other. I can see no possible way to justify such an action on moral
> grounds.
Depends on your morals, right? For example, a rational Egoist shouldn't
have too much moral trouble with such a solution (of course, he may still
reject it on practical grounds, as the idea of controlling a posthuman
is questionable). In fact, the way we raise our kids and indoctrinate
people to keep them from anti-social and criminal behaviour is a
universally accepted and indeed necessary form of programming.
So if you accept this, then there is no *moral* reason not to accept
putting some basic moral rules into an AI.
> I suggest that this passage be amended to remove the advocation of mass mind
> control. Perhaps something like this:
>
> In the first case, we could make sure that the first such entities possess
> a thorough understanding of, and respect for, existing human moral codes.
Better yet:
In the first case, we should make sure that such entities are *us*. Creating
separate SIs (from AI) is a BIG mistake. We can certainly make sure
that the new entities thoroughly understand us, but that by no means
guarantees their respect for our moral codes.
> That would be enough to ensure that they at least think the whole thing
> through before adopting some completely alien viewpoint. It is also the
> strongest measure that I can see either a moral or a practical basis for.
Morals are subjective, but the most practical (=rational) course of action
would be not to create AIs with SI potential and work on uploading instead.
Again, our prime directive should _always_ be survival. Survival is the
prerequisite for _all_ other actions. Any philosophy that does not value
personal survival [in an optimal state] above everything else is by definition
irrational. Thus follows that transhumanism (with an immortalist element)
is the best philosophy currently available to us.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:03:07 MST