From: Technotranscendence (neptune@mars.superlink.net)
Date: Tue Feb 12 2002 - 05:32:28 MST
On Tuesday, February 12, 2002 5:47 AM Anders Sandberg asa@nada.kth.se
wrote:
> It is not obvious that the "tyranny" is even possible as stated. It is
> the old problem of information again; while the AI may (somehow) get
> around the usual problem that the ruling entity does not have full
> information about the desires and actions of its subjects, it still is
> forced to deal with an immense optimization problem that has to be
> solved in real-time. It is not clear whether this would involve
> preventing any bad encounter (and thus ending up with the risk of
ending
> up in a local optimum leading to a far worse long-term outcome) or
> allowing some to happen in order to prevent worse problems in the
> future. To some extent this is what people do all the time - just
think
> of a good host or hostess at a party - but it is not obvious at all
that
> this can be scaled up arbitrarily.
I agree with the information aspect here.
> It is also not explained whether it is possible to ignore the advice
of
> the AI in the above scenario. If it is, I see no ethical problem with
it
> at all. If it overrides the free behavior of the citizens, then there
> are clearly problems with it (unless one is an untilitarian, in which
> case the issue becomes whether this system actually manages to
maximize
> utility).
There's a presumption, in the quote, that the AI knows best period. If
that were so, fine. But how would we know it is so? The problem of how
evaluates the AI comes up -- and this is no different than say turning
over our freedom to any benevolent dictator, group, or party.
The promises that such an AI will only think of our interests because
its own have been suppressed and because it's designed to have no
emotions or values seems silly to me. Once it has the power, how will
one keep it from morphing into something else -- something with
interests antithetical to those it has power over?
This is not to say I stand against AI. I do not. But I don't believe
in giving more intelligent beings control over my life period -- be they
just people with ten more IQ points than me and two PhDs or
superintelligent passionless machines.
Cheers!
Daniel Ust
http://uweb.superlink.net/neptune/
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:19 MST