Re: Yudkowsky's AI (again)

From: Bryan Moss (bryan.moss@dial.pipex.com)
Date: Fri Mar 26 1999 - 09:53:06 MST


Eliezer S. Yudkowsky wrote:
> But the key thing to note is that even this pseudo-
> absolute injunction dissolves under the influence of
> superintelligence. It's not a matter of a conscious
> condition stating: "I can wipe out humanity if I'm
> superintelligent." This trigger can be falsely
> activated - a paranoid schizophrenic may believe
> himself to be superintelligent.

Worse yet an ultra egoist[*] such as myself might think that a whimsical
urge to destroy humanity amounts to justifiable suicide.

BM

[*] This philosophy, based around discussions of egoism and utilitarianism
on this list, attempts to unite the larger ideas of the society with those
of the individual.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:03:23 MST