From: J. R. Molloy (jr@shasta.com)
Date: Fri Oct 06 2000 - 18:53:35 MDT
[Just cleaning up unread messages]
Eugene Leitl has written,
> J. R. Molloy writes:
>
> > That's because you are an intelligent entity. The AI, in contrast, must
conform
> > to the program that coders give it. <grrrin>
>
> Such as the atoms in your body must conform to the natural laws
> governing their behaviour. Or as the neurons in a biologically
> realistic (such as lobster gastric ganglion) simulation produce their
> spikes according to the program codifying their behaviour. Or as an
> NP-complete problem of a nontrivial size is being solved by a computer
> program.
No, as the AI must conform to the program that coders give it.
> These are all true statements, but they're also rather irrelevant,
> because they do not impose any noticeable constraints on system
> state. In all these cases, you can't prove that system A in state B
> reaches state C after N steps, without traversing every single state
> leading from B to C.
These are all true statements, and they're also irrelevant.
> Because behaviour is phenotype of above state changes over time, the
> problem of predicting behaviour with absolute certainty is impossible
> for a class of observers less than omniscient. All assuming, of
> course, that you can cleanly classify all behaviour into "desireable"
> and "undesireable" bins.
Right. So even if an AI becomes "unfriendly" that doesn't necessarily make it
"undesirable."
--J. R.
"Lord, grant me the serenity to accept the things I cannot change,
the courage to change the things I can, and the wisdom to hide the bodies
of the people I had to kill because they pissed me off."
--U. Biquitous
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:31:27 MST