From: Eugene Leitl (eugene.leitl@lrz.uni-muenchen.de)
Date: Tue Sep 26 2000 - 07:07:18 MDT
Franklin Wayne Poley writes:
> In his 1997 book, Warwick says the less intelligent will not continue to
> be governed by the more intelligent. Others have said much the same. My
> reply is that intelligence is not motivation. I don't care how complex the
Horse puckey. Demonstrate an instance where these two are separate,
particularly in a Darwinian design.
> machine is or whether it has mobility and can evolve or self-improve. What
> I care about is that it stays under control. That is a prime concern of
Maybe it will, if you ask it nicely, and scratch it behind its ears.
> all when it comes to AI. Human-competitive AI has no "wants" except for
> those capabilities it is given. Competing with humans is not a bad
If it only has those capabilities you've given it, it's not AI, at
least certainly not human-competitive AI.
> thing. That's why we invent all kinds of useful machines. So just spell
> out in terms we can all understand how the machines will surpass humans
> and how they will remain under control.
I'd rather discuss the properties of Santa Claus and Rudolf the
Red-Nosed Reindeer, it's way more constructive.
Sorry for the sarcasm, but this is getting silly.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:31:13 MST