AI motivation

From: Rob Harris (rob@hbinternet.co.uk)
Date: Mon Oct 25 1999 - 02:31:10 MDT


>Eliezer's pointed out the incoherence of believing you can hard wire
>high level beliefs or motivations and I quite agree. You do get to
>specify what kind of feedback-inducing behavior gets reinforced or
>attenuated though.

Absolutely, I'm not completely fresh to this subject. The fact remains that
rewarding a particular behaviour is providing motivation in itself. Whether
you type the line "Survive at all costs" into the "moti-con", or reward
existence-preserving behaviour, the result is the same. A "motivation" to
survive. So my point remains - who is going to create an AI with gene-being
style motives, then grant the AI powers necessary to seriously challenge
human affairs, which basically amounts to creating an extremely complicated
armageddon device. A bomb would be far, far easier....

Rob.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:36 MST