From: Stuart Armstrong (dragondreaming@googlemail.com)
Date: Wed Jul 16 2008 - 06:57:31 MDT
>> One of those universals was the threat of violence, and the related
>> need for government.
>> ....
>> So the best guess we can make as to what the smartest government would
>> be like, is to take the smartest governments we have now (liberal
>> demoracies with large governments), and extrapolate SLIGHTLY. It won't
>> be a very good guess, but it'll probably be the best one we have.
>
> Does the baby-sitter allow the little children to set the rules?
> What have you got against benevolent dictatorships if they
> really are benevolent?
In the real world, many, many things object to the very concept of
benevolence. Even if the AI is the smart, knowledgeable benevolent
dictator, there still remains humans varrying definitions of
benevolence. Something like CEV is a good compromise; but the question
remains as to whether humans would be happier with an AI deciding all
the best for them, or whether we would do best with some illusion of
control (and maybe some actual control).
But the more I continue along this vein, the more it strikes me how
uniformed my speculation is. I think we can conclude that the problem
of violence will remain after a singularity, and the comittant need
for an AI dictatorship or a government. Beyond that, I don't think
detailed speculation is very helpfull.
Stuart
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT