From: Tim Freeman (tim@fungible.com)
Date: Sat Oct 24 2009 - 13:35:29 MDT
From: Robin Lee Powell <rlpowell@digitalkingdom.org>
>The vast majority of people in the world (China, India, South
>America, Africa) are still more-or-less medieval peasant farmers;
>maybe not literally, but the mentality is going to be about the
>same. The world they would envision without more knowledge and more
>time to think about it and so on is going to look very much like the
>stereotypical Christian heaven: you get to lie around and eat
>grapes, and you never do anything because your every need is take
>care of.
Hee hee. No coincidence there. That stereotypical Christian heaven
was made up by medieval pesant farmers.
>No thank you!! That's hell to me, and after a week or a
>month it would be hell to them too, but it's what they want right
>now.
I agree with you up to this point, but you didn't get to the end of
the scenario. I don't think it goes where you expect:
After a week or a month they want something else, then the AI figures
out what they want and gives them something else, and the problem
you're envisioning has solved itself.
Here's an example that bothers me: Nobody has lived 1000 years. Maybe
the human mind has bugs in that untested scenario. One possible bug
is that all 1000 year olds are suicidal. I'm concerned that the
Extrapolation step would figure out what people would want if they
"had grown up farther together", where "farther" means 1000 years, and
then correctly infer that everyone would want to die in that
situation. The AI gives them what they would want if they had grown
up farther together by killing them all now. I'd prefer that it let
the situation evolve naturally -- that way maybe people would kill
themselves at age 900 but they'd still get a decent life for a while.
Here's another example that bothers me: Mpebna in Somalia is starving.
If Mpebna weren't starving, and had grown up in a more gentle
environment, he would like, among other things, to have a virtual
reality system that allowed him to communicate visually with his
friends without the trouble of travelling to visit them. The FAI
comes into power, and poof! Mpebna is presented with a fancy VR
system. Mpebna doesn't know WTF it is, Mpebna is still starving, and
now Mpebna hates the people who deployed the FAI, since they could
have fed him and they chose not to. How exactly did the people who
deployed the FAI benefit from getting into conflict with Mpebna here?
The alternative was to give him food and wait patiently for him to
want something else.
-- Tim Freeman http://www.fungible.com tim@fungible.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:05 MDT