Re: Yudkowsky's AI (again)

From: Michael S. Lorrey (mike@lorrey.com)
Date: Mon Mar 29 1999 - 09:56:08 MST


"Eliezer S. Yudkowsky" wrote:

> "Michael S. Lorrey" wrote:
> >
> > Yes, but I think with the above proposal, the altruist motivator can be
> > maintained. What do you think?
>
> I really do think that when all is said and done, predicting our
> treatment at the hands of the Powers is a fifty-fifty coinflip. I just
> don't know. What I do know is that the near-future parts of the
> probability branches indicate that, after the preconditions are taken
> into account, this coinflip chance is larger by an order of magnitude
> than all the other happy outcomes put together.

Well, I was thinking about this and browsing your 'Staring into the Singularity'
piece and found this bit:

> The exact time of Singularity is customarily predicted by taking a trend and extrapolating it, much as The Population
> Bomb predicted that we'd run out of food in 1977. For example, population growth is hyperbolic. (Maybe you learned it
> was exponential in math class, but it's hyperbolic to a much better fit than exponential.) If that trend continues, world
> population reaches infinity on Aug 17, 2027, plus or minus 1.8 years. Although it is impossible for the human population
> to reach those levels, some say that if we can create AIs, then the graph might measure sentient population instead of
> human population. These people are torturing the metaphor. Explain to me who designed the population curve to take
> into account developments in AI. It's just a curve, a bunch of numbers. It can't distort the future course of technology just
> to remain on track.
>
You must remember that the curve merely describes the impact of the sum of the
instinctual drive to reproduce in every individual human. While we are now seeing
that this may be a third order 's' curve rather than a hyperbolic equation, the drive
to reproduce is still there in people. As people get more educated, living in a more
productive economy and making more money, they invest more money in fewer children,
as well as investing more money in themselves. These two changes are a result of the
higher life expectancy and lower infant death rate, but they retard the actual grwoth
in population to that below what might be instinctually optimum. I still doubt that
most people who may live a long time will be sexually productive the whole time. We
notice that the growth in the pet population are in those pets owned by senior
citizens. We can also see the Tamagochi toy phenomenon with this.

I conclude that the pressure for AI development will be in response to the market
expressed need for people to have companions to nurture, develop, and associate with
that are less expensive than actually producing biological children. If this holds
true, then we may get AI's before uploads. The AI's will look on humans as parents
(for good or ill). Since we can see from the human population that most children are
well brought up (child abuse hypemeisters to the contrary), we can posit that a)
there may be some AI's which will be inimical to particular humans, and a few
inimical to humanity in general, b) I think that most of the AI population will be
generally amicable and caring toward humanity and will help develop uploading
technology so that their 'parents' may join them rather than dying off.

Mike Lorrey



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:03:25 MST