RE: No Singularity?

From: Billy Brown (bbrown@transcient.com)
Date: Thu Nov 18 1999 - 12:49:29 MST


hal@finney.org wrote:
> Your term "sentient labor" is a bit unclear with regard to machines.
> What would you predict would happen to costs as more labor is taken on
> by machines? Would it matter if the machines are as smart as people,
> or even smarter?

I'm assuming a slavery-free society, where anything sentient is a citizen
and gets paid. In that case automation increases per-capita wealth, either
by reducing prices or by giving everyone more dollars to spend. Thus, our
ability to automate production is the ultimate constraint on per-capita
wealth.

If you have slave labor, and you want to predict the cost that the overlords
pay for producing goods, the picture is analogous. Instead of replacing
people with machines you are replacing overlords with slaves. In the free
society cost reduction is limited by the amount of sentient supervision
required to keep the automation going. In the slave society cost reduction
(from the overlord's point of view) is limited by the amount of free
sentient supervision required to keep the slaves in line.

> It would seem on the one hand that you could just treat them like any
> other tool, like a shovel. Count how long the tool lasts and what
> its own production costs are (including labor to design it, etc.).
> Include maintenance costs for the tool as well. In the case of an AI,
> maintenance costs might include time spent periodically refurbishing
> and reorganizing the software state, elaborate versions of today's
> garbage collection systems.

Yes.

> This might require the AI to, say, spend
> time interacting with a high-information-quality environment, or to be
> dormant while the software refreshes itself for another work period.

That seems unlikely. At worst you might need one instance of the software
to do these kinds of things - then you just load all the other copies with
the data from that instance. Even that seems unlikely for an end state,
since you don't particularly want an AI slave to learn on its own.

> Many people have argued that as machines take over more tasks, everything
> will become cheaper because less labor is involved. Lyle in his pages
> argues that machines are, in effect, labor. The mere fact of substituting
> a machine for a human being does not inherently make products cheaper
> because the machine has expenses just as people do.
>
> Now, I think the flaw in Lyle's reasoning is that machines are designed
> to have lower expenses and be more productive than people. But he is
> right in that the economic gain is not due directly to the subtitution
> of a machine for a person, which an overly simplistic analysis based
> on costs-as-labor might suggest. Rather, it is only when machines are
> economically able to do the same work for less money that we have a
> true savings.

But it is only when the machines are able to do the same work for less money
that we bother to build them in the first place. Lyle's argument here is
completely vacuous. Ever since the industrial revolution we have been
replacing people with machines in order to cut costs by reducing the amount
of labor required to do things. The process has accelerated significantly
with the invention of computers and telecommunications, and we would expect
even primitive AI to create even more opportunities to do the same thing.

Lyle's musings about sentient AI are just a smokescreen raised to confuse
the issue. We know perfectly well what the economic effects of non-sentient
automation are, because we've been doing it for generations. We also know
that we have a long, long way to go before we exhaust the possibilities of
this kind of automation. Finally, we can see that nanotechnology would
greatly extend these possibilities by making the manufacturing process more
amenable to automation. We should therefore expect the current cost
reduction trend to continue for a long time before it runs out of steam.

> Economic production is traditionally analyzed based on inputs of labor,
> raw materials, and capital. Perhaps in the future the distinction
> between labor and capital will become blurry in the case of robots.
> At that time we might see that people, too, are in a sense another form
> of capital. The sharp and historically contentious distinction between
> labor and capital may come to be seen as an illusion.

I don't think it makes sense to lump human and robotic labor together.
Humans (or "free sentient beings", if you want to be precise) are both
producers and consumers of goods. This means that human labor is always in
short supply, because each human wants to consume far more than he can
produce.

Billy Brown, MCSE+I
bbrown@transcient.com



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:48 MST