Re: No Singularity?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Nov 19 1999 - 13:37:49 MST


Once you start asking whether AI labor should be accounted as "labor",
then you're digging down below the level of abstraction at which it is
possible to talk about an "economy". Let's ask about standards of
living. If nanotechnology is around, do standards of living go up?
Yes. I don't care how you define the costs of producing things. I
don't care if you can twist your definition around so that a fancy
dinner costs the same amount in raw material and intelligent labor;
obviously, if you define atoms and labor as the basis of cost and assume
that those costs are the fundamental constant, which is what Lyle does,
then it's tautologically obvious that the cost of dinner is a constant.

But what matters is the standard of living, and if I can have beef
Wellington for dinner every night, at little or no cost in terms of
mental energy and expenditure of physical effort - these being the two
factors that I would define as important to standards of living - then
the nanoSantas have arrived. And given that AIs, whatever else you can
say about them, are not likely to have to expend "mental energy" to
design a beef Wellington dinner in the first place, much less rerun the
subroutines to produce a specific instance, then this seems like a
reasonable outcome given AIs that interact economically with humanity;
that is, AIs that behave altruistically, or AIs which can be paid in a
humanly comprehensible unit of exchange. The cost of AI labor *is*
effectively zero to the AIs, and it seems plausible that the cost to
humans will be equally zero. So however Lyle chooses to define labor
costs, standards of living still skyrocket, more than enough to justify
the phrase "Santa effect". (This is leaving out the effects of nanowar,
of course.) Likewise, while Lyle may choose to regard the cost of an
atom as constant, the cost to *me*, the expenditure of my time required,
goes way down once automated asteroid mining arrives.

The parts of "Geniebusters" which are not flat wrong are tautological.
The entire site is irrelevant. Have a nice day.

-- 
           sentience@pobox.com          Eliezer S. Yudkowsky
        http://pobox.com/~sentience/tmol-faq/meaningoflife.html
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:48 MST