From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Tue Feb 04 1997 - 19:48:55 MST
[Saith Lee Daniel Crocker:]
> (7) Other complex forms will evolve to continue themselves and use
> resources at my expense if they can. To ensure the continuation of
> my consciousness, therefore, it is in my interest to learn as much
> as I can about the nature of such systems, and to create technologies
> that successfully defend against them. Because of (5), this is the
> course of action most likely to lead to my goal.
Really? So the logical course of action for a Power would be to stomp
on you before you stomped back. Similarly, the logical course of action
on my part would be to stomp on you before you stomped out my precious
Singularity. I prefer to think that cooperation between sufficiently
intelligent beings is axiomatic - that is, any two Powers will cooperate
on the Prisoner's Dilemna. You apparently believe it is axiomatic that
they will both defect.
Putting the ethical issues aside, seeing as how neither of us is a
Power, consider that saying you'll do your best to stamp out all other
sentient entities is not a good idea. They might stamp first.
Generally speaking, trying to maximize your share of the pie as a
negative-sum game is a bad strategy, because if everyone follows it, the
pie shrinks. I have no trouble with working to increase individual
shares, as long as the game being played is always positive-sum. That
is, if you increase your share from 1% to 2%, you need only increase the
size of the pie by 1.1% in doing so. The best way to ensure this is to
declare that all interactions between players will be positive-sum; this
can be done simply by outlawing the use of coercive force that could
force any individual to make a trade not beneficial to himself. In
other words, capitalist Libertarianism.
So I reject your philosophy on the grounds that it wouldn't work if
everyone followed it, and in fact won't even work for you because others
will become aware that you intend to use coercive force to steal their
shares, and will launch a preemptive strike.
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:08 MST