From: Adrian Tymes (wingcat@pacbell.net)
Date: Mon Sep 25 2000 - 21:36:31 MDT
"Peter C. McCluskey" wrote:
> As the size of the benefit gets up into the trillions of dollars, the
> effects start to resemble making one person a world dictator. It is
> this kind of consideration that would probably cause me to oppose a
> "benefit" of this sort that exceeded something like $100 billion.
> I will be reluctant to believe people who are claiming they would
> support giving someone $100 trillion until they explain why they
> aren't deterred by the maxim "power corrupts".
Do I get advanced (relative to most of the world) warning as to who this
individual will be? If so, then I can get a head start on corrupting
the recipient towards my views, like so:
* Cyber, nano, AI, etc. mean more power for those who master them
* Easiest way to get this is to fund its development for everyone
* You still get a big head start; others can catch up, but by then
you're doing something which might win you everything forever (this
is probably a minor lie)
* I can make this stuff for whoever funds me
And just as the recipient is pondering this, the money arrives...
Who says you can't use evil in the service of good? ^_^
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:31:12 MST