> Nicholas Bostrom <bostrom@mail.ndirect.co.uk> wrote:
>
> > Safe Libertarian Future:
> > The scenario assumes that many humans value freedom and independent
> > personal existence higher than anything else. When nanotechnology
> > approaches, they realise that if freedom is allowed in a world with
> > strong nanotech, then some mad person will certainly design the
> > doomsday virus. So they realise they have to give up on freedom. But
> > then some bright person comes up with the idea that all people upload
> > and that only a robot is left with the ability to operate in the real
> > world. The whole system is hardwired so that the robot only executes
> > instructions that have been agreed upon by the majority of the
> > uploads. In their virtual reality, the uploads can do anything they
> > want: each one has unlimited individual freedom. The only thing they
> > can't do in the virtual reality is to mass murder a lot of other
> > uploads (the virtual physics doesn't allow destructive nanomachines
> > to be built, for example). The uploads cannot influence the external
> > world either, except when a majority decision can be made. But for
> > many decisions, this should be feasible: e.g. colonising the galaxy
> > to provide more Lebensraum etc. One can even imagine refinements of
> > this scheme such that each individual would have his own robot that
> > he could to what he liked with; though this presupposes that the
> > robots could be built in such a way that nobody could use their robot
> > to do anything that would endanger the computer on which they all
> > existed.
> >
> > This is the only way I can think of that a very nearly
> > completely libertarian society, without any guardian or international
> > government, can exist long after the arrival of strong nanotech.
>
> There will always be (plenty) of people who'll refuse to get uploaded
> into some virtual reality asylum, what about them? Would they be
> forced to upload (upload or die!)?
Yes.
>IMHO a democratic system
> like the one above is almost by definition a severe handicap in case
> of a conflict with "free" outsiders, because while the democrates are
> busy debating and voting, the enemy has already launched his proton
> torpedoes or whatever.
There is a good chance we will never be attacked by aliens. And if we
were, then those aliens would surely be smart enough not to attack us
unless they were certain that they could easily win whatever we did.
If alien invasion really were an issue, we could develop an automated
missile launch system or something like that.
> What would happen if someone trashed the robot (the only outside link),
> would the VRs be trapped in their "dreamworld" forever?
Well, I spoke of "the robot" figuratively. In reality this would
consist of millions of von Neumann probes expanding our computer in
all directions.
> Anyway, I think the only way you can stay reasonably free *and* safe is
> when everybody (possibly in small like-minded groups) leaves earth and goes
> in different directions. A 1.000.000 lightyears or so seem (with the
> current laws of physics) like a pretty safe barrier.
Well then you would agree that we need some temporary accomodation
for the next million years or so.
------------------------------------------------
Nicholas Bostrom
bostrom@ndirect.co.uk
*Visit my transhumanist web site at*
http://www.hedweb.com/nickb