Re: SI: Singleton and programming

From: Nick Bostrom (bostrom@ndirect.co.uk)
Date: Sun Nov 22 1998 - 16:32:42 MST


I'm still looking for a really good definition of what a singleton
is (maybe I will finally have time to finish that paper this
Christman holiday?). However, I can say this about the concept I had
in mind:

1. It does not imply a unity of mind. The singleton could have one
unitary mind, or it could contain lots of independent minds (e.g.
human minds).

2. It has more to do with global efficiency. Robin Hanson's paper
about burning the cosmic commons in a Darwinian race to colonize
space depicts a scenario that is not compatible with the singleton
hypothesis since it would be globally wasteful.

3. You may ask, efficient for what? On this the singleton hypothesis
is silent. One can imagine any of a large number of global goals
either of which could be adopted by a singleton (e.g. the goal to
allow humans and posthumans to freely persue their goals without
being coerced.)

And finally, why bother about what happens after the singularity (if
indeed there will be one)? Eliezer thinks that it can take care of
itself. Well, I think that for all we know, several different
post-singularity paths may be possible, and which one will actually
be real might depend on human choices between today and the final
moments before the singularity. We therefore want to understand what
the possibilities are so we can try to bring about the one we like
best.

Nick Bostrom
http://www.hedweb.com/nickb n.bostrom@lse.ac.uk
Department of Philosophy, Logic and Scientific Method
London School of Economics



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:50 MST