"Robert J. Bradbury" wrote:
>
> Brian wrote:
> > At some point the matter (as in atoms) must fall under someone's
> > control, and personally I don't relish the idea of having to
> > constantly protect myself from everyone else who can't be trusted
> > with nanotech and AI.
>
> This statement contains some assumptions that mis-represent
> the problem. You *are* already at war with bio-nanotech
> entities who have no interest in your personal survival.
> Your immune system is generally effective in defending
> against such entities without the requirement for constant
And how praytell will I come up with an effective immune system
against hostile nanotech and/or SIs? I'm sorry Robert, but you
are the one misrepresenting the problem here :-)
> intervention. Further, you link nanotech and AI in a way
> that suggests a threat that may be invalid. You can have
> nanotech and you can have AI but for either of them to be
> overwhelmingly threatening their heat/surface area dissipation
> has to have a very recognizable signature. You have to make
> a strong assertion that such organizations of matter can
> hide themselves from detection or elimination. If you cannot
> make that case then it does not matter whether or not such
> organizations of matter can be "trusted".
This doesn't make much sense to me either, but perhaps you can
explain how I am going to have some nanotech to defend against
the (most likely) superior nanotech of a SI. Am I forced to
engage in an intelligence war with it, trying to stay as smart
just so I can keep my defenses worthy? And what about the other
people whose defenses fail?
>
> > All it takes is one Blight to wipe us out. That kind of threat
> > does not go away as humans progress to transhumanity, rather it
> > increases in likelihood. What is the stable state if not Sysop
> > or total death? There may be some other possibilities, can
> > you name some?
>
> Well, evolution in an upward spiral committed to extropian
> principles seems like a stable state. Note that this is
Sure if every single person in the solar system agrees with you, that
might work. Personally I am not happy to be forced (coerced) by the
kind of environment you are describing into constantly evolving my
intelligence and defenses just to keep from being killed.
> an unstable state for humanity as it likely concludes that
> the instantiation of organized matter as "humans" is an
> inefficient use of the locally available resources and must
> be concluded.
And your answer for the Amish is? They get eaten by whoever lacks your
ethical standards?
>
> So, the question would be, can one (as an extropian)
> provide a coherent defense of the use of the material
> and energy required for you to remain "intact" in the
> light of more efficient reallocation of the resources
> you are consuming?
>
> This is the moral problem -- Are you committed to the evolution
> of matter to its highest extropic state or are you committed
> to the preservation of oneself in a form that seems comfortable
> to you?
My personal committments do not matter to another person, nor should
they. I will not be a willing party to a future where everyone is
coerced into upgrading. It should ideally be left to each person's
own choice.
>
> Do the needs of the many (the potential application of the atoms
> at our disposal) outweigh the needs of the one (your personal survival)?
>
In the Universe I want to live in, the needs of each individual will
be protected. If this makes it impossible for you to use the Earth's
atoms for your pet science project, too bad.
-- Brian Atkins Singularity Institute for Artificial Intelligence http://www.singinst.org/
This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:40:01 MDT