"Robert J. Bradbury" wrote:
>
> I suspect this will not get linked into the discussion properly but
> one must make an attempt...
>
> My comments & links to the bills can be viewed at:
> http://nanodot.org/news/01/08/01/1333227.shtml
>
> Extropians should be very afraid -- you *are* having rights
> denied/made illegal.
All people should be afraid. Technologies that could cure many
misfortunes and produce near-miraculous things are being killed
when they are barely in their infancy out of fear and
ignorance. Government is abrogating to itself levels of power
that I see no even remote Constitutional justification for. And
no real hue and cry is raised. We have become so inured of
meaningless pressure group warfare in place of real political
sanity (if that is not an oxymoron) that we just assumed
someone's ox will be gored (no offense to the other guy) and
that their is nothing to be done but to hope it isn't yours this
time around.
>
> Brian wrote:
> > At some point the matter (as in atoms) must fall under someone's
> > control, and personally I don't relish the idea of having to
> > constantly protect myself from everyone else who can't be trusted
> > with nanotech and AI.
>
> This statement contains some assumptions that mis-represent
> the problem. You *are* already at war with bio-nanotech
Starting with the idea that all matter must be under someone's
"control". Or that we remain substantially the slighly evolved
chimps we are indefinitely even as we gain godlike powers.
> entities who have no interest in your personal survival.
> Your immune system is generally effective in defending
> against such entities without the requirement for constant
> intervention. Further, you link nanotech and AI in a way
> that suggests a threat that may be invalid. You can have
> nanotech and you can have AI but for either of them to be
> overwhelmingly threatening their heat/surface area dissipation
> has to have a very recognizable signature. You have to make
> a strong assertion that such organizations of matter can
> hide themselves from detection or elimination. If you cannot
> make that case then it does not matter whether or not such
> organizations of matter can be "trusted".
>
If it moves fast enough it will be unstoppable regardless of how
well it can be detected. Trust will come from a reorganization
of our expectations and how we see the world and operate as a
consequence. It will not come from the same old MAD games. The
context is too radically fast and different for the old games to
work.
> > All it takes is one Blight to wipe us out. That kind of threat
> > does not go away as humans progress to transhumanity, rather it
> > increases in likelihood. What is the stable state if not Sysop
> > or total death? There may be some other possibilities, can
> > you name some?
>
> Well, evolution in an upward spiral committed to extropian
> principles seems like a stable state. Note that this is
> an unstable state for humanity as it likely concludes that
> the instantiation of organized matter as "humans" is an
> inefficient use of the locally available resources and must
> be concluded.
>
Regardless of what said humans do or do not think about the
matter? Are there some things more sacred than maximum
efficient usage of local resources? How about having the value
of all sentients no matter how "inefficient" as a central value
to be preserved where at all possible?
> So, the question would be, can one (as an extropian)
> provide a coherent defense of the use of the material
> and energy required for you to remain "intact" in the
> light of more efficient reallocation of the resources
> you are consuming?
>
Who the hell cares whether I can or cannot "coherently defend"
my right to the resources I consume? Who says I need to?
> This is the moral problem -- Are you committed to the evolution
> of matter to its highest extropic state or are you committed
> to the preservation of oneself in a form that seems comfortable
> to you?
>
No, I am not commited to the "evolution of matter to its highest
extropic state". I am committed to the highest evolution and
well-being of sentient beings, including humans. All of them.
You don't junk some of them because you can think of a more
efficient use of the resources or could build a more interesting
sentient being out of the parts. To do so denies the value of
sentient beings (real "people") completely. It leads one
seeking Borg perfection by ripping apart and ignoring the value
of all that is alive. It is a hideous vision that I will have no
part of.
> Do the needs of the many (the potential application of the atoms
> at our disposal) outweigh the needs of the one (your personal survival)?
We don't live in a zero-sum game unless we chose to.
- samantha
This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:40:03 MDT