From: Mike Lorrey (mlorrey@datamann.com)
Date: Thu Mar 14 2002 - 18:08:05 MST
hal@finney.org wrote:
> >
> > Fukuyama then draws on Aristotle and the concept
> > of "natural right" to argue against unfettered
> > development of biotechnology.
>
> I wonder how our local "natural rights" theorists see this. It's always
> seemed like a serious problem with the natural rights approach, that
> as we change what is natural we lose all guideposts as to what rights
> should exist.
I don't see any reason to 'lose' guideposts, nor do I see transhumanism
as being against a natural rights POV. Much extropic thought has been
focused on using technology to allow the individual to RECLAIM those
natural rights which have been confiscated by religions, statists, and
industrial complexes in the past couple of millenia. Aristotle's
statements about natural rights recognise that technology is man's
natural right to use as he sees fit (derived from the existence of the
hand and the inventive mind), which Fukuyama should be aware of.
Fukuyama is contradicting himself when he supports using government
force to restrain the right of the individual to determine their own
path of development. It indicates that for all of his posing about
trusting the individual, he still defaults toward statism when it comes
to these future technologies.
Part of it is that there are, in fact, real risks to these technologies.
What Fukuyama fails to note is that it is government and state-like
multinational corporations who are far more likely to abuse these
technologies in the risky areas of exploitation, seeking to use them as
weapons or to exploit consumers and employees. It is not the individual
who needs to be restrained with regard to these technologies, it is the
state.
History has shown that far far more damage is wrought by the state with
technology, both in human lives lost as well as loss of liberty and
environmental vitality. Large scale industrial application of technology
to both agriculture and industry, by both states and corporations, come
in second to this, while use of technology by individuals is far less
damaging.
Of course, a major problem with individual action at all levels is that
both state and corporate mechanisms generally shield the individual
within such structures against responsibility for their actions. A huge
percent of the human race is still unable and/or unwilling to act in
their own * rational long term self interest *, and as such DO tend to
act in ways which lend support to statist mistrust of individuals. This
is primarily due to a lack of available technology needed to efficiently
access and exploit resources, but also undemocratic force monopolies
practicing looter economics.
>
> > His claim is that a
> > substantive human nature exists, that basic ethical
> > principles and political rights such as equality are
> > based on judgments about that nature, and therefore
> > that human dignity itself could be lost if human nature
> > is altered. Finally, he argues that state power,
> > possibly in the form of new regulatory institutions,
> > should be used to regulate biotechnology, and that
> > pessimism about the ability of the global community to
> > do this is unwarranted.
>
> Of course I hope that we do not see these kinds of restrictions, but
> contrary to many here, I welcome consideration of these issues by writers
> as influential as Fukuyama. It is time for the larger world to begin
> to grapple seriously with the changes ahead. They can no longer hide
> behind the comforting belief that it's all just science fiction.
I would love to get a chance to bend his ear for a while.
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:58 MST