From: Mark Walker (mdwalker@quickclic.net)
Date: Tue Nov 27 2001 - 09:03:50 MST
----- Original Message -----
From: "Samantha Atkins" <samantha@objectent.com>
>
> > There are two big families of argument here, at least as I see it.
> > (1)'Moratoriumists' believe that we do not have the wisdom at our
current
> > stage of cultural development to use Person Engineering Technologies
(PETs),
> > but leave open the possibility that in the longer-term this might be
> > ethically ok.
>
> I have some amount of sympathy for this position. The amount of
> wisdom we humans apply to any level of technology today is not
> at all impressive. But I would rather see the technologies
> develop, including ones that might be helpful in getting more
> intellectual capable humans (and other sentient beings) than
> wait some indefinite period of time at the current levels and
> capabilities - even if this was remotely possible.
I guess most of us here agree. The trick of course is for us to make the
case compelling for a wider audience. Prima facie, I think the
Moratoriumist's position might appeal to many secularists as it appears the
prudent option.
> Personally I
> don't think more raw intelligence and computational power alone
> will be sufficient to arrive at wisdom but that is a different
> question. The other arguing point against this position is to
> attempt to pin down exactly how these technologies might be
> misued. I can't come up with any very compelling scare cases at
> the moment.
>
I can't either. Of course the hardest sort of objection is the fear of the
unknown: what if there is an unknown and deleterious consequence from this
technology? (What we might think of as the Marie Curie objection). I think
we will need to have a prepared sort of answer to this sort of objection
which, as far as I can see, can only be answered by carefully and patiently
explaining the procedures, benefits, and known risks of this technology.
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:12:14 MST