From: Anders Sandberg (asa@nada.kth.se)
Date: Fri May 24 2002 - 06:59:39 MDT
On Fri, May 24, 2002 at 08:11:57AM -0400, Eliezer S. Yudkowsky wrote:
> Anders Sandberg wrote:
> >
> > Two articles in The Economist worrying about the ethics of brain science
> > and how it may create a (shudder!) posthuman future:
> >
> > http://www.economist.com/opinion/displayStory.cfm?story_id=1143583
> > http://www.economist.com/science/displayStory.cfm?story_id=1143317
> >
> > See the effect Fukuyama et al has? We better respond proactively - not
> > doing anything will lead to a walk-over situation. We need more books
> > like Greg Stock's.
>
> There is no "we". Nothing is ever done by "we". But if anyone wants to
> step up and say "I will write a letter to _The Economist_," I'm sure we'd
> all be grateful.
OK, sorry for the misuse of 'we'. Dangerous Scandinavian trait.
As for letter-writing, it might be an idea if the writer has something
constructive to say. Just disagreeing with the articles is not enough,
it is necessary to deal with their core ideas.
As I see it, the basic claims are 1) Neuroscience has the potential to
do a lot of amazing, worrying and humanity-changing stuff. I think most
of us agree totally here. 2) Neuroscience is not covered very much in
current bioethical discussions or subject to the same scrutiny as
genetics. Again, I think this is a good point. 3) Since changes to the
human condition are bad or should be viewed with suspicion, we need to
monitor neuroscience more closely to handle this.
Interpreted weakly, this claim might even be acceptable to many
transhumanists - we would like to get to posthumanity but preferrably
without disasters so we would like to scrutinize every step carefully.
But the tone in the article - and this is what really affects people,
not the semantics - consistently suggests that posthuman is bad, so the
meaning of 3) really becomes "we need to monitor the research so it
doesn't allow posthumanity to come to happen".
Our basic disagreement is of course with this, but dealing with it is
harder. I think Greg Stock did an admirable job in listing most common
arguments against genetic augmentation in his book and giving them at
least partial answers, but we need to really dig in deep in all of them
and provide strong counterarguments. Some points are of course
fundamentally impossible to resolve (like many theological issues), and
then the issue instead ought to be how to make it possible to co-exist
in the world. Others are far easier to deal with (no simple by any
means), since they are about the likely consequences and can in
principle be analysed and answered. Whether genetic design is likely to
create a class society is something that can be at least partially
answered. But these arguments and answers need to form a somewhat
coherent counter to the worldview of our opponents.
Bill Joy went at both biotech, nanotech and AI. Fukuyama focused on
biotech, the Economist did neurotech. Expect others to weigh in on why
we need to monitor AI research soon. Right now these are just sideshows
from the biotech spectacle, but they all fit in with a greater trend of
thinking that sees our goals as fundamentally antithetical to everything
good and just. In order to be allowed to pursue our goals (and in order
to increase the likeliehood that technological change does not cause
radically nasty consequences - a likely result from some of their
proposals!) we need to strengthen our own worldview and make it a point
of view that is accepted as valid in the debate.
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! asa@nada.kth.se http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:14:19 MST