Re: non-SI technothreats [was: Sweeden & Germany to phase out nuclearpower?]

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Mar 28 2002 - 17:31:47 MST


"Robert J. Bradbury" wrote:
>
> On Thu, 28 Mar 2002, Dan Clemmensen wrote:
>
> > What part of this seems dangerous? I think a human race without SI is
> > increasingly dangerous. We've somehow managed to avoid nuclear war
> > so far. but nanotech and some forms of biotech look threats that may
> > be beyond practical human control.
>
> I disagree. There are plenty of "threats" out there now (asteroids
> we are unaware of, 9.0+ earthquakes in the Pacific Northwest,
> slumbering volcanoes waiting to blow their top, etc.) that involve
> *no* tech and are significantly beyond the "control" abilities
> of technology currently available to us. In contrast with biotech
> and nanotech I can cite very specific solutions to problems that
> one might anticipate could develop.

I think that Dan Clemmensen and I, when we talk about "threats", mean
"existential risks or things that impact existential risks". Anything that
is not an existential risk should be called by its proper name,
"inconvenience".

> The sun growing into a red giant and roasting the Earth is a virtual
> certainty.

Huh? Five billion years from now, the only thing that determines whether
the sun grows into a red giant and roasts the Earth will be how we feel
about it. I don't see how you can be so certain about which way public
sentiment will run.

Besides, aren't you one of the "mass-energy is the limiting resource"
types? If so, why would there still be a Sun?

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:07 MST