John Clark wrote:
>
> This solution gives insight into NASA's values and its idea of what the future
> will be like. We must take measures to make sure there is not one chance in
> a thousand of killing somebody in a 100 years, especially for a reason as trivial as
> scientific knowledge. Even in the year 2100 a re-entering 17 ton non-radioactive
> satellite in a very predictable orbit will still be an awesome danger to our powerless
> descendents.
Nah, I have to admit I see their logic on this one. It's not acceptable
for *them* to decide to make *someone else* run a 1:1000 risk of death
in 100 years, even for a reason as important as scientific knowledge.
NASA doesn't know from Singularity; for all they know, in 100 years
those damn bureaucrats will have given up space travel and carpeted the
Earth with delicately balanced arcologies. I don't know whether the
thought flashed through their minds; however, responsible space travel
as practiced circa 2000 dictates that if you orbit it, you're
responsible for de-orbiting it safely. If NASA wanted the data, they
were responsible for putting up a satellite where losing one gyroscope
forces the decision to de-orbit it.
It's not that the 1:1000 risk is unacceptable; it's that it's a risk
which is neither traditional nor unavoidable. We accept what are
technically 1:1000 risks as "trivial" if the result is viewed as a
"special case", i.e., something *unexpected* happened. NASA's 1:1000
chance is a *known* 1:1000 chance and is therefore regarded as
unacceptable due to the way the human mind processes risks.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/beyond.html Member, Extropy Institute Senior Associate, Foresight Institute
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:06:23 MDT