Re: Sweeden & Germany to phase out nuclear power?

From: Dan Clemmensen (dgc@cox.rr.com)
Date: Thu Mar 28 2002 - 12:01:28 MST


Samantha Atkins wrote:

> Dan Clemmensen wrote:
>>[...] We can and should employ the "least disruptive"
>> current technologies even if there are long-term side-effects, because
>> we (or our brilliant successors) can easily deal with them.
>
> Wait a sec. Nuclear power is cleaner and causes less deaths right now
> (stupidities like the Russian idiocy to one side). Do we not care about
> the people that die due to non-nuclear power in the meantime?
>

We care a lot. I'm fully aware that fission power is the current
rational choice. It is much less dangerous and disruptive than any
alternative by every rational measure. It is infeasible only because
the effort required to educate the public will be too large and I see
no practical way to undertake it. The death and ecological destruction
due to failure to adopt fission power can be laid at the feet of
a mistaken belief system, like the deaths in the USSR in the 1920's and
1930's, or the Holocaust, or the current famine in North Korea.

>>
>
> Singularity, contrary to some opinions is not inevitable. A long
> endless war (the dumb US one turned more hot for instance) could
> seriously slow down technological advances as could socio-political
> counter-measures (many of which we seem to be attempting).

A short nuclear war, maybe. long endless wars tend to force certain
technologies. Eugene listed his requirements for halting the
singularity: They include police-state-level control of all software
development. I don't think that this is even feasible any more.

>> In my opinion, the correct extropian evaluation uses a very sharp
>> discount rate, because we know that technology will advance much faster
>> than most analysts believe. Therefore, we should favor energy generation
>
> We actually "know" no such thing. Look how easy it was to stop a lot of
> high-tech startups in their tracks with the one-two punch of the tech
> slump and 9/11 and the additional blow of Bushite nonsense. Technology
> is not just technology, it involves people, politics and business.

Please specify how the dot.coms were contributing to the advancement of
technology? The meltdown affected mostly dot.coms and certain
telecommunications sectors. The dot.coms were mostly worthless. The
telecomms that failed were all spending VC money, with five or ten
companies for each new niche product. The consolidation when more than
half of these companies failed did not affect technological progress of
the other half except perhaps positively as the design teams
consolidated. I worked for IPOptical until we ran out of cash: now I
work for HyperChip. IPOptical was a bit behind and was depending on
outside hardware that the chip makers failed to deliver (overambitious.)
Hyperchip built their own hardware but could not hire enough programmers
because of the insane bubble. The bubble burst, HyperChip hired a bunch
of us, and we'll deliver a massively scalable IP router this year.

>> but is a direct consequence in my rational analysis that the singularity
>> is highly likely to occur before 2020.
>
> I do not think this is highly likely any more. The world is a good deal
> different than it was a couple of years ago and not altogether for the
> good of such a prediction. I believe Singularity is possible by 2020
> but not so likely. If it does come before 2020 I would expect it to
> come from nanotech advances first rather than AI. I don't believe we
> have much of a workable plan for producing a SI that soon yet.

We have no plan. Well, some folks have some plans, but my analysis does
not depend on any particular plan. Rather, I feel that the increasing
amount of computational horsepower means that the brilliance needed in
the software design decreases over time. When a clever designer finally
does create the SI, we will see that in retrospect a truly brilliant
design would have worked on hardware available several years earlier and
that it is likely that an SI is achievable on today's available
computing power.

>
>> This does not mean that I'm
>> waiting for the tooth fairy to bring about the singularity. It does mean
>> that I'm no longer interested in the power debate.
>
> Why? Because the SI (supposedly) will figure it all out for us as soon
> as it kicks in? This seems very dangerous to me.
>

The SI will either be benign or not. If benign, It will indeed "figure
it all out." But an SI may very well be human-computer collaboration,
or may incorporate humans in some way.

What part of this seems dangerous? I think a human race without SI is
increasingly dangerous. We've somehow managed to avoid nuclear war
so far. but nanotech and some forms of biotech look threats that may
be beyond practical human control. The lesson of 9/11 is that society
makes increasing power (in the physics sense of the word) available for
any determined individual to use. This will accelerate.



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:07 MST