From: Dan Clemmensen (dgc@cox.rr.com)
Date: Sat Apr 13 2002 - 13:18:03 MDT
Harvey Newstrom wrote:
>
> On Friday, April 12, 2002, at 03:14 pm, Eugen Leitl wrote:
[SNIP 'gene's feasible solutions/activities]
>
>> Guess what? No one is doing this. Even if it would save money on the long
>> run. Instead, people go for idiotic schemes like that Los Alamos thing. I
>> don't see the point for going expensive hi-tech as long as the low tech
>> options haven't been exhausted.
>
>
> I believe this attitude is the root cause of the problem above. People
> are so convinced that this stuff is already happening, that they don't
> see any need to actually do any work themselves. There is an alarming
> trend on this group to dismiss real solutions on the theory that the
> Singularity will save us. I ha>
>> I'm also looking for practical solutions -- but the process as described
>> is compleat bogus. As quite a few processes proposed recently. I wonder
>> what does make them pass even minor scrutiny, probably the press being
>> clueless about physics.
>
>
> There does seem to be a lot of pseudoscience lately, even on this list.
> People just don't have enough basic science to evaluate whether
> something makes sense or not. Most people just jump on an idea based on
> their own desire for it to be true. I worry that most of our own pet
> technologies promoted on this list are really bogus. We seem to have
> more hype than real research. So many people are assured that this
> stuff is inevitable that I wonder if we are really slowing down
> progress. Most of what gets posted as "science" to this list is really
> just armchair conjecture. Maybe real scientists don't have time to
> participate. I'm not sure how to address this problem, but I am
> beginning to believe that pseudoscience from within our own ranks is the
> biggest threat to our future. I've seen people discount their own health,
> space travel, cancer research, scientific experimentation, and other
> "good things" on the theory that the Singularity will arrive first and
> do those things for us faster than we could do it ourselves. They then
> prefer to sit and do nothing and actively discourage these real advances
> in favor of a faith-based future that may never materialize. This
> really scares me, because it is the same religious attitude that
> prevents cult-members for taking control of their own lives. They
> expect the apocalypse or revolution to come before any realistic goals
> could be met. To hear the old religious fallacies being repeated from
> our own people is disturbing.
>
I understand your concern, but I feel that it is misplaced. I think our
difference in perspective derives from a fundamentally different
evaluations of the likelihood and timing of the singularity. You perceive
my "singularity soon" evaluation as "pseudoscience" and "faith-based."
I perceive your "singularity later or never" evaluation as "luddite" and
"self-deluded" (to use words with a non-objective emotional content
equivalent to the ones you chose.) We should try to work together to
create an evaluation matrix that we can agree on, using words with
higher rational content and lower emotional content. Here's a start.
Please feel free to adjust the terminology if you feel that mine is too
biased, and I'll do the same with any reply. Let's examine the
consequences for the four extreme scenarios:
1) We act as if the Singularity is late/never, and it is.
2) We act as if the Singularity is late/never, and it is not.
3) We act as if the Singularity is soon, and it is.
4) We act as if the Singularity is soon, and it is not.
Your complaints address scenario 4. If we all sit around doing nothing
and 4 occurs, things could very well get a lot worse until we realized
our mistake: at the extreme, we lose our one chance at saving the
planet, the race, and/or civilization. I understand and agree that this
extreme approach is a bad thing.
My complaints address scenario 2. If we aggressively allocate resources
to solving long term problems with long-payback solutions, we will waste
resources that could otherwise be used to accelerate the singularity.
Up-front costs are very real, and will not be recovered if the
investment is technically obsoleted. To the extent that this delays
the singularity the delay extends our vulnerability to pre-singularity
catastrophes: at the extreme, we lose our one chance at saving the
planet, the race, and/or civilization.
(I think there is no major problem with the other two scenarios: the
advantages are obvious)
Sitting on our thumbs and doing nothing is a certain losing strategy.
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:29 MST