From: Michael Roy Ames (michaelroyames@hotmail.com)
Date: Wed Jun 26 2002 - 10:45:43 MDT
Eugen Leitl has stated that he considers the FAI and SeedAI memes to be
dangerous, and wants to fight against them. I am sure there are others on
the list who, though not as vociferous, hold similar opinions - or at least
reservations about the ideas. Well, that's perfectly fine on the face of
it. Opposition to new ideas causes those who are promoting the ideas to
defend them. Ideas that are good and correct get improved further, and
ideas that are bad and false get discarded. This is how we all make
progress. It is a win-win situation where everybody's good ideas get
improved, and bad ideas get revealed as being bad and discarded.
But there is a type of opposition that turns that win-win into a lose-lose
scenario. By mindlessly opposing a new idea because of an irrational
belief, much time is wasted by both opposer and opposed. The 'Creationists'
debate is a good example of this. But one does not have to hold an
irrational belief to turn opposition into a lose-lose activity. If someone
wanted to hamper the efforts of SIAI, they could simply 'distract' the
participants with semi-plausible objections. This would be intended to
'burn cycles' of the people most involved in promoting the new ideas. On
the surface this might appear to achieve the aim of slowing down the spread
and/or development of the meme, and in the very short term that might indeed
be the case. However, in the mid to long-term, this strategy is
counter-productive for both parties. It is counter-productive to the
opposer because the time taken by the opposed to respond to semi-plausible
objections could have been spent considering *well thought-out* objections -
the kind of objections that might improve the idea into a form that the
opposer would support!
A single, insightful, well-explained objection is worth a thousand
semi-plausible critisizms.
Eugen: If you want to be *effective* in your fight, do your homework! Give
us a good argument, not fuzzy, out-of-focus or generalized critisizms. I
do not support FAI because of irrational belief, but because of a detailed
rational argument. If futher arguments (or evidence) were presented that
invalidated FAI, then I would not continue with my support, but would move
to the 'next best' option... maybe one of your suggested options!
Michael Roy Ames
Ottawa, Canada
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT