Re: POLITICS + POSTHUMAN FUTURES

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Apr 14 2002 - 09:59:36 MDT


The only way I know of to keep a mailing list free of politics and
pseudoscience is to moderate it. For example, on a certain mailing list
that shall go unnamed, what threatened to degenerate into political
discussion was immediately met with:

> *****-as-moderator:
>
> Just a reminder... this is not the Extropians list and all political issues
> that do not contain *direct* futuristic content are not appropriate for
> ****. I'm not saying that a good pre-Singularity sapiens should never think
> about politics or that politics has no conceivable impact on the
> Singularity, but **** does specialize and there are more appropriate forums.
>
> *****-in-personal-capacity:
>
> The great social game of politics can affect the Singularity and it's okay
> to pay attention to it, but actually becoming emotionally involved in it is
> a separate issue. Naturally our hunter-gatherer genes predispose us to
> grant an enormous amount of attention to the modern-day equivalent of tribal
> factional politics, and naturally this is not the rational distribution of
> effort for a Singularitarian or anyone else who can leverage their effect on
> the future through more effective means than enjoyable political argument.
> Politics, viewed externally, is a real phenomenon that has a real impact on
> the Singularity; viewed internally, it is a sport with teams.
>
> *****-as-moderator:
>
> It's okay to have a favorite team. It may even genuinely be the best team
> for the Singularity. But please don't cheer your team on ****.

Okay, so I'm not fooling anyone. But I wish to point out that the price of
an unmoderated list is bucketloads of irrelevant crap, and eventually the
irrelevant crap may drive away the real posters. If so, it doesn't mean
that Extropy is dead - just that the Extropians list is sick. There are
other forums; I'd be sad to see this one go, but it isn't the whole
transhumanist movement in itself.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:31 MST