From: Samantha Atkins (samantha@objectent.com)
Date: Wed Feb 27 2002 - 00:11:18 MST
I see what Eliezer is doing as attempting create a "way out" of
an otherwise fatally dangerous and unstoppable technological
run-away. This is done by guided technological means. It has
nothing specific to do with "cultural activism" unless this term
is used in a way I am not familar with. Eliezer says (correctly
by his thinking) almost nothing about social/political goals and
issues. They are seen as distractions from the essential work.
Obviously, this would not be a necessarily healthy viewpoint
if widespread. But he may well be right.
- samantha
Mark Walker wrote:
> ----- Original Message -----
> From: "Eliezer S. Yudkowsky" <sentience@pobox.com>
>
>>Trying to influence present-day culture in the hopes of influencing
>>post-Singularity culture(*) is not the only possible way of entangling
>>yourself in the Singularity. If you overestimate the relative importance
>>
> of
>
>>"cultural development" then you run the risk of investing your limited
>>resources incorrectly.
>>
>>-- -- -- -- --
>>Eliezer S. Yudkowsky http://singinst.org/
>>Research Fellow, Singularity Institute for Artificial Intelligence
>>
>>(*) Yeah, right.
>>
>>
>>
>
> I am not sure we disagree here. I would count you among the most devoted of
> the cultural activists: FAI seems to be an attempt to ensure that we have
> the best probability of raising the right sort of mind children. Your
> documents are not attempts at 'pop' culture, but they are contribution to
> culture in a more inclusive sense. As you well-know, there are some that
> think that any attempt to influence the direction of the singularity is
> futile. Some of us think that how the singularity occurs could be as
> important as the fact that it does occur.
>
> Mark.
>
>
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:41 MST