Re: making microsingularities

From: Samantha Atkins (samantha@objectent.com)
Date: Mon May 28 2001 - 19:31:41 MDT


"Eliezer S. Yudkowsky" wrote:
>
the record, for those of you who aren't aware, there's a
> significant faction that regards the whole Slow Singularity scenario as
> being, well, silly.
>

But the fast path assumes a lot of "friggin magic". It assumes
that we can build not only a true AI but an SI and instill it
with or have it develop wisdom and Friendliness.

This would of course be fabulous. But many have difficulty
believing it is possible. Many also have difficulty believing
that instantaneously making life as we know it irrelevant is all
that wise even if we can pull this off. And, of course, the
consequences of miscalculation are completely catastrophic.

> Slow augmentation through expensive BCI implants may respect cultural
> differences. Over at SIAI, well, those of you who have read Greg Egan can
> think of it as the "Introdus wavefront".

Augmentaiton over time may be the only thing that allows humans
to grow into more than human in a non-cataclysmic (without
massive destruction, terror and so on) way. It may also be the
best bet for keeping the positives of our evolution like
compassion as we go forward. Not as flashy, fraught with a lot
of suffering and potential cataclysm also, but, also in keeping
with who we are and a becoming more that is not imposed by a few
or the SI One.

- samantha



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:07:49 MST