From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Feb 17 2002 - 10:15:23 MST
"Robert J. Bradbury" wrote:
>
> Actually, no. So I may be drawing conclusions where there aren't
> many facts in evidence. I do wonder though if there isn't an implicit
> assumption that the NanoSantas or the SysOp or the Singularity
The term you're looking for is "fluffy bunnies". (As for why, let the
historical record show that this is Emil Gilliam's fault.)
> will arrive so soon that we don't need to prepare for the alternatives.
What this model completely fails to address is that our current actions can
affect the fluffy bunny ETA. We are not limited to waiting around passively
for the fluffy bunnies to arrive.
> Clearly! If one does believe the NanoSantas, SysOps or Singularity will
> arrive in the next 15-30 years, then investing of any type at all
> may be rather foolish. On the other hand if one has a life-philosophy
> of being prepared in case "shit happens", then it makes good sense.
You can also invest in accelerating the time of the fluffy bunnies.
Fortunately fluffy bunny acceleration is an additive group effort.
** From #sl4, when an innocent newcomer was dragged in by her SO:
<Sarah> besides what does it mean to have singularity occur??
...
<EmilG> The Singularity means humanity disappears over the sunset with
a The End title across the horizon and a lush soundtrack, and everyone
hugs cute bunnies forever.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:26 MST