From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Sep 25 1997 - 19:51:50 MDT
The Low Golden Willow wrote:
>
> Er, what were you thinking of by "Your generation"? I'm 22. And when
> the Singularity could be something like the Blight I don't think fear of
> it is connected to discomfort with computers. Or fear of being
> uploaded. It's more a fear of being wiped out.
Yes, but *why* does someone fear being wiped out?
Maybe it's indirect - xenophobes to fiction to popular "Borg" image to fear -
but the root still lies in xenophobia. And someone totally comfortable with
computers is less likely to have an *irrational* fear, or to have a big
emotional investment in that fear.
I don't think I'd like to be at Ground Zero of a Blight either. This means
I'll be careful - not that I'll be frightened of the Singularity. Are you
frightened of your car because a crash at 60 mph would wipe you out? No,
because you're used to going very fast. Speed doesn't scare you.
But take someone who's never been in a car out on the Interstate...
Re: Generations - since my postulated threshold was learning to use a computer
before puberty (the usual maturation window), then you might well be of a
different generation. Or maybe not. It was just a suggestion. I think some
study or other demonstrated that "generation gaps" actually were occurring at
4-or-5 year intervals, or something like that.
My little brother (age 12) is reading Broderick's "The Spike". Will he have a
still different attitude towards Singularity when he grows older? Time will tell.
The early-adopter issue may confuse it also. Is everyone of the same age in
the same generation?
And finally, you might not be a counterexample to my basic example. If you
reject the Transcension on purely rational grounds (no fear) and I adopt it on
equally rational grounds, then this would exactly support my point.
> } You can use computers, but can you freely and without prejudice regard them as
> } human? Not that my Plus was human, but there may be a hardwired maturation
>
> Regard them as human? Half the time I regard myself as "an AI wannabee
> trapped in a male human body".
There 'ya go. I agree fully.
> The difference between the Culture and
> the Singularity isn't one of AI rights, or technical capability, or
> maximum intelligence. It's one of ethics, of the behaviors of
> civilization surviving through and beyond a rapid transition. If I was
> in the Culture I'd happily fork, or even destructively upload, and grow
> my way up to Mind status.
What is the "Culture", and who is this author whose books I clearly need to buy?
But going on your description of the Culture, it sounds like an Abort - a "go
two inches into the Singularity and freeze" scenario. Or a Kadath where
humans may upgrade at leisure.
The former sounds like a simple failure of imagination. The latter depends on
Power Ethics, which I'm tired of talking about. Flip a coin.
> But I'd be happy knowing that the Amish were still toiling the soil in
> Pennsylvania. Not because I have great love for the Amish, but because
> a Mind-civilization which had that much respect for pre-existing
> sentience would be a profoundly safe and libertarian (liberal) society
> for anyone else. The image of Singularity I get these days is one of
> everything going totally haywire. But I don't see any proof that's
> inevitable. So I root for civlization.
>
> } And I don't find it frightening at all to contemplate stepping entirely into
> } Other Reality. I grew up there, after all - it's as much home to me as
>
> There's a difference between "stepping into" and "being sucked up".
You're acclimated to uploads. You're not (instinctively) frightened at all.
You're just worried about preserving your Libertarian principles. You don't
want to coerce the Amish and you don't want to be coerced. Well, good for
you. If I was in charge of uploading humanity, you'd better believe it would
be voluntary - except *maybe* in case of death, and even then as a static
copy. But I don't expect to be in charge, and it looks to me like the logic
of Libertarianism breaks down if you're omniscient For All Practical Purposes.
> } Singularity. Then I continued the calculation to find that 3 years after
> } human-equivalent AIs, AIs reach infinite speed, and I called *that* the
>
> } You call me a Rapturist. "Humanity becomes something unimaginably different
>
> Funny that. "Infinite speed"?
Sure. Just like a Schwartzchild singularity is infinitely dense and
infinitely curved. Needless to say, I expect infinite speeds to be achieved
as the result of a single breakthrough, not an infinite Moore's series. But
yeah, I have no problem with the concept of thinking infinitely fast. So I
can't visualize it! All that means is that my semantic primitives are
deficient. Like, wow, I never would have guessed. I can't wait to fix the problem.
> } But I grew up, partially, in Other Reality, where all Laws are transient as
> } clouds, blown about on the winds of technology. All except one: Moore's Law.
> } Things always get faster, more powerful, more complex. Always. Where Other
>
> Always? This law which didn't exist a few decades ago?
1) Moore's Law has held for, what, 30 generations? That's a human millennium
or thereabout, right? Long enough for me.
2) I didn't exist a few decades ago either. I'm not about to speculate about
things outside my experience. (Ha, ha, ha.)
> And do you
> really not believe in sigmoid curves?
Sigmoids apply to *specific* technologies, not fields. Much less enhanced
intelligence. Maybe, if there's an unbreakable upper limit - which I very
strongly doubt - then the Singularity will reach a ceiling and instantaneously
flatten. That's not a sigmoid, that's a hyperbolic and a constant function
spliced together.
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:58 MST