Bryan Moss wrote:
By "nothing much happens", I don't mean that we're sitting around
twiddling our fingers. I mean that no technological developments of
sufficient magnitude to disturb the world order occur.
>
> Eliezer S. Yudkowsky wrote:
>
> > *My* hoped-for timeline:
> >
> > 1999-2007: Nothing much happens.
>
> How about, Eliezer [does all the work]
> > July 6th, 2007: The Singularity Institute announces that
> > they're testing an AI. The press release uses the words
> > "self-understanding" and a lot of technobabble.
> > Absolutely nobody uses the words "Singularity" or
> > "transcend".
>
> Well, the press release will almost certainly say
> 'Singularity' in it somewhere, since it's from the
> Singularity Institute an' all.
Hm. That's actually a darned good point. I'll have to think about that one.
There really is a problem implicit in trying to do things neatly and quietly; if you're too quiet, you don't get the resources. Something of a tradeoff between assembling the technophiles and panicking the technophobes... I think I tend to lean towards assembling the technophiles, since the technophobes can always panic anyway. (It doesn't matter whether or not you admit to trying to bring about the end of life as we know it, since you'll still be accused of it.)
When in doubt... I think I still believe human interactions in twentieth-century liberal democracies are sufficiently fair that honesty remains the best policy.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way