Lyle Burkhead wrote:
>
> Eliezer writes,
>
> > Business: I believe we do have some high-financial-percentile folks
> > reading this list. I would like to see you post... a list of
> > what you're interested in funding (... Extropian business ideas? ... )
>
> I tried that almost three years ago. No response. Extropianism isnt
> about making money.
I am afraid that I agree with you.
> In another post Eliezer writes,
>
> > The most realistic estimate for a seed AI transcendence
> > is 2020; nanowar, before 2015. The most optimistic estimate
> > for project Elisson would be 2006; the earliest nanowar, 2003.
>
> To which den Otter replies
>
> > Conclusion: we need a (space) vehicle that can
> > move us out of harm's way when the trouble starts.
> > Of course it must also be able to sustain you for
> > at least 10 years or so. A basic colonization of Mars
> > immediately comes to mind.
>
> You have no idea how bizarre this discussion appears to an outsider. You
> guys are as far out of touch with reality as the Scientologists. Maybe
> more.
"You guys" I assume, includes me. Your statement about den Otter I agree with. A Mars colony is effectively impossible and ludicrously expensive compared to a L5; even L5 probably isn't practical in the time we have. Fantasyland.
About mine...
> This kind of thinking weakens you. This is not the way to see reality
> clearly. On a battlefield, in business, or anywhere, the one who sees
> clearly wins. Our way of thinking (calibration) is exemplified by the
> geniebusters site. It strengthens us. It does lead to clear perceptions.
The way to see reality clearly is to accept all the possibilities. Is there a 1% chance of Earth being destroyed? Obviously. So why not a 10% chance, or a 70% chance, or a 95% chance? There's no mysterious protective field that will prevent us from being killed by our own dumb mistakes, like thinking there's a protective field. We could learn to live with military nanotech, bring technocapitalism to the masses, get right to the verge of creating a seed AI, and get wiped out by a comet. Life ain't fair. Live with it.
I don't believe in genies, either, BTW. That kind of AI is powerful enough to shatter our reality, not just make free glow-in-the-dark frisbies.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/singul_arity.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.