From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jan 08 2002 - 15:33:43 MST
Gordon Worley wrote:
>
> I know that you meant this rhetorically, but of course not. If I were
> able to upload right now, I'd rather work on making myself better than
> spending a lot of time figuring out how my mind works and then build a
> new one. I'd figure out how my mind works and then just become a
> Power. Now, I'm a pretty nice guy and won't do anything too unFriendly
> (or at least that's what I'm telling everyone ;->), but I think someone
> else might. Thus, we almost have to create Friendly AI (or something
> like it, but better) first unless we have a death wish for the universe.
Well, you, as an augmented upload, could go on to create a Friendly AI, an
intelligent substrate scenario, or whatever. I don't really think of
Friendly AI as being in opposition to uploading. Uploading has a
different set of implicit dangers than FAI but the dangers aren't
necessarily worse, and if an upload can make it past the dangers (both
moral, and cognitive) then there's no reason why an upload-originating SI
couldn't fill the role of Transition Guide. For me the strategic scenario
amounts to [Friendly AI || uploading] versus [a repeat of 9.11 with
military nanotech || repeat of the anthrax scare with weapons-grade AIDS
that can be transmitted by mosquitoes]. If you phoned me right now and
said there was an uploading device in front of you but you had to decide
whether to get in within the next five seconds, I'd tell you to climb in,
and hope you got the rest right. It's just that uploading is a lot
*harder*, technologically, than AI; in terms of both computing capacity
and knowledge of cognitive science it is twenty years post-AI.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT