Re: Nightline comments on AI

From: Brian Atkins (brian@posthuman.com)
Date: Tue Aug 20 2002 - 20:53:34 MDT


Anders Sandberg wrote:
>
> On Tue, Aug 20, 2002 at 04:21:08AM -0400, Michael Wiik wrote:
> >
> > If true, then we have no superhuman intelligence upcoming to guide us
> > thru nano and biotech futures, and humanity is toast.
>
> If we think we need superhuman intelligence to survive those
> technologies, why do we think we can survive developing superhuman
> intelligence without any guide?
>
> Sorry, but this wish for some Big Daddy to hold in the hand really
> irritates me, regardless of whether the Daddy is an AI or a god. It is up
> to *US* to solve the problems, using our limited and fallible minds to
> come up with solutions. Some of these solutions might be minds of their
> own, of course.
>

I don't get it- is building a transhuman mind to speed our progress and
safety bad or good? I can't tell. Isn't getting to the point of having
some transhuman minds around on this ball of dirt one of the main goals?
What does it matter whether it happens to have a direct human ancestry or
not? Are you going to be personally disappointed if you aren't first?
Why bang your head against the wall unnecessarily, and potentially greatly
delay the development of transhumantech, by promoting your meme of
"we have to do it ourselves!"?

Also on one hand you seem to be claiming we can deal confidently with
most existential risks such as nanotech and biowarfare, yet this one
called "SI" you have no confidence for apparently. Yet even if it appears
later in the historical timeline (debatable), we will eventually have to
face it.

-- 
Brian Atkins
Singularity Institute for Artificial Intelligence
http://www.singinst.org/


This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:16:18 MST