From: Brian Atkins (brian@posthuman.com)
Date: Mon Jun 17 2002 - 22:23:28 MDT
CurtAdams@aol.com wrote:
>
> In a message dated 6/17/02 17:08:15, brian@posthuman.com writes:
>
> >However, I think what you're saying there doesn't make complete sense.
> >It's
> >the part about the "super-smart" AI that bugs me. If an AI has grown into
> >superintelligence then quite likely it is capable of constructing enough
> >computronium to let it fully /emulate/ the whole planet if necessary to
> >test
> >out its new tech ideas much more quickly than realtime.
>
> Trivially, if the AI or its computronium have any impact on the world,
> it can't fully emulate the planet due to Godelization issues. In any case,
Can you explain these "Godelization issues" in more detail, because I'm
not seeing it.
> the best computer on the planet can't yet even fold a large protein by
> emulation. Emulation of just one human being to that level (which actually
> isn't good enough for emulating catalysis) is thus over 50 years away.
> By Kurzweil's estimates (hardly gospel, of course) it would be about as smart
> as every human being on the planet combined - clearly a superintelligence -
> and yet still Vastly short of being able to emulate the biosphere.
50 years away without a SI driving it you mean. And this is assuming we
actually have to go that low-level to actually get usable results. Kurzweil
has also mentioned that it turns out to take 10 times less computing power
than previously guessed to completely replicate chunks of human brain
functionality such as the auditory system in software.
I think the point is that a SI can come up with much quicker and thorough
ways to test out its ideas than we can hope to. Perhaps it can find a
mathematically bulletproof way to test its code, perhaps it can cook up
a big quantum computer and use that to emulate large chunks of reality, etc.
Or perhaps it converts a chunk of the Moon into large amounts of computronium.
I just don't see any convincing reasons to have any surety at all why such
a SI combined with drextech or better would be likely to progress
technologically at the kinds of snail-pace rates humans work at. Even using
the rather ho-hum ideas we can cook up like computronium that are known
to be easily doable without any kind of magic physics should be more than
enough to accomplish what I'm talking about.
-- Brian Atkins Singularity Institute for Artificial Intelligence http://www.singinst.org/
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:14:52 MST