From: Ramez Naam (Exchange) (ramezn@EXCHANGE.MICROSOFT.com)
Date: Tue Nov 11 1997 - 18:14:48 MST
> From: Hal Finney [SMTP:hal@rain.org]
>
> It's not clear to me that minds and/or neurons can be expressed as
> formal
> systems. Equivalently, there can be no computer program which
_exactly_
> (exactly, exactly) simulates a brain. This is because we do not have
> a full understanding of the laws of physics. It is possible, as
Penrose
> argues, that actually the laws of physics are nonlocal and/or have non
> algorithmic properties. Therefore his argument does not apply even at
> the
> lowest level of our own brains.
>
> However, dealing with an upload or an AI is a different matter. Now
we
> have a full understanding of the substrate which is executing the
> program.
> It is a completely deterministic, mechanical and logical machine, and
is
> even relatively simple at the lowest level - as little as a dozen or
so
> opcodes is probably enough to be a universal machine able to run an
> upload, by our current understanding.
Whoa. If we use "we do not have a full understanding of the laws of
physics" as a rationale, then the substrate of an AI is equally suspect
of being a non-formal system.
Penrose's argument strikes me as exceedingly disengenious. It is the
interconnections between neurons (rather than the neurons themselves)
that result in the emergent information processing power of the brain.
Those interconnections operate via molecular-level signaling, not QM.
At some level quantum uncertainty affects everything, including a
"deterministic" AI substrate. But for all practical purposes, a
computer is a formal system, even if quantum fluctuations will cause it
to flip a memory bit every few thousand years. Does Penrose present any
compelling reason to think of the brain differently?
mez
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:45:07 MST