Lee Corbin wrote:
>
> "Functionalism" is the name usually associated with the
> doctrine that if it quacks like a duck, walks like a
> duck, and acts like a duck in every way, then it's a
> duck. Functionalists believe that anything that acts
> like a human, etc., really does have human awareness,
> intelligence, and feelings.
Well, not exactly. My reading suggests that "functionalism" is
the (fading) cognitive science assumption that imagines that the
brain, like an electronic computer, instantiates algorithms
that are independent of hardware substrate.
In George Lakoff's words
( http://www.ex.ac.uk/~PErnest/pome10/art21.htm )
"Functionalism, first formulated by philosopher
Hilary Putnam and since repudiated by him, is the theory that
all aspects of mind can be characterized adequately without
looking at the brain, as if the mind worked via the manipulation
of abstract formal symbols as in a computer program designed
independent of any particular hardware, but which happened to be
capable of running on the brain's wetware. This computer program
mind is not shaped by the details of the brain."
A lot of neuroscientists are dubious about functionalism,
so defined, without having rejected down-to-earth
scientific materialism. For example, in the discussion about
Gerald Edelman I posted here a year ago
( http://www.lucifer.com/exi-lists/extropians.2Q00/5580.html ),
I wrote:
"This contrast between the role of stochastic variation in the
brain and the absence of such a role in electronic devices such
as computers is one of the distinctions between what Edelman
calls "instructionism" in his own terminology (RP p. 30), but has
also been called "functionalism" or "machine functionalism" (RP
p. 30; BABF p. 220); and "selectionism" (UoC p. 16; RP
pp. 30-33). Up to the present, all human artifacts and machines
(including computers and computer programs) have been based on
functionalist or instructionist design principles. In these
devices, the parts and their interactions are precisely specified
by a designer, and precisely matched to expected inputs and
outputs. This is a construction approach based on cost
consciousness, parsimonious allocation of materials, and limited
levels of manageable complexity in design and manufacture. The
workings of such artifacts are "held to be describable in a
fashion similar to that used for algorithms".
By analogy to the hardware-independence of computer programs,
functionalist models of neural "algorithms" underlying cognition
and behavior have attempted to separate these functions from
their physical instantiation in the brain: "In the functionalist
view, what is ultimately important for understanding psychology
are the algorithms, not the hardware on which they are
executed... Furthermore, the tissue organization and composition
of the brain shouldn't concern us as long as the algorithm 'runs'
or comes to a successful halt." (BABF p. 220). In Edelman's
view, the capabilities of the human brain are much more
intimately dependent on its morphology than the functionalist
view admits, and any attempt to minimize the contribution of the
brain's biological substrate by assuming functional equivalence
with the sort of impoverished and rigid substrates characteristic
of modern-day computers is bound to be misleading."
And ( http://www.lucifer.com/exi-lists/extropians.2Q00/5578.html )
"In a biological system, much of the physical complexity needed to
support primary consciousness is inherent in the morphology of
biological cells, tissues, and organs, and it isn't clear that
this morphology can be easily dismissed: "[Are] artifacts
designed to have primary consciousness... **necessarily**
confined to carbon chemistry and, more specifically, to
biochemistry (the organic chemical or chauvinist position)[?]
The provisional answer is that, while we cannot completely
dismiss a particular material basis for consciousness in the
liberal fashion of functionalism, it is probable that there will
be severe (but not unique) constraints on the design of any
artifact that is supposed to acquire conscious behavior. Such
constraints are likely to exist because there is every indication
that an intricate, stochastically variant anatomy and synaptic
chemistry underlie brain function and because consciousness is
definitely a process based on an immensely intricate and unusual
morphology" (RP pp. 32-33)."
> ...I am also a functionalist, and it seems that practically
> all extropians are too.
Not me. (But am I an Extropian, I wonder?) ;->
Jim F.
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:43 MDT