From: Mark Walker (tap@cgocable.net)
Date: Sat Aug 11 2001 - 11:57:04 MDT
----- Original Message -----
From: Brian Phillips <deepbluehalo@earthlink.net>
> Leaving aside the question of AI, what's the list impression
> of what a superhuman mind might be like (yes I know this
> is asking the gorillas about a super-silverback :).
> I tend to think that superhuman intelligence might be best
> classified as levels of intelligence quantifiably different from our own.
> An individual with a 4 sigma IQ and a PerfectRecall chip in their
> cortex is smarter. But their baseline intel is the same just more
> effeciently harnessed. What is everyone's favorites for a cohesive
> theory of general intelligence? I dislike Multiple Intelligences/Gardner
> because it lacks any evidence. Triarchic Intelligence/Sternburg is
> proposed in a much more scientific fashion (and it's author is actually
> attempting to prove/disprove his hypothesis, unlike Gardner).
> In a nutshell, I can see giving everyone in the human race a 140 IQ,
> perfect recall, and a turbocharged cognition system. But what ELSE
> can we try to do?
>
>
Brian: You have a couple of questions here: 1. What is our best
understanding of intelligence? 2. How can we improve intelligence? Some
think that we need to answer 1 in order to make progress on 2. This may not
be the case. Sometimes it is possible to make technological progress in
absence of theory. For example, the discovery of gunpowder did not require
our understanding of the chemical theory underlying this phenomena.
Sometimes theory seems required, it is hard to imagine (possible but not
probable) that a civilization could build its first atomic bomb in absence
of some atomic theory. My guess is that the technological ability to create
greater intelligence will precede a detailed theory of intelligence. Allow
me to quote myself here:
" The ease in which we might create a larger brain through genetic
engineering is underscored by the fairly recent discovery in developmental
genetics of homeobox genes: genes that control the development of the body
plans of a larger number of organisms. For our purposes what is of interest
is that there are a number of homeobox genes that that control the growth of
various brain regions.[9] For example, if you want to make a larger brain in
a frog embryo simply insert some RNA from the gene X-Otx2 and voilą-you have
a frog embryo with a larger brain, specifically, the mid and forebrain mass
is increased.[10] Homeobox genes also come in various forms of generality.
Otx2 is obviously very general in its scope; in contrast, for example, Emx1
controls the growth of the isocortex (one of the two regions of the
neocortex). Thus, if we believe that intelligence....might be aided by
tweaking one area of the brain or another there may be just the right
homeobox gene for this task."
I am inclined to think that we could create a greater intelligence from a
human zygote by the end of the decade if the project was taken on with the
same political enthusiasm as say the moon landing. On the other hand, it is
quite possible, as Jerry Fodor et al. have argued, that a general
understanding of our own cognition (and hence intelligence) may forever
elude us. There is a sense in which we may be able to create further than we
can understand.
Mark.
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:09:44 MST