Re: Living under water

From: Robert J. Bradbury (bradbury@aeiveos.com)
Date: Wed Jun 27 2001 - 11:46:30 MDT


On Wed, 27 Jun 2001, Greg Burch wrote:

> In other words,
> all the dreams of a post-human future must pass through the eye of the
> needle of this crucial period of super-acceleration of change and progress
> and that the nature of that post-human future, if any, will be critically
> dependent on HOW we deal with it.

Agreed...
 
> I do not believe that we have the slightest basis for positing the
> possibility of consciousness outside of a social context. I am personally
> deeply suspicious of the notion of some kind of abstract "general
> intelligence" divorced from context and, most especially, divorced from
> social context.

I suppose it depends what one means by "consciousness" and "social".
We are back to suitcase terms again. I think it is a mistake to
assume that just because several primates are the only ones to
pass the mirror test (this was recently disproven for dolphins)
that consciousness is a "black" or "white" thing. If Calvin's
model for Darwinian selection of thoughts based on successful experiences
is correct, then consciousness could very well be a linear scale
and the mirror test simply represents a level near one end of it.
It may be nothing more than an emergent property of complex systems
and once we have systems that complex it may emerge. If you take
Minsky's "Society of Mind" concept, then there may well be social
scales than range from within the brain to the external societies
studies by sociologists. There could continue to be heirarchical
societies within SIs. Consciousness may end up being nothing more than
the evaluation algorithm for the direction a collection of thought
patterns heads in.

But I agree we still have a lot to learn about this.

> Sure, we have examples of domain-specific information processing, but the
> quest for an abstract general intelligence divorced from context is
> precisely the kind of Platonic idealism that has hobbled the AI community
> since its founding.

If my reading of Minsky's forthcoming book and some of Kurzweil's
comments is correct, then there isn't any such thing as "general
intelligence" (and I think I said as much at Extro5). It may be
nothing more than a collection of domain specific strategies for
problem solving that we happen to be able to recognize can be
applied in contexts other than those for which they were originally
developed. Many domain-specific-strategies plus the ability to see
patterns in the similarities of situations. I think that gets you
a lot of the puzzle.

> I predict that the greatest single
> insight we will gain in this quest is that consciousness is in fact an
> inherently social phenomenon and that, even alone, the workings of a
> conscious mind is the activity of a social machine turned in upon itself.

No argument -- but that doesn't make the case that an external social
environment is required. I've got a two processor system, I can have
a society of 2 within the same machine. Its a reflection of the
society in my mind where there are usually 2 voices speaking back
and forth.

> As Robin Hanson has so cogently pointed out in his essay "Dreams of
> Autarchy", there really is no basis for believing that we can live in
> isolation.

I'd agree that our psychological need set is probably wired against it
now, but it doesn't have to remain that way in the future.

> [snip] for we are each of us societies within, a reflection of the
> social context that gives birth to the mental life we experience.

No argument.

> [snip] no doubt expressed in new ways; but still the basic challenges of
> moral decision and cultural exchange will persist.

I'm not so sure I'd agree completely with the "moral" aspect.
There could be an interesting discussion as to whether "optimality"
replaces "morality". Yes, there will still be information exchange
within the self-limited or SysOp constrained identities composing an
SI but the opportunities for any significant cross-pollination between
the SIs themselves will be few and far between unless they all decide
to show up at the Far Side Party.

I think my major point was that thinking about the future systems,
social hierarchies, moralities, etc. and trying to discuss them
in terms of current frameworks and models is very limiting.

Robert



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:08:18 MST