From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Sat Feb 01 1997 - 18:04:47 MST
[Saith John K. Clark:]
> OK, if you want to play that game, please give me a non circular definition
> of the word "definition".
No problem whatsoever from *my* philosophical standpoint.
Definition (n): An explanation intended to convey all information
necessary for the formation of a symbol.
Our mind contains cognitive objects called 'symbols'. Nobody knows
where or how they are stored, except that we think the hippocampus
creates them and the cerebellum retrieves them. 'Symbols' generally
have words attached to them, be they auditory or ASL or whatever.
Symbols are widely thought to be handles to semantic structures,
composed of other semantic primitives such as symbols, and connected by
various semantic links the nature of which is not well understood.
The network does ground out, because semantic primitives other than
symbols exist. As a general rule the non-symbolic primitives consist of
either some experiences, or a transformation performed on the current
working memory or visualization, often by attempting to alter the
visualization so that it is analogous to abstracted extracts from
previous experience.
Classical AIs have no visualizational facilities and their symbols are
all defined in terms of other symbols, which is why classical AI is such
a horrible model of the mind.
Definition (n): An explanation intended to convey all information
necessary for the formation of a symbol.
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:07 MST