From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jan 12 1999 - 12:27:41 MST
Billy Brown wrote:
>
> Agreed. You can only build a chain of ungrounded symbols so long, then the
> system starts spitting out garbage. IMO there are at least two limitations
> at work here - you can only handle so many layers of indirection, and you
> have very little working memory with which to perform mental operations.
> Since human memory and logic are fuzzy, overflowing either boundary simply
> produces nonsense output - and since people don't normally have error
> checking at this level, they usually don't realize they aren't making sense.
>
> I would presume you still have the limited-working-memory problem when
> working with complex idea systems. If not, please explain - I've been
> trying to generalize around that one for years now.
I have an Algernic disability with respect to symbols; this includes
both chunking and what you call indirection. I can't work with
abstractions from abstractions at all. I can't manipulate systems of
systems; only the grounding. My abilities are all oriented towards
digging down, reducing, asking why; not building symbols about symbols.
I will never understand most of "Godel, Escher, Bach" this side of the dawn.
I can't get lost in a maze of words; I don't have the infrastructure.
As for limited working memory, most of my thinking is about the relation
between two concrete objects, which - counting the relation, the
objects, and my thinking - takes only four places in short-term memory*.
> Unless they use rationality to divide people from animals. Then people who
> fit their definition of rational, within the limits of available processing
> power, might survive a general extermination. Not a likely scenario, but
> not impossibly unlikely, either.
I don't think that my level of rationality is so far away from an
idiot's, relative to the total width of the spectrum, that a random
division would lie anywhere in the middle of the human race. The only
major targetable landmark is realizing that your current actions can be
dependent on your projection of the Singularity's behavior and that a
possible causal link exists.
-- (*) = Not a very good way of looking at it. The objects and their relation occupy a single mental simulation; at any given time I am probably focusing on a particular set of rules and trying to deduce facts, or vice versa. Anyway, I don't run out of RAM. -- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/sing_analysis.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:02:48 MST