From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Tue Jan 14 1997 - 20:13:26 MST
> I don't think that it is the fact that Eli is young so much as the
> fact that he makes too many generalizations and assumptions.
I'll drink to that, or I would if alcohol wasn't Evil.
The way I figure it, overgeneralizing is simply one of the inevitable
side effects of being in a major hurry. Which I am. I figure that it's
better to be wrong than to keep quiet... at least if your goal isn't
impressing anyone, but is instead to solve all the secrets of the
Universe within five years. If I'm wrong, I can be corrected, and then
I know something. If I'm right, I'm right. Silence is ignorance. To
err is to learn.
This is somewhat in conflict with the conventional view that silence is
golden. Fortunately I'm sure that the silence types won't object, since
they realize that "silence is golden" is a generalization and that they
don't really know whether silence is golden for all conceivable people.
I optimize my conversations, and my thoughts, for speed. It's not so
important to be right as to get X amount of thinking done about the
subject so it can coalesce into an understanding. Wrong and right don't
matter for that threshold; it's the quality and quantity and complexity
that count. Then, once the understanding coalesces, you can remove all
the wrong stuff. The important thing is to have a large amount of
thought to get that crucial click-click-click.
And then there's the simple stylistic issue; I'm minded of the time that
I rephrased:
"Smartness is the ability to solve some problem"
as:
> Smartness is an abstraction,
> existing solely in the human mind, from the observation - also existing
> in the human mind - that System X - also in the mind - can solve problem
> Y - also in the mind. To give this term, "smartness", a useful
> definition, we say that if System X can solve more problems than System
> Y, or solve them more elegantly - where elegance is in the mind - then
> System X is smarter than System Y. The terms used may not precisely
> reflect the truth, not having definitions down to the level of such
> definitively real items as quarks or whatever the fundamental particles
> may be, but if we attempted so foolish an endeavor as to make all mental
> assertions correspond precisely to reality, we wouldn't be able to think
> or walk across the room. The most we can hope for is terms which are
> useful, experimentally testable, and precise. I believe that
> "smartness", or - so I don't get another lecture on there being multiple
> types of smartness - that type of smartness which has to do with the
> rotation of mental pictures - is experimentally testable, precise, and
> useful. Other types of smartness are more vaguely defined due to our
> primitive grasp of cognitive science, but are still useful.
Qualification and evasion of responsibility, for me, has always been
very much something that is done *after* your theory is in place. I
build it up, you smash it down, you build it up, I smash it down, but I
*never* deliberately *slow* it down, and I try not to do anything which
would slow me down. I've always viewed thinking from an engineering
standpoint; a sweeping generalization is okay, as long as it works. The
object is not to spew out thirty pages of blither to whom nobody could
possibly object, but to be able to solve problems.
If I say: "Schools don't work because of a short-circuit between
learning and verification," it may not be true of every school or every
student. But it does tell you how to fix the problem; you set up
Institutes of Verification. And that's all I wanted from the
generalization.
Time will tell whether this method works better, at least for me, than
slow and careful critical thought. I figure that I'm good at making
sweeping and possibly incorrect statements on subjects that nobody else
has even been able to touch. I lay the foundations and get out. This
is the Meaning of Life, this is how to redo schools, this is how to
amplify intelligence, do you have any other questions?
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:00 MST