From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Fri Feb 07 1997 - 00:34:48 MST
> 1)I am the one doing the choosing if I have free will.
> 2)I have free will if I can not predict what I will do next.
> 3)I am doing the choosing.
This man is a Turing computationalist?
Lemme 'splain something, friend. The thing about Turing machines is
that all forces within the computation originate from clearly defined
causal units. What I'm trying to say is that whatever choice you make,
it *will* have a causal explanation within the computational framework.
Remember - "The choices are written in our brains, but we are the
writing?" Ya can't lift yourself up to some pure ethical plane and say
that your "choice" is made of your own "free will" while a thermostat
does not choose to keep the temperature at a certain setting. After
all, you know a lot more about yourself than a thermostat does, so your
will should be much less free.
The point I'm trying to make is that you'll assign meaning to something,
for causal reasons explicable entirely by Turing-based forces. Either
that, or start chanting, to use your phrase.
Now. The question is - will you assign meaning based on "what you
want", which is a fancy way of saying you'll let evolution do it for
you? Or will you assign meaning based on a genuinely true chain of
logic? In short, will you reason it out or accept the dictates of your
genes?
> You don't want the meaning of life, you want a way to always know what is the
> best way to maximize value, well I don't blame you, I want that too, it's
> called infinite intelligence.
Ah. I don't quite understand. Either things have value, or everything
is valueless. I'm not saying that there's only one "Meaning" which is
the best way to maximize value. I'm asking if anything has any value at
all.
And saying "Yes, it has value, to me" is simply ducking the point. You
say it has value? Well, I can take apart your brain right down to the
level of individual atoms and demonstrate that you assigned it value
simply because it tickled your pleasure centers. And I am not
impressed. If I build a computer programmed to spew out: "Cats have
value to me", fundamentally, nothing happens. So what?
I simply deny that value is an observer-dependent thing. You can have,
in memory, a declarative statement that "X has value". So what? A
CD-ROM can do the same thing with a lot less fuss. Certainly this
statement will have no causal influence on X - and what's more, it won't
have any logical force. And if I code it with RSA, does X suddenly lose
its value? If I lose the encryption key, does X lose its value?
My position is that we can't alter the Platonic structure of the
Universe through wishful thinking. We can't make things have value just
by declaring they do. Our declaration that "X has value" will cause us
to act in such a way as to maximize X. So what? What good is it,
unless X really *is* valuable?
Are you going to tell me that "value" is defined with respect to a
particular individual, and equals whatever that person acts to
maximize? That 74 degrees is the Meaning of Life for a thermostat?
Well, I am jumping out of the system. I'm saying that to *me*, acting
to maximize X *has* *no* *value*.
You see, you are arguing for a Meaning of Life, just a silly one.
You're saying that the Meaning of Life is a certain sort of behavior,
acting to maximize things. I don't see how this is any more reasonable
than saying that the Meaning of Life is "cats" or "worshipping God" or
"tuna loaf". So you're born, you live, you act to maximize something,
and you die. I see no place for intelligence, consciousness, brains,
neurons, thinking, subjectivity, or really much of anything in this
worldview.
In short, John K Clark, your ideal world is run by thermostats, who act
to maximize things with a purity far exceeding that of any human, and
have completely free will through their complete lack of
self-knowledge. Where I come from, we call this a "reductio ad
absurdum". I'm sure you disagree, though. At this point in our
discussions, you usually change the rules by subtly redefining a term.
My guess is that it will be "value". Well?
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:09 MST