From: John K Clark (johnkc@well.com)
Date: Sat Feb 08 1997 - 22:10:31 MST
-----BEGIN PGP SIGNED MESSAGE-----
On Fri, 07 Feb 1997 Eliezer Yudkowsky <sentience@pobox.com> Wrote:
>The thing about Turing machines is that all forces within the
>computation originate from clearly defined causal units.
Thank you Guru Eliezer for those words of wisdom. Your pride is justified,
only someone who has achieved the 800'th level of SAT enlightenment could
have deduced that a Turing Machine was deterministic even if it is not
predictable. Help me follow in your glorious footsteps with my own humble
query. Do you think that maybe that's why they call a Turing Machine a
machine?
>This man is a Turing computationalist?
>Lemme 'splain something, friend.
No, let me explain something to you buddy boy. I expect and want people to
emphasize the weak points in my arguments, I rather like sarcasm even if it's
directed at me, and insults don't bother me much, but deliberately
misrepresenting my position is going way over the line. And and this was not
just a misunderstanding, you're not dumb enough to think that a deterministic
machine would be a revelation to me.
>What I'm trying to say is that whatever choice you make, it *will*
>have a causal explanation within the computational framework.
Within a computational framework sure, but not within my computational
framework, and that makes all the difference. I don't understand myself,
I don't know what I will do next and this is why I feel free. This feeling
is not an illusion, I really do feel free.
>Ya can't lift yourself up to some pure ethical plane and say that
>your "choice" is made of your own "free will" while a thermostat
>does not choose to keep the temperature at a certain setting.
I said "A being has free will if and only if he can not predict what he
will do next", the only part of this definition that is vague is "a being".
If you insist that a thermostat is a being then I would have to say that
being had free will.
>After all, you know a lot more about yourself than a thermostat does
That could be true but is far from obvious. I am astronomically smarter
than a thermostat but the thing I'm trying to understand, myself, is
astronomically more complex.
>The point I'm trying to make is that you'll assign meaning to
>something, for causal reasons explicable entirely by Turing-based
>forces.
A keen grasp of the obvious.
>The question is - will you assign meaning based on "what you want"
Yes, but your way is much better, you will assign meaning in ways you don't
want.
>which is a fancy way of saying you'll let evolution do it for you?
The reason I have the personality I do is because something caused it to be
that way, my genes and my environment. I'm trying very hard to figure out
why this should be depressing, but am not having much luck. Of course,
if different causes, different reasons, had acted on me then I would it find
determinism depressing and would have been a great fan of the only
alternative, randomness.
>Or will you assign meaning based on a genuinely true chain of logic?
Which is a fancy way of saying you'll let the meaning of life do it for you.
>In short, will you reason it out or accept the dictates of your
>genes?
So you want to reason it out, you want to find the reasons, the causes, that
effects the machine that calls itself Eliezer Yudkowsky to act in the way it
does. That sounds like a worthwhile thing to do to me, but understand,
you'll be able to find some of the causes, but Turing proved you can't find
them all.
>I'm asking if anything has any value at all.
Yes, a dollar has the value of 100 cents, PI has the value 3.14149...
Value is the amount of a specific measurement, in this thread the specific
measurement I'm talking about is the desire to maximize.
>You say it has value? Well, I can take apart your brain right down
>to the level of individual atoms and demonstrate that you assigned
>it value simply because it tickled your pleasure centers.
No shit Sherlock!
>And saying "Yes, it has value, to me" is simply ducking the point.
How is that ducking the point? How does that prove that it doesn't have
value for me?
>And I am not impressed. If I build a computer programmed to spew
>out: "Cats have value to me", fundamentally, nothing happens.
It's not important if you're impressed or not, you're not the one who likes
cats, it's only important if the computer is impressed or not.
>So what?
So cats have value to the computer.
>I simply deny that value is an observer-dependent thing.
Then you have fallen so in love with your own theory that you simply deny the
results of experiments that are in conflict with it. That has never worked in
Science in the past and is certainly not a path to the truth. Ask a thousand
people what they value and what they don't and you will get a thousand
different answers.
>You can have, in memory, a declarative statement that "X has value".
>So what? A CD-ROM can do the same thing with a lot less fuss.
At least a thermostat changes its state, a CD-ROM is static, and a mind that
is static is not a mind.
>We can't make things have value just by declaring they do.
Why not? We can and do make value judgments, and we do so a thousand times
a day. You say we error when we do so, but what are the consequences of that
error? Where is the collapsed bridge? Exactly what is it that goes wrong?
>Our declaration that "X has value" will cause us to act in such a
>way as to maximize X.
Yes.
>So what?
So we maximize X and are happy.
>What good is it, unless X really *is* valuable?
According to my theory, an anvil is valuable to a blacksmith but it is not
valuable to a man trying to swim across the English Channel. According to
your theory the anvil would be of equal value to both. I'm right, you're not.
>74 degrees is the Meaning of Life for a thermostat?
74? Ridiculous! It's 42.
>I am jumping out of the system.
>I'm saying that to *me*, acting to maximize X *has* *no* *value*.
You are not jumping out of THE system, you are just jumping out of YOUR
system. If you really believe in the above for any X, then your life would
indeed have no value to YOU. You're still alive so I rather doubt that is
the case.
Things are very different for me. I think many things have value , but
If X = "The Meaning of Life" then to *me*, acting to maximize X *has* *no*
*value* to me.
>I see no place for intelligence, consciousness, brains, neurons,
>thinking, subjectivity, or really much of anything in this worldview.
I see no need for anything to give its value seal of approval to
consciousness. I do see a very great need for consciousness to give its
value seal of approval to other things.
>At this point in our discussions, you usually change the rules by
>subtly redefining a term. My guess is that it will be "value". Well?
I've already given you my definition of value, I am still happy with it and
see no need to redefine it. I can't very well redefine your definition of
value because you never gave me one.
John K Clark johnkc@well.com
PS: I probably should have waited an hour or two before I wrote this reply,
it wouldn't have had such an ill tempered tone, but there were reasons
that I acted as I did, even if I don't know what they were.
-----BEGIN PGP SIGNATURE-----
Version: 2.6.i
iQCzAgUBMv1ZiX03wfSpid95AQEZiwTw0rfGo1+bhQ4Nt9Cn+Vd0TmqeNdHfluZI
jABXfeopHUngmjVr41zvBmPh5Ms3CJqL36ZG+u/pulGN9kho4aJynDCHRgYH8lMg
naQdQVDUEzB9bvi/OmSUtscxkbiDPwVDJLzgoeK7nq7FkqMi5XYyFLaoAua/eNbE
LskdmDrkKU0ot89eVl2Ke9zqRc+39OOlz5sRZPKugqOgjyPdkPg=
=p+X6
-----END PGP SIGNATURE-----
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:09 MST