RE: Motivation and Motives

From: Lee Corbin (lcorbin@tsoft.com)
Date: Mon Sep 16 2002 - 23:15:42 MDT


gts writes

> > I'm confused. My preference would be to avoid calling
> > human heart beats human *behavior* and to also avoid
> > calling internal biological processes *motivated*.
>
> If that is your preference then you should stop trying to call internal
> biological processes "behavior," as you did when you challenged my axiom
> by citing the human heart-beat as an example that seems to defy it. :)

Yes, you're correct, I'm sorry. Bad example (at best).

> However, again, I am prepared to defend the notion that internal
> biological processes are unconsciously motivated behaviors.

We might agree here too, but, I'm afraid, the key question is,
*whose* unconsciously motivated behavior? Although exceedingly
peculiar to say it this way, I would admit that perhaps the
body could be said to have such a motive (e.g. perform internal
biological processes), but *not* the person. To reiterate:
I am not motivated to send blood through my veins.

> > You do agree then that "the only human motivations are
> > the [ones like greed, love, etc.]? Good. Then we are
> > closer to complete agreement. As a concession signifying
> > my good faith, I will throw in "the human body is motivated
> > to beat a human's heart" iff it can be said that mechanical
> > devices without intelligence can have motives.
>
> I've been thinking carefully about the "mechanical devices with possible
> motives," i.e., robots, and here is my conclusion: A robot should be
> considered an extension of its owner/creator.

Pardon me for interrupting, but I will throw in: if you hold
a flame near a bimetal strip that is oriented a certain way,
it will recoil away from the flame. Would you say that it is
motivated to do this? (after all, it's expanding on one side
faster than on the other, and perhaps you view this as a
sufficient motivation). That's *not* a rhetorical question,
but just for you to add that to your list of things to think
about.

> So then the behavior of a robot *IS* linked intimately to
> motivation and driven by the pursuit of the reward experience.

Another question: is it always the case that that which is
motivated is driven by the pursuit of the reward experience?

> >> I think what is needed here is a larger definition of "you".
> >
> > This is the rub, all right. I believe that I oppose an
> > enlarged definition.
>
>
> Yes, you seem to oppose a larger definition of "you" that would include
> your physical body and your most primitive drives such as the drive to
> live and be well. Why is that?

I do not know how unconscious is any part of my "drive" to
keep on living and to be well, but it doesn't matter to me.
If I get uploaded or survive this century some other way
I don't care what happens to my body or even to my primitive
drives. ***I*** want to be in control, and to dictate, dismiss,
or rewrite my drives and urges at will.

> the truth is that we are as much physical beings as we are
> mental beings.) You are your body as much as you are your
> mind.

Totally disagree, of course.

> Even your mind itself cannot (in my view) be considered
> separately from your brain, at least with current technology.

Right. Only the discredited dualists believe that the mind
does not depend on the brain; but I and many others on this
list believe that the brain is only one of many hardware
devices that could in principle support my mind, and that
if it were replaced overnight by an equally suitable device
then I'd never know. You probably weren't around during our
interminable discussions of "What if this is a simulation?".
Most people here seemed to believe that it was possible that
we would find out that we had all been uploaded without our
knowledge. Tipler went on about it in his 1992 book "The
Physics of Immortality".

> Any future technology that might inject the
> mind into an inorganic substrate would require that the inorganic
> substrate also be encoded digitally or in some other way with the
> information that describes the physical attributes of the brain,
> including and especially the DNA that controls gene expression in each
> neuron and thus thought itself. (This idea is where Rafal and I parted
> company, because it seems that he like you would like to think the human
> psyche can exist separately from the information contained in the genes
> of the cells of the brain that hosts it).

Though apparently half the cryonicists believe as you do,
I think that only a minority of the transhumanists or
extropians do (though I could be wrong). Anyway, quite
a number of us (who you might call "functionalists" --- a
not entirely correct term) do believe as Rafal, and moreover
think the mind is to the brain as the program is to the
computer.

> Good, so why can you not agree that the heart is
> motivated by you to keep beating?

As above, I don't think so because we disagree on what
"you" means, your usage being much more inclusive than
mine.

> > I'm implying there that the crucial difference is
> > *consciousness*, and we've already agreed that there
> > are many unconscious motivations.
>
> Right.
>
> > But I claim that unconscious motivations are not of
> > the sort that do these kinds of things [e.g., heat beats]
>
> But yet I think you've agreed above that a conscious decision to stop
> trying to live can often cause one to die.

Yes, sometimes. But you will surely admit that it's *rare*
not *often*. After all, how many people can will themselves
to die when a fate worse than death awaits them?

> > In any case, I would say that though I would be highly motivated
> > to beat my heart if I could, I cannot consciously do so and so
> > therefore we should say that I am not so motivated.
>
> Try applying your reasoning above to your breath rather than your
> heartbeat. Here we see an internal process that you *can* control
> consciously, and which you will control unconsciously at other times.

Yes, that's true. But you've lost me. So what? :-)

Lee



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:17:06 MST