"Dan Fabulich" <daniel.fabulich@yale.edu> wrote on Sunday, May 07, 2000 6:14
AM,
> Some finite number of Harvey Newstroms wrote:
>
> > Then I must clarify my goals. I don't want some Harvey Newstromish
person
> > to exist in the future. I want *this* current Harvey Newstrom to
continue
> > to exist and evolve into the future. I do not want him destroyed and
> > replaced by a new Harvey Newstrom.
>
> That's an interesting goal to have. However, I fear that it begs the
> question as to why you have it in the first place.
>
> I happen to be willing to "die" (or whatever) so long as a pretty recent
> copy of me lives on. (By pretty recent, I mean on the order of months,
> maybe a year under extreme circumstances.)
Your full posting seems to represent our respective positions accurately.
We do have different goals for what we are trying to acheive by avoiding
death.
It almost sounds like you are saying that the ends justifies the means.
That is, as long as another person is similar enough to you to carry out
your plans, you don't need to exist anymore. You don't seem to want to
continue living just because you like to be alive, but to acheive some
future goals. As long as someone else is programmed with your meme-set to
carry out your goals, you seem ready to die.
> Why did you want to preserve your "self" (as such) in the first place?
I don't have a good proof for why I shouldn't die. I just would like to
avoid it. It seems like you question whether self-preservation is a useful
goal.
> More to the point, if "you" turn out to be the program that's running on
> the fantastic machine that is your body, then shutting down a copy
> shouldn't be too big a problem. If you happen to be your current
> consciousness stream, or control over it, or whatever, then destroying
> that is a pretty big deal as far as our primitive desires are concerned.
This is where the semantic problems come in. We do not all agree on what is
the self. I don't know all the answers, but I find it hard to agree with a
definition of self that allows selves I cannot detect. I don't know if
their are other copies of me. How can they be me if I don't even know that
they exist? I think, therefore I am. But I cannot think with the copy's
brain, anymore than I can think with your brain. Until I get a replacement
brain that I can think with, the replacement is unacceptable.
> I take it as axiomatic that our goals, our ends, are what we are striving
> for; that the correct goals/ends are the correct things to strive for.
> With that in mind, do we have it as a goal to preserve our current
> consciousness stream?
I do. If self-preservation is just to create a new individual and allow
ourselves to die, then why limit ourselves to exact copies? Why not make
better individuals? Why not genetically engineer our children or build a
race of superior robots? All of these may be fine goals, but they have
nothing to do with my personal survival. If continuing this current
consciousness stream is not the goal, then why make the copy similar to me
at all?
> As far as I can tell, the answer is no. My lizard brain has nothing to
> say about whether I'm an adjective, as John Clark has been known to put
> it, or a noun, as you might have it. My intuitions just tell me to
> preserve "myself," and to avoid "death."
I have a twin. He is not me. I have had some people tell me that they
couldn't tell us apart. This does not convince me that I can die. I still
perceive an obvious difference between him and me. I have never gotten us
confused. I am obviously in this body, and he is obviously in that body.
No similarity in thinking or perceptions will suddently merge our
consciousnesses into a single individual. From that point of view, I must
insist that different similar objects can and do exist. The question is
not, "Does the universe need two Newstrom brothers?" The question is, "Do I
want to continue living?" I see no connection between these two queries.
> In light of that, and in light of my other goals, it seems to me like it
> might be most efficacious to choose the definitions of "self" and "death"
> under which I'm best able to accomplish my goals. Now, only *I* have
> special knowledge of my own goals, and therefore *I* am in a unique
> position to pursue them. But as far as I can tell, the most useful
> definition (from the point of view of my goals) is the "adjectival" view.
I know what you are saying, but it sounds too much like fudging the answer.
A lot of people on this list like to choose definitions that best support
their goals. Sometimes reality is not cooperative. Sometimes things are
not the way we want them to be. I think we need to answer the questions
about reality first, and then choose goals that are possible within that
reality.
> So why should I care about consciousness streams? What sort of goals ARE
> you pursuing that makes it a good idea to decide that "avoiding your own
> death" means "maintaining control over this very consciousness stream"?
> For they seem to be drastically different from mine.
Continuing my consciousness stream is my goal. You seem to discount this
goal and ask what other good goals come out of my life. If I can't come up
with any great purpose for my life, should I kill myself? Maybe I will
think up a great purpose later. For now, maybe I'll just stick around for
the entertainment value. Continuing consciousness is a goal in and of
itself. If this stops, why would I care about any other goals that occur
afterwards?
-- Harvey Newstrom <http://HarveyNewstrom.com> IBM Certified Senior Security Consultant, Legal Hacker, Engineer, Research Scientist, Author.
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:10:38 MDT