From: Stathis Papaioannou (stathisp@gmail.com)
Date: Sun Mar 16 2008 - 06:06:25 MDT
On 16/03/2008, Jeff L Jones <jeff@spoonless.net> wrote:
> So given this way of thinking about the copying process objectively,
> instead of worrying about some "personal identity" surviving or not
> surviving (which I would argue is meaningless), there is only one
> right answer to what I should anticipate *objectively*, regardless of
> what my goals are. I want to maximize the survival of my copies, not
> because I'm worried about my personal identity surviving, but because
> that maximizes my current causal influence on the world. No matter
> what my goals are, I have a better chance of accomplishing them if
> more of my copies survive.
I might have a greater chance of accomplishing my goals if I am
replaced by an AI which determines what those goals are by observing
and talking to me. If it came down to a choice it would be more
"rational" to kill myself and allow this AI to take over rather than
continue living. The problem is, if I choose in favour of the AI, I
can't anticipate having further experiences, which is the main
disadvantage of dying; there's not as much point in furthering my
goals if I won't be around to enjoy it. To put it differently, being
able to have future experiences is one of my most important goals, and
pursuit of that goal involves concepts of personal identity and
subjective anticipation. I agree that people who maintain the old
views will die out when evolution favours the objective view, but what
can I do? It's the way my mind works.
-- Stathis Papaioannou
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT