From: Wei Dai (weidai@weidai.com)
Date: Thu Nov 14 2002 - 12:49:58 MST
On Thu, Nov 14, 2002 at 12:35:52AM -0800, Lee Corbin wrote:
> Thanks. But continuing my theme, isn't there an equality between what
> is best for me objectively speaking, and the integral over all space
> and time of my subjective benefit?
No, because otherwise you would choose to play back your favorite
experience over and over again (and erase your personal memory of it after
each playback so they all feel exactly like the original experience). Once
we can manipulate subjective experiences at will, it becomes imperative to
stop valuing subjective experiences as an end. (They'll still be valuable
as a means of obtaining information about the objective world.)
An alternative is for us to develop strong irrational taboos against
directly manipulating subjective experiences and then continue to base our
decisions on expected subjective experiences. Perhaps this is the more
likely alternative since it seems easier for evolution to accomplish.
(It seems to prefer hacks upon hacks rather than a clean redesign, but
then again maybe this tendency will go away when people are able to
consciously redesign themselves.)
In this second alternative, when duplication becomes possible we'll
also have to learn and internalize new ways of expecting subjective
experiences that are not based on patterns of past experiences. For
example in your pit story, the character in your pit story has
to "expect" to experience being out of the pit every time he pushes the
button, even if he has already pushed it thousands of times and
experienced staying in the pit each time.
> Lastly, we are moral animals, (Jef Albright urges that one read Robert
> Wright's "The Moral Animal" in preference to Matt Ridley's "The Origins
> of Virtue"). Our evolutionary altruistic circuits (IMO) lie at the
> basis of much of our kindness towards other creatures, and I certainly
> don't want ever to sacrifice that part of our makeup. But when some folks
> learn that animals and people are only molecules in motion, then in
> *objective terms*, or so they say, happiness and suffering of others
> cease to be important considerations (to the degree that they are
> able to override these ancient impulses).
I don't understand these folks. Why do they want to override their
altruistic impulses and not also want to also override their selfish
impulses? If they believe that nothing matters, then why bother trying to
override any impulses at all?
This archive was generated by hypermail 2.1.5 : Wed Jan 15 2003 - 17:58:07 MST