RE: How to tell if you are a nice person (The VR Solipsist's Altruism)

From: Lee Corbin (lcorbin@tsoft.com)
Date: Sat Jun 22 2002 - 14:53:25 MDT


Colin writes
> Mother Theresa's 'good work', interpreted as pure altruism to
> the point of sainthood, is a 3rd party view. The 1st person
> view of M.Theresa will never be known. What can be said is that
> motivation can be tuned from selfish to unselfish and back again
> linguistically.
>
> Some 1st person views:
> e.g.. I, Saint Theresa, behaved apparently altruistically, to help others
> because:
> * (unselfish) I truly wanted to relieve suffering
> * (selfish) I feel better in the presence of people who weren't
> suffering.
> * (unselfish) I believe in setting a good example to others
> * (selfish) that others (the 'believers in the presence of altruism')
> behave in ways beneficial to me personally (I get the warm fuzzy
> kickback)
> * (unselfish) I believe in the dictates of my God.
> * (selfish) that I need to score God brownie points
> for the purposes of dealing with the pearly gate situation.

In some people, I think that the impulses you've listed
act as memes, or Minsky agents, fighting for control.
Of course, many of those will not be articulated, and
some not even acted upon consciously. I think that it's
a truism, for example, that the first two are both a
part of certain people's motives.

> The extent to which the 1st person altruistic behaver is deluded about
> the unselfishness of the situation is all that may be arguable.

Yes, for sure people will be strongly tempted to admit only
unselfish motives. But you've made me realize that even
scoundrels may have mixed motives.

> Colin Hales
> *I think I've been possessed by Daniel Dennett! Eeek!*

Good. Channeling Dennett probably increases one's accuracy.

> In Lee's example we have 'gods' that tell you that you are the only
> 'real' person. Everyone else is a puppet indistinguishable from the
> 'real thing'. This means that they are all 'shells' with whatever drives
> them located 'outside'. I propose that whatever it is 'outside' that
> drives the puppets is as sentient as the poor VR solipsist.

Oh, at least! That is, the "God", or Operating System, or advanced
Alien driving the experiment is definitely vastly superior to humans,
and if it has any emotions, they are likely far advanced over ours.
The point is merely that the VR solipsist *cannot* affect the controlling
creature any more than you could be affected by toy soldiers you were
playing with.

> The fact that they are all avatars with 'real' veneers is irrelevant.
> More than that I propose that what we humans are experiencing now is
> indistinguishable from the proposed scenario. The software that enacts
> your world which contains all your history and relationships and the
> characters of all those you interact with, that had you fooled so
> well that you had to be informed of the 'deception', is 'off-board'
> but no less real, even though declared by Lee as such - this I think
> is at the heart of it. A tautology involving the use of 'real' and
> 'exists'.

I think that you mean "the software making up the OS that you've
already been interacting with all these years is equivalent to
the people (had they existed) that you have been interacting with".

Here is why I would say no. First, yes, the OS could have chosen
to implement your world, with you being the VR solipsist, by
making a "Mom" module, a "Dad" module, and so on. These modules
would have acted (computed) all the actions of Mom and Dad. In
addition, therefore, they would have also experienced the emotions
that you thought Mom and Dad were experiencing. But the OS had
a second option (and this is the one that I am hypothesizing):
It could have directly, itself, after having studied how people
appear and how people interact, and because it's so very intelligent,
could have just *faked* how a Mom would act, and how a Dad would
act. We know that this is possible because some actors can portray
deep emotion without experiencing that emotion.

> As such, it would be pointless to alter your behaviour. Truman had
> an option. The simulated VR solipsist doesn't. The real issue here,
> IMO, is understanding what a 'real' sentient entity is.

Well, to use one of your own examples, why should not the VR solipsist
steal coins out of a beggar's cup if he is totally convinced that the
crime will not be detected? In fact, this is one of the ways that *my*
behavior would change, were I to discover that I'm the VR solipsist:
I'd steal from anyone that I could safely do so. Why not?

> I think we have to lose our prejudice of an absolute
> standard := 1 x healthy human as the only benchmark.

I don't know what your point is here (sorry).

> The most amoral entity I see in the simulation
> is the 'Gods' who think they can create viable
> sentiences and play with them like toys.

Oh, who gives a fart about them? Since the time
of the Greeks, or before, people have questioned
the conduct of gods, but without becoming skeptics
of their reality. If you were the VR solipsist,
you'd just have to deal with it.

Lee



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:14:58 MST