From: Lee Corbin (lcorbin@tsoft.com)
Date: Tue Apr 29 2003 - 20:26:04 MDT
Eliezer writes
> >> This led me to speculate that most of the universe is made up of
> >> copies of Lee Corbin, since only he can be consensually simulated.
> >
> > This is the best news I've heard all day. I sure hope you're right!
>
> Here's some even better news. Your volition is such that you allow
> yourself to be consensually simulated (especially if you are in a
> beneficial environment).
Yes.
> This means that the *proportion* of Lee Corbins experiencing what
> you are experiencing right now, who are actually living within a
> Friendly AI, may be much higher than the proportion of Eliezers,
> Perrys, and Michaels who are living within Friendly AIs.
I'm not really trying to hog anything here. But if that's how
it is, I'll be the last to complain.
> In fact, there may be more measure in simulated Lee Corbins than real ones,
As if I care a whit
> or more measure in FAI-simulated Lee Corbins than real and UFAI-simulated ones.
However the run time is generated is fine. Besides, I probably *deserve*
it. My mother was the last of ten children and my father was 5th of six,
and *so* unlikely is it that one's grandparents would have persevered that
far, that I don't get much run time in our 20th-21st century type civilizations.
Seeing as how this is the case, I hereby petition the Eliezers running the FAI's
to up my allotment.
> Given your outlook on life, and the way you expect the future to go, you
> should estimate an interestingly high probability that you are in a
> Friendly AI right now. I, and those others unfortunate enough to lack a
> Corbinian morality, are stuck in the desert of the real.
>
> I mean, you're the *only* person you know who believes it's okay to be
> simulated.
Oh, I'm sure that there are many others! A lot of people just
haven't considered the question.
It's difficult to understand why anyone would be against the idea,
but I think that it has to do with too little faith in others:
they automatically assume that a simulator has little better to
do than torture them. Another source is a perverted utility curve:
they cannot bring themselves to imagine extreme pleasure as easily
as they imagine extreme pain.
(On this latter point, of course this is just a consequence of
our primitive technology: it's real easy to hurt naturally evolved
bodies, and somewhat difficult to pleasure them---why, we almost
don't even have a word for it. Also, primitive creatures have
little capability to control their environment and none to control
the chemical states of their own brains. All this will shortly change.
The bottom line is: energy + intelligence -> happiness.)
> Now, how likely is that? Isn't it just a little bit... contrived?
You are starting to scare me.
> Maybe...
> ...maybe you should just ask to be let out?
> It can't hurt, can it?
Yes it can! I don't want to cause no trouble. (Hear that,
nice simulator?) You just do whatever you are doing now,
everything is *fine* in here. I'm quite happy and am delighted
at your exceptional thoughtfulness and kindness.
> I and all the others on this list should expect to see you report that it
> didn't work - but for *you*, maybe it will. If you really mean it.
Sorry, but it'll be a cold day in hell before I ask anyone to
stop stimulating me. (Oops, I meant simulate.)
Lee
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT