Re: Bitter Pills

From: Randall Randall (wolfkin@freedomspace.net)
Date: Thu Jun 06 2002 - 00:07:04 MDT


Emlyn O'regan wrote:
> Barbara Lamar wrote:
>
>>> Lee Daniel Crocker wrote:
>>>
>>>>I share you repugnance, but not because it's "greed"--that's a
>>>>noble thing.
>>>
>>> I'll avoid doing business with you, Mr. Crocker, if you believe that
>>> "an excessive desire to acquire or possess, as wealth or power, beyond
>>> what one needs or deserves" is a noble thing.
>>
>> Many Randalls wrote:
>> Well, who decides what is "excessive"; what one "needs" or "deserves"?
>>
>> Since you're quoting that definition, I assume you believe that you
>> have a dollar figure above which a person is being greedy? Care to
>> share that figure, so we can all judge ourselves? :)
>
> ----
>
> (and this is me, using a crappy email client)

I've reformatted the above for you. Not sure whether you would want that;
if not please say so. :)

> You can't have it both ways... the original comment was defining greed as a
> noble thing. Barbara in response, assuming that Lee would strive to be
> greedy (as he believes it noble), commented that she would avoid doing
> business with him on that basis. She wasn't defining him as greedy, she was
> assuming that he defined himself as greedy, and presenting her response to
> that.

However, I'm sure his definition of greedy doesn't include the term
"excessive". Hers does, which was why I asked that. :)

> This seems to be related to the reputation discussions in other threads.
> What Lee is saying (to many of us) when he says that greed is noble, is that
> he believes that it is a noble thing to look after ones own interests first,
> at the expense of all others possible considerations.

I'm not sure I would say "noble", as Mr. Crocker does; I might rather say
that it is basic. If my morality doesn't lead to more of me (in a longer
lifetime, say, or more children), then it is useless to me. One cannot
practice morality while non-existent, so morality must be a means to an end.
The most basic end is survival itself, so a lasting (workable, practical)
morality must have some connection, however tenuous, with survival.

> In terms of dealing with another entity in a business relationship,
> foreknowledge that the entity is willing to do whatever it takes to maximise
> its gain, no matter the cost to others (ie: myself), and with no regard to
> any moral imperative that I might deem important if I were in its shoes,
> seems to be a good reason to steer clear of doing business with that entity.
> More plainly, if I feel I can't trust someone, then I wont deal with them. I
> don't think that's irrational.

Certainly it isn't (or doesn't seem to me to be) irrational to refuse to deal
with those one doesn't trust. However, this means that any agent capable of
rudimentary reason about relationships will necessarily come to the
conclusion that in order to maximize self-benefit, one must build the trust
of others, and that the best way to do this is to be trustworthy.

Until recently in terms of history, the problem with this argument was that
one would expect to die just as one acheived the highest trust levels of
one's life, so there was an implied incentive to break trustworthiness near
the end of one's life, or if a major opportunity presented itself. Yet it
is still the case that lifelong trustworthiness could improve the lot of
one's children, so trustworthiness was selected for in mostly stationary
populations, on the whole. Civilization as we know it couldn't exist
otherwise, I would expect.

However, there's a new game on the horizon, one in which there isn't
necessarily going to BE any endgame, ever. Nor can there be, in a world of
accelerating progress, any opportunity major enough to offset all the
(presumed infinite) gain that could be accrued by being trustworthy at all
times. So one would expect rational self-serving (greedy, I think Mr.
Crocker would say) transhumanists to strive to be trustworthy at all times,
and in all situations except possibly lifeboat situations. This behavior
would correlate nicely with the best behavior expected of altruists, and
it seems to me that there is no accident in that: altruism is based on the
greed of genes, which are potentially immortal in the pre-singularity era.
I speak figuratively, of course, but the arguments from reason which drive
a greedy transhumanist to be as perfectly moral and trustworthy as she can
be drive Darwinian selection of genes as well, if I understand correctly.

Hope this helps explain where I'm coming from... :)

-- 
Randall Randall <randall@randallsquared.com>
Crypto key: randall.freedomspace.net/crypto.text
...what a strange, strange freedom:
   only free to choose my chains... -- Johnny Clegg


This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:14:37 MST