From: Richard Steven Hack (richardhack@pcmagic.net)
Date: Fri Mar 01 2002 - 18:35:20 MST
At 02:36 PM 3/1/02 -0800, you wrote:
>On Fri, 1 Mar 2002, Richard Steven Hack wrote:
>
> > My point is that a fully developed nano entity has no need for "allocation
> > of resources" - at least not as we understand it. Without biological
> > death, and with the nano ability to construct anything at will from cosmic
> > resources, the only need to construct anything will be in pursuit of goals
> > we cannot now imagine.
>
>Hmmmm... There are goals we "can" imagine, survival comes at the top
>of my list. While the cosmic resources are large you don't have
>access to a relatively "infinite" amount of them over short time
>periods (perhaps of the order of millions of years). You hit the
>limits of the solar system fairly rapidly after the singularity
>takes off (probably in less than 1000 years).
I would say sooner than that by far. IF you assume that consumption of
resources by posthumans on a large scale is likely. My point is that this
is NOT likely - certainly not for pure survival reasons. Unless you can
make a case for that...
You also assume that some significant percentage of the human race goes
posthuman. That might not be the case. If one assumes that only a small
percentage of people desire the Transhuman state, and if one assumes that
the majority of the human species do not, it may well come to it that only
a small percentage (say, 1%) become posthuman. That gives us (at the
present population) at most 60 million posthumans. I will grant that kind
of number would eat up the solar system IF they need or want to. And I
also suspect such a number will have no problem (in fact, I suspect ONE
posthuman will have no problem) eliminating the rest of the human species
if they become a problem to their (its) survival.
> > Granted, it is possible that some of these goals
> > may require cooperation between such entities, but it is not certain that
> > this cooperation will need to be "traded for" - it might be freely
> given if
> > the goal is considered desirable by all the entities concerned.
>
>Agreed. But meta-"entities" at interstellar distances have a
>difficult time "cooperating" on anything. The light speed limits
>on probes and the spreading of communications beams makes the costs
>of "cooperation" very high unless the communication requirements
>for cooperation are *extremely* small relative to the amount
>of computation that has to go into what is being communicated.
You're assuming that they operate on that scale. They may not. Over time
they may, but if you assume the speed of light as a limit (I don't, given
the potential for extraordinary physics discoveries by posthuman
intelligence), then it will be a very long time (relative to our standards,
not posthuman standards for whom time I suspect will be mostly irrelevant)
before posthumans spread very far anyway.
And if they can't cooperate, that bolsters my point that they may not bother.
> > [snip] There may be such a thing as "posthuman
> > economics" but I have yet to determine what it might entail, other than
> the
> > exchange of information.
>
>I don't think you can escape from the "economics" as a means for
>optimal resources allocation paradigm. If I give you free tickets
>to a award winning broadway play and the Metropolitan Opera for the
>same night you have to make a choice. There are opportunity costs
>for using your resources (your time) for one thing and not another.
>
>Now presumably a completely self-regulating mind could "forget" about
>the opera tickets and not realize that it had to pay the opportunity
>cost of going to the play, but that doesn't mean that the cost wasn't
>"really" paid. If the Universe is handing JBrains and MBrains stuff
>to think about they actually have to make actual choices. Costs will
>be incurred.
Not necessarily - my choice may be to ignore both sets of tickets and trash
them. You're speculating from a human viewpoint. Also, are we sure there
are "opportunities costs" for posthumans? If you aren't going to die in a
few decades, and you have everything you need to survive, why be in a hurry
to do anything? You can do everything eventually (assuming you want to and
assuming you can do enough of it to be satisfied before whatever
"end-of-the-universe-thingy" occurs). You can make choices but they may
not cost you much.
> > I don't think a "hive mentality" is at all likely, either. I suspect that
> > posthumans will be just the opposite - absolute individuals with no need
> > for social interaction as we know it - but perhaps with the capacity for
> > extremely intimate social relations when desired or necessary.
>
>Sure, free-floating intergalactic JBrains & MBrains.
>
> > [snip] that this will not take long. I predict that by the end of this
> > century, humans will no longer exist
>
>I'm not so sure that you can make that conclusion. I could easily see
>the "post"-humans becoming "Angels" for the humans.
>
>If there is a "moral" (ethical?) posthuman path, it would require
>that natural humans be allowed to follow a "natural" course.
>This of course gets into the very complex issue of whether
>posthuman morality trumps human morality.
>
>Robert
>
And,,of course, from previous posts, you may assume that I believe it does.
BTW, the reason I theorize that the UFOs are the results of a NON-human
prehistoric civilization is precisely because I would assume one based on
humans MIGHT act as "Angels" to the race following them. I see no evidence
of this. Of course, it may simply be that once posthuman, you simply do
not care what happens to other sentient entities. Even Star Trek has the
Prime Directive...
Richard Steven Hack
richardhack@pcmagic.net
Richard Steven Hack
richardhack@pcmagic.net
--- Outgoing mail is certified Virus Free. AVG Anti-Virus System Version 6.0.325 Release Date: 01/28/02 Virus Database: 182 Release Date: 02/19/02 Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.325 / Virus Database: 182 - Release Date: 2/19/02
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:43 MST