From: Richard Steven Hack (richardhack@pcmagic.net)
Date: Sat Mar 02 2002 - 01:06:37 MST
At 11:06 PM 3/1/02 -0500, you wrote:
>Richard Steven Hack wrote:
> >
> >
> > I would say sooner than that by far. IF you assume that consumption of
> > resources by posthumans on a large scale is likely. My point is that this
> > is NOT likely - certainly not for pure survival reasons. Unless you can
> > make a case for that...
>
>It is a rather standard human characteristic of a general incapacity to
>properly appreciate exponential growth curves, whether it is saving
>money for retirement, or calculating resource consumption rates for a
>posthuman civilization.
>
>That being said, one error that Robert Bradbury's most extreme "Pave the
>Universe" scenarios make is to fail to account for the fact that each
>greater generation under Moore's Law not only enables greater power,
>processing, information, etc, but it allows the resource utilization
>efficiency to rise as technology improves (this is, in fact, how new
>generations of computers are able to squeeze greater performance out of
>the same number of silicon electrons). It should not be surprising that
>each new Moore generation would improve the utilization of mass and
>energy resources by similar scales of exponential improvement.
>
>Right now, energy and matter consumption efficiency doubling times in
>human civilization still lag quite a bit behind computational efficiency
>doubling times, but they are also improving. The advent of
>nanotechnology will give this resource efficiency doubling time a kick
>in the pants much as photonic circuits will give Moore's Law a kick in
>the pants in a few years with computers.
That seems reasonable.
> >
> > You also assume that some significant percentage of the human race goes
> > posthuman. That might not be the case. If one assumes that only a small
> > percentage of people desire the Transhuman state, and if one assumes that
> > the majority of the human species do not, it may well come to it that only
> > a small percentage (say, 1%) become posthuman. That gives us (at the
> > present population) at most 60 million posthumans. I will grant that kind
> > of number would eat up the solar system IF they need or want to. And I
> > also suspect such a number will have no problem (in fact, I suspect ONE
> > posthuman will have no problem) eliminating the rest of the human species
> > if they become a problem to their (its) survival.
>
>With practical immortality, it is irrelevant what percentage of each
>generation chooses to live in a transhuman state. If ANY percentage
>chooses to, eventually the vast majority of humanity WILL be tranhumans,
>and the 'naturals' will be considered as much a feral minority as the
>bushmen of the Kalahari are today.
>
>The best revenge is to outlive your enemies.
Agree with the latter statement. However, I wasn't referring to the notion
that eventually there will only be transhumans, ASSUMING that ALL humans
die out - not likely since humans do breed. The question is whether humans
will actively resist transhumanism. One can be optimistic and assume that
eventually most humans will transcend, but this is not a certainty. We
still have Kalahari bushmen... In fact, a very large percentage of the
world's population has yet to make a phone call. I would say this
indicates there will be a significant lag time between the early adopters
and the rest of the species. This lag time is likely to be shorter than
any other, but if the effect of it is to traumatize the population into
resisting transcendence, the result could be disastrous - for humans...
> > >
> > >Agreed. But meta-"entities" at interstellar distances have a
> > >difficult time "cooperating" on anything. The light speed limits
> > >on probes and the spreading of communications beams makes the costs
> > >of "cooperation" very high unless the communication requirements
> > >for cooperation are *extremely* small relative to the amount
> > >of computation that has to go into what is being communicated.
> >
> > You're assuming that they operate on that scale. They may not. Over time
> > they may, but if you assume the speed of light as a limit (I don't, given
> > the potential for extraordinary physics discoveries by posthuman
> > intelligence), then it will be a very long time (relative to our standards,
> > not posthuman standards for whom time I suspect will be mostly irrelevant)
> > before posthumans spread very far anyway.
>
>The problem is that you have no idea what the potential for
>'extraordinary physics' is, on an objective basis. The record of human
>scientific discovery doesn't bear this out.
The record of human scientific discovery doesn't bear out a lot of
things. That is irrelevant to whether a significant change in physical
theory is still possible. I do not know enough physics to have confidence
in the Theory of General Relativity - especially since I know there ARE
scientists with apparently adequate degrees who still argue that exceeding
the speed of light MAY be possible.
> For instance, the speed of
>light has been known since the days of ancient India (a means of
>calculating light speed has been found in an ancient Sanskrit text), and
>that knowledge has not changed in several thousand years. No new
>scientific development has shown its actually possible to surpass the
>speed of light in an practical way (i.e. even in the recent FTL
>experiments, the wavefront NEVER exceeded light speed).
>
> > >
> > >I don't think you can escape from the "economics" as a means for
> > >optimal resources allocation paradigm. If I give you free tickets
> > >to a award winning broadway play and the Metropolitan Opera for the
> > >same night you have to make a choice. There are opportunity costs
> > >for using your resources (your time) for one thing and not another.
> > >
> > >Now presumably a completely self-regulating mind could "forget" about
> > >the opera tickets and not realize that it had to pay the opportunity
> > >cost of going to the play, but that doesn't mean that the cost wasn't
> > >"really" paid. If the Universe is handing JBrains and MBrains stuff
> > >to think about they actually have to make actual choices. Costs will
> > >be incurred.
> >
> > Not necessarily - my choice may be to ignore both sets of tickets and trash
> > them. You're speculating from a human viewpoint. Also, are we sure there
> > are "opportunities costs" for posthumans? If you aren't going to die in a
> > few decades, and you have everything you need to survive, why be in a hurry
> > to do anything? You can do everything eventually (assuming you want to and
> > assuming you can do enough of it to be satisfied before whatever
> > "end-of-the-universe-thingy" occurs). You can make choices but they may
> > not cost you much.
>
>This is a rather cultish interpretation of things, in a further extreme
>beyond the regular nano-santa giddishness that occurs here and elsewhere
>among some transhumanists.
No it is not. My point is simply that the notion of "opportunities costs"
is not, AFAIK, a proven principle of the universe OR of information
theory. And the question is not whether there ARE costs, the question is
do those costs exert pressure on one's survival. Among humans, they
may. For posthumans, I remain unconvinced and I doubt anyone can prove the
point not having a posthuman existence to point to to prove it.
>By ignoring the tickets, you certainly are forgoing an opportunity, even
>if you trash them.
Are saying that any time someone offers me something, I have to worry about
it? Is there normal human behavior? No wonder humans never get anything
done... It is irrelevant whether I am foregoing the opportunity - what is
relevant is whether the opportunity was significant to my survival
priorities, and what the actual cost - i.e., the actual impact on my
survival probability - is by foregoing the opportunity. Posthumans MAY - I
do not say this as a certainty because I am not yet posthuman - have much
less to worry about in this regard than humans.
Basically, your argument is an attempt to scale up human problems to a
posthuman scale. Understandable, as Vinge noted, you can't think in
Transhuman terms if you aren't a Transhuman, so people tend to try to bring
the issues down to their level. But it's not correct unless we are talking
about truly immutable physical laws that apply equally to any entity. And
as noted above, I doubt that "opportunities costs" qualify as such a law.
> So long as all information is not evenly distributed
>in the universe, there WILL be opportunity costs for any intelligence at
>ANY level of development. This is an information equivalent of the laws
>of thermodynamics.
See above.
> > >
> > >I'm not so sure that you can make that conclusion. I could easily see
> > >the "post"-humans becoming "Angels" for the humans.
> > >
> > >If there is a "moral" (ethical?) posthuman path, it would require
> > >that natural humans be allowed to follow a "natural" course.
> > >This of course gets into the very complex issue of whether
> > >posthuman morality trumps human morality.
> > >
> > BTW, the reason I theorize that the UFOs are the results of a NON-human
> > prehistoric civilization is precisely because I would assume one based on
> > humans MIGHT act as "Angels" to the race following them.
>
>Actually, given the Fermi Paradox, and if you accept that a)
>intelligence is common in the universe, b) intelligent entities will
>tend to spread out through interstellar space as they advance, and c)
>nanotechnology and quantum computation and quantum encryption are
>standard developmental technologies of technological intelligences, then
>it logically follows that there could easily be intelligent posthuman
>races existing here on Earth, who originated from one or many other
>worlds in our galaxy, and who exist in a nanotechnological environment
>as 'crypto-dirt', who communicate via quantum encrypted packet
>signalling which we are unable to discriminate from background
>radiation. Such posthuman civilizations could exist here and could be
>hacking into the nervous systems of individuals to produce hypnogogic
>experiences with 'angels' and 'aliens' and 'demons', and, of course,
>making people hallucinate UFO's where none actually exist.
>
>This conjecture is at least as likely as your nano-santa dreams (and
>require the exact same set of circumstances to be true).
I agree that theory is a possibility. I thought about that a couple years
ago. In fact, the distinction between that theory and mine lies merely in
the origin of the entities involved - if they evolved here on Earth, it's
my theory - if elsewhere, it's yours. Otherwise, it's the same theory. As
to its truth, that is something we will not likely determine until we are
posthuman. As someone said once, "The gods will not speak to us face to
face until we ourselves have a face...."
> > I see no evidence
> > of this. Of course, it may simply be that once posthuman, you simply do
> > not care what happens to other sentient entities. Even Star Trek has the
> > Prime Directive...
>
>Star Trek is BUNK.
But it does illustrate my point.
Richard Steven Hack
richardhack@pcmagic.net
Richard Steven Hack
richardhack@pcmagic.net
--- Outgoing mail is certified Virus Free. AVG Anti-Virus System Version 6.0.325 Release Date: 01/28/02 Virus Database: 182 Release Date: 02/19/02 Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.325 / Virus Database: 182 - Release Date: 2/19/02
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:43 MST