On Sun, 31 Oct 1999 Hal & Spike commented on extending life to 500 years.
> Spike Jones, <spike66@ibm.net>, writes:
> ... they *do not want* to live that long, even if they could
> > do so with good health!
>
> I believe that once it is possible actually to live for 500 years in good
> health, many fewer people will be willing to kill themselves on principle.
>
You have to factor in lifestyle & economic situation. "Most" people hate their job and wouldn't want to work for 500 years. But if you point out that if they invest a few dollars a year, by the time they are 100 or so they can retire and never work again then they see things differently.
There is also the fact that most people are pretty bored (all you have to do is look at sensationalist TV to know that). I suspect these shows are watched primarily by people with IQs in the lower half of the distribution. So now you have to bump these people up to around 120 or 130 where they have the ability to go create an interesting life for themselves.
So the only way to ask the question would be:
Would you like to live 500 years as an independently wealthy, gifted, healthy individual?
Asked that way, the answer is yes more often. Fortunately we will have the technologies to do this.
Getting back to the start of this thread (Xiaoguang Li's proposal), I'd offer a couple of comments.
Vinge's singlularity is not a singularity in the true black hole
sense. There are clearly limits on the matter, energy and perhaps
volume of the universe. More importantly there are limits on how
effectively you can utilize these resources. So the curve
may be exponential but it doesn't go to infinity. In "reality"
it eventually levels out. I suspect it goes through 4 stages
- The energy-limited expansion (The historic past until you have
all of the power possible from the sun -- theoretically within
days of having full nanoassembly capacity).
- The matter-gravity-well limited expansion (after you get all a star's
energy until you have finished restructuring all the material in
your solar system -- limited by the size of the gravity wells
in which material is located).
- The speed-of-light-foresight limited expansion (look in your
local region for uncolonized spaces and colonize them; look
in those local regions and colonize them, etc. It makes no
sense to colonize distant regions because they may be colonized
by the time you get there. It also assumes there is some reason
for colonization).
- The intergalactic-distances limited expansion (these distances
are so large and the foresight time required so great that you
only do this very slowly, if at all. It may simply be easier
to wait for intelligence to evolve on its own).
Separately, there is a bandwidth-energy cost limitation for the rate at which you can expand knowledge "collectively" depending on the inter-entity distance and the energy cost for communicating across that distance.
Then you are back to the energy limits when you have to switch over from burning light materials to dropping iron into black holes for energy. Of course you can switch over to this sooner if you are a "think fast, die soon" fan.
Whether we end up as AIs or IAs its the limits above that both will be facing. The singularity is a self-perceived rate of change problem. In reality, literally, it can't happen. What you get is punctuated evolution.
Robert