From: Robert J. Bradbury (bradbury@aeiveos.com)
Date: Sun Mar 24 2002 - 09:57:25 MST
(I'm somewhat playing devil's advocate here...)
On Sun, 24 Mar 2002, Amara Graps wrote:
> The reason I don't like this ('adapt people to technology')
> perspective is that it can easily skew the humans' value system to
> one of: technology has a higher value than humans.
When you realize what an absolute kludge humans are (speaking
from the standpoint of a programmer) this isn't very hard.
> I think that _humans_ are first and foremost and from here,
> we seek means of survival (*and growth*) on this planet/outwards
> working, as best we can, with the complex interaction between
> the Earth's biology, sociology, chemistry, physics, politics,
Are we? If you accept that humans are sub-optimal, then doesn't
growth [really more importantly -- evolution] become much more
important than survival?
> I agree that technology moves faster than we can keep up with
> it, but we should always keep in mind that _it is 'us'_ who are
> creating it.
Actually Amara that isn't true. In everything from our computers
to our cars to our planes to even your telescopes and probes,
its our *tools* who are doing much of the creation. As a team
at Brandeis has shown if you give a program an idea of how
to recognize something "useful" it can do the creation aspect
quite easily with no input from the human other than having
gifted it with the ability to say "ah-ha". I could go on about
computers writing, creating art, data mining, etc.
We have passed out of the realm where humans are the sole
creators on the planet. We started doing so long ago --
it requires a furnace and a hammer to make a blade.
> I really think that we can't emphasize enough the value of
> humans/transhumans. It is from this position of our high value, that
> we will/can address the problems of which technology can bring.
No argument. But as we have discussed many times we don't have
a good solution for when "The needs of the gods outweigh the
needs of the demi-gods, or the non-gods".
There was a good (repeat) episode about this on Stargate SG-9
last night. A race of relatively primitive people (X) has been
relocated to planet 'paradise'. Planet paradise is the only
known planet with a Stargate that has the right atmospheric
conditions for X to live. Alien spaceship shows up holding
the remnants of civilization Y, a 10,000 year old civilization
that has been wiped out. It starts terraforming planet paradise
such that it can support the resurrection of civilization Y.
Of course the terraforming resources on the ship can only be used once
and planet paradise is the only planet out of millions examined
capable of supporting Y. The terraforming process is completely
deadly to people X and there isn't enough time to evacuate
them even if they could find an suitable alternate planet.
The dilema for the Stargate team is whether to sacrifice
people X or attempt to destroy the spaceship, permanently
extincting civilization Y. They discover an interesting
solution to the problem which I will not disclose here.
I don't think these types of moral dilemas are going to go
away anytime soon. I think one has to come up with better
arguments than saying 'a priori' the value of humans or
even the rights of humans must be preserved indefinately.
Anyone who has read any science fiction can presumably
come up with more than a few cases where that simply
doesn't seem to be justifiable.
> What values, ethics, politics enable that?
The same values and ethics (I'll leave politics out of the equation)
that can rationally recognize when oneself (as an individual
or a group or a species) is no longer relevant in the grand
scheme of the universe.
Robert
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:05 MST