On Fri, Sep 14, 2001 at 11:16:30AM -0700, J. R. Molloy wrote:
> From: "Anders Sandberg" <asa@nada.kth.se>
> > What would transhuman values be?
>
> Rationalism, pure intelligence, quest for knowledge, and extropic principles.
Why don't you think these are part of human values?
My guess is that you would say they are been tainted in the human
mind by all sorts of "useless hypotheses" and emotions, and that we
need a transhuman mind free of them to achieve these values. But we
can strive for values even if we are not good at achieving them. By
making the above values transhuman, you are implying that humans can't
ever achieve them, which is both a very pessimistic statement about
humans incompatible with much of the humanist foundations of
transhumanism, and gives a ready excuse for not pursuing them until
super-neurotech is here to fix our brains.
Personally I consider pure intelligence to be a contradiction in
terms: intelligence is something practical, that gets its metaphorical
hands dirty in learning and changing the real world. Emotion is an
important part of intelligence too, if only as a value selection
system. I seriously doubt even a highly reconfigured mind could do
away with it and remain functional in the real world.
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! asa@nada.kth.se http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:40:46 MDT