Re: Definitions, semantics...(WAS:basic logic)

From: Harvey Newstrom (mail@HarveyNewstrom.com)
Date: Wed May 15 2002 - 12:25:39 MDT


On Wednesday, May 15, 2002, at 09:54 am, Alex Ramonsky wrote:

>
>
> KPJ wrote:
>
>> It appears as if Eliezer S. Yudkowsky <sentience@pobox.com> wrote:
>> |
>> |Uploads, cyborgs, mutants, transspecies, simulations, and
>> discorporates are
>> |not transhuman unless they are smarter than Homo sapiens sapiens. A
>> |transhuman is a transhuman mind.
>>
> Surely a human mind in an immortal body would be transhuman, even if it
> were no smarter after the transfer (at first)...?

This is interesting. I'm not sure we have specifically defined
transhumanism. Perhaps we have merely given examples, and everybody
assumed a different criterion for why the examples were transhuman.

I always assumed that transformation to a different state would be
transhuman. Transmorgrifying into Eyore would be transhumanism for me
after I transcended my human condition. Eliezer seems to assume that
there must be some improvement above the human condition. Therefore
Transmorgrifying into a super-friendly super-smart
artificially-intelligent Tigger would be transhumanism for him.

(Wait a minute. I'm Eyore again! How come I'm never Tigger?)

--
Harvey Newstrom, CISSP <www.HarveyNewstrom.com>
Principal Security Consultant <www.Newstaff.com>


This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:14:06 MST