Re: Jaron Lanier Got Up My Shnoz on AI

From: John Grigg (starman2100@lycos.com)
Date: Tue Jan 15 2002 - 02:27:11 MST


J.R.,

You took some serious hits!! Before you abandon ship on these points, would you like to expound further??

I do realize even Hans Moravec believes humans may become hopelessly obsolete. But, I think part of the extropian dynamic of courage is to not give up on oneself. I think a lot of substantial upgrading could be done to the standard homo sapiens sapiens, so hang in there!

>I think of cryonics as a neo-Luddite technology that >seeks to preserve (freeze) unenlightened biological >entities instead of evolving to higher capabilities.

So, in your mind it sounds like you are saying, "out with the old(let them die), and in with the new(A.I. and otherwise). Natural selection at its finest! Will you be implementing this concept on yourself? ;)

best wishes,

John

J. R. Molloy <jr@shasta.com> Wrote:

> I think of cryonics as a neo-Luddite technology that seeks
> to preserve (freeze) unenlightened biological entities instead of
>evolving to higher capabilities.

But it's a little tricky to go on a program of self improvement and evolve
higher capabilities if you're dead, I think it would cramp my style.

> Artificial sentience has no monetary value for the same reasons
>that human sentience has no monetary value.

There is no way that could be true. If sentience had no monetary value
evolution would not have invented it. Sentient beings must behave in ways
that non sentient beings do not, in this case in ways that enhance survival.
Behavior always has monetary value.

>Furthermore, we can attain superlative sentience, while our creative
>ability may remain undeveloped.

You can't have superlative sentience without new superlative thoughts
and that's just another name for creativity.

          John K Clark jonkc@att.net



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:11:40 MST