From: Alan Grimes (alangrimes@starpower.net)
Date: Thu Jan 31 2002 - 10:55:29 MST
Dark something wrote:
> I don't think what individual humans wish for right now really matters
> on their fate at all;
You are speaking against a fundamental tenant of humanism....
I don't say that you might be incorrect but rather that it is an
extremely unplesant viewpoint.
> I think it would be considered unethical to a Next-Level
> entity to *not* convince a human (which it could do quite easily) to
> accept the uploading/transcension process.
While the inclusion of "I think" proves that your ego is better
restrained than Elizer's, you must always remember that all ethics are
relative.
> particles, all genes, all living things, all rocks, on many subsequent
> levels and all over the place) is int! egrated into the Final Pattern.
How is this different/better than a gaia spirituality?
> I have faith that the Sysop will
Faith is dangerous -- ALWAYS.
> create an environment for us that will lead to the greatest possible
> growth/enlightenment/happiness/fullfillment/intelligence/etc,
We appear to already be in that environment. Our challenge lies with
improving ourselves not crawling into a box with lots of blinking LEDs.
> regardless of what we think now.
uh?
Anyone who respects me at all will not manipulate me in such a way.
[natural death]
> Only an unethical, subhuman AI would allow this to happen without a
> fight.
That sounds very authoritarian. Please be clear about where you stand on
the political spectrum.
> And AI's always win in fights with humans, perfectly and
> physically harmlessly in cases of intellect/memetics.
An AI can do many things. But that alone has never been nor ever will be
at all connected with what is right, proper, and just.
-- DOS LIVES! MWAHAHAHAHA http://users.rcn.com/alangrimes/ <my website.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT