From: Lee Corbin (lcorbin@tsoft.com)
Date: Sun Apr 28 2002 - 12:38:58 MDT
Samantha wrote
> Robert J. Bradbury wrote:
>
> > Nonvolitional uploading would violate Extropian principle #6
> > of self-direction. Some people might prefer to evolve
> > individually into an Aristoi. It is impossible to do
> > that if they are constrained within a SysOp/AI.
>
> Suppose a benevolent or moderately compassionate SI sees that
> many humans are so mad... I would not be surprised at all if
> such an SI at least kept up to the fraction of a second scans
> of the individuals from which they could be reintroduced in
> perhaps different circumstances if calamity struck them.
Yes, I totally agree: in my eyes it would be no less than
criminal to do otherwise. For those who disagree with us:
Would your conscience allow you to leave adrift to die
people your find marooned on a raft at sea? Would it
allow you to fail to resurrect humans who died in pre-history
if one day we can? All these are extremely similar from the
kind of advanced perspective a benevolent AI will have.
Lee Corbin
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:40 MST