Samantha Atkins wrote:
>
> "Eliezer S. Yudkowsky" wrote:
> >
> > Samantha Atkins wrote:
> > >
> > > I would also point out, once again, that a believe it is
> > > possible for the essence of you to jump substrate at all is
> > > grounds for something that looks very much like a
> > > technologically backed doctrine of the soul and even soul
> > > migration and possible reincarnation. I embrace those
> > > implications, you would demean others who believe something like
> > > this is possible on different grounds than you do.
[snip]
> I don't see how this says anything about what I was speaking of.
> If the essence of you can be not only preserved but uploaded and
> partially or wholly downloaded into other bodies then this is
> very much like the notion of a "soul" that inhabits this body and
> that can survive the end of this body. It is a technological means
> by which this old dream/vision/wish can be made real.
The point is that "soul" is probably the most famous of all the suitcase
terms. I once proposed that it should be dissected, at least, into
"immortal soul", "extraphysical soul", "weird-physics neurology",
"morally-valent soul" (atomic game-theoretical unit of moral
responsibility; "free will"), "qualia", "mind-state preservation", and
"self-continuity".
The issues involved on uploading bear on mind-state preservation,
self-continuity, qualia, and possibly weird-physics neurology. It would
be both unjustified and sloppy to, by importing the suitcase term "soul",
conflate the postulation of extraphysical immortality and religious
judgement with the idea of technological transference of self-continuity
of an informational pattern to a new substrate. Essentially the same
issue of pattern continuity versus material continuity is raised by
molecular replacement in the human body.
> > I tend to be confident about my ability to handle what goes on in my head,
> > but I don't mess with social group polarization, I don't mess with
> > attaching moral valency to predictions (learned that the hard way), and I
> > don't mess with religious analogies. I tend to regard these things as the
> > cognitive equivalent of sticking your head in a microwave oven.
>
> I tend to regard disowning hard questions using absurd analogies
> as the equivalent of sticking your head in the oven and lighting
> a match. <g>
I am not disowning the hard questions, but yes, I am suggesting that we
should give absolutely no a-priori confidence to past attempts at
answering them; if there are any relevant ideas they should be considered
on a case-by-case basis rather than as part of a category. Current
experience has shown (a) that past ideas are usually wrong and (b) that
people have a tendency to become sentimentally attached to past ideas. I
would therefore argue either for complete neutrality (if confident of
rationality) or the use of a small corrective bias against ideas to which
you fear you may have become sentimentally attached (meta-rational
correction for estimates of self-fallibility).
I don't think that anything is gained by dragging soulism into this and I
am not inclined to grant it the benefit of the doubt.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:39:42 MDT