From: Samantha Atkins (samantha@objectent.com)
Date: Thu Aug 16 2001 - 18:47:57 MDT
"Robert J. Bradbury" wrote:
>
> Samantha commented on my comment regarding whose hands the
> "blood of billions" shall stain and whether any of them/us
> will be around in need of absolution (you can see my Catholic
> background showing here...).
>
> While I understand your point Samantha, I believe my fears
> and your fears lie in different areas. I'm less afraid of
> nanotech and more afraid of amoral AIs than you probably
> are (correct me if this is incorrect). I also don't believe
> I would want to live in a world managed by any kind of
> intelligent SysOp (no matter how friendly). That probably
> derives from my growing up in Massachusetts, right next
> to the "Live free or die" state of New Hampshire. There
> are also the Tea Party & Concord Green incidents that are
> part of my heritage. You can take the boy out of Boston
> but can't take the Boston out of the boy.
>
Actually, I was mainly reacting to the seeming implication that
guilt, in either acting too slowly or too quickly, should be a
significant motivator. I think it is healthier generally and
specifically on the REALLY BIG things to be primarily motivated
by what one wishes to acheive and be part of, by the postive
rather than the negative. That sort of motivation is more
likely to have us act at the right speed and keep us focused on
what we wish for rather than avoiding "blood on our hands".
I share some misgivings about a Sysop scenario. I am not sure
it is a valid and viable "solution". But it is not one fear
over another that provoked my response. My greatest fear is
that we act without knowing what it is we want to acheive and
thus act very unwisely with any/all technologies at our
disposal. The price of the attendant errors can be our
extinction. But I don't distinquish so much between which form
of technology is likely to be the most dangerous.
> However, I don't view our perspectives as mutually exclusive
> given the current state of the universe (lots of matter
> and energy for constructing things and lots of physical
> "space" that can be allocated for security zones). So
> it seems like both of our visions are realizable. However
> what doesn't seem realizable to me is the "resurrection"
> of those whose information content is irretrievably lost.
> We may be able to recreate a reasonable facsimile of Sasha,
> one that I can even "believe" is him, but in my heart I
> will always question how close it can be to the original.
>
I do not know whether we are in a simulation and whether it is
with or without backups. Most likely the best course is to
assume not. If not and if time travel is impossible (w/o
creating alternate worldviews) then there is such irreparable
loss. However, whether we view it as tragedy that is our
personal responsibility, that we are guilty for, is another
question. I don't believe that such a guilt burden is necessary
for creating the conditions for avoiding such loss. I think
creating this kind of guilt can get in the way and be dangerous
in and of itself.
> The blood of billions *will* be on the hands of those arguing
> against cryonics whether or not the reanimations or recreations
> are done with or without either nanotech or the SysOp.
No, it won't. As long as there is freedom then those who don't
see cryonics as viable have no guilt whatsoever. Honest opinion
is not guilt producing. Forcing others to forego their own
opinions is cause for guilt.
- samantha
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:09:53 MST