From: James Higgins (jameshiggins@earthlink.net)
Date: Sat Jun 22 2002 - 16:42:58 MDT
At 03:13 PM 6/22/2002 +0200, you wrote:
>On Sat, 22 Jun 2002, Eliezer S. Yudkowsky wrote:
>
> > > Considerable leverage is available to people to inhibit the kinetics
> > > of early stages via legacy methods.
> >
> > Name EVEN ONE method.
>
>Isn't it quite obvious? Laws and their enforcement.
>
>If you're working with radioactive materials, especially fissibles, nerve
>agents, pathogens, recombinant DNA you're subject to them. I distinctly
>hope that anything involving molecular self-replication in free
>environment and ~human level naturally intelligent systems will see heavy
>regulation, at least initially.
>
>Maybe your AI will turn out to be all love and light. Then maybe not.
Dam! I haven't laughed that hard in YEARS! Laws? Are you really serious?
I bet you believe in gun control too (because we all know if you outlaw all
guns then criminals won't use them, right?).
Why the HELL would anyone working on the Singularity obey the law? By the
time anyone even realized that they didn't it would be too late. Not to
mention the fact that it takes government bodies YEARS to understand
moderately complicated technology. Look at the Internet for an example,
many government bodies still don't get it; and that is based on 20+ year
old technology...
>Once again, we're talking about avoiding specific developments. The
>relevant threat for this forum is runaway superintelligence, a Singularity
>turned Blight and death of us all due to side effects.
Do you suppose the superintelligence will obey the laws too? (sorry,
couldn't resist)
>Don't expect a specific scenario. The more specific it is, the more
>irrelevant. What we could do is draft broad guidelines, which necessarily
>need to be adaptive in nature.
>
>I told you the inhibiting aspects: regulating, tracking, enforcing. This
>is not pretty, nor does it guarantee 100% success, but it's a lot better
>than nothing. Talk to your friendly molecular biologist working in Level 3
>facilities how they're coping with the regulation load. Remember that the
>temporal scope of the regulations is limited, the hard limit being the
>late stages of Singularity, which will shrug off any regulations imposed
>by previour players due to raising power gradient.
And these regulations is why there have never been any bio-terrorism
problems? Oh wait, there have been. Well, all it would take is ONE
Singularity incident and that's it. Thinking that ANY regulation could
have ANY noticeable effect on this problem is ridiculous. Unless you plan
to take away everyone's computers and closely observe what is being done
with the few computers you do allow. Is that what you are proposing?
My apologies, in advance, to the list for not playing nice in this post. I
just couldn't believe this is a serious suggestion.
James Higgins
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT