Re: Ethical basics

From: ben goertzel (ben@goertzel.org)
Date: Thu Jan 24 2002 - 16:28:00 MST


I think there is a lot of truth in what Vlad is saying.

It is nice that the segments of society with narrowminded, largely
religiously-guided notions of ethics are focusing their attention on human
cloning and genetic engineering and so forth, and leaving AI alone.

As soon as the first really impressive "Real AI" results come out, it's
quite possible that these forces will turn against *us*.

Fortunately, though, computing is even harder to squelch than biology...
computers are everywhere in the Western world, software can be easily
transmitted from place to place, country to country, etc.... By the time
the forces of narrowmindedness start to pay attention to us, it will
probably be too late for them to cause much damage. Especially if these
early "Real AI" programs are doing good deeds. After all, the reason
they'll fail to stop human cloning and so forth, is that it does so much
good for curing disease.... It's hard for the average American to really
hate a branch of science that saves cute little kids -- or their parents --
from dying...

I think that if the Singularity is going to go down relatively smoothly with
Joe and Jane America, we have to be sure that our AI software, when it's at
the stage of "scarily smart but not yet superhuman" is doing things that are
palpably for the good of humanity.

If it's my team's AI software that gets there first, then this will be the
case, because (in our spare time ;) we are working on applications in the
area of genetics and drug discovery...

-- ben g

----- Original Message -----
From: "Vlad Vul" <vul@mail.ru>
To: "Eliezer S. Yudkowsky" <sl4@sysopmind.com>
Sent: Thursday, January 24, 2002 3:59 AM
Subject: Re: Ethical basics

> Hello Eliezer,
>
> Thursday, January 24, 2002, 12:17:38 AM, you wrote:
>
> This is M.Anissimov's words:
> >> Look forward to Singularitarian ideas reaching far, far beyond the
"core
> >> audience of scientifically literate aggressive rationalists" in the
very near
> >> future. Instead of insisting that the meme be stapled down to its
original
> >> tiny group, (however rational and intelligent they may be) perhaps we
should
> >> be considering what variants would have the best net result in
conditions of
> >> imminent mass propagation.
>
> ESY> As for trying to propagate the meme outwards, I've been trying to do
that
> ESY> for six years! And I've made some progress, although not as much
progress
>
> I think, wide propagating of Singularity is a mad and dangerous idea. The
> most of people suffers the future shock, and they will fury to know
> about a speed-up the progress. Then most of peoples, including govs,
> will BELIEVE in Singularity, they will do ALL to stop it.
>
> All we will get due to propagating is a huge backlash. If a doctor of
> math Tadeush Kashinsky(Unabomber) and science director of Sun corp. Bill
Joes (Why
> future don't need us) dosn't like the idea of the end of humanity, why
> do you think a mere people would not became your enemies?
>
> Now nobody cares about Singularity becouse nobody knows about it. And
> thoose who knows does't believe it in-heart, as we do. But then they
will...
> Do you want to be a more wanted than Bin Laden? :o)
>
> So IMHO we need some sort of conspiracy, moving ahead, keeping a privy
> knowledge in teeny group.
>
>
> --
> Best regards,
> Vlad mailto:vul@mail.ru
>
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT