From: Samantha Atkins (samantha@objectent.com)
Date: Wed Mar 10 2004 - 23:11:26 MST
On Mar 5, 2004, at 1:33 AM, Marc Geddes wrote:
> I also wonder just how much Information on AGI should
> be shared with the general public whilst the projects
> are going on? I see that Ben is publishing quite a
> lot about his project, and Eliezer has publically
> published quite a lot as well. Be aware that anyone
> with net access can read all that. Dictators in China
> and North Korea, the odd pychopath... is it wise to
> provide too much information about how to create AGI?
> On the one hand, sharing can advance research, on the
> other, the risk of someone creating Unfriendly A.I is
> increased. And of course, shorter term A.I results
> could have substantial proprietary value.
>
The number of people who can understand a workable design for even much
lesser projects is small. The number who both understand it and have
the time/talent/resources to implement it or anything like it is
infinitesimal.
One of the things that is most likely to lead straight to a dystopia is
being such control freaks on information and knowledge that advancement
of human effective intelligence is stillborn before it becomes
effective enough and concentrated enough to succeed at AGI. Our
proprietary attitudes toward knowledge might be more on the problem
than the solution side of the equation.
> To tell you the truth, I was slightly uneasy even
> posting those few very general ideas to sl4 you see in
> my last couple of posts.
>
I really don't see why. Those ideas and a hell of a lot of
inspiration, genius and hard work still may well not be enough.
> Most of the people working in A.I probably visit sl4
> and copy everything down. They're probably ripping
> off all our ideas without a second thought.
>
"Ripping off"? I wish that someone would "rip off" something critical
to increasing the intelligence available before it is much too late.
> The marketing side of it needs to be a lot more
> careful as well. Some things (like the Sys Op idea)
> will just cause people to go ballistic. Other things,
> like wild speculation about life after the
> Singularity, will just cause people to dismiss it all
> as sci-fi fantasy. I would never have talked about
> the Singularity at all. And for God's sake don't talk
> about life after the Singularity! Most people just
> don't believe a word of this stuff. There is too much
> hype and far too many 'slip ups' on the marketing
> side. And the few non-scientifc people in the general
> population who do believe this stuff are scared
> shitless by it. AI will upset religious and social
> norms.
For Newton's sake don't try to tell us what we should and should not
talk about! One person's rant is another's inspiration. Religious
and social norms need to be upset. We need to grow new norms as
quickly as we can.
>
> All of this may sound a bit paranoid, but really Sing
> Inst needs to start thinking about these things. It's
> just 'Singularity Realism'.
One person's realism is another person's paranoia. :-)
- s
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:46 MDT