From: Ben Goertzel (ben@goertzel.org)
Date: Tue May 21 2002 - 17:18:56 MDT
>
> >In my opinion, Darpa is not yet ready to fund the Singularity mission,
> >although as it gets closer, the US military will catch on first and
> >provide funding before the commercial sector. The safety of the
> nation is
> >almost priceless.
>
> I'm curious about this statement. My gut feeling is that the
> military would
> consider research aimed at the Singularity as something potentially very
> *harmful* to the nation. If the research is successful, it would mean the
> END of the government and the military. My feeling is also that
> there are a
> great deal of narrow minded people in the government/military sector, who
> perhaps are afraid of such a massive upheaval.
>
> Don't you think that the military would use their funds to support AI
> intended for warfare instead of, say, Eliezer's Friendly AI project?
>
> Have you yourself talked to people at DARPA (or other military programs)
> about the Singularity? If so, what was their reaction?
The military will have its own slant on the Singularity. It will put a lot
of $$ into AGI with the objective of getting there before anyone else, on
the intuition that if there is going to be a major breakthrough that alters
the nature of the world fundamentally, they want to be the ones steering it,
not the "bad guys."
Sure, it may not make any difference who steers the advent of the
Singularity. And it may not make any difference whether one tries for
Friendly AGI or not either. But the military, like Eliezer, will make its
best effort.
I don't think the military will place a huge focus on Friendly AI, but I
suspect they will place more focus on defensive than offensive applications
of AI. Offensive applications would be too easily taken by enemies and used
against us. Software is a lot easier for enemies to steal or copy than
offensive weapons like nukes or fighter planes, say. There is a huge risk
in developing aggressive software -- all it takes is one spy programmer to
give the code to China, and then we may have an enemy on our hands with
exactly the same software. For this reason I suspect that developing AGI
aimed at national defense in the literal sense will be a much bigger
priority. I don't think we care as much if China pirates our defensive AGI,
not nearly as much as if they steal our killer AGI.
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT