From: Eugen Leitl (eugen@leitl.org)
Date: Thu Jun 13 2002 - 10:50:11 MDT
On Thu, 13 Jun 2002, Smigrodzki, Rafal wrote:
> ### Yes and no. A breakup of monopolies is likely to produce the creative
> chaos out of which the brightest inventions spring. You would need to
> *enforce* certain monopolies.
The power monopoly of the state, alas. If it introduces more diversity
into the node landscape and security instead of features (orelse you'd get
sued into oblivion) it can't be all bad.
> ### Agreed. But, most AI researchers say they want to develop AI
> locally, rather than networked. Measures affecting the net will not
> slow down SAI emergence, only its spread (minimally), when it's too
> late anyway.
I don't listen much to what AI researchers say. Their credulity rating is
about zero. Given both abysmal track record and airy dismissal of lessons
to be learned from molecular level analysis of existing instances of
intelligence and the processes which brought them into being. (And ain't
that a Good Thing? This buys us time)
I don't believe in a team explicitly coding an AI (seed is AI as far as
complexity is concerned). The only other alternative I'm aware of is
breeding AI essentially brute force using evolutionary algorithms. That
approach has been validated before. The latter has excessive (I mean,
really ridiculously mindbogglingly excessive) hardware requirements for
the initial mapping of the parameter space and kickstarting the
co-evolution to near human level (below that it's innocuous).
There is a considerable difference between a billion and a few 100 nodes.
Larger facilities have an operation signature that is impossible to miss
(if you thought your garden variety pot growing operation was hard to
conceal, think again), and can be regulated.
> ### I do not underestimate governmental ability to perform such
> actions. Information might want to be free, say the digerati, but most
> of them have never seen a real badass dictator, except on TV. You seem
> to imply that a very intrusive and comprehensive control regimen needs
> to be initiated, something much more than the silly little
> cryptography export prohibitions which didn't work. Usually, such
> control is instituted at a significant economic and societal cost. I
I point to considerable regulations necessary for recombinant DNA work
with hot pathogens. That work requires trained professional experts, large
facilities with a budget and supply of specific parts. A big AI project
cannot be missed, and be it by a sudden absence of a number of key
players. (I'd be watching ALifers and neuromorphic people, especially if
they dabble in hardware).
> doubt that the world's governments will act quickly enough (within 5
> years) and decisively enough (formation of a global government,
I don't expect human grade AI to emerge within decades. It might take
considerably longer if the AI community further revels in their
recalcitrance/orthodoxy. In a sense this is bad, because automatic fabbing
needs robust insect-grade AI, and there will be no space exploration
without that.
> willingness to militarily destroy dissenting nations, full
> surveillance of all sufficiently advanced computer tech). They could
> but they won't because they are not scared enough. Convincing
> goverments to do it will be only possible after it's too late, when a
> SAI announces they are not govermnents anymore. You are even less
That scenario does not appear extremely probable. If we get a critical AI
seed we're likely dead meat, anyway.
> likely to gain support for the uploading of humanity before this
> event.
Well, you can only die once. <--- here's some comfort
> ### Well, for now there is no risk but in 2020 when the desktop PC
> equals the hardware in a human brain, and from the net you can
The predictions of 2020 is compleat bogus. 3d integrated molecular
circuitry will take longer than this (if you want to hear a meaningless
date pulled out my nether orifice, it's at least 2040). Notice that this
is hardware, software will take longer, given how slowly a mature field
moves.
> download scads of modules able to perform all kinds of tasks at the
> low and intermediate level of cognition, even a bunch of
You assume you can build an AI from a bunch of CPAN modules? AI is a
massively parallel numerical (doesn't mean it uses floats) application.
It most likely doesn't make a distinction between code and data (the
algorithm updating the sites is a simple enough transformation to be
implementable directly in molecular circuitry). But the state, the pattern
is pure black magic. All the details are encoded there. How many bits is
it necessary to encode a human baby? An AI doesn't have homeoboxes, so
where do you obtain the magic vector, breathing life into it? You have to
do at least part of the work evolution did in the last gigayear on a
substrate dish the size of this plaent. Lots of computation there.
> garden-variety religious nuts might be able to cobble together a god.
I think this is a remarkably silly claim.
> By that time you'd need to have worldwide surveillance of all PC's and
I thought we had it already. It's disguised as an industry standard OS.
> all programmers, something that would require unification with China
> and Russia, or a nuclear war with them. I say you need political
> unification, or else the competitive forces between polities will
> assure that well-funded military labs will build their superhuman
> warriors long before the small-fry terrorists get their act together.
There's realtime sensorics and control AI for the military hardware and
there's tactical AI. Could we get human-grade AI from military labs? I
have no idea. Though sensors gives you advantage over humans, it still
needs realtime capabilities and adaptiveness to not be tricked by devious
monkeys. Otoh, a lot of defense work tends to be conservative.
> Is this the price you'd like to pay?
>
> Rafal
>
> PS. I wish you were right. I want to upload now but I am convinced it
> won't happen this side of the SAI emergence.
If you're right we've booked the first seats for Ragnar0k. So lean back
and enjoy the show. (At least the special FX will be fucking amazing).
Either way, I'm not very concerned. We're having a good chance of making
it through, and if we don't, we'll die very quickly, and being dead
doesn't hurt. Looks a lot like win/win in my book.
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:14:46 MST