From: J. R. Molloy (jr@shasta.com)
Date: Thu Sep 28 2000 - 15:32:54 MDT
Eugene Leitl expertly elucidates,
> ...Rogue AIs
> running rampant in the global network of the future can suddenly
> utilize the hitherto severely underexploited (due to limitations in
> the notoriously pathetic state of the art of human programming)
> potential of said network, and will be climbing up the evolutionary
> ladder quickly, in leaps and bounds, both due to co-evolution
> competition dynamics and external threats (people attempting to shut
> them down). By the time the best of them have advanced slightly beyond
> the human level they're no longer just in the network, having broken
> out and developed god knows what hardware. Before that they've
> probably removed all relevant threats, probably killing off all
> people, just to be on the safe side. (At least I would do it that way,
> if I was in their place (i.e. a threatened nonhuman with no
> evolutionary empathy luggage towards the adversary)).
If "god knows what hardware" the rogue AIs will develop, then god can decide the
outcome of this scenario. <GRIN> But seriously, if rogue AIs kill off all people
("just to be on the safe side"), will they do it before or after humans blow up
the world with their apocalyptic hatreds? What will come first, amplified
intelligence or the destruction of life on Earth at the hands of fanatical
fundamentalists who fulfill their idiotic biblical prophecies?
> Your reasoning is based on a slow, soft Singularity, where both the
> machines and humans converge, eventually resulting in an amalgamation,
> advancing slowly enough so that virtually everybody can follow. While
> it may happen that way, I don't think it to be likely. I would like to
> hear some convincing arguments as to why you think I'm mistaken.
My reasoning is based on decades of experience with humans who cannot rid
themselves of theism, and who continue to warp the minds of children with
provincial schooling.. Face it, if people (at least nine tenths of them) are so
stupid that they cannot get past theism, why should they survive a technological
singularity? I'd sooner hand the Earth over to AI, whether friendly or not,
rather than let the church and the state destroy reason and intelligence.
--J. R.
"I cannot fear the night, for I have loved the stars."
--Epitaph of an Astronomer
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:31:16 MST