From: John Marlow (johnmarlow@gmx.net)
Date: Wed May 16 2001 - 00:35:17 MDT
> I think what is needed is honesty, less bullshit from journalists
and scientists and promoters of this or that. Or if they are
promoters of X or Y, let them state it clearly.
Clearly stated: I'm a promoter of nanotechnology as the solution to
most of our problems and the means to the realization of most or all
of our aspirations.
Clearly stated: I think more likely the damned thing will kill us all-
-so I'm a bit ambivalent about speeding things up.
:)
And, obviously, there'll be no stopping it, so we might as well face
the beastie and do our best.
I also think it entirely conceivable that a completely nonbiological
consciousness (which is what most seem to be thinking of when they
refer to AI/SI) may be flat-out impossible. "Self-awareness" as
generally discussed in relation to AI/SI is not consciousness.* A
computer which interprets data and makes logical decisions based upon
those data is not displaying consciousness. A computer which passes
the Turing Test is a parlor trick; the test itself is meaningless
because its definition of intelligence is meaningless.
You may well ask for a definition of consciousness.
Well, hey--if you gotta ask...
jm
*Simple example: Computer detects heat which will damage its
components, and eliminates the heat source or moves components away
from it. Another: Computer discovers way to improve its source code
(because it has been programmed to do just this), and does so. This
is "consciousness?" I think not.
Of course, consciousness is not a necessary component of the SkyNet
scenario.
-- On 16 May 2001, at 1:39, Spudboy100@aol.com wrote: > In a message dated 5/16/2001 12:47:50 AM Eastern Daylight Time, > johnmarlow@gmx.net writes: > > << My meaning was that there may BE no "long term." We'll be > extraordinarily fortunate to stagger through another century. >> > I understand your suggestion of many threats to human and biotic existence, > be a Spike or a Nuke or Global Warming or whatever. > > We may indeed wind up in the shit can of annihilation, but that has been > pimped before by many journalists and Paul Ehrlich in the Population Bomb, > the Club or Rome, and the rest, including neonazi survivalists of the 1980s > and such. I would suggest that the human species will survive, and that it > may, well, have to deal with problems, that have to do with dealing with > "Skynet scenarios." > > I hold with George Dyson's more convivial view that we will be part of what > makes SIAI work, but it will take astronomically, more computational power, > then even Kurzweil has suggested. For example; look how much computer > capability have increased over the last 20 years, and it is having an impact, > but nothing that has approached the science fiction of a William Gibson, or a > Bruce Sterling. > > We do suck at environmentally friendly technology for energy and using raw > materials. That is largely a part of the marketplace, and how it interweaves > with human psychology. Why split water for hydrogen fuel cells, if gasoline > is so cheap (circa 2000)? Why make hydrogen fuel cells, if the old, IC > engine is the only thing to drive? You get the idea, its all linked. > > Pessimism as such, seems to be worse for a society, especially a democracy > (republic) then optimism. People who are pessimistic settle for less and are > less critical of their "leadership." People can also turn to psychotic types > of leadership, out of a sense of desperation. > > I think what is needed is honesty, less bullshit from journalists and > scientists and promoters of this or that. Or if they are promoters of X or Y, > let them state it clearly. > Peter Jennings is happy to editorialize (American Broadcasting Company) while > he reads the news. Verification of what he claims, is often a harder matter > to obtain. > We need honesty to make good choices. > > MItch > John Marlow
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:07:40 MST