From: Ben Goertzel (ben@goertzel.org)
Date: Tue Aug 08 2006 - 06:03:44 MDT
Hi,
About the "problem" of a lack of wide Singularity understanding...
Ricardo said that:
> First of all, I think it's not even a problem of people being able to
> understand the singularity. Right now, I would say the problem is more
> fundamental - people don't even hear about it. I'd say that even most
> computer programmers don't know about it!
>
> Second, even in the cases that people have heard about it and don't
> understand it, I don't think it's a matter of understanding the
> evolution on computer speed and capacity. I'd say it has more to do
> with the fact that many educated people have a very rough idea of how
> our brain works, and an even rougher idea of what things a computer
> can do and how it does those things. Those two things combined cause
> people not to understand how a computer could simulate/emulate a
> brain.
Actually, I have talked to a lot of folks about the Singularity, and I
don't think either of these are the main problem.
Most educated, open-minded, non-religious people I talk to are willing
to believe that
* something like a Singularity could happen eventually
* eventually AI's will probably supersede human intelligence
* ultimately brain scanning will allow us to map out the workings of
the brain, allowing us to replicate these in some kind of engineered
machinery
What most people seem to have a really hard time believing are the
timing estimates being put forth by Kurzweil, myself and the like ---
or even the order of magnitude of these estimates.
The appeal to exponential growth is understood by any educated person
... but so is the appeal to failed predictions from the past (such as,
to take a single amusing example, Jetsons-style commuter spacecraft,
which were projected by many enthusiastic futurists for the early 21st
century, and yet are not apparent anywhere around us today...)
In other words, it's a lot easier to convince educated, open-minded,
non-religious people that
"The Singularity is Coming ... Eventually ... If Humanity Doesn't
Destroy Itself First"
than that
"The Singularity is Near"
And while these two propositions are not that far off in terms of
intellectual content, they are pretty far off in terms of pragmatic
consequences...
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT