From: Harvey Newstrom (mail@HarveyNewstrom.com)
Date: Mon Jul 30 2001 - 09:11:05 MDT
Spike Jones wrote,
> The more I learn about the singularity, the more puzzling the entire
> concept becomes to me. I realized I had misunderstood the term
> singularity when I saw Eli's definition as the point when machine
> intelligence exceeds human intelligence. It is not clear to me that
> the accelerating bootstrapping process would automatically occur,
> even if we get a super-capable machine. I suppose it is because
> I am failing to understand machine *motivation*. I understand what
> motivates humans, but I am at a loss to explain what would motivate
> a machine. spike
I may be confused, but I'm not sure that this was the original meaning of
the term singularity. I think some people have decided that AI will be the
soonest and most likely cause of the singularity, so they decided that the
term most likely refers to AI.
I think the term originally was more generic. Any exponential growth curve
soon becomes almost vertical. This vertical line approaches infinity going
straight up. It becomes impossible to predict what exists after that wall
of vertical/infinite rate is reached. I believe that this generic
description was the original meaning of the term. This generic infinite
growth could come about with self-modifying AI, runaway nanotechnology,
uploading of the entire universe into a simulation, a sudden change in the
simulation, the advent of a true religion, strange matter destroying the
universe, the Big Crunch, etc.
Does anybody else have memory of an earlier, more generic meaning for the
term singularity? Am I having false memories, or is my mind-wipe fading?
:-)
-- Harvey Newstrom <http://HarveyNewstrom.com> <http://Newstaff.com>
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:09:17 MST