Re: What is the singularity?

From: Party of Citizens (citizens@vcn.bc.ca)
Date: Tue Jul 31 2001 - 12:33:54 MDT


And I gather from your web site that you think this could well occur
within a decade. How would you describe this projection:

(a) pulled out of the air
(b) a joke
(c) a fantasy
(d) a serious opinion
(e) all of the above
(f) none of the above

POC

On Mon, 30 Jul 2001, Eliezer S. Yudkowsky wrote:

> "J. R. Molloy" wrote:
> >
> > Eliezer's definition is when AI exceeds intelligence of smartest humans.
>
> No, hardware-improved humans also count, as long as it's genuine hardware
> improvement and not just using Google or forming corporations or whatever
> the latest Fad Pseudo-Singularity of the Month.
>
> > I ask because a list member has expressed fear that a system
> > which identifies incorrect thinking might do so with extropians. Wouldn't that
> > actually be a friendly thing to do? I mean, if extropians think incorrectly, a
> > friendly AI would be doing all sentient beings a big favor by removing that
> > incorrect thinking, right?
>
> By *identifying* the incorrect thinking, to those people who would want an
> AI to help them identify incorrect thinking.
>
> -- -- -- -- --
> Eliezer S. Yudkowsky http://singinst.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
>



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:09:20 MST