From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Nov 28 1999 - 14:38:49 MST
"Eliezer S. Yudkowsky" wrote:
>
> "D.den Otter" wrote:
> > Add to that the fact that
> > humans, if allowed to continue freely with their development
> > after the SI has ascended, would most likely create and/or
> > become superintelligences (i.e. competion for resources and
> > a potential threat),
>
> This is the fundamental disagreement. I simply don't believe in
> competition between SIs. I think the motives converge, and multiple SIs
> would merge, or all but one would commit suicide, or whatever.
Let me rephrase that: Given den Otter's assertion that SIs would be in
competition for resources, a proposition about which I am agnostic (what
resources? for doing what, exactly?), or rather that multiple SIs could
not use resources efficiently, then they would merge or dismultiply or whatever.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:52 MST