From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Jul 14 2005 - 18:54:51 MDT
Russell Wallace wrote:
> On 7/14/05, Eliezer S. Yudkowsky <sentience@pobox.com> wrote:
>
>>Okay. Let me know if you have anything new to contribute to the conversation
>>that started here:
>>
>>http://www.sl4.org/archive/0401/#7483
>
> Hmm... actually I think within the assumptions that the participants
> in that conversation appeared to be making, you're right.
> Specifically, I think it's about the speed and timing of takeoff...
>
> The combination of these three things implies that the far future
> population will have evolved from a single, compact core, and the
> combination of 1 and 2 mean that the core will have started with a
> simple, low-entropy goal system. Given that, yes, Darwinian evolution
> may well not apply.
>
> My vision of the default scenario differs on all three counts. I think
> there is no single threshold beyond which the rules abruptly change;
> if we ever do have "each hour now longer than all the time that went
> before" in Vinge's memorable phrase, that will require highly
> developed ultratechnology, which means it will be late, which means
> ultratechnology will be widely available. I also think intelligence is
> not quite as dominant over numbers as you think it is, so a single
> entity that got ahead could still be pulled down.
>
> Thus it will resemble previous transitions (biological, cultural,
> technological) in that the population going into the transition will
> be large, with very high total entropy; there will be no single goal
> system, and no single core to preserve it; thus, Darwinian evolution
> will dominate the overall dynamics because it will be the _only_ force
> with global scope; thus, the population will converge on a nonsentient
> optimal self-replicator from many directions.
Your argument fails to carry even given its assumptions; it addresses only a
small portion of the necessary conditions for natural selection. In
particular, you did not address my points about extreme-fidelity replication,
insufficient frequency of death to ensure multiple generations, and
insignificant covariance of goal system content with replication speed. Any
one of these points is sufficient to rule out natural selection as a
significant shaper of post-Singularity order, even if we grant slow takeoff,
late development, and force of numbers as an even balance to greater intelligence.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT