Samael wrote:
If you are counting on AI self-termination to stop the Singularity, you'll
have to explain why affects every single AI.
>
> But why would it _want_ to do anything?
>
> What's to stop it reaching the conclusion 'Life is pointless. There is no
> meaning anywhere' and just turning itself off?
>
Nothing stops any particular AI from deciding to do this. However, this
doesn't stop the singularity unless it happens to every AI.
The singularity only takes one AI that decides to extend itself rather than
terminating.