What is the probability that a disaster not related to nuclear weapons will set progress toward human-level AI back decades or centuries? For example, consider runaway climate change, a biotechnologically engineered plague, self-replicating nanomachines, economic collapse, a planetary totalitarian government that restricts technology development, or something unknown.