From: Mike & Donna Deering (deering9@mchsi.com)
Date: Thu Jun 13 2002 - 09:47:10 MDT
In my personal opinion any kind of Singularity is a good Singularity, whether humanity is around to see it or not. And I also think that baring the total collapse of technological civilization the Singularity is inevitable.
The way I see it happening is thus:
#1. After a few more doublings, $1K home computers become powerful enough to run a general human level intelligence program. These computers won't have the computational capacity of the human brain but due to efficiencies of engineered over evolved intelligence will be just adequate.
#2. One of a vast number of midnight hackers who have been studying AI design over the internet hack together the first real AI. The only way for one of the AI development teams to beat the hacker is to do it before #1.
Nevertheless, there is still the possibility of total collapse of technological civilization threatening the Singularity.
Natural disasters from space, meteors, interstellar dust clouds, roaming black holes, and cosmic ray bursts are all too statistically infrequent to be a large risk in the short term.
Natural disasters on Earth are not vast or powerful enough to destroy technological civilization.
Environmental ecosystem collapse, global warming, and Gaia self defense. We don't know enough about the dynamical feedback characteristics of these to define the level of threat. Something to watch and study.
Nuclear war. The countries with enough weapons and delivery systems to destroy the world have historically demonstrated rational restraint. The countries with a small number of weapons or primitive delivery systems are not a threat to society in general.
Chemical weapons. Same as nuclear weapons.
Nanotech weapons. Still a few decades away from development and difficult to solve the energy problem, finding a free energy source in the natural environment strong enough, plentiful enough, and distributed enough to power nanotech gray goo.
Biological weapons. This is it. This is the one that is going to cook our goose if we don't take drastic action now. There are at least 25 countries developing bio weapons. Many bio agents have already been created that could destroy human civilization and more are being designed all over the world even as we speak. Someone somewhere is going to make a mistake and one of these is going to escape from a government lab someday soon. The technology for recombinant genetic engineering is rapidly climbing down the scale of general availability. If the government labs don't get us the garage experimenters or the terrorists will. There is also the possible though less likely accidental creation of a killer virus in one of the many university, or business labs. The perfect bio weapon would be a virus of maximum contagion such as flu, combined with maximum lethality such as marburg, clandestinely disseminated in a location of maximum dispersion such as an airport.
What to do? As much as I hate the idea, the best option I see is to outlaw privacy. But this does not seem to be practical in the present political environment. Alternatively we could treat bio viruses like computer viruses, with firewalls and antiviral software. Firewalls would consist of each home being equivalent to a level 3 bio containment facility with hepa filters and decontamination air locks. When you went outside you would wear a racal hood with a virus filter and exhaust fan along with a virus impervious body suit. The anti-virus software would be by subscription to a daily antibody update. This would require the development of systems that could make a vaccine for a new virus in 24 hours or less. We are not quite there yet.
Mike.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT