From: Matt Mahoney (matmahoney@yahoo.com)
Date: Mon Nov 24 2008 - 08:53:33 MST
--- On Mon, 11/24/08, Petter Wingren-Rasmussen <petterwr@gmail.com> wrote:
> My conclusion: We cant know what, if any, traits an Ai with superhuman
> intelligence will have inherent. Several, maybe all, of the traits we want
> AIs to have can be gained through evolution. Can we afford the risk of
> developing AIs of this level without applying evolutionary pressure?
It depends on what you mean by "intelligence" and "risk". In a battle, we often attribute greater intelligence to the victor. Can bacteria outsmart us? It seems so when they evolve resistance to every drug we try against it. DNA encodes information, and protein performs computation. Viewed this way, the kilogram of bacteria in your digestive tract has more storage and computing power than your brain.
If AIs reproduce, modify themselves, and compete for computing resources (materials and energy), then they will evolve. If AIs are smarter than us, then it will be them that apply selective pressure to us, not the other way around. We aren't at the top of the food chain any more.
Is this a risk? What is your opinion of the extinction of homo erectus, or viewed another way, its evolution into homo sapiens?
-- Matt Mahoney, matmahoney@yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT