From: Phillip Huggan (cdnprodigy@yahoo.com)
Date: Fri Dec 09 2005 - 13:10:10 MST
I agree. Without a goal system an AGI would be just like a really good wikipedia or the personalized google AI-search-agent we will likely see in a year or two.
I really don't like the rapture idea of us needing an AGI to make us smarter. I like me the way I am and don't need an AGI robot to shove (insert whatever fields of knowledge you think will make people "smarter") computer chips in my brain. If the goal is to make me kinder, how will this be effected? Tabs of Ecstasy shoved down my throat? Seriously, how can an AGI forceably make me smarter without killing off my identity? Will it broadcast images of suffering children and tell me about all the underfunded high rate-return research programs? Sounds too much like a mother in law for my liking.
David, you are smart enough to handle MNT in a way that doesn't create a new extinction risk. Don't bomb cities with it and you are set. It is the other people with MNT that create the risk. The risks are not that different from well-known military ambitions. A MNTed missile defense and MNTed missiles delivering nukes/neutron bombs/EMPs, that is the risk posed by irresponsible MNT administrators. The solution to UFAI and MNT risks is identical: don't allow hostile forms of these development programs to occur (I don't think defenses to an AGI attack are possible until we have a much greater appreciation of extreme physics). Only the time-scales and odds for success differ in whether the solution is carried out by FAI, an Oracle, or plain old people with MNT.
David Picon Alvarez <eleuteri@myrealbox.com> wrote:
From: "P K"
> That would never happen. For the AI to give an order it would have to have
a
> goal system. Passive AI does NOT have a goal system. Let me take another
> shot at explaining passive AI.
Intelligence *requires* goals. Even subhuman theorem provers need goals.
All these arguments aside, we don't just want an AI to do our goals, we
partialy want an AI to work out what our goals should be. Yes, think about
it, if we were smart enough we'd be in a position to avoid most existential
risks from MNT...
---------------------------------
Yahoo! Shopping
Find Great Deals on Holiday Gifts at Yahoo! Shopping
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT