[p2p-research] 10 Reasons
Paul D. Fernhout
pdfernhout at kurtz-fernhout.com
Mon Sep 28 01:24:41 CEST 2009
Ryan wrote:
> Interesting how the common good bias is coming into this discussion...
>
> Sent to you by Ryan via Google Reader: 10 Reasons via Accelerating
> Future by Michael Anissimov on 9/25/09
>
> My “10 Reasons” document from August 2004 is getting some great play on
> StumbleUpon and other venues. 5,000 visits so far this month. Check out
> my “10 Reasons to Develop Safe Artificial Intelligence”:
>
> 1. Because human cultures aren’t exotic enough.
Try elephants, whales, and dolphins. And since the West often defines
"human" as Judeo-Christian with European ancestry, try other religions and
cultures.
> 2. Because intelligence should be fluid, not rigid.
Try octopuses.
"600 lb Octopus Escapes Through Hole Size of Quarter"
http://www.youtube.com/watch?v=SCAIedFgdY0
> 3. Because we need someone to help us organize the data we’re drowning
> in.
Try Google.
> 4. Because aliens aren’t showing up, we should make our own.
Try illegal immigrants from other cultures.
> 5. Because a virtual world would be a cool place to grow up in.
Only if you are a virtual person.
"Our Babies, Ourselves: How Biology and Culture Shape the Way We Parent"
http://www.amazon.com/Our-Babies-Ourselves-Biology-Culture/dp/0385483627
> 6. Because we need new perspectives and thinkers.
Or, to pay attention to the old ones like Buckminster Fuller, Ursula K. Le
Guin, Jacque Fresco, Roxanne Meadows, Mahatma Gandhi, John Holt, Grace
Llewellyn, and so on...
> 7. Because it would be interesting to engineer new emotions.
Might be nice to use try using the ones we have.
http://en.wikipedia.org/wiki/The_Theory_of_Moral_Sentiments
"How selfish soever man may be supposed, there are evidently some principles
in his nature, which interest him in the fortunes of others, and render
their happiness necessary to him, though he derives nothing from it, except
the pleasure of seeing it. Of this kind is pity or compassion, the emotion
we feel for the misery of others, when we either see it, or are made to
conceive it in a very lively manner. That we often derive sorrow from the
sorrows of others, is a matter of fact too obvious to require any instances
to prove it; for this sentiment, like all the other original passions of
human nature, is by no means confined to the virtuous or the humane, though
they perhaps may feel it with the most exquisite sensibility. The greatest
ruffian, the most hardened violator of the laws of society, is not
altogether without it."
That was from Adam Smith!
> 8. Because sci-fi stereotypes need to be shattered.
Sure. :-)
> 9. Because humans are often biased away from the common good.
Sure. :-(
> 10. Because AI is coming whether we like it or not, so it might as well
> be safe.
Read the 1979 sci-fi novel "The Two Faces of Tomorrow" by James P. Hogan,
available online:
"The Two Faces Of Tomorrow"
http://www.webscription.net/10.1125/Baen/0671878484/0671878484.htm
"""
From chapter 5:
"About a week ago, Titan came within a hair's breadth of killing five
people," Lewis told him somberly. Dyer stared at him incredulously. Before
he could say anything, Lewis went on. "It appears that hesper program
structures are capable of integrating to a far greater degree than anybody
thought. They're starting to link things together in ways they were never
supposed to and the results in behavior are impossible to predict."
Hoestler explained, in response to the still bemused look on Dyer's face.
"It used the Maskelyne mass-driver to bomb an ISA survey team on the Moon.
Could have wiped them out."
"What?" Dyer turned an incredulous face toward Lewis but the Dean nodded
regretfully to confirm Hoestler's words.
"One of the hesper-controlled subsystems in the Tycho node was given the
job of shifting a piece of terrain that was forming an obstruction," he
explained. "It was supposed to use normal earth-moving equipment to do it,
but nobody bothered to tell it that. Somehow it managed to connect together
information from several subsystems that shouldn't have been connected, and
came up with what it thought was a better shortcut to solving the problem.
According to the people who analyzed the system dump afterward, it seemed
quite proud of itself."
"""
> Ironic that Jamais Cascio has accused Singularitarians like myself as
> not being interested in culture. It’s not a matter of dancing, it’s a
> matter of survival. If we do not program the first recursively
> self-improving seed AI appropriately, we will all perish. And death is
> so final.
Well, it may be more than just one AI. And, unlike in Two Faces of Tomorrow,
people may be a more integral part of an emerging noosphere. But for those
who believe in a coming singularity, accept that it might be a reflection of
our values, and let's get our moral house in order first (like, universal
health care, global abundance, a global basic income, a gift economy, local
subsistence through 3D printing of open designs, renewable energy, mutual
security, ending compulsory schooling, reducing prison populations and
ending the drug war, widespread understanding of the techniques of conflict
resolution, a good balance of meshworks and hierarchies, a "global
mindshift", and so on).
--Paul Fernhout
http://www.pdfernhout.net/
More information about the p2presearch
mailing list