From: J. R. Molloy (jr@shasta.com)
Date: Fri Sep 08 2000 - 03:26:46 MDT
Eugene Leitl writes:
> What if it takes five minutes for the seed AI to achieve god status,
> taking over the global networks?
I suspect that any AI that could achieve god status could also make many copies
of itself (massively replicate itself) to take over the world -- the works! Who
knows if it would make AIs happy to exterminate humanity. There have been a few
brief moments in my life when it would have made *me* happy to exterminate
humanity. So, I can feel a little bit empathetic about robots who take over the
global networks and the people who own them. (Serves them right for mistreating
Galileo and Socrates.)
> No one wants a really stupid dog, that's the problem.
So, if everyone wanted a really stupid dog, that would solve the problem? Just
kidding. Actually, I think it's great that no one wants a really stupid dog. If
people could just learn to feel that way about politicians, we might be able to
solve some actual social problems. No I'm not kidding. Have you noticed that the
general public distrusts very smart people. That could become problematical.
<<SNIP>>
> But does this also mean a stable and prosperous human community? I
> think it means almost instanteous extinction for the whole of the
> biology, us included, as a side effect of them going about their
> business.
Extinction for the whole of the biology sounds a bit harsh. But it would be
worth it if it guarantees the emergence of something so wonderful that it can
instantaneouly wipe out all life on Earth, us included, merely as a side effect
of going about its business. Yes, the most tragic mistake would be to obstruct
the birth of millions of AIs which would transform themselves into SIs.
> Sorry, I'd rather become that higher life form than die by being
> stupid enough to make that life form before it's own due time.
Perfect. When you become that higher life form here's your name with names of
other higher life forms:
Moses, Buddha, Mohammed, Lao Tzu, Christ, Krishna, Socrates, Leitl, et al.
> > > Socialize a god.
> >
> > I have a better idea: Let a million gods socialize us.
>
> Then you're willing to commit your suicide, and homicide on countless
> people around the world.
So, there you are with countless people around the world.
Here I am with a million gods.
Do the words "resistance is futile" mean anything to you?
--J. R.
"Government big enough to supply everything
you need is big enough to take everything you
have ... The course of history shows that as a
government grows, liberty decreases."
- Thomas Jefferson
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:30:51 MST