From: Max M (maxmcorp@inet.uni-c.dk)
Date: Sun Nov 17 1996 - 06:41:11 MST
> Have you read _The Time Ships_ by Baxter by any chance? He discusses this
> scenario in the book, in the final part of the book.
No can't say that i have. But i have recieved a lot of reading suggestions
that i will check out. Hopefully they have them at amazon.com as american
books are horribly expensive here in Denmark. 3 times the US price at
least.
>I often argue for what could be seen as the robot scenario
>with humans as hangers-on: by developing AI based on human-computer
>symbiosis, we ensure that we will remain important parts of any spreading
>AI civilization.
This means that very early on we must take the question of robo-control
into consideration.
> Why should I care whether my descendants are meat or metal?
> The best choice would be to go there myself;
agreed
> I have often considered the possibility that our creations will outsmart
> humanity... they will simply become too smart for humans at our current
level > of intelligence.
> The only real solution I see is self-augmentation/transformation and
intense
> cyborganization.
Maybe something as simple as making robot societies self controlling.
Robots looking after robots. Sort of a police function.
> We also need to train our minds and develop a very strong mental
discipline > within ourselves.
That will probably be close to impossible for the majority of people living
today.
> It's survival of the fittest. It always has been, and it always will be.
By saying that you imply that humans are merely a slightly higher evolved
animal and i don't agree with that. Humans have intelligence. Maybe some
future race will have a higher intelligence but that doesn't mean that they
will per default win some kind of survival battle. We are different than
animals in a lot of ways. For one thing we can make concious decisions and
go for an explicit goal.
> We cannot remain the type of creatures we are now. We must transform
> ourselves to survive. We also cannot wait until machine takeover is
imminent.
> We must start *now*, using whatever we have available to transform
ourselves,
> disciplining our minds, training ourselves to be more intelligent,
creative,
> resourceful and adaptable.
> Strange paradox: we must change profoundly to survive. That's life.
yes
> Wait a minute -- destroyed? Just because an entity is more powerful
> and intelligent than humans doesn't mean it's going to set out to destroy
> us.
It could also happen out of shere negligence on their part. Like a lot of
species are destroyed by os now
> Apparently you expect some kind of Paget scenario, in which our
> robot-creations declare war on us.
It's certainly a possibility, but it can be made a lot less likely if we
think it into our designs from the beginning.
MAX M
New Media Director
Private: maxmcorp@inet.uni-c.dk
http://inet.uni-c.dk/~maxmcorp
Work: maxm@novavision.dk
http://www.novavision.dk/
This is my way cool signature message!!
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:35:50 MST