From: Tony Belding (tlbelding@htcomp.net)
Date: Sun Apr 19 1998 - 16:49:39 MDT
Anders Sandberg <asa@nada.kth.se> wrote:
AS> Why do we have pets? I think it is because we want companionship, but
AS> not necessarily with the demands of other humans and quite often
AS> because we enjoy companionship with a *different* kind of being.
Very often the psychology is that of a parent-child relationship. This is
true not only for the human owners, but also possibly for the pets. For
example, if you watch the behavior of cats, much of it is based on their
perception of humans as mother cats. That's not surprising. If you're a cat,
mother is the one who feeds, cleans, and protects you. So, your human owner
fits that definition pretty closely.
AS> So I wouldn't be surprised if we in the future keep semi- or fully
AS> intelligent pets. They might have rights on their own; we already give
AS> animal rights,
Speak for yourself! I don't acknowledge any rights of animals, and AFIAK the
law here in Texas doesn't either. Neither does most of society, I believe,
except for the few animal-rights extremists.
Animals, like most things, have the value that people assign to them. My pet
cat is beloved to me; I would be hurt deeply if anything happened to him. But
a scraggly feral cat prowling around my back yard at night has no value to me;
I would shoot one without even blinking. I don't think there's anything
abnormal about that. It's human nature.
This reminds me of the abortion debate. When does a fetus become a person?
My answer is that it happens whenever people -- mostly the parents, but also
possibly others -- have invested enough physical and emotional resources in it
to /care/ about it as a person.
AS> and one could imagine a canine union acting as the lobby for
AS> intelligent dogs (demands for fair treatment,
That sounds like a particularly amusing vision of Hell.
I'm opposed to the whole idea of animal rights for exactly this reason -- it
sets a bad precedent for the future. If you recognize animal rights, what's
next? It all comes down to economics. See if you can follow my reasoning
here...
You could recognize rights of robots, if they meet some certain level of AI.
At some point you would have to call it "sentient" and recognize it as a
person instead of a mere machine.
THEN it might be considered improper to own them. That would be slavery.
They would have to be set free and be paid for their work. (Why do you call
them FREE if you have to PAY them? No, don't answer that!) Under these
circumstances, the robots change from being *servants* of humanity to being
competitors. Instead of finally lifting the burden of WORK from our
civilization, the robots would simply take our jobs and leave us with nothing.
There are four basic economic resources: land, labor, capital, and
entrepreneurial ability. I see the coming era as a change from a
labor-oriented economy to a capital-oriented economy. This change has already
started, it happens whenever workers are replaced by machines. The ultimate
goal should be for *all* workers to be replaced by machines, and all the
former workers become capitalists -- owners of the machines. But if we grant
that sufficiently advanced machines might also be /people/, then it throws a
monkeywrench into the whole plan.
It may seem I've rambled a long way from the subject of pets, but this is all
inter-connected. The same technology that will blur the distinction between
labor and capital will also blur the distinction between machines and living
creatures. It's hard to say that life is sacred unless you can define life.
Likewise, it's hard to say sentience is sacred unless you can define
sentience. I haven't yet heard a convincing definition of either, and this
may lead to many conflicts.
AS> These pets do not need to be just animals or improved
AS> animals, they might be AIs or biotechnological constructs.
There, you see? Exactly what I was saying...
AS> Overall, I think uplifting of other species is a good idea, but the
AS> ethics is quite complicated.
I don't see anything to gain by it. In a way, animals are too much like us.
Although not as intelligent, they share the same basic drives we do,
programmed by eons of survival and reproduction. That means ultimately they
will probably WANT the same things we do. Biology is destiny. Even if you
shed the biology, its legacy remains. So, I am more interested in creating
entirely artificial creatures that can be programmed from the ground up.
Without an evolutionary legacy, they can be given /drives/ that are completely
unlike natural life forms: innate compulsions to do things that are useful to
US.
-- Tony Belding http://hamilton.htcomp.net/tbelding/
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:48:56 MST