Rob wrote:
>> If this AI is instead a purely rational master problem solver, then
humanity will surely disagree with much of its philosophical output too.
Clint wrote:
>There is no such thing as objective morality, and don't try to tell me you
were talking about philosophy when its obvious you're talking about
philosophical >morality. What "should" be done is always subjective because
it begins what one "feels" it should be.
Rob responded:
I was talking about philosophy. I have absolutely no interest in
entertaining the possibility of "objective morality". This is an American
preoccupation, born out of an insanely inflated society-wide
self-righteousness. I am not American, and there is no objective morality -
it's a ridiculous proposition.
Rob wrote:
>> thinking about constructing a solid definition for "intelligence", then
think about how you might program a computer to posess this quality, and
what use it would >> be.
Clint wrote:
> This is a very BAD BAD way to go. Instead work on making it self-aware.
Consciousness has NOTHING to do with intelligence. Many people consider
> me several times more intelligent than most of my peers, does that make me
more conscious than them?
Rob responded:
I was proposing thinking about the implications and benefits of making a
computer "intelligent" to try and wash out some of the AI drivel that keeps
being bounced around the place. I did not mention consciousness - it has
nothing to do with my point.