In a message dated 7/2/98 1:01:27 AM Eastern Daylight Time, johnkc@well.com writes:
<< Good point, I should have said, I predict we will never find a deeper reality. >>
Okay. And then I could say that I predict we will. We've both made predictions now. So how does the idea that the best we can hope for is the making of predictions prevent us from discovering "deep reality" (and I'm really sure that I know what that means).
<< >Our brains are different from computers not merely in terms of the
>quantity of connections but in organization and quality.
I think the difference is that present computers are much simpler than
brains,
but computers are getting more complex every day and brains are not>>
You're assuming mental content to be a function only of complexity. I think that there are a lot of reasons not to assume that.
<< You keep making statements about the subjective mental states of cells and computers and lightning and other people, but that's a dead end, I know nothing about such things and never will, all that I can observe is that I am conscious, everything else is conjecture. Intelligent behavior on the other hand I can study and observe in other things.>>
This is interesting. You don't think that you know that other human beings have mental content? You don't know what they mean when they say "I am feeling pain" or such? And you are not sure if lightning is or is not conscious?
<< It doesn't follow, aspects of the world could be information, in fact they must be, and there is no reason there can't be information about that information, or information about that.>>
Information IS an aspect of the world according to the argument I sketched out and there can be information about information within it. So if those are your reasons for justifying the charge that the conclusion of the argument doesn't follow, then the charge fails.
<< >Proving the story [brain in a vat] right or wrong doesn't
>necessarily have anything to do with proving anything more
>fundamental than information.
If you and everyone and everything you know are nothing but software programs then ... >>
Then what? The scenario that we don't know how to reason is simply selfrefuting. I don't see how the argument I sketched relies upon particular physical phenomena. The fact that I think that the world is flat that underneath us is a gigantic Range Rover whose movements cause the wind and earthquakes and such, would not prevent me, even in that situation, from possibly concluding that there must be something about which there can be information before there can be information.
On a separate note, I really think that the brain in the vat assumes a lot less about consciousness than the software analogy, which doesn't I think work necessarily.
<< >>Me:
>>But nothing can provide anything but information.>>
>What about food, or light, or oxygen?
it's not unusual for programs to require certain specific bits of information to keep from crashing.>>
You said that nothing can provide anything but information. I disagreed, and somewhat obviously pointed out that things can provide other things besides information. I have no idea what that reply means.
<< I have a hunch that individual cells are not conscious and I have a hunch that you are, although the only thing in the universe that I know with absolute certainty to have that attribute is me. It's irrelevant however because language doesn't need a frill like consciousness to work. >>
Unless you're going to make every single cause and effect sequence an instance of language, which is strikingly absurd only if the word is not being used in some stretched poetic metaphor, consciousness must be for there to be a language.
<< I wouldn't know, no two lightning bolts on planet earth have ever been identical, and no two target trees for them to hit either. >>
Assume that it does. We can always substitute another hypothetical similar enough to satisfy the argument (a bat and ball, pins and a ball, etc.).
<<A language with an infinite number of letters is as meaningless as a
language
that has only one. Also, the genetic code has a grammar, for example,
3 letters per word, lightning has no grammar.>>
Oh I disagree. If your conception of language is correct, each and every one of those letters means precisely something to the tree which gets hit. If we can say that the "grammar" of a genetic code is the fact that it works in 3 chemical units, then we can just as easily say that the "grammar" of lightning is the angle of it, or even the lack of other bolts hitting the same tree (X bolt communicates and no other bolt communicates, therefore the tree behaves as thus...).
<< >CAU doesn't mean anything to the ribosomes.
Ok if you say so, but tell me one thing, in what way would the ribosome act differently if CAU did mean something to it? >>
It wouldn't, necessarily.
<< >If you hand me a message, AND I understand it, then it has meaning to me. If I do not understand it, it has no meaning to me despite
>the fact that it causes certain reactions in my brain.
But there are lots of messages that will produce nearly identical states in your mind, you will treat one gibberish message much like another even if they're quite different. You'll react to "dxjkhq" in much the same way you would to "kszvbe". In language on the other hand even a small word like "not" can dramatically change the way you react to a message.>>
I agree. I'm not sure how that contradicts the paragraph it responds to though. Was it meant to contradict it?
Andrew