Lyle Burkhead wrote:
> That's the kind of point I'm making in
> Geniebusters. It's a general point about the structure of the
> situation.
>
> But some people just don't get it.
> . . .
> Some students could learn arithmetic, but when you
> gave them an abstract proposition in geometry, they just didn't get
> it. They would come to that point and stop, like an ass stops
> before a bridge.
Interesting. Suppose that you yourself had been introduced to geometry by a math teacher, a teacher who thought you were smart enough to appreciate what a triangle might be, by analogy with something a bit more familar. Say the teacher made reference to a triangular corner cut from a piece of construction paper, for instance. Now after thinking about this a little, let's say that your reaction was to write an essay called "Construction Paper Busters" where you insist that whatever the future of triangles may hold, it would be a mistake to believe that triangles are construction paper! Who isn't "getting it" in a situation like this?
In the actual situation of nanotech discussions, which I think we are supposed to be talking about, the comparison of AI's to genies appeared in Drexler's _Engines of Creation_, in a paragraph near the end of the chapter titled "Thinking Machines". This short paragraph opens by stating that "eventually", "some AI systems" could be expected to have great technical and social abilities. The paragraph then says that "given charge of enough energy, materials, and assemblers" such a system could aptly be called a "genie machine", with the quote marks provided by Drexler himself, in this case. In other words, it's a comparison, or analogy, to relate this advanced prospect to something more familiar, the "genie" of legend. Finally, the paragraph ends by warning about the "danger" of this particular kind of "engine of creation". Nothing here suggests that Drexler has confused AI's with genies. More importantly, nothing suggests that Drexler thinks that technology will solve every problem for us, which is pretty much the confusion underlying Lyle Burkhead's "Geniebusters" idea.
Unfortunately, I've seen some signs, myself, that some people respond far too literally to Arthur Clarke's dictum, that "a sufficiently advanced technology is indistinguishable from magic". For instance, I've read some things by Vernor Vinge, for instance, that seemed interesting, even scary, at first, concerning some sort of unstoppable takeover by AI's, only to do a double take on why he should assume that a particular AI would get all the world's resources, unopposed. The point is, if some people think AI's may have absolutely magical power, that's not Drexler's problem! You've really missed something if you think that this is what he means in speaking about programmable assembers, nanofactories, or what have you? Anyway, while we're discussing this, let's try for a major advance in geometry -- "Construction Paper Busters", anyone?
David Blenkinsop <blenl@sk.sympatico.ca>