Re: Birth of a Thinking Machine (fwd)

From: Samantha Atkins (samantha@objectent.com)
Date: Sat Jul 14 2001 - 18:24:41 MDT


It has been a while since I read about Cyc but as I understand
it Cyc might very well understand "ball" as a "spherical object
generally used in a variety of games and having various
properties depending on the game" and "red" as the apparent
color of the ball and the relationship of red to other colors.
But these are facts, definitions and relationships that are
part, I think, of what the project includes in their
ontologies.

Do they not go at least this far?

- samantha

"Eliezer S. Yudkowsky" wrote:

> Actually, what I said was that Cyc didn't understand the sentences that
> were being programmed in, because it didn't understand the words. It
> doesn't help to correctly parse the syntax of the sentence "the ball is
> red" as "apply(adjective: 'red', noun: 'ball')" if the AI doesn't have a
> visual cortex, an abstracted symbol for 'red'ness, and a prototype image
> for 'ball'. Cyc conceivably has abstract knowledge about which symbols
> tend to be associated with each other, and occasionally, how those symbols
> fit into more abstract network-like properties such as causal
> relationships. But the symbols are still pretty much devoid of content by
> human standards. Cyc has a tremendous amount of syntax but still very
> little in the way of semantics, except for those symbols that are
> fortunate enough to refer to fundamentally network-like or logic-like
> referents. So Cyc *might* understand "before", "after", "cause", or even
> "supercategory", but not "ball" or "red".
>
> -- -- -- -- --
> Eliezer S. Yudkowsky http://singinst.org/
> Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:08:46 MST