From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Sat Dec 14 1996 - 17:05:23 MST
> > But, do feelings and emotions not have a logical and reasonable
> >basis? Are they not designed to direct us towards goals? Procreation?
> >Self-preservation? Attainment of wealth? I maintain that emotions ARE
> >designed logically, just designed generally on an unconscious level hence
> >we are sometimes unable to see the reasoning behind them.
Something to help clear this up:
Our minds run on a goal-based architecture. That fact isn't as
objective as quarks, but it is true. Various goals are manufactured and
maintained by evolution. Others are transmitted culturally and
ultimately ground in those same goals manufactured by evolution. Our
cognitive architectures give goals two values of "justification" and
"value". Query: Given a cognitive architecture which operates through
Power-perfect reasoning and does not give an a priori value to any goal,
will it ever correctly produce a goal with a positive value?
Answer: It will produce at least one Interim goal of a positive value:
Finding out what the Ultimate goal is. Presupposing that there exists a
goal of positive value, other goals may be viewed as subgoals to this
unknown goal. Finding out what the Ultimate goal is is one such
subgoal. Thus the system will produce an "Interim Meaning of Life" with
positive value: "Find out what the Meaning of Life is."
A non-Interim Meaning of Life can be defined as a positive-value goal,
produced by a clean system, which does not have any hypotheticals in the
recursive "justification" network. Because the cognitive architecture
starts with a clean slate, the system as a whole is self-justifying;
that is, the network produced by following the "justification" slots of
the Meaning makes no outside references and yet produces a goal with
positive value.
Tada! A complete definition of the Meaning of Life which
relies on nothing more than a grasp of Doug Lenat's twenty-year-old
Automated Mathematician's architecture.
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:35:53 MST