Eliezer S. Yudkowsky wrote:
> Since I'm one of the people who may wind up actually deciding the seed AI's
> default motivations (in the event there isn't a forced solution), I've
> given some thought to the issue. I have two possible solutions:
> 2) Overlay informed consent. We may find it very difficult to conceive of
> simultaneously "knowing" and "not knowing" something, but I can imagine a
> cognitive architecture which would "protect" the core Buffy processes while
> maintaining the awareness and processing of the external self. Any given
> sequence of cognitive events, including emotional bindings dependent on the
> belief that Sunnydale is real, would proceed as if the knowledge that the
> world is a simulation did not exist, and memories of that experience would
> be formed; however, a smooth blend between that untouched core and the
> external awareness would be maintained. Thus you could remain "you" while
> being someone else.
This is the mental model that I've always used when thinking about sub-
selves that may need to operate in an environment in a manner consistent
with them being autonomous (such as doing anthropological research on
humans...)
-- Stirling Westrup | Use of the Internet by this poster sti@cam.org | is not to be construed as a tacit | endorsement of Western Technological | Civilization or its appurtenances.
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:04:02 MDT