Re: Galaxy Brain Problem

From: Geoff Smith (geoffs@unixg.ubc.ca)
Date: Wed Aug 13 1997 - 15:42:47 MDT


On Wed, 13 Aug 1997, Anders Sandberg wrote:

> On Wed, 13 Aug 1997, Geoff Smith wrote:
>
> > As someone in another thread pointed out, game theory does not
> > apply to post-singularity entities.
>
> Huh? Could this someone please explain why it would not apply after
> the singularity? Many situations in game theory will not be changed
> if the players are ultra-intelligent (in fact, game theory often
> assumes the players are very rational, more rational than most humans
> are).
>

I was wondering how this idea was put into my head, so I looked
back in the list a bit, and found this quote from Eliezer S. Yudkows:

> I'm just not sure that game theory applies to posthumans. It's based on
> rather tenuous assumptions about lack of knowledge and conflicting
> goals. It works fine for our cognitive architecture, but you can sort
> of see how it breaks down. Take the Prisoner's Dilemna. The famous
> Hofstadterian resolution is to assume that the actions of the other are
> dependent on yours, regardless of the lack of communication. In a human
> Prisoner's Dilemna, this, alas, isn't true - but assuming that it is,
> pretending that it is, is the way out.

So this is the man to talk to ;)

I'm not knowledgeable enough in this area to lean either way.

geoff.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:43 MST