Re: Singularity Card Game Alpha Test

From: Anders Sandberg (asa@nada.kth.se)
Date: Thu Mar 07 2002 - 07:45:06 MST


On Tue, Mar 05, 2002 at 06:33:53PM -0800, Adrian Tymes wrote:
> Anders Sandberg wrote:
>
> We can only pump so many memes into a single simple game. Perhaps
> another one can speak to the realities of tech deployment.

A good idea. Maybe I could try out writing my own version, if you don't
mind? My version would likely deal less with the race towards
singularity and more with the politics and economics of technology.

> >Having a competition element is good for motivating people; the trick is
> >to make the game a complex mixture of cooperation and competition.
>
> Hmm. Perhaps if we pump up the random element? That is, when it is
> the world's turn to play, play as many cards as there are players. You
> can pass on your turn if you want, shuffling your hand into the deck and
> drawing a new hand...but the world will continue apace. You can either
> enjoy the ride, or try to steer it towards your goal. For any given
> technology or event, you may be given the opportunity to put it into
> play...but if you don't, someone else eventually will.

This is a good idea. Even if you are a die-hard luddite you better
participate than just sit back. Depending on the size of the deck
certain cards will likely not re-appear very often, so getting rid of
them is semi-permanent.

Hmm, maybe one could even play this game luddite-wise: you try to ban
all technologies so that singularity *and* disaster are impossible. For
a luddite win to happen all technologies enabling new stuff have to be
banned, and these bans have to be upheld for a certain time. Sounds like
a real challenge.
 
> The problem here would seem to be that it gives too much of a chance
> of world-caused Singularity or Catastrophe, which means a good chance
> that no one can win.

> >Remember the goal. If you claim this, then many players will realize
> >that the sure way of winning the game is not to play the game. What are
> >the benefits of the technologies? To the player there are none in the
> >current form, just an increase of abstract numbers. Even a "Clinical
> >Immortality" card doesn't cut it as a motivator, since it would maybe be
> >a motivator for the humans in the game world, but does not per se mean
> >much for the player. The singularity is just the end, and has no other
> >meaning. Instead of promoting progress the message seems to be that
> >increasing these technologies increases risk, and the singularity is
> >just someone winning. That doesn't seem very extropian to me.
>
> Actually...re-read the conditions. When you have X points on average, Y
> points will put you over the edge. When you have X*2 points, you need
> Y*2 points. And any single card has the same value. Thus, deploying
> new tech increases the tolerance, on average.

But this seems to assume that "the solution to tech problems is more
tech", and that the true goal should be some kind of balanced form of
technology. Personally I don't see why you can't have a singularity
based only on biotech (maybe something like Blood Music or the Edenists
of Peter F. Hamilton), AI or nanotech. This might be more of a game
solution, of course - it is rather neat to have this kind of gliding
thresholds.

> >Perhaps a better way of handling it would have a "tolerance" level for
> >the different points. It represents how much the world can handle, what
> >institutions can deal with issues and how much people have adapted to
> >it. The points must remain below the tolerance for the world to work;
> >tolerance is increased using either safety technologies or perhaps
> >direct payment of money (representing the building of institutions). To
> >reach singularity the world needs to adapt. This seems to be closer to
> >the memetic goal of the game and transhumanism.
>
> So why not just do pure tolerance (at least, as pure as you can get)
> *and* supress all new technologies? This seems to be just the same
> abstract numbers you were objecting to.

Sure, but given the assumptions about costs I had made, every player
would eventually run out of money. Then they would not be able to
prevent new tech from emerging.
 
> >>>I would like to add a fourth kind of risk: Social risk. While
> >>>technologies may cause other risks, bans make society more
> >>>dangerous. If the Social risk goes too high, we end up with a social
> >>>disaster. The risks can easily be represented by markers moved along
> >>>a line.
> >>>
> >>Problem: what is a disaster for some may be heaven for others. The
> >>world will not end if, say, America becomes a police state under martial
> >>law, even if it would suck. The world would recover from such a state
> >>within (current) human lifespans.
> >
> >The biotech gray goo scenario of Greg Bear's _Blood Music_ seems rather
> >nice from my point of view - does this mean that we should regard the
> >Bio/Nano disaster in the game as similarly relative?
>
> No, because the biotech gray goo you refer to is not the one I'm
> referring to. Perhaps I should specify "mindless gray goo".

Sure. But how is this different from the social risk described by Orwell
as "If you want a picture of the future, imagine a boot stamping on a
human face--for ever"? With ubiquitious law enforcement, paranoid
culture and AI enforcement you could get it even if all AI is obedient,
all biotech under control and the nanotech under lock and key. It might
even be self-reinforcing and impossible to get rid of. It is the social
version of gray goo, a permanently entrenched society that does not
promote human growth.

What I worry about in your system of bans, is that it suggests that
banning technologies is a good thing and that it does not carry any
cost. If antibiotics or the Internet are banned, in the real world this
would cause hundreds of thousands of deaths and billions in economic
losses. In the game it would remove a few Bio or Robot points.

> >I think it is
> >important not to isolate the game from social reality. If it is intended
> >to convey a transhumanist point of view it better demonstrate that
> >technology alone isn't enough, we better make sure our culture isn't
> >turned into something nasty as we advance.
>
> Again, define "something nasty". For any given permutation, some of the
> audience will be predisposed to think it's actually a *good* thing...so
> better to just avoid that topic entirely, no?

It is your game, and you may do with it as you like, but I think leaving
out issues like this would make it less interesting and actually less
likely to spread the positive memes you would like to spread. We already
have enough games and scenarios around where technological development
is pursued for its own sake, and far too few that dares to look at how
society interacts with technology.

This is one of the most obvious weaknesses of transhumanism today, and
many of our critics latch on to it: we do not integrate our
technological visions with social visions, and that either suggests that
we do not care the least about the rest of humanity, that we naively
think technology is the only thing that matters or that we have unsavory
social visions we do not reveal. All three views are wrong, but we have
to show them untrue ourselves.
 

-- 
-----------------------------------------------------------------------
Anders Sandberg                                      Towards Ascension!
asa@nada.kth.se                            http://www.nada.kth.se/~asa/
GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y


This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:49 MST