Re: Singularity Card Game Alpha Test

From: Anders Sandberg (asa@nada.kth.se)
Date: Tue Mar 05 2002 - 06:55:28 MST


On Sun, Mar 03, 2002 at 09:28:40PM -0800, Adrian Tymes wrote:
> Someone asked for the rules to this. Like I said, there were no formal
> rules, but here's my best shot at making some up.
>
> http://www.wingedcat.org/singulcard/
>
> Feel free to forward this URL to anyone else that may be interested.
> If I get enough suggestions - *especially* ideas on how to fill up the
> card list - I think I know someone who could get this tested and
> published (Cheap-Ass Games, or a similar publisher). But that won't
> happen with just four cards on the list; I'd need at least forty (or
> somewhere in that neighborhood) before I'd approach them with this.

I like the basic simplicity of the game, a bit like Nim. The
question is whether to keep it simple and clean, or more complex but
also more related to real ideas about technological futures.

My main problem with it is the technological and economic
assumptions. Is the singularity really a goal in itself, and why
would a disaster automatically happen if the points are unbalanced?
It seems to assume technology is developed because somebody just
decides to, and that things can be perfectly banned. I would like to
make the game a bit more Illuminati-like, even if this may lose
some of the elegance.

What about this: players have "Money" (money/power/knowhow) that can
be invested. You get a certain amount of money for playing certain
cards (representing that you invented something, gained a temporary
monopoly or became the big expert in the field), and pay money for
certain cards (like bans and unbans). This means that it might be
tempting to play a powerful card, even if the risks increase. In
fact, you may have to do that in order to get enough money to affect
a ban.

Bans cost money, proportional to the importance of the technology or
how hard it is to ban - banning basement technology is inherently
more expensive than banning nuclear weapons. Some cards make banning
inherently harder (like the Internet), some easier (like Global Law
Enforcement). They might be not just bans, but regulations and other
"softer measures".

The goal should be to make the situation such that everybody can
profit from developing dangerous technology, but that cooperation is
necessary to uphold bans. Some technologies that are very useful
(like nanoimmune systems) also require dangerous technologies.

The game ends in a disaster if too much risk points are accumulated.
For example, Nano-manipulators have just +1 Nano risk (they are in
themselves not very risky) while Desktop Fabrication has perhaps +3
risk (it is an application) and Free Replicators +10 risk. Risk can
be reduced by developing safeguard technologies such as Nanoimmune
Systems (-10 Nano Risk) and policies such as Secure Nanolabs (-2
Nano Risk). So the race is to gain enough points while not allowing
the risk to grow too large.

I would like to add a fourth kind of risk: Social risk. While
technologies may cause other risks, bans make society more
dangerous. If the Social risk goes too high, we end up with a social
disaster. The risks can easily be represented by markers moved along
a line.

Each turn, a card is also drawn from the deck that is played by the
"world" - possible events and developments not foreseen by the
players. If it is not a valid play it will vanish, but if the
technology or event is possible to play it will be played.

Some possible cards based on this (I have not included points
seriously here, since I'm more interested in general ideas of
contents and gameplay):

Tech: Internet: +2 Robo, +2 Social Risk. Ban cost: 10

Tech: Ubiquitious Law Enforcement: +10 Social Risk, +5 Money,
requires Distributed Processing and Nanosurveillance. Ban cost: 5

Tech: Distributed Processing: +1 Robo, +2 Money, requires Internet.
Ban cost: 5

Tech: Friendly AI: +4 Robot, -5 Robot Risk, +5 Money. Requires
Cognitive Engineering. Ban cost: 5

Tech: Free Replicators: +5 Nano, +5 Money, +10 Nano Risk. Ban cost:
7

Tech: Nanoimmune Systems: +3 Nano, +5 Money, -10 Nano Risk, requires
Nanomedicine and Distributed Processing. Ban cost: 7

Tech: Space Habitat: Decreases all risks by 5, -10 Money. Ban Cost:
2

Unban: Treaty Defection : Unbans a technology, and gives the player
the money on the card.

Unban: Freedom of Speech Lawsuit: Unbands a technology, -10 Money

Ban: Oversight Committe: Halve the risk of the controlled
technology. -5 Money

Event: Secure Nanolabs Protocol: -2 Nano Risk, -5 Money

Event: Global Law Enforcement: Halved cost of bans, +5 Social Risk

Event: Unexpected Synergy: Increase all points and risks by 3.

Event: Replicator bug: If free assemblers are played, increase their
Nano Risk by 2.

Event: High Tech Terrorist: If played by the world the Bio or Nano
Risk is within 5 points of disaster, the game ends in disaster
(otherwise the card has no effect). If played by a player under the
same conditions, that player can destroy the other players and win.
Under other conditions may be played to provide a free ban of any
technology.

-- 
-----------------------------------------------------------------------
Anders Sandberg                                      Towards Ascension!
asa@nada.kth.se                            http://www.nada.kth.se/~asa/
GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y


This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:46 MST