From: Adrian Tymes (wingcat@pacbell.net)
Date: Tue Mar 05 2002 - 19:33:53 MST
Anders Sandberg wrote:
> OK. But if you do not create a game that is playable and fun, it won't
> matter what memes you try to spread through it.
>
> I would love to have a very clean game that manages to convey core
> transhuman ideas.
On both of these, we are in agreement.
>>>The goal should be to make the situation such that everybody can
>>>profit from developing dangerous technology, but that cooperation is
>>>necessary to uphold bans. Some technologies that are very useful
>>>(like nanoimmune systems) also require dangerous technologies.
>>
>>Prerequisites are already explicitly allowed for. Adding in the profit
>>motive taints the meme with "tech is developed so the rich get richer",
>>not "tech is developed to make the world a better place". While they
>>are both true, it is better to promote only the former meme, not the
>>latter.
[-_- Of course, I meant that the other way around, though my intended
meaning seems to have been understood.]
> I think it is actually a bad thing. First, it spreads an erroneous image
> of how technology is developed (technology is built because somebody
> somewhere decided mankind should have it, with no links to economics,
> culture or happenstance, so technology can simply be regulated by
> controlling that guy), and this image is already doing a lot of damage
> to the transhuman project (for example political attempts at
> relinquishment). The altruist model of tech development is even worse,
> since it so easily borrows itself to centralism and utilitarian
> projects. Both are dangerous oversimplifications, missing the true
> complex mixture of motivators of development - people develop
> technology both due to altruism, profit, out of curiosity and random
> ideas, as shaped by the surrounding society.
We can only pump so many memes into a single simple game. Perhaps
another one can speak to the realities of tech deployment.
> Having a competition element is good for motivating people; the trick is
> to make the game a complex mixture of cooperation and competition.
Hmm. Perhaps if we pump up the random element? That is, when it is
the world's turn to play, play as many cards as there are players. You
can pass on your turn if you want, shuffling your hand into the deck and
drawing a new hand...but the world will continue apace. You can either
enjoy the ride, or try to steer it towards your goal. For any given
technology or event, you may be given the opportunity to put it into
play...but if you don't, someone else eventually will.
The problem here would seem to be that it gives too much of a chance
of world-caused Singularity or Catastrophe, which means a good chance
that no one can win.
>>>The game ends in a disaster if too much risk points are accumulated.
>>>For example, Nano-manipulators have just +1 Nano risk (they are in
>>>themselves not very risky) while Desktop Fabrication has perhaps +3
>>>risk (it is an application) and Free Replicators +10 risk. Risk can
>>>be reduced by developing safeguard technologies such as Nanoimmune
>>>Systems (-10 Nano Risk) and policies such as Secure Nanolabs (-2
>>>Nano Risk). So the race is to gain enough points while not allowing
>>>the risk to grow too large.
>>
>>The point is that the knowledge itself, and the world's possession of
>>it, *is* the risk. And the promise. The two are the same.
>
> Remember the goal. If you claim this, then many players will realize
> that the sure way of winning the game is not to play the game. What are
> the benefits of the technologies? To the player there are none in the
> current form, just an increase of abstract numbers. Even a "Clinical
> Immortality" card doesn't cut it as a motivator, since it would maybe be
> a motivator for the humans in the game world, but does not per se mean
> much for the player. The singularity is just the end, and has no other
> meaning. Instead of promoting progress the message seems to be that
> increasing these technologies increases risk, and the singularity is
> just someone winning. That doesn't seem very extropian to me.
Actually...re-read the conditions. When you have X points on average, Y
points will put you over the edge. When you have X*2 points, you need
Y*2 points. And any single card has the same value. Thus, deploying
new tech increases the tolerance, on average.
And as for "not to play", i.e. pass each turn - that's taking a gamble
that the world will not self-destruct from its own plays. Plus, check
the scoring: Singularity - everyone wins, with someone perhaps winning a
bit more; Catastrophe - everyone loses equally.
> Perhaps a better way of handling it would have a "tolerance" level for
> the different points. It represents how much the world can handle, what
> institutions can deal with issues and how much people have adapted to
> it. The points must remain below the tolerance for the world to work;
> tolerance is increased using either safety technologies or perhaps
> direct payment of money (representing the building of institutions). To
> reach singularity the world needs to adapt. This seems to be closer to
> the memetic goal of the game and transhumanism.
So why not just do pure tolerance (at least, as pure as you can get)
*and* supress all new technologies? This seems to be just the same
abstract numbers you were objecting to.
>>>I would like to add a fourth kind of risk: Social risk. While
>>>technologies may cause other risks, bans make society more
>>>dangerous. If the Social risk goes too high, we end up with a social
>>>disaster. The risks can easily be represented by markers moved along
>>>a line.
>>>
>>Problem: what is a disaster for some may be heaven for others. The
>>world will not end if, say, America becomes a police state under martial
>>law, even if it would suck. The world would recover from such a state
>>within (current) human lifespans.
>
> The biotech gray goo scenario of Greg Bear's _Blood Music_ seems rather
> nice from my point of view - does this mean that we should regard the
> Bio/Nano disaster in the game as similarly relative?
No, because the biotech gray goo you refer to is not the one I'm
referring to. Perhaps I should specify "mindless gray goo".
> I think it is
> important not to isolate the game from social reality. If it is intended
> to convey a transhumanist point of view it better demonstrate that
> technology alone isn't enough, we better make sure our culture isn't
> turned into something nasty as we advance.
Again, define "something nasty". For any given permutation, some of the
audience will be predisposed to think it's actually a *good* thing...so
better to just avoid that topic entirely, no?
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:47 MST