What Game Theory reveals about the science of cooperation
By Daniel Hunt

Picture this. You’ve moved into halls for the first time. In the beginning, everyone in the flat is on their best behaviour and the place is sparkling clean. But as the weeks go by, one of your housemates begins to show their true, slobbish nature. The dishes pile up in the sink and mould festers in teacups.
What do the rest of you do? Quite often, you stop making an effort. No one buys milk to share anymore. You leave the hob grimy and greasy. The bins overflow to the floor. If your flatmates aren’t cleaning for you, why should you clean for them?
As frustrating as it can be, if this dynamic plays out in your flat there may be some comfort in knowing that you are simply obeying the rules of mathematics.
There’s an area of research called game theory which, as the name suggests, investigates how to make winning decisions. The theory doesn’t only apply to games – it can be used to explain situations such as military posturing or even the setting of prices in the economy.

Robert Axelrod’s game
Imagine you’re invited to a massive tournament by a billionaire. Think Squid Game, minus putting your life at risk. In each game, you face one opponent; the rules are simple. You can choose to cooperate or to cheat. If you both cooperate, you each get $300. If your opponent cheats and you don’t, they alone receive $500. If you both cheat, you only get $100 each.
It’s important that this isn’t a zero-sum game. Unlike Squid Game, you don’t die when you lose a round, but you do stand to lose money. The game involves playing 200 games against 200 different people, and each game has many rounds. You can’t just make it up as you go along; you need a strategy. How do you walk away with the most money possible?
This is the competition political scientist, Robert Axelrod, presented in 1980. He invited researchers to submit their strategies as computer programs using game theory. While the details of each program varied, they all fell into one of two camps. The ‘nice’ strategies never cheated first. Whereas the ‘nasty’ programs always attempted to gain an advantage by cheating.
While some researchers were convinced of the benefits of cooperation and mutual benefit, the more cynical among them believed that being ‘nice’ would open them up to being exploited.
However, the results of Axelrod’s competition were clear. Of fifteen entries, the eight nice strategies beat all seven nasty strategies. When Axelrod repeated the tournament with more players, the nice strategies far outperformed the nasty again.
What’s most illuminating about Axelrod’s research is that a program called ‘tit-for-tat’ won both competitions. While this strategy never cheats first, if its opponent cheats, it retaliates once and then returns to cooperating.
The strength of this strategy lies in its simplicity. Tit-for-tat players aim to cooperate so their opponent knows they may ‘do business’ together. They are forgiving, meaning they can return to cooperating even after a dispute. They aren’t pushovers though. By retaliating, they ensure their opponent knows they aren’t to be taken advantage of.
Tit-for-tat strategies in the real world
This is a style often seen in global politics. The idea of mutually assured destruction suggests that stockpiling nuclear weapons is actually a path to peace because no two nuclear-armed countries will go to war if it leads to their annihilation. Implicitly or explicitly, it’s a belief endorsed by the leaders of countries like Britain, Russia and China. You could argue this classifies as a tit-for-tat game strategy (if you still consider yourself cooperative while you’re holding a gun to your opponent’s head).
Much like a nuclear war, the dirty flat scenario is one of mutually assured destruction. A mouldy plate is a declaration of war that can only lead to the annihilation of your supply of utensils, as they pile up by the dozens in the sink.
But unlike a nuclear war, claiming there’s no coming back might be a little dramatic. If you set up a cleaning rota or, better yet, don’t break the clean streak in the first place, you can maintain a cycle of cooperation.
Nuclear weapons also serve as a more convincing example of cooperation. Ever since the 1980s, a process of nuclear disarmament has reduced the number of nuclear weapons to one-fifth of former levels. It’s a process which has relied on cooperation, particularly between Russia and the United States. If one side were to cheat, it would bring disarmament to a halt. Unfortunately, disarmament appears to have halted anyway due to rising global tensions.
The limitations of Game Theory
Like any scientific model, game theory isn’t the be-all and end-all. It assumes players are as rational as robots – they aren’t motivated by spite, selflessness or greed. It doesn’t apply to zero-sum situations, for which cooperation is impossible. Furthermore, it only works when the game is repetitive – like Squid Game, you can’t have a wider strategy if a single loss spells the end of the game.
Most importantly, the best strategy depends on the environment you’re in. Axelrod’s competition found nice strategies were better on average when they played with both nice and nasty players alike. But if a nice player is placed in a room full of nasty players, they, at best, become nasty too, and at worst, are exploited like no other.
So next time you find yourself dealing with a nasty situation, think of game theory. Foster cooperation if you can, but don’t be a doormat.
Oh, mathematicians. Did we need computer programs for that?