2009-06-18

Overthinking

There are a few things about Game Theory that always tend to fall down when the models are tested with actual people being rational actors. In most cases, the players will act in "sub-optimal" ways. In Prisoner's Dilemma trials, people tend to keep quiet as opposed to ratting. In Centipede Game trials, players continue playing for several turns.

While there are certainly many reasons for this sort of behaviour, and many theses are written on the subject of limited rationality, non-perfect information, etc, I think a lot of this behaviour boils down to one thing: When making actual decisions, humans tend to not overthink. And really, that's what Game Theory is a lot of the time: Taking a situation where there are consequences and rewards for acting in certain ways and finding the optimal way to play. But humans just don't do that kind of overthinking when actually sitting down to play one of these games.

People who do such overthinking(they might rather think of it as being efficient) are always flabbergasted by amateurs and underthinkers who pull off an amazing move despite their inefficiency. Watch a professional poker player play an amateur and you'll see what I mean.

One of my favourite examples of where this kind of overthinking works against someone also comes from one of my favourite webcomics: The Order of the Stick. Specifically, this comic, where Redcloak the Goblin is interrogating O'Chul the Paladin.

Redcloak is making a terrible leap in his logic: That everyone thinks the same way he does, that everyone makes their decisions based on optimality. He even comments a couple comics later that "Apparently, [the paladins] reserved their efficiency for killing goblin women and children."

Redcloak is unable to comprehend that decisions are made for a whole host of reasons beyond simply greedy optimality; in this case, the decision was made as part of a final promise between friends.

So the moral of the story is: Don't overthink things. You should still think through your actions, prepare for contingencies, etc, but don't overdo it. Only Game Theorists and computers have time for that sort of math.

2008-12-07

The God Game

Yes, yes, it's been a while. Travel got in the way.

I had a conversation on a discussion board I frequent about the nature of God. What does God want? How do your reconcile free will? All that jazz.

I then had a thought that since we were discussing possible motivations and actions of God, some Game Theory might be in order. So if one is willing to look beyond the fallacy of applying human definitions to a deity, here is some analysis to help explain why God does or does not do something.

There are three basic assumptions about any given Game Theoretic player:
1. They know the rules of the game. While this is a problematic assumption for real world experiments, I think it's pretty safe for God.
2. They are self interested. That is, they are most interested in improving their personal utility. This is tricky and we will get to it later.
3. They are rational. They can compute and compare the different options available to them. Depending how much potency you ascribe to God or Gods, this can be a pretty safe assumption.

But because God is not mortal, one must discuss what other assumption we can ascribe to God.
Omniscience is common. In this context, it would mean supreme rationality. God has no need to compute and compare options in order to find the optimal solution, God just knows. Thus, God knows what the optimal solution is in each possible situation and has no reason to not do it.
Omnipotence is also common. But what would this mean as far as Game Theory is concerned? Basically, it means God can cheat. God can change the number, nature, value, etc of the players, the options, etc to whatever. If this is so, why would God not do this?

But we must come back to the issue of self-interest. What is God interested in? Often people will ascribe human emotions and motivations to the Gods of their mythos(Greek and Norse Gods are great examples) and as a result, their goals are not unlike human goals on a grander scale. Thus there is not much surprise that Gods like that are often jerks. Greek Gods in particular were notoriously spiteful.

But what really is the goal of God?

Since God is often portrayed as a creator, it would also make sense to think of God in a Mechanism Design situation, as a central authority manipulating a game so that the players act in a specific way. This is actually a very apt position for God.

Omniscience and Omnipotence still beg the age-old questions, though. Why not interfere? Why such poor design? What is the goal? And thus the question of motivation comes back.

While we can't necessarily assume what God's interest is, we can be sure that God is doing it optimally.

2007-07-13

Throw the Fight.

A staple of sports movies and conspiracies, paying off someone to give up a competition has its own interesting game theoretic psychology.

For simplicity, I will consider a 2-person competition, from the eyes of one competitor, ie, the one getting bribed.

Game: Player P is in a 2-person competition(boxing, chess, whatever) where he has a percentage chance of winning equal to w, the prize for winning being worth c, the prize for losing being 0(for simplicity).

P is approached by some people who have vested interests in P losing. They offer P a reward of b for losing on purpose.

P now has a choice to make, and so we will look at a couple cases depending on P's attitudes to determine whether or not P should take this offer.

CASE 1: P cares only for his own utility.
In this case, it's a fairly simple matter of comapring possibilities.
If he denies the bribe, his expected utility is w*c. If he takes the bribe,his utility is b.
If b>=w*c, then it is clear P should take the bribe, because while c by itself may be greater than b, there is still the chance that P will lose anyway, getting a utility of zero.
If b<(w*c), then it is simply a matter of how much of a gambler P is. If w is greater than 1/2, then it is wise to decline the bribe, since then he would have a good chance to win c. If w is less than 1/2, then it is wiser to take the bribe, since the sure prize of b is better than a slim chance of winning c.
But, as I said, it depends on how much of a gambler P is.

CASE 2: P discounts utility for pride.
IE, P has a distaste for the dishonesty of the bribe. She would prefer to win or lose by her own power.
There is a discount factor d, where d is P's "pride". It is added when she refuses, and subtracted when she accepts the bribe.
So, her expected utility for not taking the bribe increases is w*(c+d) + (1-w)d = (w*c)+d.
Her utility for taking the bribe is b-d.
If b>=(w*c)+2d, then b-d>=(w*c)+d and we are in a similar situation to above. This represents the "everyone has their price" situation. P would be hard pressed to refuse such an offer, since her "pride" has been factored into the bribe.
If (w*c)+d <= b < (w*c)+2d, then b-d<(w*c)+d, but it's still not a clear cut decision, because (w*c)+d is still only an expected utility. "That's pride fuckin' wit' you" as Marcellus Wallace from Pulp Fiction would say. There's no clear decision to make here, unfortunately, and such decisions are tests of one's character.

All this is not even taking into account possible legal repercussions of the bribe, but I think this is a decent enough simulation with out it. Pride in a competition is a tricky thing, and even if you are unscrupulous and selfish, there are still insufficient prices.

Until next time, cheers.