We show that computing a Nash equilibrium is an intractable problem. Since by Nash’s theorem a Nash equilibrium always exists, the problem belongs to the family of total search problems in NP, and previous work establishes that it is unlikely that such problems are NP-complete. We show instead that the problem isas hard as solving any Brouwer fixed point computation problem, in a precise complexity theoretic sense. The corresponding complexity class is called PPAD, for Polynomial Parity Argument in Directed graphs, and our precise result is that computing a Nash equilibrium is a PPAD-complete problem.

In view of this hardness result, we are motivated to study the complexity ofcomputing approximate Nash equilibria, with arbitrarily close approximation. In this regard, we consider a very natural and important class of games, calledanonymous games. These are games in which every player is oblivious to the identities of the other players; examples arise in auction settings, congestion games, and social interactions. We give a polynomial time approximation scheme for anonymous games with a bounded number of strategies.

That is from the abstract of Constantinos Daskalakis's thesis [pdf], underlining emphasis mine. I had a tweet about this earlier this week, but I wanted to say a bit more about it. Since we can view the market as an algorithm to solve a computational problem (I like this blog post and there is also this paper), the market is probably not a Nash equilibrium. What is also interesting is that the Nash equilibrium problem is as hard as solving the Arrow-Debreu equilibrium problem (an application of the Brouwer fixed point theorem) ... although that shouldn't be surprising since the proof of existence of Nash equilibria can also proceed via Brouwer's theorem.

Daskalakis proceeds to look at algorithms to find approximate Nash equilibria, but this result makes me think that economic equilibria may not be even

*approximate*Nash or Arrow-Debreu equilibria. The process of tâtonnement (an algorithm for finding approximate equilibria) would proceed until you reached either a satiation point (one all parties can live with) or simply the most likely point (the average of all possible system configurations) ... i.e. the maximum entropy point (see also this).
Also, one of the ways to "break" intractability is with randomness -- that is the idea behind

Economists will probably just ignore this like they ignore the SMD theorem [Kirman, pdf], though.

*Monte Carlo*algorithms -- see this old*Scientific American*article [pdf]. Remember when*Scientific American*was good?Economists will probably just ignore this like they ignore the SMD theorem [Kirman, pdf], though.

Steve Keen would probably like this. And if Keen is to be believed about Austrians, so would they.

ReplyDeleteI am not sure I understand. Is this in reference to the non-mathematical approach?

DeleteNo, in reference to having more ammo to attack economists with. I could see him saying your final sentence for example.

DeleteHe already complains about them ignoring the SMD theorem.

Delete"Also, one of the ways to "break" intractability is with randomness..."

ReplyDeleteSounds a bit like the use of dither to resolve difficulties in signal processing?

That is an interesting analogy!

DeleteThe reason I raise the above is I understand professionally you work with signal processing algorithms.

ReplyDeleteHenry

O/T: While Googling Fielitz and Borchardt (to see if they perhaps published any other examples, or if anybody else out there found applications), I ran across this write up on you Jason. The folks there are sure not fans of conflating information theory and thermodynamics! For example:

ReplyDeletehttp://www.humanthermodynamics.com/TNETIT.pdf

The person who runs that site commented here awhile ago (no idea where it is now) and told me that he put that up there.

DeleteI have no idea what his issue is with "conflating" the two. There is no technical problem, and I asked what bad idea would end up in one's head from "conflating" the two but got no response.

What is funny is that in that pdf link, the author says:

"Equations that have the same ‘form’ but different functions, as used in different fields, as mentioned, are

what are called mathematical isomorphisms. This is another area of common confusion: people thinking that just because two equations are isomorphic that they are the same."

Actually, that is exactly what the existence of a mathematical isomorphism means. Much of the 20th century progress in physics comes from treating mathematical isomorphisms with respect.

If the equations governing two systems are isomorphic, the only way the systems wouldn't be the same (in some way) is if one or the other equations doesn't describe the system accurately -- i.e. information theory or thermodynamics is wrong.

But thermodynamics is just information theory that is subjected to constrained optimization with a particular Lagrange multiplier (temperature).

I can only conclude that it is some ineffable quality that "true thermodynamics" has that thermodynamics derived from information theory doesn't. Like monetary policy effectiveness or awesomeness.

Thanks for sharing your thoughts! "Conflating" was my word: my perception of what that author would have said.

DeleteHere's his comment.

DeleteI am not a physicist nor information theorist, but IMHO Fielitz and Borchardt have made a very important discovery. The fact that information transfer relatiohships can completely describe important physical phenomena is amazing. I think that more important is that information transfer relationships in many cases provides a better understanding of a system's dynamics that even a complete physical description can't provide. It really establishes that information transfer is perhaps of primary importance in physical phenomena, and therefore likely to be a profound descriptor of any applicable system.

DeleteTodd, after writing a paper with Jason, how well would you say you understand the information transfer relationships now?

DeleteOne of the takeaways I get (from F&B) is basically this (I'll put it in quotes, even though it's only my attempt to paraphrase them):

"Square laws and exponential laws abound in nature, and information equilibrium is a possible way to explain many of them"

You just need (as Jason has explained)

"

But there aren't that many criteria to check

"Two process variables

Micro scale degrees of freedom

Plausible information transfer channel

So the deciding factor is usually just looking at empirical data.

Hi Todd,

DeleteThe latter part of your comment is not true of any physical systems ... Do you have any examples?

The formalism represents a massive simplification. I don't think there are many cases where that kind of simplication leads to better understanding ... Even in the economics examples, it's essentially a reformulation of supply and demand.

I think it is useful in Econ because they've made some assumptions that they can't seem to question (utility maximization) ... Maybe it could shake it up.

Even in the EEG, it's another way to understand the well known power law distributions.

... And you see what it takes to trigger my boosterish enthusiasm mitigation mode. Excitement is cool, but skepticism is always warranted.

In my comment "Square laws" should be "Power laws."

DeleteI was thinking of F+B's diffusion transfer and K-Trumpler effect results, not of anything else. In diffusion transfer, information transfer relations led to the correct understanding, which is irrespective of the underlying diffusion constant of the media. So yes I am a believer, but not because of you or me ;)

ReplyDelete