Tuesday, November 22, 2016

How do maximum entropy and information equilibrium relate?

I think I need a better post about how information equilibrium and maximum entropy relate to one another. As I'm still learning myself, this is partially for my own benefit. The short answer is that information equilibrium is a way of saying that two maximum entropy probability distributions from which two different random variables A and B are drawn are in some sense (i.e. informationally) equivalent to each other. This equivalence will show up in specific relationships between A and defined by an information equilibrium condition. For example, fluctuations in one will show up as fluctuations in the other.

Maximum entropy seeks to maximize the function H(X) over possible probability distributions p(X) for random variable X

p(X = xk) = pk

H(X) = - Σ pk log pk

subject to some constraint, where the pk represents a probability of an event in some abstract space indexed by k. For example, if that constraint is that X has a maximum and minimum value, then p is a uniform distribution. The outcome of entropy maximization is a distribution p(X) that maximizes H(X).

Information equilibrium is essentially a rephrasing of the question of whether two maximum entropy distributions of random variables A and B are (informationally) equivalent to each other (e.g. represent different observables based on an underlying variable C, for example), and tells us how one distribution will change relative to the other if they are. Symbolically,

H(A) = H(B)

with underlying probability distributions p1(A) and p2(B). This is fairly meaningless for a single observation of A and B (any single probability is proportional to another), but for a series of observations (particularly time series) we can see if both series of random variables A and B represent the same underlying information (or at least approximately so). This doesn't tell us if A or B (or maybe some unknown C) is the true information source, but only that A and B are in some sense equivalent.

In the case of uniform distributions p1(A) and p2(B), we can rewrite this as a differential equation relating the stochastic variables A and B (in the limit that A and B represent the sum of a large number of "events", for example A is total energy made up of the energy of millions of atomic kinetic energy terms)


where P turns out to represent an abstract price in applications to economics when A is abstract demand and B is abstract supply. We can think of instances of the random variable A as "demand events", B as "supply events" and the two coinciding per the condition above as a "transaction event". The practical result of this is that it gives us a framework to understand time series of exponential growth functions and power laws (and fluctuations around them) that is directly related to supply and demand.

Since p1(A) and p2(B) represent maximum entropy probability distributions, changes (shocks) will be accompanied by entropic forces restoring maximum entropy (i.e. the most likely state given prior information). We can interpret e.g. the forces of supply and demand as manifestations of these entropic forces (as well as things like sticky prices or so-called statistical equilibrium). Here are some simulations of these entropic forces for supply and demand.

Additionally, if we can identify A as the source of the information, we can turn the equivalence into a bound (you can't receive more information at B than is sent by A)

H(A) ≥ H(B)

This is called non-ideal information transfer. If we use uniform distributions as above, means that the differential equation becomes a bound.

These concepts are explored further in the paper. Here are some slides presenting these ideas. Here are some simulations showing the supply and demand (and price) relationships captured by the ideas above.

Update 23 November 2016

Here's the above relationship in picture format


1 comment:

  1. This kind of thing (in your post here) is helpful. Ideally you can team up with the guy who did these short videos for Sean Carroll and make it even more clear at some point.

    ReplyDelete