## Tuesday, October 27, 2015

Update: animations!

In comments with Ken Duda on the Info EQ 101 post, I realized that the way I've been presenting the example allows confusion between σ and n to occur and also loses out on some important details. So let's set up an example where you roll a six-sided die (σ = 6) five hundred times (n = 500) each for "supply" (ns) and "demand" (nd). This represents 500 widgets being supplied by 6 firms that are going to be allocated among 6 different firms (demand).

The information revealed by each roll is log₂ 6. The information revealed by n rolls is n log₂ 6.

After the first 10 rolls (10 widgets), you have two very different empirical distributions:

In these graphs, you add a box to the column labelled 4 when you roll a 4. After 500 rolls, you have approximately equal, approximately uniform distributions:

This is why we must assume nd, ns >> 1 (many rolls of the die). Only then do the empirical distributions approximate the theoretical (uniform) distribution (and therefore each other). We can imagine these distributions as the distribution of widgets supplied and the distribution of widgets demanded. They are not equal for nd, ns ~ 1 and you have cases of too many goods supplied for one firm and too few goods supplied to another.

We can also see the empirical information entropy of the two distributions are 1) not exactly equal to each other and 2) not equal to the theoretical entropy of log₂ 6 (per widget, gray line):

However all three of these become approximately equal when nd, ns >> 1 (e.g. after 500 rolls). The KL divergence also gets smaller (note: log scale) when nd, ns >> 1:

Now the 4x4 boards I drew in the previous post represent a 16-sided die roll (σ = 16) and I showed only about 6 rolls. It looked like this:

In the limit of a large number of rolls nd, ns >> 1 (where information equilibrium becomes a good model), it should look more like this:

1. Thanks, Jason. These concrete examples really help me.

-Ken

1. Cheers, Ken.