tag:blogger.com,1999:blog-6837159629100463303.post7320416847131217135..comments2017-08-02T17:01:05.373-07:00Comments on Information Transfer Economics: The economic allocation problemJason Smithhttps://plus.google.com/108549166524543391707noreply@blogger.comBlogger2125tag:blogger.com,1999:blog-6837159629100463303.post-80132932297026784422016-01-12T17:02:14.783-08:002016-01-12T17:02:14.783-08:00I think I regenerated the figure (which was random...I think I regenerated the figure (which was randomly generated) after writing the text with an earlier version.<br /><br />I think it is best to view information equilibrium as a necessary, but not sufficient condition for two distributions to be equal. As in the previous post you had commented on, there is a heirarchy: equal, matched, information equilibrium. We are generalizing the idea of equilibrium/equality.<br /><br />For example, a normal distribution, uniform distribution and a pareto distribution can have the same information entropy for a particular choice of parameters σ vs a,b vs x0,α ...<br /><br />normal ~ 2 log σ + ...<br />pareto ~ log x0/α + ...<br />uniform ~ log (b-a)<br /><br />but the sequence of draws will show differences. In your case, they'd show almost exactly the inverse frequencies of 1's and 0's. One averages 9 0's out of 10 and the other averages 9 1's out of 10. Changing the labels in your case eliminates the issue, but in the case of these uniform distributions, the KL-divergence will be different and you'd have to set up a more complex model to handle it. This is also why the KL divergence is only a measure of information loss, not the absolute measure of information loss.<br /><br />In your example, we have the case where almost every demand isn't met and almost every supply doesn't fall on a demand. The two sequences contain the same information (they are complements of each other).<br /><br />Your example just comes down to the symmetry of the binary information function (there are two possible binary probabilities yielding the same information p and 1-p).<br /><br />One way to think of it is as the other solution to the quadratic equation given that we've made our measure of information positive definite.Jason Smithhttps://www.blogger.com/profile/12680061127040420047noreply@blogger.comtag:blogger.com,1999:blog-6837159629100463303.post-88715920283164617242016-01-12T16:40:55.829-08:002016-01-12T16:40:55.829-08:00"A person shows up at store #4 (space grid po..."A person shows up at store #4 (space grid point 4) at time t = 2 (time grid point 2) wanting to buy blueberries, represented on the grid as a white translucent box. If there are blueberries available (represented by a blue cube), then blue cube and white box match up and a transaction occurs."<br /><br />Whether we start counting from 0 or 1 at the apparent 1st square in either dimension, this appears to not be the case.<br /><br />Assuming we count from 1, then t=2, s=4 has one pint of blueberries, but no demand. Counting from 0 is a square with neither supply nor demand.<br /><br />OK, not very important perhaps, but I like to check these things.<br /><br />You write:<br /><br />"I'm going to simplify this picture a bit by saying the blueberries don't go bad"<br /><br />... and that demand is persistent as well.<br /><br />"In information theory, this loss can be measured via the Kullback-Leibler divergence (calculated in the graph). In general, I(destination) ≤ I(source). Assuming equality is called ideal information transfer or information equilibrium. It turns out to be equivalent to assuming ideal markets where there is no excess supply or demand -- the probability distributions are equal."<br /><br />However, simplifying to two space possibilities and integrating over time again, and assuming (like I did in your more recent draft lecture post) that for demand we have P(Demand at space=1) = 0.1 and P(Supply at space 1) = 0.9, (i.e. both Bernoulli processes, unequal to each other, but with equal information entropy), what happens? I realize that KL divergence still applies, and that information has to be lost.... but are your words in this section 100% accurate?Tom Brownhttps://www.blogger.com/profile/17654184190478330946noreply@blogger.com