$$

N(m) =\sum_{i} n_{i} = \sum_{i} m^{a_{i}} = \sum_{i} e^{- a_{i} \log 1/m}

$$

has the form of a partition function

$$

Z(\beta) = \sum_{i} e^{- \beta E_{i}}

$$

Where $\beta = 1/k T$ corresponds to $\log 1/m$ and the energy of the states $E_{i}$ corresponds to the $a_{i}$, which corresponds to the maximum entropy probability distribution. If we take that analogy at face value, the expected value of the random $a_{i}$ with maximum entropy would be

$$

\langle a \rangle = - \frac{\partial \log Z(\beta)}{\partial \beta} = \frac{\sum_{i} a_{i} m^{a_{i}}}{\sum_{i} m^{a_{i}}}

$$

Since $N \sim m^{\langle a \rangle}$, we now have an exponent that varies with $m$ -- exactly we have observed with the exponent $\kappa(N, M)$! (see here or here). The resulting function $N \sim m^{\langle a \rangle}$ now overestimates the result of adding the markets together. Here are the results for uniformly distributed $a_{i}$ where $a_{i} \in [0,1]$, $[0,2]$ and $[0,4]$ (the plot of $\langle a \rangle$ appears alongside the corresponding graph of $N \sim m^{\langle a \rangle}$):

One thing is that in statistical mechanics, higher temperature means that more of the higher energy states are occupied, it appears as though the observation in economics is that higher $m$ (money supply) means that more of the lower $a_{i}$ states are occupied in order to produce this figure. I'll have to look into this a little more to fully understand how this works. However this may be a pretty important result, at least for information transfer economics. Stay tuned!

## No comments:

## Post a Comment

Comments are welcome. Please see the Moderation and comment policy.

Also, try to avoid the use of dollar signs as they interfere with my setup of mathjax. I left it set up that way because I think this is funny for an economics blog. You can use € or £ instead.

Note: Only a member of this blog may post a comment.