There are a couple of loose ends that need tying up regarding the IT index. One of which is the derivation of the information equilibrium condition (see also the paper) with non-uniform probability distributions. This turns out to be relatively trivial and only involves a change in the IT index formula. The information equilibrium condition is

\frac{dA}{dB} = k \; \frac{A}{B}

$$

And instead of:

$$

k = \frac{\log \sigma_{A}}{\log \sigma_{B}}

$$

with $\sigma_A$ and $\sigma_B$ being the number of symbols in the "alphabet" chosen uniformly, we have

k = \frac{\sum_{i} p_{i}^{(A)} \log p_{i}^{(A)}}{\sum_{j} p_{j}^{(B)} \log p_{j}^{(B)}}

$$

where $p_{i}^{(A)}$ and $p_{j}^{(B)}$ represent the probabilities of the different outcomes. The generalization to continuous distributions is also trivial and is left as an exercise for the reader.

However, while it hasn't come up in any of the models yet, it should be noted that the above definitions imply that $k$ is positive. But it turns out that we can handle negative $k$ by simply using the transformation $B \rightarrow 1/C$ so that:

\begin{align}

\frac{dA}{dB} = & - |k| \; \frac{A}{B}\\

-C^{2} \frac{dA}{dC} = & - |k| \; \frac{AC}{1}\\

\frac{dA}{dC} = & |k| \; \frac{A}{C}

\end{align}

$$

That is to say an information equilibrium relationship $A \rightleftarrows B$ with a negative IT index is equivalent to the relationship $A \rightleftarrows 1/C$ with a positive index.

## No comments:

## Post a Comment