I thought it might be a good idea to put a bunch of definitions I use frequently into a single reference post. All of this stuff is discussed in my paper as well. Let's start with two macroeconomic observables $A$ and $B$.

__Information__

This is information entropy in the Shannon sense, not "meaningful knowledge" like knowing the fundamental theorem of calculus or how to play bridge. See here for an introduction to information theory specific to this blog.

__Information equilibrium__

The notation $p : A \rightleftarrows B$ represents an information equilibrium (IE) relationship between $A$ and $B$ with price $p$. I also refer to this as a market (it should be thought of as the $A$ and $B$ market, with $p$ being the $B$ price of $A$). It stands for the differential equation

p \equiv \frac{dA}{dB} = k \; \frac{A}{B}

$$

which is derived from assuming the fluctuations in the information entropy of two uniform distributions have equal information content. These fluctuations register as fluctuations in the "price" $p$. The differential equation has the solution

\begin{align}

A & =A_{ref} \; \left( \frac{B}{B_{ref}} \right)^{k}\\

p & = \frac{A_{ref}}{B_{ref}} \; \left( \frac{B}{B_{ref}} \right)^{k-1}

\end{align}

$$

__Information transfer index__

Frequently shortened to IT index, the parameter $k$ in the information equilibrium relationship above, or the information transfer (IT) relationship below.

__Non-ideal information transfer__

The notation $p : A \rightarrow B$ represents an information transfer (IT) relationship between $A$ and $B$ with price $p$. It is basically an information equilibrium relationship with information loss such that the information in fluctuations in $A$ are only partially registered in changes in $B$. The differential equation becomes a differential inequality

p \equiv \frac{dA}{dB} \leq k \; \frac{A}{B}

$$

Via Gronwall's inequality (see here), the information equilibrium relationship defined above is a bound on this information transfer relationship. The observed price $p^{*}$ will fall below the information equilibrium price, $p^{*} \leq p$. The same applies to the observable $A$; we will see the observed value fall below the information equilibrium value, $A^{*} \leq A$.

__Partition function approach__

The information equilibrium partition function approach starts with an ensemble of markets $p_{i} : A_{i} \rightleftarrows B$ with common factor of production $B$ and defines the partition function

Z = \sum_{i} e^{-\beta k_{i}}

$$

where $\beta \equiv \log (b + 1)$ and $b \equiv (B - B_{ref})/B_{ref}$. The normal application of the partition function in e.g. thermodynamics follows. It is derived from assuming a maximum entropy distribution of $B$ among the markets $A$ where the macrostate (the collection of all the markets $\{ A_{i}\}$) has a well defined ensemble average $\langle k \rangle$ (see here for more details).

__Entropic force__

Entropic forces are essentially the same in information transfer economics as in thermodynamics. They are "emergent" forces that do not have a description in terms of individual agents (e.g. atoms in thermodynamics). They arise from a tendency to maintain or achieve a particular maximum entropy distribution, or to keep two distributions in information equilibrium.