I thought it might be a good idea to put a bunch of definitions I use frequently into a single reference post. All of this stuff is discussed in my paper as well. Let's start with two macroeconomic observables $A$ and $B$.

__Information__

This is information entropy in the Shannon sense, not "meaningful knowledge" like knowing the fundamental theorem of calculus or how to play bridge. See here for an introduction to information theory specific to this blog.

__Information equilibrium__

The notation $p : A \rightleftarrows B$ represents an information equilibrium (IE) relationship between $A$ and $B$ with price $p$. I also refer to this as a market (it should be thought of as the $A$ and $B$ market, with $p$ being the $B$ price of $A$). It stands for the differential equation

p \equiv \frac{dA}{dB} = k \; \frac{A}{B}

$$

which is derived from assuming the fluctuations in the information entropy of two uniform distributions have equal information content. These fluctuations register as fluctuations in the "price" $p$. The differential equation is a generalization of one Irving Fisher wrote in his thesis. It has the solution

\begin{align}

A & =A_{ref} \; \left( \frac{B}{B_{ref}} \right)^{k}\\

p & = \frac{A_{ref}}{B_{ref}} \; \left( \frac{B}{B_{ref}} \right)^{k-1}

\end{align}

$$

I will occasionally write the relationships without the "detector" $p$ or explicitly calling out the information transfer (IT) index:

A \overset{k}{\rightleftarrows} B

$$

__Information transfer index__

Frequently shortened to IT index, the parameter $k$ in the information equilibrium relationship above, or the information transfer (IT) relationship below.

See this post for the log-linear form of an information equilibrium relationship.

An information equilibrium relationship implies that

$$

\frac{d}{dt} \log \frac{A}{B} = (k - 1) r

$$

__Log-linear form__See this post for the log-linear form of an information equilibrium relationship.

__Dynamic equilibrium__An information equilibrium relationship implies that

$$

\frac{d}{dt} \log \frac{A}{B} = (k - 1) r

$$

if $B$ is growing at a rate $r$, i.e. $B \sim e^{rt}$. More about this here or in this presentation. Essentially, $A/B$ grows (or decreases) at a constant rate on a logarithmic scale (a constant continuously compounded annual rate of change).

Because $A/B$ is proportional to the right hand side of the information equilibrium condition above, the same thing goes for prices:

$$

\frac{d}{dt} \log \frac{A}{B} = \frac{d}{dt} \log \frac{p}{k} = \frac{d}{dt} \log p = (k - 1) r

$$

__Non-ideal information transfer__

The notation $p : A \rightarrow B$ represents an information transfer (IT) relationship between $A$ and $B$ with price $p$. It is basically an information equilibrium relationship with information loss such that the information in fluctuations in $A$ are only partially registered in changes in $B$. The differential equation becomes a differential inequality

p \equiv \frac{dA}{dB} \leq k \; \frac{A}{B}

$$

Via Gronwall's inequality (see here), the information equilibrium relationship defined above is a bound on this information transfer relationship. The observed price $p^{*}$ will fall below the information equilibrium price, $p^{*} \leq p$. The same applies to the observable $A$; we will see the observed value fall below the information equilibrium value, $A^{*} \leq A$.

__Partition function approach__

The information equilibrium partition function approach starts with an ensemble of markets $p_{i} : A_{i} \rightleftarrows B$ with common factor of production $B$ and defines the partition function

Z = \sum_{i} e^{-\beta k_{i}}

$$

where $\beta \equiv \log (b + 1)$ and $b \equiv (B - B_{ref})/B_{ref}$. The normal application of the partition function in e.g. thermodynamics follows. It is derived from assuming a maximum entropy distribution of $B$ among the markets $A$ where the macrostate (the collection of all the markets $\{ A_{i}\}$) has a well defined ensemble average $\langle k \rangle$ (see here for more details).

__Entropic force__

Entropic forces are essentially the same in information transfer economics as in thermodynamics. They are "emergent" forces that do not have a description in terms of individual agents (e.g. atoms in thermodynamics). They arise from a tendency to maintain or achieve a particular maximum entropy distribution, or to keep two distributions in information equilibrium.

Refers to the set of IT indices making up the partition function above. Entropic forces can maintain a particular distribution of IT indices. More about this here.

I will also refer to the set of all available baskets of goods (region constrained by a budget constraint) as the economic state space. Gary Becker called this the opportunity set. It is discussed more in these presentation notes (with slides).

__Economic state space__Refers to the set of IT indices making up the partition function above. Entropic forces can maintain a particular distribution of IT indices. More about this here.

I will also refer to the set of all available baskets of goods (region constrained by a budget constraint) as the economic state space. Gary Becker called this the opportunity set. It is discussed more in these presentation notes (with slides).

## No comments:

## Post a Comment

Comments are welcome. Please see the Moderation and comment policy.

Also, try to avoid the use of dollar signs as they interfere with my setup of mathjax. I left it set up that way because I think this is funny for an economics blog. You can use € or £ instead.

Note: Only a member of this blog may post a comment.