Thursday, March 31, 2016

Fairness = tendency to maximize entropy?

File under speculation.

I re-read one of my old posts and it had this quote (from this post by Mark Thoma):
Max Burton-Chellew ... said: 'Game theory can be used to predict how a self-interested and rational person will behave in social situations. However, economic games, in which people have to make decisions on how to allocate money to themselves and others, have consistently shown that these predictions fare poorly. In particular, it seems that people are overly generous and altruistic, and appear to be primarily motivated by concerns of fairness rather than maximising income. ...'
This "fairness" leads to these games resulting in outcomes that are closer to the entropy optimum than the game theory solution.

Maybe humans evolved a sense of fairness because uniform distributions can help to maximize entropy? Not only does this improve the functioning of markets, but it can have other benefits.

Maybe markets are a "hack" of the human sense of fairness? Since we have a sense of fairness, markets can work well. In the past, our sense of fairness has overcome the natural propensity for markets to create e.g. wildly unequal distributions of income though social norms and other human behaviors (religion, wars).

Anyway: speculative.

Does saving make sense?

There were a couple of posts that came up recently that discuss how "saving" doesn't mean what you think it means (in economics). Here's Nick Rowe with a good macro class version. Here's Steve Roth's critical take on the term. (N.b. I'm not critiquing either post in this post).

I find Nick Rowe's (i.e. the traditional) definition of saving to be a perfectly acceptable definition:
... "saving" means literally anything you do with your income except buying newly-produced consumption goods or paying taxes.
Which is encapsulated in the identity Y = C + T + S. This says there are two economically interesting things you do with your income (consumption and pay taxes = C + T) and everything else (S = "saving" = boring!). Let's try to illustrate what is going on in the macroeconomy using the information equilibrium picture (e.g. here). Say taxes are blue, consumption is red and saving is everything else (yellow). The growth states of a macroeconomy might look like this:


This says e.g. consumption on item X went up in location Y, while taxes collected from activity A went down in state S, etc. Integrating up these diagrams from each quarter (year, month, etc) gives you the level of the contribution to Y on that color coded segment.

Does it make sense to break up Y into C, T and S in this case? Not really. Each type is uniformly distributed throughout the states. But what if there's a central government and all changes in taxes collected are correlated? Well, in that case we have a diagram that looks like this:


And in this case, it does make sense to break out T.

However, since S is "everything else" in the economics definition, it doesn't really make sense to break it out. It should be Y = C + T + R where R means residual. My diagram above makes a case that Y = X + T where T is taxes and X is everything else.

That's the idea behind breaking up income Y into a set of separate measures: you've identified some highly correlated components of your economic system -- so correlated, they appear to move together as a single unit to a decent approximation. And you've likely identified some sort of effect on the economy that goes along with that correlated component.

However, that also means that in most of those national income accounting identities, there will be a "residual". In Y = C + I + G + NX, it's I. In some post-financial crisis models, there's an argument to add a financial sector (gray):


This model is

Y = X + G + F

where G is government spending, F is the financial sector's contribution to GDP (whatever that is), and X is everything else.

Note these break-outs don't have to move exactly as a single block, and you could have something that looks like this at some period of time (imagine a transition from the first graph above to this graph -- a jump in consumption growth and a fall in "saving" growth, with Y growth staying the same):


In this case, it's a bit fuzzy to nail down exactly what is happening to what (this is probably the most realistic version -- with the large box versions being an approximation to this diagram). But overall, there are a few things that appear to operate as correlated units.

So the bottom line is that an accounting identity of the form:

Y = A + B + ... + R

Is a statement that someone (at some time) found A and B (and ...) seem to move as correlated units in a way that is useful to explain some effect or policy. And then there is everything else (R).

In our old friend the national income accounting identity, we have

Y = C + I + G + NX

where R = I. This identity happened because in the Great Depression consumption of newly produced goods seemed to be correlated (and falling), and while it was theoretically possible to try and export your way out of that mess, not everyone could (and most of your trading partners had the same problem). So boosting government spending was a potential solution.

This only makes sense if the real empirical distribution looks like that last one with G in blue and say C + NX in yellow and I (the residual) in red.

But if that isn't what the empirical distribution looks like, then boosting government spending (moving blue boxes) might just displace some red and yellow boxes with no net result -- e.g. due to Ricardian equivalence, or monetary offset. That is to say: declaring Y = C + I + G + NX does not mean the distribution usefully breaks up into C, I G and NX.

...

Update 1 April 2016

In case you were wondering if this is how a real economy looks, here's some data. I used the components of CPI instead of nominal output, but that should be a good proxy. They should be weighted by the relative size of each component, but this just counts up the number of CPI components with a given growth rate (and normalizes it).


As you can see, this shows a peaked distribution around the CPI growth rate (inflation rate), much like the graphs above (although those are nominal growth rate).

...

Update 10 December 2016: Which definition?

Steve Roth stops by in comments below to dispute my interpretation of the national income accounting identities. Let's consider two possible definitions of what we could mean by the "accounting identity" Y ≡ A + B. First a picture, and then some words:


  1. Y is defined as the sum of A and B which are independently defined.
  2. Y is independently defined. A is independently defined as a subset of Y. B is then "not A" and Y = A + Y\A (A plus the Y-complement of A).
In the second definition, B is the residual (what is left over after defining A), and there may be things in B (subsets of B) that are not (loosely speaking) well-defined except in terms of Y and A.

Now let's say (in the following, by consumption, I mean private consumption, and everything is relative to the "current year" and for "final use"):

Y = current-year produced goods and services
A = current-year produced goods and services that are consumed (C)
B = current-year produced goods and services that aren't consumed (I)

It seems obvious to me that we are looking at the second definition when it comes to the national income accounting identities Y ≡ C + I and  C + S (ignoring government and exports, but including them just creates three well-defined A's above, with I and S still remaining residuals).

It is critical to note that neither definition has any implications for the relative dynamics of Y, A, or B. It's a definition, not a model (or as Steve puts it, counterfactual). More on this here.

We can adapt the old example of shoes and socks used to discuss the axiom of choice in math to help intuition here.

The first definition could be A = left shoes and B = right shoes and Y = shoes. And that works for the other definition as well because the complement of right shoes in all shoes are the left shoes.

However if Y is instead the set of things you wear on your feet, then the Y-complement of right footware includes not only left shoes but socks such as tube socks that are neither left nor right. That is to say the second picture is the one we should have in our head, not the first.

When we look at Y = C + I (ignoring government and exports for simplicity), we have a situation where Y is defined and C is defined (Y that is consumed) leaving I to be the residual (Y that is not consumed, which could be anything and which economists confusingly dub "investment" since most normal people consider buying stocks or bonds "investing" rather than e.g. toilet paper bought for company bathrooms).

Steve additionally says that I isn't a residual, but S is (in Y = C + S), which is problematic for maintaining total income equal to total expenditure.

In our shoes and socks example, we have footware sold and footware bought. These should be the same.

However in one case Steve says the definition Y = C + I is more like definition 1 rather than definition 2 while the other (Y = C + S) is more like definition 2 than definition 1. This means in one case (items sold) we could potentially have things in the complement that aren't the "opposite" (e.g. the tube socks) that don't appear in the definition of the other (items bought).

We'd fail to maintain the income/expenditure identity. That is to say

C + S = Y = C + I + X

where X is unknown. To be explicit with the footware example, we have:
Left shoes bought + not left shoes bought = footware = left shoes sold + right shoes sold + other stuff sold (tube socks)
But previously we said as a matter of definition Y ≡ C + I. So X must be zero (i.e. the empty set).

Essentially, Steve must be changing the definitions of savings and investment to be different from economists' definitions. That's fine, but he can't the call me out for being wrong for not going along with his definitions. In those cases we usually say something like "I don't like those definitions" ‒ and I agree! That was the point of the my post above. I think my post gives a far more well defined way to proceed. We can even address some of those dynamics questions by linking the definitions of our partitions of Y to partitions that have sensible dynamics. Keynesians like to think that Y = C + I + G means that boosting G boosts Y, and if G is a correlated unit (as representing in the post above by the big blue box), then that is probably true. And if I is a residual, then there's really no telling what happens to I when G changes (∂I/∂G is not just model dependent, but not even well-defined).

The key issue is that when you define something by what it is not, you have to be careful about what you've put in your set.

Wednesday, March 30, 2016

Information equilibrium on the Wicksellian roundabout


In an earlier post, I showed the results of a traffic jam on the Wicksellian roundabout. Now, I'm going continue from that point and try to describe the results in terms of information equilibrium.


Let's say I have $n_{N}$ cells and $n_{M}$ dots (motes?) as pictured above. Each cell represents a quantity demanded for motes. In equilibrium, the motes are uniformly distributed among the cells -- the most likely state. If we are moved away from equilibrium (by the traffic jam), the equilibrating force is the entropic force restoring this most probable state. That is to say random moves will eventually restore this state.

Let's take $n_{X} \; dX = X$ where $dX$ measures the size of an individual element of $X$. Irving Fisher would have written down this equation:

$$
\frac{N}{dN} = \frac{M}{dM}
$$

We can rearrange and generalize (adding a constant $k$, the information transfer index, and defining the abstract price $p$):

$$
\text{(1) }\; p \equiv \frac{dN}{dM} = k \; \frac{N}{M}
$$

with general solution:

$$
\frac{N}{N_{ref}} = \left( \frac{M}{M_{ref}} \right)^{k}
$$

Let's keep the number of motes constant $M = M_{ref}$ (since we do).

$$
\begin{align}
p & \equiv \frac{dN}{dM} = k \frac{N_{ref}}{M_{ref}} \left( \frac{M}{M_{ref}} \right)^{k - 1}\\
& =  k \frac{N_{ref}}{M_{ref}} \left( \frac{M_{ref}}{M_{ref}} \right)^{k - 1}\\
& = k \frac{N_{ref}}{M_{ref}} \sim \frac{1}{\rho_{ref}}
\end{align}
$$

where $p$ is the abstract price and $\rho$ is the density of motes per cell. In information equilibrium, the price is constant if we don't change $N$ or $M$ (demand for/supply of motes).

The "traffic jam" is an excess (non-equilibrium) demand for motes in that cell (or being ok with an excess supply of motes). Note that this means the traffic jam is a case of non-ideal information transfer. The information revealed by selecting some number of cells is not equal to the information revealed by selecting some number of motes (accounting for the proportionality constant $k$) when the distribution of motes is uneven.

We can account for this by looking at the information

$$
I(P_{N}(X)) \geq I(P_{M}(X))
$$

Define $0 < \alpha < 1$ such that

$$
\alpha \; I(P_{N}(X)) = I(P_{M}(X))
$$

where $P_{A}$ is the probability distribution of $A$ from which the random variable is drawn. Re-deriving equation (1) leads us to

$$
\text{(2) }\; p \equiv \frac{dN}{dM} = \alpha \; k \; \frac{N}{M} = \alpha \; p^{*}
$$

where $p^{*}$ is the ideal (information equilibrium) price. So what does the price look like in our scenario:


We can see the price drops at the onset of the traffic jam and then steadily returns to equilibrium. [Ed. note: see update below. This is one possibility. More likely this should be viewed as $N/N^{*}$]

What is this supposed to represent in the real world? We have $M$ being the money supply (don't care which measure at this level of detail -- motes = bank notes), and $N$ being nominal output. We can show that fall in the price level leads to a fall in nominal output since

$$
N^{*} = (1/k) \; M \; p^{*}
$$

where $N^{*}$ is ideal (information equilibrium) output, so that non-ideal (measured) output is

$$
N = \alpha N^{*}
$$

Since we were looking at the situation where the number of cells (and therefore information equilibrium output) was constant, this means the fall in nominal output looks just like the fall in the price level.

However, in this particular formulation, real output is unchanged since

$$
\frac{N}{p} = \frac{\alpha \; N^{*}}{\alpha \; p^{*}} =  \frac{N^{*}}{p^{*}}
$$

PS I have to think about this some more to make sure I have this right. Do we apply $\alpha$ entirely to the ideal price $p^{*}$ -- or is it applied to the ideal output $N^{*}$, where some goes toward output and some goes toward the price level. In the latter case, there would be a real effect because

$$
\frac{N}{p} = \frac{\alpha \; N^{*}}{\alpha_{p} \; p^{*}}
$$

where $\alpha = \alpha_{r} \alpha_{p}$.

PPS There's also this:






...

Update 31 March 2016

In working through the equations, I have now convinced myself that $\alpha$ modifies ideal (information equilibrium) nominal output (not the price). It modifies the price alone if

$$
\frac{d \alpha}{dM} = 0
$$

In this case, a traffic jam doesn't result in a change in real output, only nominal. But generally $\alpha$ depends on $M$, so

$$
\begin{align}
p \equiv \frac{dN}{dM} & = \alpha \frac{dN^{*}}{dM} + \frac{d\alpha}{dM} N^{*}\\
& = \alpha p^{*} + \frac{d\alpha}{dM} N^{*}\\
& = p^{*} \left(\alpha + \frac{M}{k} \; \frac{d\alpha}{dM} \right)\\
p & = \alpha' \; p^{*}
\end{align}
$$

Which means real output would be affected. I trudged through the algebra and came up with this (but I don't trust it, n.b. this is not a call for someone to fix it -- I wouldn't trust your answer either):

$$
\alpha' = \alpha - \frac{\alpha}{k} + \frac{1}{\log \sigma_{M}}
$$

The last term is a finite system effect. And for a large system and $k = 1$, the impact of a traffic jam is purely real (i.e. not nominal). For the quantity theory of money ($k = 2$) and a large system, a traffic jam is half real and half price level. 

Tuesday, March 29, 2016

Coordination and complexity

The answer to everything on Twitter today seems to be "Gary Becker's irrational agents", but one conversation with @unlearningecon has led me to put down my thoughts on "random" agents.

I touched on my thinking in this dialog post:
Algorithmic complexity is loosely defined as the length of a computer program required to produce a string of output. The behaviors of those people in the economy, they can be represented by a string of transactions. You are saying the program required to produce that string is almost as long as the string itself. ... if the program was as long as the string, then you'd have something that is algorithmically random. ... very complex behavior can be thought of as random ... If you consider a person as an information source for that string of transaction information, the complexity of that string, as you look for a longer and longer time, approaches the information entropy of that string.
I will continue to use the string of transactions construct in the following.

@unlearningecon suggested that there are complex coordinations (like fashion, advertising) that would move us away from the maximum entropy requirement for Gary Becker's irrational agent demand curve to work out. And that is true!

Even if behavior was very complex, we'd imagine that showing agents advertising for Macbooks could impact their string of transactions. Even if that meant 1% more purchased a Macbook, that's enough of a coordination to be problematic for the maximum entropy argument.

That assumes we're showing the advertising when every agent can conceive of a need for a Macbook and has the money to purchase it. Thus, the effect of advertising on a complex agent becomes not just dependent on the complexity of the program that produces the string of transactions as output, but on the history of output (Have I already purchased a Macbook? Did I buy something else with the money?).The consumption baskets are also different for different aged people and for different incomes.

But then there's things like this:


How does the random agent approximation to a complex agent survive an onslaught of Mac startup chimes?

It doesn't.

Seriously, this kind of thing result in a permanently depressed economy relative to one with more diversity. Be glad the economy isn't just Macbooks.

But that's what saves the Becker model. Changes into and out of states like the one pictured matter, but once you've accepted non-ideal information transfer/coordination, i.e.

I(S) = α I(D)

with α < 1, then we're just talking about relative information. Are there more macbooks than usual? Plus this kind of thing happens in many markets. In fact, this kind of fall in diversity happened in one particularly important market a long time ago.

Once you can predict a college student has a Macbook, the event of a purchase of a Macbook by a college student (a demand event meeting a supply event) ceases to carry any information. A light that is always on doesn't communicate anything. The market ceases to be a channel for information transfer.

But the ultimate arbiter of whether we can say complex human agents are effectively random agents is whether the approximation leads to a useful (empirically successful) effective theory. In the case of macro, it seems to -- at least as a first order approximation. In the case of prediction markets, it also seems to work -- again at first order.

But we need a first order framework to even start to try and understand the complicated higher order effects like every college student buying a Macbook.

...

Update 30 March 2016

One could imagine the collection of Macbooks as a filled Fermi sphere -- all of the action takes place at the Fermi surface where some people decide to get a Macbook, a Dell or not have a laptop.

I am tempted to say a similar thing happens in elections, but there the level matters -- turnout is as important as swing voters, if not more (in the US).

Traffic model on the Wicksellian roundabout

Nick Rowe had a post about the "Wicksellian roundabout" almost exactly 2 years ago where he wanted to model an economy as money flowing around from one person to another. I took it up here. He mentioned a Japanese video that he couldn't remember a link to -- here's that video:


Nick characterized it as one car slowing down ("What happens if one car slows down temporarily?"), however in the mathematical model (of "jamitons") there is no real "cause":
However, above a critical threshold density (that depends on the model parameters) the flow becomes unstable, and small perturbations amplify. This phenomenon is typically addressed as a model for phantom traffic jams, i.e. jams that arise in the absence of any obstacles. The instabilities are observed to grow into traveling waves, which are local peaks of high traffic density, although the average traffic density is still moderate (the highway is not fully congested). Vehicles are forced to brake when they run into such waves. In analogy to other traveling waves, so called solitons, we call such traveling traffic waves jamitons.
That is to say a microscopic slowdown in one of the cars is amplified into a travelling solitary wave -- a jamiton.

This is mostly just background information. In the rest of this post (and in subsequent posts as this is a work in progress), I'm going to apply the information transfer framework to traffic where there is a cause -- a slowdown in one cell of the Wicksellian roundabout. I've used the traffic model before as an analogy for the economic information transfer model (here, here and here). Here's a 20-cell roundabout with a temporary obstacle in one cell (modeled as a decrease in probability of leaving that cell). I plot the density in that cell as well as the density in all of the cells versus time:



You can see the jam creates a travelling wave, although this one appears to propagate forward before dissipating in the normal density fluctuations. If we look at a single time frame during the jam, you can see the loss in entropy (maximum entropy occurs when the "cars" are uniformly distributed across the cells):


Information equilibrium (in economics, general equilibrium) is restored as entropy returns to its maximum over time.

Saturday, March 26, 2016

Effective information equilibrium theory


While the original information equilibrium condition was derived using a uniform distribution, it lead to an equation that contains a conformal symmetry:

$$
\frac{dA}{dB}=k \frac{A}{B}
$$

such that if $A \rightarrow \alpha A$ and $B \rightarrow \beta B$, we get the original equation back. If you take the effective field theory approach (which I mention in the paper), you would write down all the possible terms on the right hand side that are consistent with the symmetry to first (or lower) order in the process variables. That only adds a constant term:

$$
\frac{dA}{dB}= c + k \frac{A}{B}
$$

This is one of the earliest differential equations to ever be solved (in 1694 by John Bernoulli) and the solution is:

$$
\frac{A}{A_{0}} = \frac{c}{1-k}\frac{B_{0}}{A_{0}} \left( \frac{B}{B_{0}} \right) + \left( \frac{B}{B_{0}} \right)^k
$$

The big change is the addition of a linear component -- so that for small values of $B$ (with $k > 1$), we no longer have $A \sim B^k$, but $A \sim B$. The limit for $B >> B_{0}$ is the same as the original solution (i.e. $A \sim B^k$). This could have potential application to e.g. the minimum wage or minimum costs of production. Here are graphs depending on whether c is positive or negative:


The price ($p = dA/dB$) shows a similar behavior:




Second order?

Going to second order in the conformal symmetry adds several terms:

$$
\frac{dA}{dB}= k_{00} + k_{10} \frac{A}{B} + k_{20} \left( \frac{A}{B} \right)^2 + k_{11} \frac{A}{B} \frac{dA}{dB} + k_{02} A \frac{d^{2}A}{dB^{2}}
$$

Mathematica can (implicitly) solve this if you take $k_{02} = 0$ (making it a first order nonlinear differential equation) -- it's a mess. However, the general form is a nonlinear second order differential equation that requires numerical solution. Including just the additional $k_{20}$ term is solvable in closed form (but also a mess). Numerically, it affects the behavior more when $A/A_{0} >> B/B_{0}$:

The price shows similar behavior:

...

Update 29 March 2016

I thought I'd combine the first and second order result, with both using negative coefficients. This gives a result that is relatively flatter near the endpoints and relatively steeper in the middle :


Friday, March 25, 2016

Interest rate and monetary base updates

I've updated some graphs regarding interest rates and the monetary base prediction (see the prediction link for more):



With the second one, we can be sure something has changed, but the drift towards the level marked "C" seems slow. Of course, the drift up to the level "0" took a little over a full year.

...

Update 28 March 2016

Also, this graph:


Bernanke endorses policy recommednation of information equilibrium model

Ben Bernanke. Image from Wikimedia Commons.

Ben Bernanke is coming around to the information equilibrium model in his latest post. Ok; just kidding. But an interest rate peg seems to be the way inflation gets generated when you're stuck with low inflation. See here:

Gold was irrelevant

Will the UK be the first to exit the Great Recession?

Does a liquidity trap ever end?


...

Update


I also found this footnote fascinating (not to say the whole post isn't good):
[4] So how did the pre-1951 Fed succeed in capping the rates on very long-term bonds? It’s a bit of a puzzle. Probably wartime controls and limited liquidity in the bond markets helped. It may also be that, for most of the rate-targeting period, investors were comfortable that inflation and short-term would stay low indefinitely and thus had no reason to challenge the peg (see Eichengreen and Garber). By late 1947, with inflation rising and wartime controls eliminated, the peg was coming under pressure, and the Fed had to buy a significant quantity of long-term securities to keep the peg in place (Eichengreen-Garber, p. 182).

Wednesday, March 23, 2016

Some information transfer model basics

I do this kind of thing from time to time to make sure I have everything straight in my own head. There's no reason not to share it with you.






Information transfer as a common language for economics, part II

On my previous postCameron Murray linked me to something else he's written on how to teach pluralism in economics. In it he has a chart comparing different schools of economics. I'd like to first answer the headings for the IT framework. However, I'll come back and explain how the IT framework is a bit more general than these schools and you can represent each within it.

First, the IT framework answers to the chart questions:
The economy is made up of ... 
Information
(e.g. the information entropy of the state space of potential allocations)
Information transfer channels
(i.e. markets) 
Individuals are ... 
Complex
Potential for rationality to be emergent
The mechanism by which the economy explores the state space 
The world is ... 
Certain if markets are large and ideal
Uncertain markets are small and/or non-ideal/poorly designed
(i.e. do not transfer information) 
The most important domain of the economy is ... 
Model dependent 
Economies change through ... 
Exploration of and invention of new areas of state space 
Policy recommendations ... 
Model dependent, but generally:
Subsidize/allow individual exploration of state space
(e.g. minimum income, unemployment insurance, school, health insurance ... or could be interpreted as less regulation letting individuals explore state space)
Markets contain potential for "bad coordination" (panics)
(i.e. need government to mitigate using "good coordination")
Now here are the ways the IT framework can encompass the various schools ...
Classical: Essentially ideal information transfer in large markets. Could use the emergent rationality above. Interference in the market creates bad coordinations. 
Neoclassical: Essentially ideal information transfer in mesoscale markets (so that there is rare non-ideal information transfer, but potentially large fluctuations/uncertainty  due to smaller markets). Could use the emergent rationality above. Interference in the market creates bad coordinations or good coordinations. 
Marxist: Marxist economics is basically classical economics, but with a different view of the market solution to the allocation problem. The market allocation is bad, so replace with a different algorithm
Developmentalist: This school just focuses on the exploration of the state space (development). Allows for government interventions to create good coordinations. 
Austrian: Essentially ideal information transfer in mesoscale markets (so that there is are potentially large fluctuations/uncertainty due to smaller markets). Could use the emergent rationality above. Interference in the market creates bad coordinations. 
Schumpeterian: Non-rational entrepreneurship is one way to describe state space exploration. Complex world means that IT framework can be used to construct an approximation of the economy. 
Keynesian: Information transfer (ideal and non-ideal) in mesoscale markets (so that there are potentially large fluctuations/uncertainty  due to smaller markets). Could use the emergent rationality above as an approximation in equilibrium. Interference in the market necessary to create good coordinations. 
Institutionalist: Institutions set up the specific information transfer channels. Institutions can affect ideal or non-ideal information transfer as well as good and bad coordinations. 
Behavioralist: The emergent rationality is an approximation, but in most interesting cases it does not apply. Non-ideal information transfer dominates because of human behavior. IT framework more a way to show how far we are from rationality by illustrating the ideal case.
...


I also wanted to reproduce part of my reply to Cameron because I think it helps describe the IT framework:
FWIW, in the IT framework, those variables [in the models] represent probability distributions over some domain ... e.g. space, time. You have equilibrium when e.g. the spatial probability distribution of 'demand events' for X is equal to the spatial probability distribution of 'supply events' of X -- and there are a large number of units of X (so that the probability distribution comes close to being the actual distribution). In that case, the information required to construct both distributions is equal I(Pd(X)) = I(Ps(X))
Since the whole thing simplifies if you talk about uniform distributions, you can think of e.g. NGDP as the total number of 'demand events' (measured in e.g. dollars). When uniformly distributed (not true, but works to leading order), the information in a string of 'demand events' drawn from that distribution is just proportional to the number of events ... I(Pd(D)) ~ NGDP
However, since this definition is pretty malleable, it could easily fit distributions of property rights, accounts, etc. you discuss in your follow-up. 
For example, I've used it for the distribution of price states in a time series. The distribution over nominal interest rate states is equal to the distribution over "the price of money", which is proportional to velocity in this very basic model. 
The IT framework is just one example of a possible way to address this issue, but I agree that more needs to be done in this regard.
Also, my comment I used the term demand event which I think is much better than demand widget. When a demand event and a supply event meet, you have a transaction event.

Tuesday, March 22, 2016

Information equilibrium: a common language for multiple schools


Cameron Murray has a great post about the challenge of reforming economics in which he points out two challenges: social and technical. The social challenge is that different "schools" are tribal, and reconciliation isn't rewarded. Just read Murray on this.

The second challenge is something that I have tried to work towards answering:

[H]ow do you teach a pluralist program when there is no recognised structure for presenting content from many schools of thought, which can often be contradictory, and when very few academics are themselves sufficiently trained to to so?
...
What is needed is a way to structure the exploration of economic analysis by arranging around economic problems around some core domains. Approaches from various schools of thought can be brought into the analysis where appropriate, with the common ground and links between them highlighted.
Despite being completely out of the mainstream, the information equilibrium framework does not have to subscribe to a specific school of economic thought. I actually thought this is what you were supposed to mean by framework (other economists disagree and include model-specific results in what they call frameworks). In fact, I defined framework by something that is not model specific:
One way to understand what a framework is is to ask whether the world could behave in a different way in your framework ... Can you build a market monetarist model in your framework? It doesn't have to be empirically accurate (the IT framework version is terrible), but you should be able to at least formulate it. If the answer is no, then you don't have a framework -- you have a set of priors.
This is what pushed me to try and formulate the MMT and Post-Keynesian (PK) models that use "Stock Flow Consistent" (SFC) analysis as information equilibrium model. The fact that I criticized an aspect of SFC analysis upset the MMT and PK tribes (see the post and comments) led me to not end up posting the work I'd done.

But in the interest of completeness and showing that the information equilibrium framework allows you to talk about completely different schools of economics with the same language, let me show the model SIM from Godley and Lavoie as an information equilibrium model.

SFC models as an information equilibrium model

First, divide through by $Y$ (this represents an overall scale invariance), so all the variables below are fractions of total output (I didn't change the variable names, though because it would get confusing).

Define the variable $B$ to be government spending minus taxes.

$$
B \equiv G - T
$$

Define $x$ to be a vector of consumption, the variable $B$, taxes, disposable income and high powered money:

$$
\begin{bmatrix}
C \\
B \\
T \\
Y_{D} \\
H
\end{bmatrix}
$$

Define the matrix $A$ to be

$$
\begin{bmatrix}
1 & 1 & 1 & 0 & 0 \\
1 & 1 & 0 & -1 & 0 \\
0 & 0 & 1 & 0 & 0 \\
1 & 0 & 0 & -\alpha_{1} & -\alpha_{2} \\
0 & 0 & 1 & 1 & 0
\end{bmatrix}
$$

Define the vector $b$ to be

$$
\begin{bmatrix}
-1 \\
0 \\
-\theta \\
0 \\
-1
\end{bmatrix}
$$

The SFC model SIM from Godley and Lavoie is then

$$
A x + b = 0
$$

$$
H \rightleftarrows Y_{D}
$$

with [Ed. note: I originally got my notes confused because I wrote $Y_{D}$ as $D$ through part of them and $B$ instead of the $Y_{D}$ I use here, so left off the following equation]

$$
B \equiv \int_{\Gamma} dY_{D}
$$

where the second equation is an information equilibrium relationship [and the third is a path integral; in the model SIM, they take $\Gamma$ to effectively be a time step]. The issue that I noticed (and upset the SFC community) is that it's assumed that the information transfer index is 1 so that instead of:

$$
H \sim Y_{D}^{k}
$$

You just have

$$
H \sim Y_{D}
$$

and the velocity of high powered money is equal to 1. Also, there is no partial equilibrium -- only general equilibrium so you never have high powered money that isn't in correspondence with debt (or actually in the SFC model, exactly equal to debt).

Even with this assumption, however, the model can still be interpreted as an information equilibrium model. There is supply and demand for government debt that acts as money. This money is divided up to fund various measures e.g. consumption.

Market monetarism as an information equilibrium model

Over time, I have attempted to put the various models Scott Sumner writes down into the information equilibrium framework. The first three are described better here.

1) u : NGDP ⇄ W/H

The variable u is the unemployment rate. H is total hours worked and W is total nominal wages.

2) (W/H)/(PY/L) ⇄ u

PY is nominal output (P is the price level), L is the total number of people employed and u is the unemployment rate.

3) (1/P) : M/P ⇄ M

where M is the money supply. This may look a bit weird, but it could potentially work if Sumner didn't insist on an information transfer index k = 1 (if k is not 1, that opens the door to a liquidity trap, however). As it is, it predicts that the price level is constant in general equilibrium and unexpected growth shocks are deflationary in the short run.


This 4th one is described here.

4) V : NGDP ⇄ MB and i ⇄ V

where V is velocity, MB is the monetary base and i is the nominal interest rate. So that in general equilibrium we have:

V = k NGDP/MB

log i = α log V

Or more compactly:

log i = α log NGDP/MB + β

More models!


More mainstream Keynesian and other models all appear here or in my paper. Here's a model that is based on the Solow model. However, I think showing how the framework can illustrate both Market Monetarism and Post Keynesianism using the same tools gives an idea of how useful it is.

I can even put John Cochrane's asset pricing equation approach in the framework!

The interesting part is that it lays bare some assumptions (e.g. that the IS-LM model is an AD-AS model with low inflation).

And despite my protests, expectations can be included. It just involves looking at the model temporally rather than instantaneously.

Monday, March 21, 2016

An RLC circuit with R = S and L = F


An RLC circuit is a simple electric circuit with a resistor, inductor and capacitor in it -- with resistance R, inductance L and capacitance C, respectively. It's one of the simplest circuits that displays non-trivial behavior.

You can derive an equation for the behavior by using Kirchhoff's laws (conservation of the stocks and flows of electrons) and the properties of the circuit elements. Wikipedia does a fine job.

You arrive at a solution for the current as a function of time that looks generically like this (not the most general solution, but a solution):

$$
i(t) = A e^{\left( -\alpha + \sqrt{\alpha^{2} - \omega^{2}} \right) t}
$$

with $\alpha = R/2L$ and $\omega = 1/\sqrt{L C}$. If you fill in some numbers for these parameters, you can get all kinds of behavior:


As you can tell from that diagram, the Kirchhoff conservation laws don't in any way nail down the behavior of the circuit. The values you choose for R, L and C do. You could have a slowly decaying current or a quickly oscillating one. It depends on R, L and C.

Now you may wonder why I am talking about this on an economics blog. Well, Cullen Roche implicitly asked a question:
Although [stock flow consistent models are] widely used in the Fed and on Wall Street it hasn’t made much impact on more mainstream academic economic modeling techniques for reasons I don’t fully know.
The reason is that the content of stock flow consistent modeling is identical to Kirchhoff's laws. Currents are flows of electrons (flows of money); voltages are stocks of electrons (stocks of money).

Krichhoff's laws do not in any way nail down the behavior of an RLC circuit.

SFC models do not nail down the behavior of the economy.

If you asked what the impact of some policy was and I gave you the graph above, you'd probably never ask again.

What SFC models do in order to hide the fact that anything could result from an SFC model is effectively assume R = L = C = 1, which gives you this:


I'm sure to get objections to this. There might even be legitimate objections. But I ask of any would-be objector:
How is accounting for money different from accounting for electrons?
Before saying this circuit model is in continuous time, note that there are circuits with clock cycles -- in particular the device you are currently reading this post with.

I can't for the life of me think of any objection, and I showed exactly this problem with a SFC model from Godley and Lavoie:


But to answer Cullen's implicit question -- as the two Mathematica notebooks above show, SFC models don't specify the behavior of an economy without assuming R = L = C = 1 ... that is to say Γ = 1.

Update:

Nick Rowe is generally better than me at these things.

Empirical economics, a denouement

“Our basic state of knowledge in economics is way below where you would think it was,” he says, adding that “the thing that annoys noneconomists about economists is their unbelievable certainty that they know what they are talking about, when the actual reality is they do not really know.”
That's from a nice article at the IMF on David Card, the economist behind showing via a natural experiment that raising the minimum wage doesn't necessarily increase unemployment.

Waiting for a philosophy of economics

Noah Smith: One of the issues with economics is that there are ten million theories with no way to choose between them.

Jason Smith: I'd agree -- I think economics needs a framework to reject theories.

John Handley: So you're a Popperian?

Jason Smith: Not really. That falsification stuff is good in principle, but Popper believed general relativity falsified Newtonian gravity which is not a word I would use.

John Handley: But general relativity is the current fundamental theory of gravity, so Newton is false, right?

Jason Smith: Newtonian gravity is an effective theory of gravity for small field strengths. It gets most spacecraft where they're going. One of the few practical applications of general relativity is a small correction to GPS results due to satellites being a bit farther out in the gravitational field, changing the speed of their clocks relative to one on the ground.

John Handley: So it's not false?

Jason Smith: You could say it's false on galactic scales and near black holes. But really, a physicist would just say that's outside the theory's domain of validty. Relativity represented a paradigm shift in how we looked at Newton.

John Handley: Paradigm shifts and theory rejection ... you're a postpositivist. Empirical evidence in favor of a hypothesis is irrelevant and falsification is all that matters.

Jason Smith: Maybe for the current state of economics, but that's not an absolute. In physics, what's needed are more experiments and more theories. Theory rejection isn't a problem in physics since there aren't ten million theories -- there's only one.

John Handley: So postpositivism is just your philosophy of economics ...

Jason Smith: I offer a lot of empirical evidence in favor of the information transfer models on my blog. Theory rejection is important, but so is theory validation. Economics seems to have a problem with rejection, not with validation -- all kinds of models are consistent with the data.

John Handley: You don't seem to have a proper philosophy of science.

Jason Smith: It seems like a lot of the philosophy of science is done by people who aren't scientists. Why do they take it up?

Alexander Nehamas: I will tell you why I became a philosopher. I became a philosopher because I wanted to be able to talk about many, many things, ideally with knowledge, but sometimes not quite the amount of knowledge that I would need if I were to be a specialist in them.

Jason Smith: That explains a lot.

John Handley: Then what is your philosophy of economics?

Jason Smith: I'd say it's more of a pragmatic empiricism.

John Handley: I don't think that word means what you think it means.

Jason Smith: Oh, no. I've stumbled into more philosophy. What word?

John Handley: Empiricism.

Jason Smith: Why?

John Handley: Empiricism is most closely associated with Hume and, while it serves as the backdrop for newer philosophy of science, is neither new nor worthy of attention when considering a new philosophy of economics. Empiricism is an epistemology that elevates experience over a priori deductions.

Jason Smith: No! That is exactly what I mean!

John Handley: You can't be serious.

Jason Smith: I'd update personal experience a bit for the information age, but you shouldn't believe anything in economics unless you've run the regressions yourself. In the age of FRED, all of the data is a few clicks away.

John Handley: But that's old-fashioned age of Enlightenment stuff.

Jason Smith: Yes, but that's where economics is! It's hard to see with all the sophisticated math developed for physics and the modern data aesthetics, but deep down economics is a field without a Newton.

Noah Smith: There are lots of empirical successes in economics.

Jason Smith: Yes, and there were lots of empirical successes in physics before Newton. That's what Newton did -- couched the empirical successes in a theoretical framework.

S: Couched.

Jason Smith: Couched.

S: Tomatoed. Are you done blogging?

Jason Smith: Almost, sweetie.

S: When you're done, can I play Goat Simulator?

Jason Smith: Of course .... And done.

...

References

http://informationtransfereconomics.blogspot.com/2016/01/falsifiability-isnt-empirical-validity.html

Update:

I forgot to add this somehow relevant picture (in my mind) from the Galileo museum ... Geocentrism had cooler models ...


Sunday, March 20, 2016

Economics doesn't need new theories; it needs to eliminate theories

David Sloan Wilson's latest article at Evonomics misunderstands the state of (macro)economics and Noah Smith's empirical tack. The key to understanding Smith is this sentence:
Ten million cool theories are of little use beyond the “gee whiz” factor if you can’t pick between them. Until recently, econ was fairly bad about agreeing on rigorous ways to test theories against reality, so paradigms came and went like fashions and fads.
... and this blog post. Smith is calling for what I will call negative empiricism: theory rejection. There's more from Smith here.

Wilson's response is:
By Smith’s own account, the field of economics is experiencing an empirical revolution. Unlike the past, it has become necessary to test theories against reality. That places the field of economics many decades behind the field of evolution and numerous fields in the human social sciences that have been rigorously evidence-based all along. 
Emphasis mine. You can tell that Wilson has misunderstood Smith by using the word "necessary". Wilson should have said: Unlike the past, it has become possible to test theories against reality. That was Noah's point; that's the issue. There are ten million theories because there was no way to reject them.

Now I agree with Wilson's call for a framework -- I've made it on this blog several times (e.g. here). In fact, I've made exactly the same kind of statement as Wilson. Here's me:
Also, I was inspired to do this because of Noah Smith's recent post on why macroeconomics doesn't seem to work very well. Put simply: there is limited empirical information to choose between alternatives. My plan is to produce an economic framework that captures at least a rich subset of the phenomena in a sufficiently rigorous way that it could be used to eliminate alternatives.
Here's Wilson:
The main reason that the so-called orthodox school of economics achieved its dominance is because it seemed to offer a grand unifying theoretical framework. Too bad that its assumptions were absurd and little effort was made to test its empirical predictions. Its failure does not change the fact that some unifying theoretical framework is required to prevent the “ten million cool theories” problem. What Nick Hanauer, Eric Liu, and I are saying (Smith misses that our arguments are cut from the same cloth) is that a combination of evolutionary theory and complexity theory offers the best prospect for a unifying theoretical framework.
But note the misunderstanding again (emphasis mine). It's not that little effort was made. The data was uninformative. It was not possible to reject theories. You literally could not tell if the assumptions were absurd as far as the macroeconomic outcomes were concerned. Maybe they were absurd at the micro scale, but there is no reason to believe that absurd micro assumptions necessarily result in an absurd macro theory. In fact, one of the ways macroeconomics might be tractable is if it doesn't matter what you assume about agents. Maybe the relatively simple traditional utility maximizing rational agent framework would work out.

For example, treating molecules as point particles with zero size and no internal structure (absurd) is perfectly acceptable if you have an ideal gas. If we had discovered atoms before the thermodynamics of gasses, and happened to follow Wilson's approach, we'd conclude that statistical mechanics is absurd. It treats atoms as if they have no electrons or nuclei! How silly!

But then because Wilson misunderstands the empirical problem in macroeconomics, he suggests his own (along with Hanauer and Liu's) even more complex framework!

Smith is skeptical because the data has only recently starting to eliminate possibilities and he believes enlightenment will come from empirically testing theories -- not new theory fads:
To me, that seems like a much bigger deal than any new theory fad, because it offers us a chance to find enduringly reliable theories that won’t simply disappear when people get bored or political ideologies change.
Wilson's evolution as a framework (as well as Hanauer and Liu's list of assumptions) are exactly the new theory fads Smith is talking about.

...

Now you may ask how the information transfer framework fits in here. Well, the primary thing is that the information transfer framework is a much simpler framework than even the framework Wilson calls the "so-called orthodox school of economics". Because it is simpler, it can be rejected. The data becomes informative. It's not some new "more realistic" or "more complex" approach. It's a simplification. And it seems to work.