Noah Smith likes slides (as do some of the commenters and twitterers (?) ...). A downloadable pdf version is [here, pdf] (let me know if my google drive settings aren't appropriate); here are some (DRAFT) slide images. Here you go:
References:
Smith, Jason. Information equilibrium as an economic principle. arXiv:1510.02435 [q-fin.EC]
** and references therein
Lee Smolin Time and symmetry in models of economic markets. arXiv:0902.4274 [q-fin.GN]
Becker, Gary S. Irrational Behavior and Economic Theory. Journal of Political Economy Vol. 70, No. 1 (Feb., 1962), pp. 1-13
Chen, M. Keith and Lakshminarayanan, Venkat and Santos, Laurie, The Evolution of Our Preferences: Evidence from Capuchin Monkey Trading Behavior (June 2005). Cowles Foundation Discussion Paper No. 1524. Available at SSRN: http://ssrn.com/abstract=675503
Fielitz, Peter and Borchardt, Guenter. A general concept of natural information equilibrium: from the ideal gas law to the K-Trumpler effect. arXiv:0905.0610 [physics.gen-ph]
Fisher, Irving. Mathematical Investigations in the Theory of Value and Prices. (1892).
Shannon, Claude E. (July 1948). A Mathematical Theory of Communication. Bell System Technical Journal 27 (3): 379–423.
Okun, Arthur M. (1962). Potential GNP, its measurement and significance.
Since Noah Smith wants to see slides, I'm going to try and put some together (in part for my talk in June). Here's an animation of when an information equilibrium description of economics would apply for agents that have a choice of two goods with the same price and a budget constraint:
And here's what it could look like if behavioral economics got involved. These could represent a recession (left) or a shift in preferences (right) from both goods to a single good (e.g. betamax vs VHS):
... Then we stumbled on something very new. After 25 years of doing this [split-brain] test we finally asked a different question [of an experimental subject]. ‘Why did you do that?’
It is the inventive interpretive mind first applying itself to our personal life and then to our social existence that is our core skill. Once humankind realized it possessed this technology, we seized on it to thrive in and control our niche on earth.
At that moment, his left brain was immediately confronted with a puzzle. Again, it knows why the right hand pointed to the chicken but why did the left hand point to the shovel? So, on the spot the left brain said, ‘Oh, the chicken claw goes with the chicken, and you need a shovel to clean out the chicken shed.’
That one simple observation, now repeated dozens of time on several patients, revealed another special capacity of the dominant left brain. We called this device the ‘interpreter’ and have come to realize it is the storyteller
...
Yet when Yuval Harari is talking about gaining control of people by the use of fictions, he is talking about the kinds of abstractions and ideas everybody can understand — money, religion, politics and preferences, the kinds of things an interpreter is at work on all day long.
Emphasis mine. This is why I cringe when economists want to hear a "story". Humans can rationalize anything. Humans can see patterns in randomness. I think economics is more in the thrall of "the interpreter" than most fields -- precisely because it deals with us.
I'm not saying I'm immune! Far from it. But if you start with the assumption that you don't understand economic agents, then you aren't as tempted to construct a story about what they think.
One of the results of information equilibrium is a power law relationship between the information source and the information destination. If $A \rightleftarrows B$ (i.e. $I(A) = I(B)$) then
where $ref$ refers to the reference values for the integration (and become just some parameters in our model). I refer to this as the "general equilibrium" solution to make a connection with economics. Both $A$ and $B$ are adjusting to changes in the other. The "partial equilibrium" solutions follow from assuming $A$ or $B$ adjust slowly to changes in the other (are roughly constant) and result in supply and demand curves (see the paper). We can write this in a log-linear form:
$$
\log A = k \log B + c
$$
These are power law relationships -- not power law distributions. Therefore information equilibrium relationships should be between two aggregates, not rank orders or cumulative distributions. It seems that a lot of economics deals with the latter, but there are a few examples that come up that fit this mold. First, there is CEO compensation (C) versus firm size (S) from Xavier Gabaix (I borrowed a graph from this blog post about Gabaix recent paper). We have the model $S \rightleftarrows C$ so that
$$
\log S = k \log C + c
$$
And this works pretty well ...
... but not perfectly. What we have here is probably some measurement error, but also deviations from ideal information transfer (non-ideal information transfer) so that $I(C) < I(S)$ and therefore
$$
\log S \geq k \log C + c
$$
We'd say the information in the firm size isn't reaching the CEO's compensation for large and small firms. Since we expect ideal transfer to be a better approximation as our variables go to infinity, it seems likely that the high end represents more measurement error (fewer of the largest firms) than non-ideal information transfer. (I realized that after I drew on the figure.)
Here's another example relating GDP and population (P) fromNature:
This could be represented by the model $P/A \rightleftarrows GDP/A$ where $A$ is just the area unit to obtain population and GDP density. I drew in a schematic curve (again) that represents information equilibrium (ideal information transfer) and shows how non-ideal information transfer data would fall below the line. We can also see how allowing for non-ideal information transfer means you need to take a different view towards fitting the data -- we expect our results to fall below the line so a simple power law fit will be biased. I imagine new techniques will have to be developed to make this process more rigorous.
...
Update
In Gabaix's paper, footnote 7 (on page 194, regarding the CEO pay) is an information equilibrium model:
$$ \frac{dw}{dS} = k \frac{w}{S} $$
with $k \equiv d \log w/dr$.
If you really think you've got something new and better, you publicize the hell out of it. You make slides explaining the basics. You write working papers - free and ungated -- showing how the new thing works, and why it works. You give examples. You draw pictures. You teach. You educate. The burden of publicization and education is on you.
There is also this blog. All of it is free and ungated. And it doesn't even throw out a lot of mainstream economics (update 3 Nov 2016: a link to a list of examples).
...
Update
I'm also trying to keep to Sean Carroll's alternative science respectability checklist ...
There was an uptick in inflation with the January 2016 numbers that some have considered to be noteworthy (e.g. here -- although that is a year-over-year measure, that John Handley takes on here -- that it's most likely due to the very low Jan 2015 number).
I wanted to see if it had something to do with seasonal adjustment (since this was floated as an explanation of why the Q1 GDP has been so low -- q.v. here). CPI inflation shows the same jump (see my table I've been comparing to the IT model here) and unlike PCE has both seasonally adjusted and unadjusted data on FRED. Here are the differences:
One thing that jumps out is that the Great Recession appears to have changed how seasons work. Here are all the months, warmer months (northern hemisphere) in reds, colder months in blues:
There is no particular reason that this affects January more than other months -- actually January is one of the few months that wasn't strongly affected by the Great Recession. Maybe the January CPI number isn't big -- it's just that the other monthly numbers got small.
Paul Krugman has a nice post with a concise definition of the liquidity trap:
... we are still in or near a liquidity trap, a situation in which cutting interest rates as far as possible isn’t enough to restore full employment.
Note that the current interest rate target range is 0.25-0.5%, so zero/low interest rates will only tend to be associated with a liquidity trap -- they aren't a requirement. He also provides a graph, but here's my improved version (nominal interest rates in black, real in red and the natural rate in blue):
The idea behind the liquidity trap is that monetary policy can't get the red line down to the blue line, and is therefore unable to restore full employment. It doesn't actually matter where the red line (or black line) is -- just that the red line can't get there. QE is supposed to produce inflation, lowering the real interest rate ... but it doesn't seem to have worked that way in the real world (there was almost no effect on inflation).
So really there are two questions here:
Why can't the central bank restore full employment?
Why can't the central bank create inflation?
The standard liquidity trap answers are:
The central bank cannot lower real interest rates to the natural rate.
Inflation expectations are too well anchored (fear that the punch bowl will be taken away).
So what about models in the information transfer framework? There are a couple of ways to understand it with two different models. The first is the monetary information equilibrium model (that's described in my paper on the arXiv) where the effects of monetary policy depend on a parameter called the information transfer (IT) index k. The IT index k is mostly controlled by the relative size of the economy (NGDP) to the monetary base (minus reserves). When k >> 1, the model behaves like a quantity theory of money. When k ~ 1, the model behaves like an IS-LM model ... and when k ≈ 1, the price level loses its dependence on monetary factors. I describe this in terms of the IS-LM model in this post, and in terms of interest rates in this post (see here for the connection between those two posts). Since the result largely depends on the IT index k, the reason why k ≈ 1 becomes the answers to the two questions. The simplest description is that k represents a Lagrange multiplier (like temperature in thermodynamics) and the maximum entropy distribution associated with k near 1 consists of more lower growth and lower inflation states: larger economies are colder economies. I discuss this in more detail here.
So the monetary information equilibrium model answers are:
Monetary policy does not effectively control NGDP or inflation when k ≈ 1.
There are more ways to configure a large economy out of many low growth/low inflation states than a few high growth/high inflation states. The result is low inflation and k → 1 over time (all advanced economies tend toward this state).
A second way to address this in the information transfer framework is with the what I (with a bit of snark) called a quantity theory of labor. In this model, there is no direct monetary control over NGDP or inflation -- both follow from the size of the civilian labor force. Our answers in this case are simple:
The central bank doesn't control full employment.
The central bank doesn't control inflation. Inflation falls for demographic reasons.
That first answer is actually a pretty good explanation for the remarkable regularity in employment recoveries -- if the central bank isn't involved, then the different central bank policies from the 1940s to the 2010s shouldn't have shown an impact on employment. In a sense, I produced this model as a reductio ad absurdum. It is fairly empirically successful for several countries (here), so represents a real Occam's razor: is your model more empirically accurate than this quantity theory of labor? No? Well, it sucks to be your model, then.
Now you may ask: Why do you have two models? Which one is it??! This also has two answers:
I am a scientist and am capable of holding more than one model in my head at once.
The labor model may well still be the reason why larger economies are colder economies (or the monetary model is what is behind the size of the labor force)
Nick Rowe did a recent post on his alpha and beta money (this time in terms of modeling Tony Yates "fairy tale" about escaping the zero bound) that he has discussed before. For whatever reason, when I read it this time, I thought about two different kinds of money in information equilibrium with each other and set about building the model:
$$
N \rightleftarrows M2 \rightleftarrows M0 \rightleftarrows S
$$
with $N$ being nominal output. Here M2 is Rowe's beta money and M0 (monetary base without reserves) is alpha money. However, I didn't take this much farther than that -- at least with regard to Rowe's model. This was just more for my own personal illumination. This information equilibrium relationship looks like:
Use of the (slowly) time varying $k_{1}$ and $k_{3}$ can turn the nominal output and aggregate supply equations into identities by proper choice of the parameters because
$$
a^{\log b/\log a} = b
$$
where the exponent is our "model" for the information transfer index
$$
k_{1}(N, M2) = \frac{\log N/c_{1}}{\log M2/c_{1}}
$$
This allows us to solve the first information equilibrium relationship $N \rightleftarrows M2$
$$
P_{1} = \frac{dN}{dM2} = k_{1} \frac{N}{M2}
$$
$$
\log N \sim k_{1}(N, M2) \log M2 \propto \log N
$$
and see
$$
\log P \sim (k_{1} - 1) \log M2 = \log N - \log M2 = \log V_{M2}
$$
where $V_{M2}$ is the velocity of M2. The parameter choice is essentially $N_{ref} = M2_{ref}$. The interesting bit is where we solve for the best fit parameters for the middle relationship and set $P_{2} = CPI$. There we essentially have the old price level model except with M2 as the information source instead of $N = NGDP$. The model itself is pretty accurate:
Here is the information transfer index, graphed both ways I've defined it in the past (so pick the one you like):
For the front end, we have the nominal output model -- which is perfect, because we designed it that way (by choosing $N_{ref} = M2_{ref}$):
The information transfer index is approximately constant
And we can see that the price $P_{1}$ is proportional to the velocity of M2 (the y-axis label on this graph is wrong):
The model $N \rightleftarrows M2$ is basically contentless as I've put it -- it amounts to choosing the price $P_{1} = V_{M2} \equiv N/M2$. I repeat -- do not read anything into this pseudo-empirical success. It is a definition.
The only reason I put this piece here is to represent Rowe's alpha and beta money. One interesting take-away is that the abstract price in the AD-AS model $P : N \rightleftarrows S$ is actually the product of the abstract prices in the individual pieces
$$
P = P_{1} P_{2} P_{3}
$$
Here we have (take $k_{3} = 1$ for simplicity meaning $P_{3}$ is constant):
$$
P \sim V_{M2} \times CPI
$$
meaning the AD-AS model that has the price level as the vertical axis is only valid for constant velocity.
When governments sell a lot of bonds, people think the government is sooner or later going to soak up these bonds with taxes, and do not spend.
That is John Cochrane in his recent post on the "fiscal theory of the price level". I don't know about you, but I'd have to look up how many bonds the government has recently sold, if this represents an increase or decrease relative to the past rate of increase or decrease, and then calculate what that should mean in terms of my spending. I have done this exactly never times.
Yes, there is Friedman's "as if" metaphor, but then that means Cochrane's statement should just be treated as a metaphor. However, it is used to formulate a mathematical model in terms of agent expectations. These aren't expectations, but metaphorical expectations.
A information equilibrium argument is more plausible:
When governments sell a lot of bonds, the state space of individual consumption changes such that individual agents do not on average move to states of increased consumption in future time periods.
This is at least consistent with the empirical observation that people don't behave the way Cochrane says -- they could just behave "as if" they expect future taxes.
However, the state space description above is equivalent to an intertemporal budget constraint, which is only a valid assumption near an equilibrium. Therefore we shouldn't act "as if" Cochrane's statement is true in general. Which makes sense of our intuition that Cochrane's description of human behavior is very odd.
...
PS Blogging via mobile. Will add links later. ... Update +6 hours Updated and added links.
This is a bit of an aside; I'm going to create an additional model that can be related to information equilibrium that may provide a source of helpful analogies. Consider a transistor with emitter current $i$ and base voltage $V$ and consider the information equilibrium relationship $V \rightleftarrows i$ (base voltage is the information source and emitter current is the information destination):
$$
\frac{dV}{di} = k \frac{V}{i}
$$
with slow changes in the voltage relative to current (current adjusts faster to changes in voltage than voltage adjusts to changes in current). That gives us the "partial equilibrium" (in economics parlance) solution
$$
k V_{0} \log \frac{i}{i_{ref}} = V - V_{ref}
$$
take $V \gg V_{ref}$ and rearrange to obtain
$$
i = i_{ref} \exp \frac{V}{k V_{0}}
$$
This is the Ebers-Moll model of a transistor in the forward region, acting as an amplifier.
I brought this up because Cesar Hidalgo used a transistor metaphor in Why Information Grows:
Now consider that we can push [a chemical] system to one of [its] steady states by changing the concentration of inputs ... Such a system will be “computing,” since it will be generating outputs that are conditional on the inputs it is ingesting. It would be a chemical transistor.
Information equilibrium relationships can represent supply and demand (information flowing between them ... see the paper). We add a new metaphor borrowed from Hidalgo: information flowing between the base voltage and the emitter current in a transistor. In this case, traveling along supply and demand curves can be seen as the linear region of an amplifier which faithfully reproduces the information in the weak signal at the output.
In the example, voltage is the demand for electrons and current is the supply of electrons. Our partial equilibrium solution represents movements along a demand curve. Note that the abstract price in this example is the (effective) resistance $R$ [1] -- the RHS of the first equation is half of Ohm's law.
$$ P \equiv \frac{dV}{di} = k \frac{V}{i} \propto R $$
such that we get the price relationship of a demand curve (increase base voltage $V$ and you get a fall in the price/resistance):
$$
R = P = \frac{k V_{0}}{i_{ref}} \exp \left( - \frac{V}{k V_{0}} \right)
$$
Changes in temperature (which change the thermal voltage that maps to $V_{0}$, see 25 Feb update below) are shifts of the demand curve.
Footnotes
[1] Note this is the effective resistance: assuming a voltage applied at the base produces the amplified current at the emitter. It's not the resistance of the current across the transistor (collector-emitter). For a large enough voltage, the effective resistance can go close to zero ... but this isn't a superconductor. The current is coming from the other terminal of the transistor.
[Updated 26 Feb 2016]
...
PS Sorry for the tenseness of the post. It was composed one finger tap at a time on an iPad. [Updated 26 Feb 2016 to be a little less terse.]
...
Update 25 Feb 2016
I have extended the above results to include the Early effect (Early is a person's name). The Early effect can be derived through a charge conservation argument (depolarization across the PN boundaries). I'll derive it via information equilibrium below. But there is a question: why should a physical process know about conserving information? The answer is in the question. Information equilibrium is an information conservation argument -- information carried by those charges. In places where we expect the transistor to operate without information loss (the forward active region), then we should expect charge conservation to produce results consistent with information equilibrium.
Here's the derivation and a diagram to go along with it:
Interestingly, we can add the IP3 point (well, not exactly, but its IV analog instead of power) to get a good example of how ideal and non-ideal information transfer describe the system at various times. An amplifier is operating successfully when the information in is equal to the information out ... just louder. This happens in the "linear" (log-linear) region after the saturation, but before the nonlinearities (compression) described by e.g. the "IP3 point" kick in. Inside this region, information equilibrium is a good approximation:
But in the other regions we should expect less current than an ideal (information) amplifier because I(i) < I(V). And that is generally what is observed.
...
PS The onset of current at some small voltage at the base is what gives a transistor its switch-like properties.
I've combined this calculation (solving for NGDP by integrating a differential equation) with this finding (that the nominal shocks in the DSGE form of the model are approximately given by shocks to employment) -- the latter meaning that the the noise function n(t) is given by changes in employment.
The result is fairly good (for such a simple model) over short periods (the validity of the DSGE form). In the graphs: the IT model is blue, the IT model using empirical shocks is red (this should recover the original data -- errors are numerical), and the NGDP data is black.
One thing to note is that the employment shocks result in a much smoother transition to recession. That's because employment is much smoother than NGDP.