By far, the most common response to the work I've been doing on this blog from people with a background in economics is that they have no idea what I am talking about. It's nice being noticed by Scott Sumner, Nick Rowe, David Glasner and Stephen Williamson, but they all have said at one time or another they don't understand what this blog is all about. At least that's better than Noah Smith's response (deleting my comments**). Mike at Free Radical disagreed with the philosophical approach (I am skeptical that economics can tractably model human behavior). Robin Hanson thought I was using the word 'information' in a different way than I am (in the Shannon information entropy sense, not in The Market for Lemons or game theory sense). I'd say the major issues are my explanatory writing skills, my background in physics (I use the word isentropic way too much) and the lack of formal education in economics (I have worked my way through most of Romer's Advanced Macroeconomics, but all that has helped me to do is write the information transfer model as a DSGE model***).
So let me try to do another Vox-style explainer post ... on the right side of the blog there are a series of posts under the heading "Information transfer economics for beginners", so those are some more resources.
It is based on the first title of this paper by Fielitz and Borchardt that I saw: Information transfer model of natural processes: from the ideal gas law to the distance dependent redshift. They have since changed from "information transfer" to "natural information equilibrium" and I have followed suit, more frequently referring to information equilibrium than information transfer. In that paper, Fielitz and Borchardt come up with a kind of generalized thermodynamics and connected a couple of different things in physics to information theory that hadn't been connected before. I derived supply and demand (or at least something that looks exactly like supply and demand diagrams) from their equations assuming demand was an information source and supply was an information destination. I thought that was pretty cool, so I started a blog to see if anything more could come out of it.
In truth, the information theory mostly serves as a motivation for a set of equations, and then the blog is mostly about the consequences of those equations. The equations the information theory sets up are for an abstract diffusion process. (In the model, money is like time and output is like distance.) I do sometimes go back to the information theory for insight -- for example observed prices will tend to fall below a theoretical maximum price because you can't get more information out of a signal than it contains.
Pretty much. Instead of writing $MV = PY$, I would write:
\log PY \sim k \log M
$$
$$
\log P \sim (k -1) \log M
$$
The big difference is not only that $k$ changes (yes, like velocity), but changes deterministically. The deterministic formula for $k$ comes from the underlying information theory: it is a ratio of the Hartley information from two sets of given size. However, you can rewrite that ratio in an interesting way using the properties of logs. A set $A$ has $|A|$ elements and the Hartley function is
H(A) = \log_{b} |A|
$$
where the information is measured in units based on base $b$. Our deterministic $k$ above is
k \sim \log_{M} PY
$$
[update 1/19/2015, so you don't have to dig at the link] or
$$
k \sim \frac{\log PY}{\log M}
$$
That's why I've called the changing $k$ the unit of account effect in the past. It represents the units which which we measure everything in economics ... and $PY$ is just NGDP. The proper money aggregate $M$ is determined empirically. It's not M2 or the monetary base. The answer turned out to be strikingly simple: physical currency.
Well, $k$ has a tendency to fall over time in economies because $M$ grows faster than NGDP (and NGDP has recessions). Eventually $k$ gets close to 1 (from above) and you have something that looks pretty much exactly like a liquidity trap.
This is where I have in the past gone full heterodox with all due respect to the field of economics. Expectations might be able to wiggle inflation around a bit, but in the end the price level comes from $k$ and $M$ through the equations above.
However, I have since lightened up on this and in fact consider one possible interpretation of the information transfer model is as information flowing from an expected future to a realized present. Instead of considering all possible levels of aggregate supply and aggregate demand consistent with a given price level, we consider all possible expected futures consistent with present macro conditions. That is still different from typical expectations models in economics. A credible central bank sets an inflation target and expectations will center around that target -- it determines $\pi_{t+1}$ from a model (e.g. the central bank can hit its target). In our case, we are agnostic about how expectations are set and take the result to be the average over the entire set of expectations consistent with macro conditions -- it determines $\pi_{t+1}$ from the set of all $\pi_{t+1}$ consistent with e.g. $\pi_{t}$, $m_{t}$, etc****.
This is like thermodynamics. The pressure of an ideal gas is the most likely pressure given its volume, temperature, etc. The actual value of pressure will vary -- but since there are so many molecules, the variance is small and you get a deterministic result. In the information transfer model, there are so many people with so many different inflation expectations that effectively, the final result depends entirely on how big $M$ is.
Where does the model differ from thermodynamics?
Probably the biggest difference is that there is no second law and in fact, a fall in entropy is linked to recessions. This is where human behavior matters. Humans can coordinate themselves in a way that atoms cannot. Most of the time humans are uncoordinated (maybe an economist would say coordinated by the market here), but sometimes we panic together and that results in a recession.
How deep does the connection with thermodynamics go?
Really deep. In fact, there may be a way to apply partition functions and define an economic temperature (proportional to $1/\log M$). Secular stagnation and liquidity traps may be emergent properties of economies that don't exist for individual markets -- so no amount of models that just put together separate markets will capture these emergent properties.
Are there more ordinary applications of this approach?
Yes. Here's an example of a two-good market that basically gets the standard result. And here's how to derive the ISLM model.
Footnotes:
** It's his blog; he can do what he wants.
*** A neat takeaway is that the different information transfer index ($\kappa = 1/k$) limits drop out simply in the DGSE model. You get either a "liquidity trap" inflation = constant ($\kappa$ = 1) or "quantity theory" inflation = money growth rate ($\kappa$ = 1/2).
**** The $m$'s and $\pi$'s here are the log-linear versions of $M$ and $P$ such as you would see in a DSGE model.
Can you give a simpler explanation of what
ReplyDeletelogPY∼klogM
logP∼(k−1)logM
means ?
You lost me at that point.
Another instance where my physics background leads to a lack of clarity. The ~ symbol is from here. It really just allows you to write down the essence of an equation. If y = m0*x + b0 for a line then you can write y ~ x.
DeleteIn the case above, I left off some constants and lower order terms (they change much more slowly than the first terms in the equation):
log PY = k log M + k log m0 - log(p0 y0)
log P = (k - 1) log M + (k - 1) log m0 + log k - log(p0)
As far as what the P, Y, M and k mean, they are from the equation of exchange or the Cambridge equation where P is the price level, Y is output, M is the 'money supply' and k is sort of like the Cambridge k -- except the information transfer model represents a new model for it.
What these equations all do is describe a 'quantity theory of money' where monetary expansion ('printing money') leads to increased inflation (P goes up) and output (PY goes up).
http://en.wikipedia.org/wiki/Quantity_theory_of_money
Pure quantity theories have mostly been rejected by looking at the data (unless k or V = velocity of money are pure fudge factors which makes the quantity theory unfalsifiable) so there are various modifications -- for example, market monetarists have something like an expectations-augmented quantity theory of money in their minds. There is also debate about which monetary aggregate to use for M.
I look at MV = PY as being two descriptions (each equal to the other) of an economy which lives in a container of size one-year.
ReplyDeleteYes, this would be the equivalent of equating PY as being the sum of the velocity of each gas particle in a container where there are many particles and many velocities.
Yes, this would be the equivalent of equating MV as being the average velocity (V) of a single representative particle (M), with the limit that the representative particle represents less than the total number of particles in the container.
Now we notice that MV changes from year to year. Something is happening to the entire container so that one year varies from the adjacent year.
It seems to me that this is equivalent to accelerating the entire container of gas. If we accelerate the entire container of gas, during acceleration, the particles will concentrate on one side of the container and de-concentrate on the opposite side. A thermometer would measure that the concentrated side has become warmer, the de-concentrated side has become cooler.
The above description can be applied to an economy. It seems to me that adding money (which is an increase in term M) imbalances the part of the economy that first receives the added money. This added money does not reach the later-reached sectors of the economy until much later, maybe not until the money addition stops. This would correspond with the visual image of a container of gas where the particles would remain concentrated on one side until the acceleration stopped.
It is time to turn to the use of logarithmic notation to represent this process. I don't see any need for logs up to this point. There is, however, another potential justification. Taxes tend to be uniformly applied at every monetary exchange so a repetitive transfer to government occurs.
Logarithmic notation can be used to learn the number of money supply turnovers based on a tax rate influenced money supply. I wrote a post “Finding the Exponent in the Fiat Decay Model” which can be found at
http://mechanicalmoney.blogspot.com/2014/11/finding-exponent-in-fiat-decay-model.html
It seems to me that the exponent here is equivalent to the K (or velocity) of your
k ~ log base M PY
equation.
The Fiat Decay Model is based on the theory that a one-time injection of fiat money into an economy can be completely recovered by government using taxation. A one-time injection will create a predictable maximum GDP as it slowly decays with each succeeding transaction.
Perhaps my understanding of your IT theory is improving but I cannot confidently compare your result to my work. You may find that the parallels and analogies here in this comment are not too good. Please remember that I am trying, and be patient.
Hi Roger,
DeleteIn the equations above, k in the ITM is essentially a model for velocity in the quantity theory -- I will do a check of the math, but k does seem to be similar to the exponent in your decay model since tax receipts are roughly proportional to GDP:
http://research.stlouisfed.org/fred2/graph/?g=XAJ