Saturday, January 10, 2015

What did I miss? — Comments on the work of others

I missed quite a bit when I was on vacation, and have only now started to go back through the past month of my RSS feeds. Here is the information transfer perspective on some of those posts (in no particular order):


Sumner calls for price level targeting. In the information transfer model (ITM), there appears to be a maximum achievable inflation rate and for the EU this has fallen below 2% (and probably close to 1%) making any level target that requires an inflation rate above that maximum untenable. In a sense, Sumner is calling for solving traffic congestion by eliminating the speed limit. This can work for low traffic volumes (analogous to an economy that obeys the quantity theory), but not higher volumes (a liquidity trap economy).


A redefinition of the unit of account (or any other monetary regime change including engaging in a bit of hyperinflation) could change conditions and this appears to have been what happened during WWII. However the zero lower bound itself isn't the issue and the liquidity trap can occur above the ZLB in the ITM. Monetary policy ineffectiveness is a general property of cold economies, not the result of how the unit of account is embodied (although changing it will in general redefine the relationships between inflation and NGDP).


Like the "Econ 101" analysis of the minimum wage, Uber's surge pricing also hides many model-dependent assumptions if you look at the situation carefully. The "Econ 101" analysis says that increased prices will reduce demand and/or incentivize supply during peak travel times for Uber's car service. As I show at the link earlier in this paragraph, that is a model dependent result (for all the reasons that Interfluidity gives). If you look at a single market, taking the price P → α P in the differential equation for the supply and demand curves only has the effect of changing the price elasticities and results in the same equilibrium quantity demanded/supplied (you can think of it as modifying the information transfer index). One would need to specify how the car service market interacts with other markets in order to figure out the effect; hence the result is model dependent. The "Econ 101" analysis isn't "impeccable" assumes an implicit model where everyone is the same.


In the ITM, the spikes in inflation in the 70s and early 80s are about half monetary and half from some effect outside the model such as oil shocks. The fall in inflation has been destined since the monetary regime changes of the WWII era (gold standard, Breton Woods, wartime hyperinflation). Unemployment appears to be a result of mass panic (coordination) and only weakly linked to inflation -- negative shocks NGDP do result in falls in inflation, but only depend on log NGDP. At first glance, in the ITM, inflation and unemployment don't have a strong, direct relationship (see here or here).


In the ITM, exchange rates are more a story of statistical noise (most movements have no information) and relative aggregate demand of the two countries in question.


In reading Woolley's post you get a sense of how difficult the agent-based/microfounded problem is. How do we know when we've incorporated all of the effects? Aging, storage of goods, preferences, or even preferences of others -- a sizable fraction of the stuff in my kitchen has been purchased for me for Christmas or my birthday because my friends and family know I like to cook. How do we order these effects in terms of the size of their contribution? They all seem to be the same size in Woolley's model. How do we know when to stop? Shouldn't we incorporate fads in here? It sounds like a nightmare.

And how do we know if the stagnation we get out of this is really the stagnation we see? If you kept adding effects without a well-defined criterion (as opposed to e.g. Feynman diagrams) until you observed stagnation, that's bad methodology. And if it's the sum of a bunch of complicated effects that depend specifically on the micro theory, why does it seem the same across several countries that have similar macro properties? (This argument is similar to the one I've used for unemployment -- economic recoveries have a remarkable regularity, so it's unlikely that search costs or the internet or any other mechanism have anything to do with it; it's also related to this argument).

In the ITM, secular stagnation appears to be an emergent macro phenomenon that has the relatively simple explanation that a given dollar is more likely to be found facilitating a transaction in a low growth market in a large economy. We don't need to know the details of the preferences behind that transaction or even what the product is in order to figure that out.

No comments:

Post a Comment

Comments are welcome. Please see the Moderation and comment policy.

Also, try to avoid the use of dollar signs as they interfere with my setup of mathjax. I left it set up that way because I think this is funny for an economics blog. You can use € or £ instead.

Note: Only a member of this blog may post a comment.