Saturday, December 3, 2016

Saving the scissors


Noah Smith has a great new post up about "Econ 101-ism" and the labor market. As they say, read the whole thing (including the comments). He has a great discussion of falsification right off the bat. I only have one quibble with this line:
So since people have different expectations for a theory ... whether a theory has been falsified will often be a matter of opinion.
I'd rather say that "falsified" is not a useful term for any theory except one that should never be used (e.g. aether). It's whether a theory is "good enough" (sign/direction, relative magnitude, order of magnitude, 10% error, 1% error, etc) for a problem at hand that will always be a matter of opinion.

But really, just read Noah. Towards the end he says we should stop using Econ 101 for the labor market:
If econ pundits, policy advisors, and other public-facing econ folks were scientifically minded, we'd stop using this model in our discussions of labor markets.
But he then laments that the simple framework probably won't be abandoned:
The fact that this theory is such a simple, clear, well-understood tool - so good for "organizing our thinking", even if it doesn't match reality - will keep it in use long after its sell-by date
Stephen Williamson comments:
Partial equilibrium supply/demand is a simple tool that we can teach to someone with little technical expertise, which can help them think about the basics of economic processes. But for the questions in [Noah's post], it's not even a matter of the theory being "false" - it's just the wrong tool for the job.
David Andolfatto comments as well:
As I (and others) have argued before, Marshall's scissors do not seem like the best organizing framework for the labor market. The scissors assume anonymous spot markets. In contrast, most labor markets involve relationships.
I am 100% behind everything being said. The problem is that abandoning "Econ 101" leaves economics with a dearth of easy-to-communicate tools for understanding what happens in reality. Good, you might say, Econ 101 is wrong about this stuff. And that's true. However, if we have to resort to heterogeneous agents or matching theory ‒ or worse, macro models ‒ then for most people they're going to replace supply and demand with zero sum heuristics. To a large extent, that has already happened. "Immigrants take our jobs," is the common refrain.

What we need is something that's as easy to understand as Marshall's scissors and hasn't been falsified. To that end, let me present the information equilibrium approach to the problem ... which can hopefully save the scissors by clearly defining the scope of the partial equilibrium approach.

I will just look at the labor supply shock below as I've looked at the minimum wage a few times before (notably, here and here).

*  *  *

Let's start by saying the nominal output of jobs (the aggregate demand for jobs) is $N$ and the labor supply is $L$. These derive from distributions over the possible states of the economy (jobs available in Seattle during the summer versus jobs available in Albuquerque in the winter, and workers available in Chicago in the spring), and we have equilibrium when the two distributions are the same. Everyone who wants a job at a specific time in a specific place has found one. The picture we have looks something like this:


where the blue density is the distribution of workers and the white density (with level curves) is the distribution of jobs. Think of something like this picture of population:



These represent distributions over possible states in the economy, and as such are inherently heterogeneous (e.g. just add dimensions for different kinds of jobs). What we want to know is what happens to the information entropy of the distributions when we change either distribution by a little bit (e.g. $N \rightarrow N + d N$). The simplest case would be for uniform distributions and keeping the relative information entropy constant. This results in the information equilibrium condition

$$
W \equiv \frac{dN}{dL} = k \; \frac{N}{L}
$$

where we've defined the "wage" $W$ as the "exchange rate" [1] between aggregate demand for labor and labor supply. The parameter $k$ is called the information transfer index. We can write a shorthand for this relationship using the following notation: $W : N \rightleftarrows L$. This differential equation has the solution [2]

$$
\frac{N}{N_{0}} = \left( \frac{L}{L_{0}}\right)^{k}
$$

where $N_{0}$ and  $L_{0}$ are parameters that we use to define the equilibrium (the state where the distributions pictured above match). We can also solve for the wage $W$:

$$
W = k \; \frac{N_{0}}{L_{0}} \; \left( \frac{L}{L_{0}}\right)^{k - 1}
$$

Note that we've already changed how we're approaching "Econ 101". This is the "usual" (i.e. general equilibrium) case of adding to the labor supply and we've left open the possibility that this increases wages (if $k > 1$). Let's rewrite this in terms of a difference from equilibrium $\Delta X \equiv X - X_{0}$

$$
\begin{align}
1+ \frac{\Delta N}{N_{0}} = \left( 1+ \frac{\Delta L}{L_{0}}\right)^{k}\\
\frac{W}{W_{0}} = \left(1+ \frac{\Delta L}{L_{0}} \right)^{k - 1}
\end{align}
$$

where $W_{0} \equiv k N_{0}/L_{0}$ [3]. This defines a family of relationships that depend critically on $k$


If we go back to our original relationship and ask what happens if $N$ changes slowly [4] with a change in $L$ (i.e. supply changes quickly). In this case we find that [3]

$$
W = W_{0} \; \exp \left( k\;\frac{\Delta L}{L_{0}}\right)
$$

this traces out a supply curve. Compared to the general equilibrium solution above, this is partial equilibrium. Changes in $L$ is movement along the labor supply curve. Shifts of the labor supply curve shift the parameter $L_{0}$ which defines equilibrium (note that shifting $L_{0}$ also changes $W_{0} \equiv k N_{0}/L_{0}$, so a positive/rightward shift of the supply curve represents a fall in price).

Likewise, we can ask what happens if $L$ changes slowly with respect to $N$; in this case, we get a demand curve defined by [3]:

$$
W = W_{0} \; \exp \left( - \frac{\Delta N}{k N_{0}}\right)
$$

We can show these with the traditional Marshallian scissors graphs [5] with the supply curve in red, and the demand curve in blue:


The second graph shows the rightward shift of the supply curve (movement along the demand curve) from a sudden shock of additional labor supply resulting in lower wages. We've finally gotten to the Econ 101 result, but we've had to make some additional assumptions to get here. Namely, that the supply shock is fast or large (or fast and large) relative to the change in demand.

This is where David Andolfatto's comment above comes in (along with Noah's talk of general equilibrium and matching). Under what circumstances can we ever say that a labor shock is fast relative to demand? It takes time to find jobs, and people need stuff to live while they are looking. In a sense, a big influx in migration would probably first be a positive demand shift.

This is not to say there's never a scenario where Econ 101 might occur in a labor market. It does seem to be true that a tight labor market raises wages exactly how Econ 101 says. It's possible that closing a major employer in one town might put downward pressure on wages, but that also might be tied up with a demand shock. Basically, we should probably look at the general equilibrium solutions in the labor market.

Where do the partial equilibrium solutions matter? When demand and supply can be separated and we can definitely make the assumption that supply changes faster than demand. A good example would be dumping a bunch of blueberries or Magic, the Gathering cards on the market. You can usually change the supply of either much faster than the demand for either. In this case you might briefly fall into the "partial equilibrium" regime like the simulations at this link (same as the Magic cards link):


However, the "usual case" is that an increase in labor supply either increases wages or leaves them the same, and you have to bend over backward with the assumptions to get the Econ 101 result. But lab experiments have shown supply and demand to be a useful description sometimes (see here and here), so sometimes we are in the Econ 101 domain of validity.

Can we save the scissors by clearly defining the scope?

PS Commenter Unknown makes a great point at Noah's post:
People don't oppose increasing minimum wage because of econ 101. They deploy econ 101 because they oppose increasing the minimum wage, and the opposition to it does not have a prior justification that has anything whatsoever to do with economics.

*  *  *

Footnotes:

[1] You can think of an exchange rate as the ratio of a tiny amount of dollars ($dD$) for a tiny amount of Euros ($dE$), or $dD/dE$. This is also how Irving Fisher thought about exchange in his 1892 thesis.

[2] If we're presenting this without calculus, we can just start here, just like in introductory physics without calculus you start with 

$$
S = \frac{1}{2} a t^{2}
$$

which is the result of an integral (integrating the constant $a$ twice with zero constants of integration). In fact, you solve the information equilibrium differential equation by integration.

[3] We can write these in even more compact form by defining $x \equiv X/X_{0}$ and $\Delta x \equiv \Delta X/X_{0}$

$$
\begin{align}
1+ \Delta n & = \left( 1+ \Delta \ell \right)^{k}\\
w & = \left(1+ \Delta \ell  \right)^{k - 1}
\end{align}
$$

along with the supply and demand curves

$$
\begin{align}
w & = e^{k \Delta \ell}\\
w & = e^{-\Delta n/k}
\end{align}
$$

[4] Technically, we ask

$$
\frac{dN}{dt} \ll \frac{dL}{dt}
$$

This defines the scope (domain of validity) of the partial equilibrium solutions.

[5] The angle brackets are unnecessary to the main thrust of this post, but are explained here.

Friday, December 2, 2016

A May 2017 recession?

There's been some discussion out there that we're due for a recession. For example the US has never gone more than 10 years without one. I thought I'd try this simplistic model based on the structure of the unemployment rate (more here). It's still uncertain, but it does say we should expect  a higher unemployment rate rather than a lower one in the next few years:


The vertical line is the estimate of the recession date (per the link above) of the latter half of May 2017 (plus or minus 4 months, technically 2017.4 ± 0.4).

Note that this model is very simplistic ‒ it allows for a zero or even negative unemployment rate. That means that adding the prior that the unemployment rate will typically be above 3-4% would increase the expected unemployment rate, so our estimate is a conservative one. Here are the probability distributions for the unemployment rate for 2017 Q1 through 2018 Q1 (by quarter):


Here are the results in table form (end of quarter)

2017 Q1       5.0 ± 0.3 %
2017 Q2       5.4 ± 0.8 %
2017 Q3       6.2 ± 2.0 %

The rest of the uncertainties are too high to be useful.

Anyway, this is more for fun. This "model" is independent of the information equilibrium model, and is based entirely on an assumption of an "dynamic equilibrium" in the unemployment rate (such that it drops by about 0.05 percentage points per month (0.6 percentage points per year).

It is interesting that a possible information equilibrium (IE) indicator of recession (see here and here) has been showing that a recession is possible. In the past, short interest rates above the IE model are a precursor to recessions, and we seem to be entering another period of "high" interest rates:


Chamberlain (1948) vs Smith (1962): non-ideal vs ideal information transfer

I was reading an interesting historical perspective on economics centering around the year 1952 (written as a massive tweetstorm). I may have more to say about it later, but one thing that caught my eye was a quote from Vernon Smith about Chamberlain's (1948) experiment:


Smith is essentially saying that there wasn't enough exploration of the state space to come to equilibrium. Chamberlain's experiment was essentially redone in List (2004), which I was able to reproduce using via a simulation with random agents. In my code, I let the agents "circulate" until no more transactions could occur. However, if I limit the time (allow only a few attempts at transactions), you certainly don't get the expected equilibrium (left/first is limited, right/second is from the previous link):


This is an example of non-ideal information transfer where the price (and quantity) fall below the equilibrium because of incomplete transfer of information between supply and demand. In a sense Chamberlain's (1948) conclusion
Perhaps it is the perfect Market which is "strange"; at any rate, the nature of the discrepancies between it and reality deserves study.
Should be taken as support that markets sometimes fail ‒ especially when the state space hasn't been fully explored. Different auction/market constructions can lead to different "efficiency" (more or less ideal information transfer, more or less exploration of the state space). Vernon Smith's (1962) experiments show ideal information transfer (shown with a random agent simulation from me):



Thursday, December 1, 2016

Store of value and medium of exchange are incompatible


From Wikipedia.

How's that for a bold title? Well, either bold or trivial. Anyway, let's assume information equilibrium is correct. I thought I'd give some reasons for why the debate over bitcoin scaling (H/T Frances Coppola in tweets here, here) is not resolvable. The store of value and medium of exchange purposes of money are incompatible using information equilibrium.

We'll follow the usual information equilibrium argument for money as a medium of exchange here. We'll start with two goods $Y_{1}$ and $Y_2$ and two demands for those goods $X_{1}$ and $X_2$. So we start with $X_{i} \rightleftarrows Y_{i}$ (with IT indices $k_{i}$). Let's introduce money $M$ so that we have

$$
X_{i} \rightleftarrows M  \rightleftarrows Y_{i}
$$

This gives us two (well, four ... two for each $i$) information equilibrium conditions

$$
\begin{align}
\frac{dX_{i}}{dM} & = \frac{k_{i}}{k^{(m)}_{i}} \; \frac{X_{i}}{M}\\
\frac{dM}{dY_{i}} & = k^{(m)}_{i} \; \frac{M}{Y_{i}}
\end{align}
$$

This follows via the chain rule (no pun intended) and the identity $M/M = 1$. This represents our medium of exchange. You can buy $Y_{1}$ with money and the seller can then buy $Y_{2}$ with that money. In this arrangement, what does store of value mean? It means that the exchange rate for money for $Y_{1}$ is constant ‒ money never buys less of $Y_{1}$ or more. Therefore

$$
\frac{dM}{dY_{1}} = \;\text{const}
$$

In order for this to be true, we must have $k^{(m)}_{1} = 1$, and therefore $M \propto Y_{1}$. However, you could have picked $Y_{2}$, and if you did, you'd find $M \propto Y_{2}$. Therefore:

$$
Y_{1} = \alpha \; Y_{2}
$$

where $\alpha$ is some constant. This means that $Y_{1} \rightleftarrows Y_{2}$ with IT index 1 so that $Y_{1}$ and $Y_{2}$ grow at the same rate $R$. You can then, by induction, prove this for any pair of goods $Y_{i}$ and $Y_{j}$.

In order to have money $M$ operate as a store of value and a medium of exchange, it requires the entire economy to basically scale the exact same way. You can't introduce new products. Effectively we're stuck with an economy that is just scaled version of the economy in the past (by a factor of $e^{R \; t}$). (Every good would have to also grow so that there was always the same relative number of units, which would present a problem for bulk goods like blueberries or fluids like beer.) This is connected to the idea that money is something of a physical manifestation of scale invariance (at least in the information equilibrium picture). However the reason the problem exists easy to understand: it's because you can exchange money for one good, and then that money can be exchanged for a different good. Because that simple sequence of transactions is possible, every pair of goods has to grow at the same rate relative to the money stock. That is to say, it's basically because money is a medium of exchange.

And the problem is basically that store of value assumption. There's nothing wrong with an ensemble of markets with different/changing growth rates (like the discussion here, except it uses labor and productivity). With an ensemble, you'd have inflation that decreases over time (like the productivity that decreases over time) ‒ but that is exactly degrading the value of money.

So we have an incompatibility. Money can be a store of value, but only if it's not a medium of exchange. And money can be a medium of exchange, but only if it's not store of value.

Why don't you just show us why it's better?

If you think you have a better economic modeling paradigm, just show that it's better. If you rely on arguments against other paradigms, you'll end up looking foolish when you have to make the same "mistakes" as those paradigms.

Pedro SerĂ´dio points us to a paper [pdf] that trashes DSGE in favor of agent based modeling. It says a couple things about DSGE having unrealistic assumptions (these are just two examples):



However a few pages later, we get this:


I fail to see the difference between "not the most descriptively accurate" and "unrealistic". Here's the definition of realistic:
re·al·is·tic
adjective
2. representing familiar things in a way that is accurate or true to life
So a good definition of unrealistic is "not representing things in a way that is accurate", so in order for ABMs to deliver meaningful statistical implications, research must often employ unrealistic assumptions.

This not to say I think ABMs are misguided. It's a tool, and tools can be used effectively or not. Basically, my view can be boiled down to a few points. First, I've used an agent based model here (an incredibly simple agent, but an agent nonetheless). But the major issue with complex agents -- basically the one above -- is the same issue with any complex economic model: too many parameters. My agent model was simple. Check out this agent, though:


... and the authors called it "stylized" (and "Mark 0") meaning it's only going to get more complex.

Second, I think it is an interesting question as to whether information equilibrium relationships arise out of agent based models. Some preliminary results say yes, they can. Note that this agent model is also pretty simple (a few lines of Mathematica code).

And finally, there are some basic reasons to expect that macroeconomics is either independent of the details of its agent substrate (macro is a theory restricted a low-dimensional subspace of the million-dimensional agent space) or completely intractable (macro is a theory that requires most of the million-dimensional agent space). I wrote about this more here (with a Socratic dialog).

Instead of arguing that mainstream paradigm X sucks (e.g. X = DSGE), you should just show us that your new paradigm Y works. Show us that Y makes successful predictions. Show us that Y describes the data well (like this). Show us that Y is useful. This is how science works.

Another IT model success: forecasting [using] exchange rates

At the beginning of the year, I used the IT model to forecast a relative fall in Canada's GDP compared to the US via the decline in the exchange rate, and that turns out to have happened. Here's the original graph:


And here's the latest data through 2016 Q3 (new and revised data are darker blue/darker yellow = brown):


Note that due to data revisions in GDP that was used in the original fit, the fit changed a tiny amount.

Update: Also, it seems, like usual, the exchange rates over-react. So this is useful direction information, but not necessarily magnitude.

Update: The title should read forecasting using exchange rates. The exchange rate data is available daily while GDP tends to come out a month after the quarter it represents ends.

Wednesday, November 30, 2016

Economic theory and male answer syndrome

I found this via Pedro SerĂ´dio. Early on, it hit a phrase that made be LOL
For me the attraction of the work of Kondratieff, Schumpeter and Carlota Perez in the modern era, though I am critical of them all ...
The author probably would let us know that his likes and retweets aren't endorsements, either. There is literally no reason for the phrase "though I am critical of them all" in the paragraph in which it appears except as signalling (the criticisms are never discussed).

Anyway, let's look at a bit more:
First you would have to fix the problem Paul Romer identifies in “The Trouble With Macroeconomics”: over-abstract models, divorced from data, based on over-restricted assumptions. Deference to authority where “objective fact is displaced from its position as the ultimate determinant of scientific truth”.
Ah, good. Making macroeconomics more empirical is laudable.
Next, you would have to relentlessly chase down the sources of the massive miscalculation of risk preceding 2008. These include a failure to factor in crime and malfeasance; the inability to measure or trace risk in the shadow banking system; the failure even to model the dynamics of banking as a separate and distinct agent. And the complete failure to model or accept the need to account for irrational human behaviours.
Wait -- how do you know this? Didn't you just say in the previous paragraph that macroeconomics is divorced from data? Then there are no empirically grounded models you could be using to base the importance of these particular mechanisms in describing macroeconomic data. Basically, this paragraph is divorced from the data in exactly the same way the author just said macroeconomics is divorced from the data.

I've said this many times. Don't just say including irrational human behaviors or banking yields better models. Build these models and show that that they are better empirically. That is to say, understand the first paragraph before writing the second.
... macroeconomics should suddenly become instead of a theory based on the assumption of equilibrium and rationality, one based on the assumption of disequilibrium and shocks – not just external shocks but shocks generated inside the system.
Um, you probably shouldn't base a theory on assumption of what the theory is trying to understand.

This is something that many economists (of all stripes) seem to do and it baffles me. Well, it baffles me as a scientist -- I totally understand it from a sociological/human behavior perspective.

Let me call it "economist answer syndrome", which is very close (if not usually identical) to "male answer syndrome". What should be the fundamental questions of economics (What are recessions? What determines the economic state? Is there a useful definition of equilibrium?) are instead presented as answers (Recessions are monetary. Endogenous shocks. No.). The answers differ from economist to economist. The various "schools of economics" are probably best described as specific answers to what should be the research programs of economics.

A good example of this is that second paragraph above. It's all answers. Risk was miscalculated leading to a financial crisis that caused a recession that was missed because macro models left out banking. The question form is to ask what role banks played in the crisis. In fact, some theories out there say that the financial crisis was a symptom, not a cause of the recession. If we were being scientific, as Paul Romer would have us be, then we should present this as a question, potentially presenting a mechanism and some data as evidence backing up that mechanism. If we're just saying stuff, then there are people out there that say the financial crisis was a symptom. He said, he said.

People often say that economics is politically biased, but really I think the issue is more that economics simply uses the political mode of thinking (where there are answers for anything of political interest) rather than the scientific one (where there are questions about anything of scientific interest).
One thing that would happen is that the future would start sending signals to the present via the market ...
There is actually a way to turn this vague statement into something meaningful (using information theory to describe the communication channel carrying those signals). It leads to the theory advocated on this blog (which should be noted is not entirely at odds with mainstream economics).
... so that they assume breakdown, irrationality, crime, inadequate and asymmetric information ...
This is just more male answer syndrome, more assumptions.

But the information transfer framework does allow (from the start) for markets to breakdown (it's called non-ideal information transfer). It turns out that it might be useful when looking at recessions, but not for the typical market state.

Update 1 December 2016

I was one of the Dean's Scholars at the University of Texas as an undergrad, and the director of the program was Alan Cline. He was the one who introduced me to "male answer syndrome"; it was one of the things he highlighted in a message he gave us on graduation day. Ever since then I've tried to follow his advice -- to stop and listen first, to think before proffering theories.

Tuesday, November 29, 2016

Causality, Newcomb's paradox, and rational expectations

Quantum eraser experiment. From Wikipedia.

I've at times enjoyed philosophy, but generally things like Newcomb's problem (which was linked to at Marginal Revolution today) makes me roll my eyes. There are two basic questions with this thought experiment: are we ceding the infallibility of the predictor and the potential lack of causality in the prediction or not? They turn out to be linked by causality.

There's a lot of persuasion in the problem that the predictor is infallible, but the problem doesn't come out and say so. Is the predictor an oracle in the computer science sense? There's really no reason to continue this discussion if we don't have an answer to this.

David Edmonds says "You cannot influence a decision made in the past by a decision made in the present!" At a fundamental level, the quantum eraser basically says that what Edmonds statement is generally wrong as stated (you just can't send a signal/communicate by influencing a past decision with a present decision). The way out of that is that we're dealing with a macroscopic system in the ordinary world, but in the ordinary world there's no such thing as an oracle. The Stanford Encyclopedia of Philosophy has more to say.

However, I think this problem is illustrative of a paradox with expectations in economics, so let's reformulate it.

Firm X (which determines most of the economy) can decide to cut output if it expects less aggregate demand. Normal output is 100 units, cut back is 50.

However, there's also a central bank. The central bank's forecasts have always been right. However, if the central bank forecasts that firm X will keep output the same, they will cut back on aggregate demand (raising interest rates, making firm X lose money). And if the central bank forecasts firm X will cut back on output, they'll boost aggregate demand. The boost/cut is +/-50 units.

Here's the original table:

Predicted choice  |  Actual choice  |  Payout

B (+1M)              B                 1M
B (+1M)              A+B               1M 1k
A+B (0)              B                 0
A+B (0)              A+B               1k

And here's our new table:

Predicted choice  |  Actual choice  |  Payout

Cut back +50  (r-)   Cut back (50)      100
cut back +50  (r-)   Keep same (100)    150
Keep same -50 (r+)   Cut back (50)        0
Keep same -50 (r+)   Keep same (100)     50

This might sound familiar: it's Scott Sumner's retro-causal Fed policy. An expected future rate hike yields lower expected output (expected avg = 25). And assuming the Fed is infallible (predicted = actual, i.e. rational/model-consistent expectations), the optimal choice is to cut back output (take box B). However, assuming the Fed is fallible (not rational expectations), the optimal choice is to keep output the same (expected result = 100). Basically, this is Edmonds answer above: take both boxes. When a model assumption (rational expectations) reproduces philosophical paradoxes, it's probably time to re-examine it.

The question at hand is whether rational expectations can move information from the future into the present. I've discussed this in more detail before in the context of so-called "neo-Fisherism". The rational expectations "operator", much like the oracle/predictor "operator", acts on a future (expected/predicted) state and moves information (sends a signal) into the present. In general such an operator -- were this information genuine (i.e. the predictor is infallible) -- violates causality. In quantum physics, there are cases where it appears on the surface that there might be causality violation (such as the quantum eraser above), but in every case no communication can occur (usually meaning locality is violated, but not causality).

So the question really is are we suspending causality and superluminal communication? If that is the premise of Newcomb's paradox or rational expectations, then there is nothing wrong with someone who can exactly predict the future (they're probably getting the information from the future) or future actions causing present conditions. If we're not, then the obvious choice is to assume that even rational expectations or infallible predictors can be wrong and take both boxes.

This so-called "philosophical paradox" should explicitly say whether we are suspending causality in our thought experiment instead of being mealy-mouthed about it.

Monday, November 28, 2016

The scope of introductory economics

I butted into a conversation between David Andolfatto (DA) and Noah Smith (NS) on Twitter about methodology in economics. Let me start with the conversation (edited slightly for clarity) that lead to me jumping in:
DA Noah, we all have models (thought-organizing frameworks) embedded in our brains. Unavoidable. No alternative. 
NS Understanding is not the same as thought-organization. Very different things. 
DA OK, let's step back and define terms. How do you define "understanding" something?
NS Let's say "understanding" means having a model that is both internally and externally valid.
DA "Validity" is a statement concerning the logical coherence of a sequence of statements, conditional on assumptions. 
NS That's internal validity. External validity means that the model matches data. 
DA Yes, but one needs a well-defined metric with which we judge "match the data." ... In my view, this judgement must be made in relation to the question being asked.
This is where I butted in:
JS I LOLed at 1st 1/2 of this. Well-defined metric is theory scope plus stats (works for every other field). Econ does not yet get scope. 
DA "Econ does not yet get scope." What does this mean? 
JS A model's scope (metrics for "matching data") should be derived alongside the model itself. Doesn't seem to happen in Econ texts. 
DA At the introductory level, the "empirics" we seek to explain/interpret are largely qualitative in nature. So it does happen. ... But better job could be done at upper levels, for sure.
So what did I mean? Basically, economics isn't approached as an empirical theoretical framework with well-defined scope. It is not set up from the beginning to be amenable to experiments that control the scope and produce data (quantitative or qualitative observations) that can be compared with theory. I'll try and show what I mean using introductory level material -- even qualitative.

Let me give a positive example first from physics. One of the first things taught are inelastic and elastic collisions. Inelastic collisions are almost always qualitative descriptions because even graduate students probably wouldn't be able to quantitatively describe the energy absorption in a rubber ball bouncing or friction slowing something down. You can sometimes approximate the latter with a constant force. The experimental setup is not too terribly different from how Galileo set up his tracks (he used balls, but now we know about rotational inertia, so that comes later):


These are set up to mimic the scope of elastic collision theory: approximately frictionless, approximately conserved kinetic energy. That scope is directly related to the scope of the theory. And we show what happens when scope fails (friction slowing the little carts down).

Now I'd say it isn't critical that students carry out these cart experiments themselves (though it helps learning) -- it would probably be a hard hurdle to surmount for economics. Simply describing the setup showing the results of these experiments would be sufficient, and there exist real economics papers that do just this.

In introductory economics, some general principles are usually discussed (like here from Krugman, but Mankiw's book starts similarly), then things proceed to the production possibilities frontier (PPF), upward sloping supply curves and downward sloping demand curves. This is probably the best analogy with the physics scenario above.

The assumptions that go into this are rationality and a convex PPF -- these should define the scope of the (qualitative) theory (i.e. individuals are rational and the PPF is convex, which requires 2 or more goods). The way that demand is taught also requires more than one good (so there is a trade-off in marginal utility between the two).

Now first: rationality generally fails (the charitable version is that it's a mixed bag) for individuals in most lab experiments. So there is either an implicit assumption that this is only for a large number of individuals (i.e. collective/emergent rationality, which isn't ruled out by experiments) or only deals with rational robots. As economics purports to be a social theory, we'll have to go with the former.

Additionally, the classic experimental tests of "supply and demand" (e.g. Vernon Smith, John List) do not approach economics this way. In those experiments, individuals are assigned utilities or reservation prices for a single good. You could imagine these as setting up "rational" agents analogous to the "frictionless" carts in the physics example, but we're still dealing with a single good. As an aside, there is an interesting classroom demo for the PPF using multiple goods, but like the experiments designed to show demand curves, this one isn't actually showing what it's trying to show (the area relationship of the pieces immediately leads to a quadratic PPF surface, which will have convex PPF level curves).

Here are a couple of graphics from Vernon Smith (1962):



So, are these results good? Are the fluctuations from theory due to failures of rationality? Or maybe the small number of participants? Is it experimental error? The second graph overshoots the price -- which is what the information equilibrium (IE) approach does by the way:


[Ed. note: this is a place holder using a positive supply shift until I get a chance to re-do it for a demand shift, which will give the same results, just inverted. Update: updated.]

In the IE model, the fluctuations are due to the number of participants, but the overshoot depends on the details of the size of the shift relative to the size of the entropic force maintaining equilibrium (the rate of approach to equilibrium, much like the time constant in a damped oscillator).

I use the IE model here just as a counterpoint (not arguing it is better or correct). The way introductory economic theory scope is taught, we have no idea how to think about that overshoot [1]. Rationality (or the assigned utility) tells us we should immediately transition to the new price. The analogy with introductory physics here would be a brief large deviation from "frictionless" carts or conservation of energy in an elastic collision.

Aside from rationality, which is a bit of a catch-all in terms of scope, there are issues with how fast shifts of supply and demand curves have to be to exhibit traditional supply and demand behavior. The changes Vernon Smith describe are effectively instantaneous. However much of the microeconomics of supply and demand depend on whether the changes happen slowly (economic growth, typically accompanied by inflation) or quickly (such as this story about Magic cards). And what happens after supply and demand curves shift? Is the change permanent, or do we return to an equilibrium (as IE does)? Does the speed of changes have anything to do with bubbles (see Noah Smith as well)?

In a sense, much of this has to do with the fact that economics does not have a complete theory about transitions between different economic states -- but supply and demand curves are all about transitions between different states. And what happens if nothing happens? Does the price just stay constant (a kind of analogy with Newton's first law)? The EMH says it follows a random walk -- does it return to the equilibrium price as the supply and demand theory seems to suggest? With so much of economics and econometrics looking at time series (even Smith's experiment above), one would expect introductory economics to at least address this.

Another issue is what David Glasner and John Quiggin have called the macrofoundations of micro (here's Krugman as well) -- the necessary existence of a stable macroeconomy for microeconomic theory to make sense. This also impacts the scope, but could probably be left out of the introduction to supply and demand much like the Higgs vacuum can be left out of the introduction to physics.

Overall, one doesn't get a good sense of the true scope of the theory in introductory economics, and it isn't taught in such a way that is consistent with how the classic experiments are done.

This issue carries over into introductory macroeconomics. One of my favorite examples is that nearly all of the descriptions of the IS-LM model completely ignore that it makes an assumption about the (well-documented) relationship between the money supply and output/inflation in its derivation that effectively limits the scope to low inflation. But I never see any economist say that the IS-LM model is only valid (is in scope) for low inflation. In the IE version, this can be made more explicit.

Paul Pfleiderer's chameleon models [linked here] points out one problem that arises out of not treating scope properly in economics: models that flip back and forth between being toy models and policy-relevant models. This is best understood as flipping back and forth between different scope ("policy relevant" means the theory's scope is fairly broad, while toy models tend to have narrow or undefined scope). Generally, because of the lack of attention to scope, we have no idea if a given model is appropriate or not. One ends up using DSGE models to inform policy even if they have terrible track records with data.

...

Footnotes:

[1] That overshoot is also the only thing in my mind that tells me this experiment actually measures something rather than being completely tautological (i.e. impossible for any result other than orthodox supply and demand to emerge).

Saturday, November 26, 2016

The effect of a December 2016 Fed interest rate hike

Last year when the Fed raised short term interest rates from a range between 0 and 25 basis points to a range between 25 and 50 basis points, I predicted (based on the information equilibrium [IE] model) that the monetary base (the path labeled C in the graph below) would start to fall (I had no idea how fast) relative to no change (the path labeled 0 in the graph below). That turned out to be a pretty successful prediction. The Fed now stands poised (according to Fed watchers like Tim Duy) to raise rates again after its December meeting to a range between 50 and 75 basis points. What will be the impact? What can we learn about the IE model?

The key question is whether the monetary base will start to fall faster or not. The IE model predicts a lower equilibrium monetary base for the range 50-75 basis points (labeled C' in the graph below). If the distance to the equilibrium matters, then the rate of fall should accelerate a bit. However it is possible the drift rate has to do with factors other than the distance to the equilibrium (such as volume of trading) which might be independent of the distance to the equilibrium. I illustrated both of these paths. The solid line is time series model forecasts based on the Mathematica function -- which auto-selects an ARIMA process -- for weekly source base data after the Dec 2015 announcement. The dashed line is a possible accelerated path that is simply adjusted to cover the new distance (to the new equilibrium C') a year later. The RMS error is shown as a yellow region (and blue for the period after the original estimate of reaching the equilibrium C).


Data after the December 2015 Fed meeting is shown in orange (both weekly source base and monthly adjusted monetary base). The expected paths (assuming immediate adjustment) are shown in gray as 0 (no change), C (25-50 bp after 2015), and C' (50-75 bp after 2016). The black line represents zero reserves and the dashed black line represents the average 40 billion in reserves from 1990 to 2008.

*  *  *

The model is described in more detail in my paper.

Here's a cool 3D visualization of the model.

Some additional discussion about the December 2015 prediction is here.

Here is the IE model's performance compared with DSGE and other models.

Here are a series of interest rate forecasts compared with the Blue Chip Economic Indicators (BCEI) forecast. The BCEI forecast has continued to be incorrect -- much worse than the IE model.