Wednesday, January 10, 2018

Labor shortages reported by firms

Via Twitter (H/T @traderscrucible), I came across this survey data about firms reporting shortages of "qualified" workers. It looks remarkably correlated with JOLTS data (e.g. here), so I ran it through the dynamic information equilibrium model. In general it works fine, but due to how short the series is there is some ambiguity in the dynamic equilibrium (there are two local minima where one is about 0.07/y, and the other is about 0.11/y). I thought this made for an interesting case study of which model we should believe.

Scenario 1:

0.07/y dynamic equilibrium.
2008.8 recession center (lagging indicator)
overshooting step response to recession
no indication of next recession (lagging indicator)


Scenario 2:

0.11/y dynamic equilibrium.
2008.9 recession center (lagging indicator)
no overshooting
signs of next recession (leading indicator)


Which is it? Neither dynamic equilibrium slope (or any other model parameter) seems wrong -- both are comparable to the 0.098/y value for JOLTS openings or the -0.096/y value for the unemployment rate. My guess is that Scenario 1 is correct because of its consistency as a lagging indicator at the expense of a completely plausible overshooting in survey data. It also seems unlikely that it would go from a measure with one of the longest lags to one with one of the longest leads (assuming the other JOLTS leading indicators are accurate there is another recession in the next year or so). It is of course arguable that the upcoming recession (if it is indeed upcoming) might be a different type of recession compared to the Great Recession and accompanying financial crisis (e.g. the financial crisis was a surprise, whereas low future growth due to a labor "shortage" is more slow-rolling). In either case, auxillary hypotheses are needed to resolve the ambiguity in either direction [1].

Whatever the final resolution, I thought it was fascinating that the outcome of survey data with a somewhat vague question (What does "qualified" mean to the survey respondent? [2] Are the firms answering "yes" to a perceived shortage offering below-market wages?) posed to human beings resulted in data that follows mathematical formula. True, it is probably because it is directly anchored by the unemployment rate. However, using this model we can potentially predict how humans will answer a question in the near future -- a question that I thought would be potentially clouded by politics. People report inflation of the deficit is higher if the President is of the opposite political party, so why wouldn't this affect whether you think it's easy or hard to find "qualified" workers ... and there is of course footnote [2].

...

Footnotes

[1] It should be noted that this isn't an indication of a degenerating research program per Lakatos: eventually more data will resolve the dynamic equilibrium slope.

[2] To a significant fraction of HR managers hiring for particular jobs, "qualified" includes being white and male per numerous studies of e.g. submitting resumes with different genders or names that 'sound black' and 'sound white'.

JOLTS follow-up

I thought I'd also show the plot of the JOLTS quits data to the ensemble of leading indicators forecasts:




Tuesday, January 9, 2018

Happy JOLTS data day

The week after the latest unemployment rate data is released, we have the Job Openings and Labor Turnover Survey (JOLTS) data at FRED. I've been tracking these as potential leading indicators of recessions since last summer. There isn't much change in the results, however I do want to start posting the job openings counterfactual shock estimate alongside the hires. In the leading indicators post, I noted that hires seems to experience its shock earlier than other indicators. However I also noted that I have exactly one recession to work with [1], so that should be taken with a grain of salt. With the latest data, the indicator that came second [2] (i.e. openings) seems to be showing a possible shock as well (but the series is much noisier and therefore uncertain).

Here are the two measures with the latest shock counterfactual (in gray):


And here are animations of the evolution of the shocks counterfactuals:



And finally, here are the latest points on the Beveridge curve (also hinting at a shock which would take it back along the path between the 2001 and 2008 labels on the graph):


Note that my most recent paper available at SSRN talks about these models and theory behind them.

...

Foonotes:

[1] The JOLTS data series on FRED begins in December of 2000, effectively at the start of the 2001 recession, so only one complete recession exists in the data.

[2] The center and width of the shocks to various JOLTS measures: hires, openings, quits, and the unemployment rate:


Monday, January 8, 2018

Qualitative economics done right, part 3

Ed. note: This post is late by almost a year. As mentioned below, part of the reason is that I think Wynne Godley's work has been misrepresented by some of his proponents. I added footnote [1] and the text referencing it, and toned down footnote [3].
This was originally going to be a continuation in a series of posts (part 1, part 2, part 2a) based on an UnlearningEcon tweet:
[Steve] Keen (and Wynne Godley) used their models to make clear predictions about crisis
It was part of a debate about what it means to predict things with a qualitative model. I covered Keen in part 2. This post was going to focus on Wynne Godley. One of Godley's influences on the subject is his "sectoral balances" approach, which is uncontroversial and not exclusively MMT or Post-Keynesian (for example, here is Brad DeLong using the approach).

Now UnlearningEcon says "predictions about crisis" (i.e. how it would play out), not "predictions of crisis" (i.e. that it would occur) which leaves in a large gray area of interpretation. However much of the references to Godley by the heterodox economics community say that he predicted the global financial crisis. As a side note, I wonder if Martin Wolf's FT piece saying Godley helps understand the crisis lent credence to others saying he predicted the crisis?

However in my research, I found that Godley himself doesn't say many of the things attributed to him. He doesn't predict a global financial crisis. He doesn't tell us that the bursting of a housing bubble will lead to a global financial crisis. In the earliest documented source [pdf], Godley says that falling house prices (as already observed in 2006) will lead to lower growth over the next few years (more on this below). This has little to do with "heterodox economics" and in fact is indistinguishable from the story told by mainstream economists like Paul Krugman. For example, Krugman was warning about the effect of a deflating housing bubble on the broader economy in the summer of 2005:
Meanwhile, the U.S. economy has become deeply dependent on the housing bubble. The economic recovery since 2001 has been disappointing in many ways, but it wouldn't have happened at all without soaring spending on residential construction, plus a surge in consumer spending largely based on mortgage refinancing. ... Now we're starting to hear a hissing sound, as the air begins to leak out of the bubble. And everyone ... should be worried.
Unfortunately, Godley's policy note linked above is completely mis-represented in a paper by Dirk Bezemer that I have been directed to on multiple occasions as "documentation" of how the heterodox community predicted the global financial crisis. It was even cited in the New York Times. The paper is “No One Saw This Coming” Understanding Financial Crisis Through Accounting Models [pdf], and its introduction claims that it's simply a survey of economic models that anticipated the crisis:
On March 14, 2008, Robert Rubin spoke at a session at the Brookings Institution in Washington, stating that "few, if any people anticipated the sort of meltdown that we are seeing in the credit markets at present”. ... [‘no one saw this coming’] has been a common view from the very beginning of the credit crisis, shared from the upper echelons of the global financial and policy hierarchy and in academia, to the general public. ... The credit crisis and ensuing recession may be viewed as a ‘natural experiment’ in the validity of economic models. Those models that failed to foresee something this momentous may need changing in one way or another. And the change is likely to come from those models (if they exist) which did lead their users to anticipate instability. The plan of this paper, therefore, is to document such anticipations, to identify the underlying models, to compare them to models in use by official forecasters and policy makers, and to draw out the implications
Godley's paper above is cited and purportedly quoted to provide a basis for using Stock Flow Consistent models because of their supposed validity. Bezemer's purported quotes of Godley are:
“The small slowdown in the rate at which US household debt levels are rising resulting form the house price decline, will immediately lead to a …sustained growth recession … before 2010”. (2006). “Unemployment [will] start to rise significantly and does not come down again.” (2007)
These quotes appear in a table at the end of the paper (p. 51) as well as in the text (p. 36), but neither of these quotes appear in the cited references to Godley. The second one doesn't appear in any form in any of the cited papers that could be construed as Godley (2007) — which is great for Godley as unemployment in the US has since fallen to levels unseen in almost two decades [1]. [Update: the source has been found, but it is not one of the cited ones.] The first is cobbled together from a few words in a much longer passage in Godley (2006) linked above:
It could easily happen that, if house prices stop rising or if the financial-obligations ratio published by the Fed continues to rise, the debt-to-income ratio will slow down during the next few years, much as it did in the late 1980s and early 1990s. ...
The results are a bit surprising, since the apparently quite small differences between debt levels in the four scenarios generate such huge differences in the lending flows. In particular, Scenario 4, the lowest projection, shows that the debt percentage only has to level off slowly and then fall very slightly for the flow of net lending to fall from 15 percent of income in 2005 to 5 percent in 2010. ...
The average growth rates for 2005–10 come out at 3.3 percent, 2.6 percent, 1.8 percent, and 1.4 percent. The last three projections imply sustained growth recessions—very severe ones in the case of the last two. ...
Is it plausible to suppose that the growth of GDP would slow down so much just because of a fall in lending of this size? Figure 7, which shows past (and projected Scenario 4) figures for net lending combined with successive, overlapping three-year growth rates, suggests that it could. Major slowdowns in past periods have often been accompanied by falls in net lending.
Bezemer also says "This recessionary impact of the bursting of asset bubbles is also a shared view." which is to say that the the predictions of Godley and Keen [2] about the negative impact of a fall in housing prices are not unique to their models. A good example is the aforementioned Krugman quote; he probably didn't use an SFC model or some non-linear system of differential equations.

But the original discussion with UnlearningEcon was about the usefulness of qualitative economic models (per the title of this post). The thing is that Godley's models were quantitative and do look a bit like real data:


Of course the debt data does look a bit like the the counterfactual path shown (in shape, as usual I have no idea what heterodox economists mean when they say "debt" and therefore what their graphs represent; I plotted several different data sources) However, the GDP growth rates miss the giant negative shock associated with the global financial crisis. This means this model definitely misses something because debt did follow the shape of the path Godley used as the worst case scenario.


I wouldn't call this a prediction about the global financial crisis, but rather just a model of the contribution of housing assets to lower GDP growth. But still, it was a quantitative model (one of Godley's sectoral balance models based on the GDP accounting identity). And this is all Godley says it is [3].

Doing the research for this post has given me a newfound respect for Wynne Godley (and Mark Lavoie), but also a real sense of the sloppiness of heterodox economics more broadly including MMT and stock flow consistent approaches. Maybe because it is such a tribal community (see [3]) there is little introspection and genuine peer review. I know from my own efforts that I get few critiques of my conclusions from people who agree with those conclusions. This leads me to try and be my own "reviewer #2" even to the point where I have built two independent versions of the models I show on this blog on separate computers.

...

Footnotes:

[1] People will undoubtedly bring up other measures of unemployment. However these do not appear to contain additional information not captured in the traditional "U3" measure — U6 ~ α U3 for some fixed α.

[2] Bezemer also says that Steve Keen predicted the crisis:
“Long before we manage to reverse the current rise in debt, the economy will be in a recession. On current data, we may already be in one.” (2006)
But in the original source, this is in reference to Australia. Australia hasn't had a recession since 1991 (in September of 2016, Australia had managed to rack up 100 quarters without recession and at 25 [now 26!] years is second only to [now tied with] the Netherlands that went for 26 years from 1982 to 2008).

[3] I do want to take a moment to mention that Wynne Godley and Mark Lavoie are far more reasonable than you might be lead to believe by their proponents out in the Post-Keynesian and MMT community. They'd probably be fine with what I pointed out about SFC models since the "fix" is just adding a parameter.

On Twitter (see the whole thread), there was an excellent example of how the supporters of Godley and Lavoie aren't doing them any favors. Simon Wren-Lewis showed how a non-flat Philips curve implied a Non-Accelerating Inflation Rate of Unemployment [NAIRU]. It's a pretty basic argument ...
If π(t) = E[π(t+1)] - a U + b, there exists a U at which inflation is stable (NAIRU) = b/a.
Post-Keynesian blogger and Godley and Lavoie fan Ramanan said that they (G&L) showed there was an exception, therefore Wren-Lewis's argument was not valid.

Wren-Lewis responded "[t]hat is obviously not a NAIRU model, because you are saying the [Phillips curve] is flat", which is what I also said:
But [Ramanan]'s purported exception has a flat piece, so it's not a counterexample to [Simon Wren-Lewis]'s argument.
I added that
Techncally, [Ramanan]'s [Philips Curve] has two point NAIRUs plus a continuum (between [two of the] points on his graph).
Which turns out is exactly what Mark Lavoie said to Ramanan (and he quoted it on his blog):
Another way to put it is to say that there is an infinite number of NAIRU or a multiplicity of NAIRU (of rates of employment with steady inflation).

Friday, January 5, 2018

Labor market update: comparing forecasts to data

The latest data for the unemployment (U) rate and the (prime age) civilian labor force (CLF) participation rate are available, so I get to test to see if the models have failed or not. Here's the last unemployment model update [1] (includes a discussion of a "structural unemployment") and here's the post about the novel dynamic equilibrium "Beveridge curve" for CLF/U [2] shown below. Now let's add the newest data points (shown in black in the figures below).

First, the unemployment rate forecast remains valid:


And it's still looking better than the history of forecasts from FRB SF and FOMC:


As discussed in the second link [2] above, here are the two CLF forecasts (with and without a shock in 2016):


The "Beveridge curve" (the theory of these dynamic equilibrium "Beveridge curves" is discussed in my latest paper) relating labor force participation to the unemployment rate (a curve you likely would not have seen unless you use the dynamic information equilibrium model) also discussed in [2] is also on track with the latest data:


The shocks to CLF are in red-orange and the shocks to U are green. In the absence of recession shocks, the data should continue to follow the dotted blue line upwards from the black point. However, it is likely that we will have a recession in the mean time, and so — like the rest of the curve — we will probably see the data deviate towards another dynamic equilibrium (another gray hyperbola). The only place I have seen so far where these kinds of Beveridge curves are stable enough to be useful is in the classic Beveridge curve (data for which will be available next Tuesday). This stability arises from both the size and timing of the shocks being approximately equal. In the case above, the shocks to CLF are not only much smaller, but also much later (even years later) which cause the Beveridge curve above to become a spaghetti-like mess.

Thursday, January 4, 2018

Structural breaks, volatility regimes, and dynamic equilibrium

In scanning through the posters, papers, and discussions of the preliminary schedule of the upcoming ASSA 2018 meeting in Philadelphia I found a lot of interesting sessions (e.g. two machine learning sessions). As a side note, those who think economics ignores alternative approaches should note the (surprising number of) sessions on institutionalist, Marxian, Feminist, and other heterodox approaches.

One poster from the student poster session caught my eye — in particular the identification of low volatility and high volatility regimes in the S&P 500:


That's from "Structural Breaks in the Variance Process and the Pricing Kernel Puzzle" by Tobias Sichert [pdf]. It seems these low volatility and high volatility regimes line up with the transition shocks of the dynamic information equilibrium model (green line):


The top picture is the dynamic information equilibrium model with shock widths (full width at half maximum, described here). The bottom graph is Sichert's paper's structural breaks (black indicating the start of a low volatility regime, red indicating the start of a high volatility one per the figure at the top of this post). However, the analysis started at 1992, so that isn't so much the beginning of a low-volatility regime as the beginning of the data being looked at (therefore I indicated it with a dashed line). I colored in the high volatility regime with light red, and we can see these regions line up with the shock regions in the dynamic equilibrium model. The late 1990s early 2000s is seen as a single high volatility regime in Sichert's analysis and the Great Recession seems to continue for awhile after the initial shock — possibly due to step response? However, overall volatility looks like a good independent metric to identify periods of dynamic equilibrium (low volatility) and shocks (high volatility).

Wednesday, January 3, 2018

Canada's below-target inflation


Some years ago I had predicted that Canada would begin to undershoot its 2% inflation target, and then touted the success of the information transfer monetary model when that prediction came true. However I mostly see the monetary model as at best a local approximation with the dynamic information equilibrium model being better empirically (discussion in terms of US inflation at this linked post).

To that end, I thought I'd put together how you'd look at Canada's below-target inflation in terms of the dynamic information equilibrium model (of all items CPI). In this case, the dynamic equilibrium is approximately 2%, and the undershooting is due to a long-duration shock possibly triggered by the global financial crisis/Great Recession.



The first graph is the full CPI level dataset from FRED. The second shows a more recent CPI level data. The third shows year-over-year inflation. The main shocks are the demographic shock centered at 1978.65 ± 0.04 (width [1] = 3.0 y) and the post-crisis shock is centered at 2017.7 ± 4.9, with a width of 3.3 years. There are two additional shocks in 1991 and 1993 to deal with the bump in the CPI.

...

Footnotes

[1] I've been a bit sloppy on this blog about what I mean by the "width" of a transition, although I nearly always use the "width" or "inverse steepness" parameter $b_{0}$ of the logistic function

$$
f(t) = \frac{a_{0}}{1+e^{-\frac{t-t_{0}}{b_{0}}}}
$$

Since the derivative is nearly a Gaussian function, we can think of the 1-standard deviation width $\sigma$, which is approximately

$$
\sigma \approx \sqrt{\frac{8}{\pi}} b_{0} \simeq 1.6 b_{0}
$$

based on matching the leading order of the Taylor series. The other possible measure is the full width at half maximum ($FWHM$) which is

$$
FWHM = 2 b_{0} \log \left(3 + 2\sqrt{2} \right) \simeq 3.5 b_{0}
$$

Therefore if $b_{0} \simeq 3.0\;\text{y}$ means $\sigma \simeq 4.8\;\text{y}$, and $FWHM \simeq 10.6 \;\text{y}$. Using the $\sigma$ measure, 95% of the shock occurs within $4 \sigma$ distances (i.e. $2 \sigma$ on either side) or 19.1 years.

Tuesday, January 2, 2018

Most popular posts of 2017


As part of an annual tradition of looking back on the previous year, here's a compilation of the most popular posts of 2017. The most popular posts on my blog tend to be the critiques and discussions of methodology, and this year was no exception.

What mathematical theory is for
Scientists first use equations to relate different empirical observations to each other. Only after that is successful can you begin to build a framework — and only after you have that framework can you reasonably talk about theorems or toy models.
"Heterodox" economics has a tendency to think it somehow is better than "mainstream" economics, but often is exactly the same just with different priors. I deconstruct a garbage derivation and show that it basically is just a restatement of the original assumptions. I also made a fun diagram illustrating the issues with reasoning from an accounting identity.
This post was used as a first draft of what became my article at Evonomics titled "Hayek Meets Information Theory. And fails." In it, I describe how neoclassical economics can basically be recovered as a "perfect information" limit of an information-theoretic framework without all of the baggage.
I used a debate between two philosophers about human comprehension as a jumping-off point to reiterate my claim that I think economists assert human agency more than is necessary out of bias (both the "gut" version as well as methodological). Chris Dillow linked to this post (as well as What mathematical theory is for listed above), directly discussing it with a post of his own that is much more pragmatic contra my more philosophical take.
Honorable mention: On these 33 theses
While this post does (at the time of writing) have fewer pageviews than the previous ones, it was posted at the end of the year and judging by the distribution might actually make the leader board for 2017 if only it had had more time. The post itself is yet another entry in my series of criticisms (e.g. here or here) of "manifestos" of what economics should be that effectively assume the outcome of research programs. If you're writing that economics should use evolutionary paradigms, you should instead be writing about the successful economic models you're constructing using evolutionary paradigms. Don't tell me — show me. I see these manifestos as more a pitch for funding than "real work". 
I do not write treatises about how economics should adopt the information equilibrium framework. I instead show how to use the model to understand empirical data (or sometimes theoretical puzzles) and rely on those descriptions of empirical data (or puzzles) to motivate the adoption of information equilibrium. Instead of saying you should use a Wasserstein GAN instead of a traditional GAN for machine learning, you should show how using a WGAN gets the results you want. The original paper on WGAN does exactly that. No computer scientist would have taken the paper seriously if the authors just gave some hand-waving theoretical motivation. For the same reason, I do not take any "new paradigm for economics" or "heterodox approach to economics" seriously that doesn't show some results. Despite how interesting or deep you think your insight into the problem is and how seductive you might think your approach is, I can guarantee you that a sizable fraction of people disagree with you. MMT seems theoretically satisfying to its proponents, to me it just assumes its conclusions. I like the information equilibrium approach, but as I've discovered over time through comments and tweets many people find it unsatisfying. The only thing that seems to win anyone over (or at least get heard) are the empirical results. 
Anyway, this is one of my pet peeves in the econoblogosphere. If you think economics is really just methodology and philosophy — that's fine. But some of us have seen the failures of macroeconomics over the past couple decades and want to figure out how to fix it, not just talk about how to fix it.
Honorable mention: 18 signs you are not having a productive conversation about economics
Another late entry that received a lot of attention, this list is a product of my frustration with the "debate" about the state of economic theory (in particular, macroeconomics). This "debate" doesn't seem to go anywhere on either side with the same tired criticisms or defenses being brought up again and again.


... What was your favorite post?


Monday, January 1, 2018

New paper up at SSRN


I put a new paper up at SSRN (Maximum entropy and information theory approaches to economics) that I believe is accessible despite currently being under review that's been accepted and is no longer under review:
It covers some of the material I've covered in presentations (collected here), but with a lot more details and explanations. It also contains the derivation of my favorite equation I've come up with here:


That's probably the best shot I'll get at an equation I could engrave on my tombstone.

Monday, December 25, 2017

Checking some long term market forecasts against data

Since Mathematica seems to have stopped supporting S&P 500 data, I've finally recovered the original forecast using some cobbled-together data sources with a bit of re-sampling:


Most of the rise since the beginning of 2017 has been pretty much on trend; the most recent data is a bit above (probably due to the increased likelihood of stock buybacks in the wake of the tax bill passing) — but still within the expected error.

The same events are likely influencing the bond market with yields up recently, but again the path is consistent with the forecast I made in 2015:



*  *  *

And then there's bitcoin ...

Bitcoin went through a bit of crash recently, which has helped reduce the uncertainty in the expected path (per the model I've been following here):


As the previous link states, I've given up on this as a useful forecasting tool (bitcoin is too volatile, and it also seems estimates of shock amplitudes are initially too small then too large). Instead it's more of a post hoc description of the data that is consistent with dynamic equilibrium. The only thing I'd take seriously from this graph is the slope of the future path (i.e. down due to the −2.6/y dynamic information equilibrium rate). Another shock could hit in 2018 (or the current shock could continue with the recent fall being a temporary fluctuation), but in the absence of future shocks and taking the model's estimate that the current shock is mostly over as genuine the graph above is what I'd expect [1]. Note that the best fit suggests a rebound from the current losses, but new data could easily revise that estimate (which is why I consider the model useless for forecasting, but not necessarily describing data after the fact).

...

Footnotes:

[1] One thing that I find interesting is that in other cases the exchange rates between two currencies represent (effectively) a ratio of the GDPs of the two countries. Is this bitcoin exchange rate a ratio of the "bitcoin economy GDP" to the US GDP? There seems to be a financial industry swarming around bitcoin (with e.g. futures markets just recently) which some people seem to be making money off of (regardless of its long term sustainability). I'd say there are far more questions than answers here.