Thursday, May 3, 2018

Three sigma deviation in the 10-year rate

So I'm continuing to track the 10-year interest rate forecast from nearly 3 years ago. While the forecast did well before the 2016 election, today we're above a 3-sigma deviation from the estimated model error (the 99.9% percentile, or 1 in 1000). Of course with nearly 800 data points, we might expect to see at least *one* 3-sigma event. A similar deviation happened in the early 80s (Sep 1981 to Jun 1982) making this the second period of such a deviation.

This is extremely interesting because that time period represents exactly the time period where the Fed raised the discount rate to its maximum level, which (according to the standard narrative) kicked off the the second dip of the double-dip recession. However, like in the 80s, there appear to be signs of an upcoming recession in other data that might be a leading indicator.

I will admit it is speculative, but given the timing/timeline of the previous 3-sigma event it may become clear the US is in recession in the next 6 months (NBER won't officially declare it until a few quarters later).

Now you might wonder how raising interest rates to only about 2% could trigger a recession today in the same way raising interest rates to 14% did in the 80s. I admit I don't have a good answer to this except to say increasing labor force participation in the 80s probably provided a sufficient tailwind that Fed had to do do much more.

In any case, this makes for an excellent test of the model. Interest rates should come back down in the near term (about 6 months). A possible mechanism to bring them down is recession. The longer they stay at the 99.9% of their range or further, the more likely the model can be rejected.

Here's a zoomed in and non-log scale version of the graph at the top of the page (the green band was the forecast of the green line while the gray bands represent 50% and 90% confidence limits on the model error from the observed path):


  1. To be clear in my understanding, you assert the chain of causality is increasing Fed discount rate -> higher interest rates -> decreased business -> increasing unemployment?

    1. I'm actually working on a draft piece about not claiming such causality!

      The Fed generally raises interest rates between recessions, and since all the rates are fairly correlated this just means that the Fed may well be cut-off in its rate raising behavior by a recession.

      Even though a drunk at a bar is being cut-off by last call, last call wasn't caused by how drunk he is. His drinking binge was just ended by a separate process.

      I think I will use this analogy in my upcoming post.


Comments are welcome. Please see the Moderation and comment policy.

Also, try to avoid the use of dollar signs as they interfere with my setup of mathjax. I left it set up that way because I think this is funny for an economics blog. You can use € or £ instead.

Note: Only a member of this blog may post a comment.