Monday, December 11, 2017

JOLTS data day!

The latest data from the Job Openings and Labor Turnover Survey is out today on FRED and we're here with another update of the forecast performance/recession indicator. Here are the hires and openings data:


Here's the update of the hires shock counterfactual evolution (a fall in hires might be a leading indicator):


Here's the updated Beveridge curve as well:


Sunday, December 10, 2017

Another unemployment rate forecast comparison

Paul Romer tweeted a graph of an unemployment rate projection. I'm not sure where it came from — my guess is the World Bank — but I thought I'd add it to the forecast from the dynamic information equilibrium model last updated here. Already it (thick dark blue line) seems to be fairly wrong (no confidence limits were given; the dynamic equilibrium model uses 90%):


Of course the dynamic information equilibrium forecast is conditional on the lack of shocks (which can be identified via the algorithm discussed here). The forecast Romer tweeted could be the result of a very broad but small amplitude shock to the dynamic equilibrium model, but such a shock would be unlike any other adverse shock in the US data since the Great Depression.

Saturday, December 9, 2017

Latest unemployment numbers and structural unemployment

The latest monthly unemployment numbers for the US came out on Friday (unchanged at 4.1% from last month) and so I've yet again put the new data points on my old model forecast graphs to see how they're performing (just great, by the way — more details are below). There were several mentions of the old "structural unemployment" argument (against fiscal or monetary stimulus) given in the wake of the financial crisis saying that the arguments hadn't held up well as unemployment has fallen to the lowest levels in years. In particular, Paul Krugman noted:
Remember when all the Very Serious People knew that high unemployment was structural, due to a massive skills gap, and could never be expected to return to pre-crisis levels?
He linked back to an old blog post of his where he showed an analysis from Goldman Sachs about state unemployment rates and then looked at unemployment rates and the subsequent recovery by occupation. The data showed that occupations (and states) that had been hit harder (unemployment increased more) had recovered faster (unemployment had declined more). Krugman said this indicated unemployment was cyclical, not structural:
So the states that took the biggest hit have recovered faster than the rest of the country, which is what you’d expect if it was all cycle, not structural change. ... the occupations that took the biggest hit have had the strongest recoveries. In short, the data strongly point toward a cyclical, not a structural story ...
What was interesting to me was that the data Krugman showed was actually just a result of the dynamic information equilibrium model — the larger the shock, the faster the fall since the dynamic information equilibrium is a constant slope of (d/dt) log u(t). In fact, the data Krugman shows match up pretty well with the result you'd expect from the dynamic equilibrium model:


This tells us that the dynamic equilibrium is the same across different occupations (much like how the dynamic equilibrium is the same for different races, or for different measures of the unemployment rate). All of this tells us that unemployment recoveries [1] are closer to a force of nature (or "deep structural parameters" in discussions of the Lucas critique). But on another level, this is also just additional confirmation of the usefulness of the dynamic equilibrium model for unemployment.

*  *  *

As I mentioned above, I also wanted to show how the forecasts were doing. The first graph is the model forecast alone. The second graph shows comparisons with the (frequently revised) forecasts from the FRB SF. The third graph shows a comparison with the (also revised) forecast from the FOMC.





...

Footnotes:

[1] The shocks to unemployment are non-equilibrium processes in this model. It remains an open question whether these shocks can be affected by policy, or whether they too are a force of nature.

Tuesday, December 5, 2017

Does increased compensation cause increased productivity?

Noah Smith has an article at Bloomberg View asking why compensation hasn't risen in lockstep with productivity. Recent research seems to say it at least rises a little when output rises in the short run, but not one-for-one:
This story gets some empirical support from a new study by economists Anna Stansbury and Larry Summers, presented at a recent conference at the Peterson Institute for International Economics. Instead of simply looking at the long-term trend, Stansbury and Summers focus on more short-term changes. They find that there’s a correlation between productivity and wages — when productivity rises, wages also tend to rise. Jared Bernstein, senior fellow at the Center on Budget and Policy Priorities, checked the results, and found basically the same thing.
I thought the long run data would be a good candidate for the dynamic information equilibrium model, but came out with some surprising results. It's true that the models appear correlated. Real output per hour (OPH) seems to rise faster at 1.46%/y while real compensation per hour (CPH) rises at about 0.45%/y. This has held up throughout the data that isn't being subject to a non-equilibrium shock (roughly the "Great Moderation" and the post-global financial crisis period).

But the interesting part of this particular framing of the data is the timing of the shocks — shocks to real compensation per hour precede shocks to real output per hour:


The shocks to CPH (t = 1952.1 and t = 1999.6) precede the shocks to OPH (t = 1959.8 and t = 2001.0). Real compensation increases before real output increases. It's not that compensation captures some part of rising output; it's that giving people raises increases productivity.

Now it is entirely possible this framing of the data isn't correct (there is a less statistically significant version of the dynamic equilibrium that sees the periods of the shocks as the equilibrium and the 80s and 90s, as well as the post-crisis period as the shocks). However there is some additional circumstantial evidence that the productivity shocks correspond to real world events. The late 90s shock seems associated with the introduction of the internet to a wider audience than defense and education, while the 40s and 50s shock is likely associated with a post-war increase in production efficiency in the US. It is possible increased compensation is due to increased skills required to use new technologies and methods — with those raises and increased starting salaries needing to happen before firms can implement these technology upgrades [it costs more to get labor with the latest skills]. Those could well be just-so stories (economists like stories, right?), but I believe the most interesting aspect is simply the plausible existence of this entirely different (but mathematically consistent) way to look at the data along with the entirely different policy implications (i.e. needing to find ways to directly raise wages instead of looking for ways to increase growth or productivity).

...

Update: I added a bit of clarifying text — e.g. "upgrades" and bracketed parenthetical — in the last paragraph that was ambiguous. Also the reference to "post-crisis period" replaces "2010s" because the latter could be confused with the actual shock to output in 2008/9 whereas I am actually referring to the the period after the shock that we are still in.

Monday, December 4, 2017

Supply and demand and science

Sometimes recognizing a symmetry is the biggest step.

Sometimes when you give a seminar or teach a class a student or attendee brings up a point that perfectly sets you up to explain your topic. A few tweets from @UnlearningEcon and Steve Roth gave me this opportunity today:
UE: Confused by the amount of times it is claimed demand and supply has loads of empirical evidence behind it when I've barely seen any 
UE: Naming a couple of obvious insights or point estimates is not sufficient to prove the full model! Can somebody give me an actual falsifiable test, please? 
UE: Conclusion: demand-supply is largely an article of faith which people 'prove' with a couple of casual observations. Science! 
SR: Thing is you can't measure demand (desire) and supply (willingness) to buy/sell — necessarily, across a range of prices at a point in time. Only observe the P/Q where they meet. Why S/D diagrams are always dimensionless.
There's an almost perfect analogy here with the concept of "force" in physics. Force F, often described using the equation F = m a [1] or better F = dp/dt, is actually a definition. That is to say it's F ≡ dp/dt. At the time of Newton [2], it was an article of faith. It was an article of faith that organized a bunch of disparate empirical "point estimates" and insights from Kepler and Galileo.

That is all to say Newton represents more of a statement of "looking at it this way, it's much simpler" than an application of the oversimplified "scientific method" we all were taught in school that involves an hypothesis, collecting data, and using that data to confirm or reject the hypothesis. Unfortunately contributions to science like Newton's aren't easily reproduced in classrooms so most people end up thinking the hypothesis testing with data is all there is. Note that Einstein's famous contributions were like Newton's in the sense that the organized a bunch of disparate empirical point estimates (in this case they were deviations from a Newtonian world).

Though Newton and Einstein get all the plaudits, both of their big contributions are really specific instances of what is probably one of the greatest contributions to physics of all time: Noether's theorem. Emmy Noether was asked by David Hilbert about energy conservation in General Relativity, but she ended up proving the more general result that conservation laws are consequences of symmetry principles. Newton's symmetry was Galilean invariance; Einstein's were Lorentz covariance (special relativity) and general covariance (general relativity). Newton's laws are really just a consequence of conservation of momentum.

That gives us a really good way to think about Newton-like contributions to science: they involve recognizing (or steps towards recognizing) general symmetry principles.

What does this have to do with supply and demand?

This is where Steve Roth's tweet and the work on this blog comes in. Supply and demand relationships seem to be a consequence of scale invariance (the dimensionlessness Steve points out) of information equilibrium [3]. In fact, realizing supply and demand is a consequence of the scale invariance encapsulated by the information equilibrium condition gives us a handle on the scope conditions — where we should expect supply and demand to fail. And since those scope conditions (which involve e.g. changing supply much faster than demand can react) can easily fail in real-world scenarios, we shouldn't expect supply and demand as a general theory to always be observed and empirically validated. Good examples are labor and housing markets where it is really hard to make supply change faster than demand (in the former case because adding workers adds wage earners adding demand, and in the latter case because it is impossible to add housing units fast enough). What we should expect is that when the right conditions are in effect, supply and demand will be a useful way to make predictions. One of my favorites examples uses Magic, the Gathering cards.

Yes, I realize I have obliquely implied that I might be the Isaac Newton of economics [4]. But that isn't the main point I am trying to make here. I'm trying to show how attempts at "falsification" aren't the only way to proceed in science. Sometimes useful definitions help clarify disparate insights and point estimates without being directly falsifiable. The information equilibrium approach is one attempt at understanding the scope conditions of supply and demand. There might be other (better) ones. Without scope conditions however, supply and demand would be either falsified (since counterexamples exist) or unfalsifiable (defined in such a way as to be unobservable [5]).

...

Footnotes

[1] You may think that mass and acceleration are pretty good direct observables making force an empirical observation. While acceleration is measurable, mass is problematic given that what we really "measure" is force (weight) in a gravitational field (also posited by Newton). Sure, this cancels on a balance scale (m₁ g = m₂ g → m₁ = m₂), but trying to untangle the epistemological mess is best left to arguments over beer.

[2] Actually Newton's Lex II was a bit vague:
Lex II: Mutationem motus proportionalem esse vi motrici impressae, et fieri secundum lineam rectam qua vis illa imprimitur.
A somewhat direct translation is:
Second Law: The alteration of motion is ever proportional to the motive force impressed; and is made in the direction of the right line in which that force is impressed.
The modern understanding is:
Second Law: The change of momentum of a body is proportional to the impulse impressed on the body, and happens along the straight line on which that impulse is impressed.
Where momentum and impulse now have very specific definitions as opposed to "motive force" and "motion". This is best interpreted mathematically as

I ≡ Δp

where I is impulse and p is the momentum vector. The instantaneous force is (via the fundamental theorem of calculus, therefore no assumptions of relationships in the world)

I = ∫ dt F

F ≡ dp/dt

where p is the momentum vector. The alteration of "motion" (i.e. momentum) is Δp (or infinitesimal dp), and the rest of the definition says that the F vector (and impulse vector I) is parallel to the p vector. Newton would have writen in his own notes something like f = ẋ using his fluxions (i.e. f = dx/dt).

[3] I've talked about this on multiple occasions (here, here, here, or here).

[4] At the rate at which new ideas become incorporated into economic theory, I will probably have been dead for decades and someone else (with the proper credentials) will have independently come up with an equivalent framework.

[5] People often make the point that demand isn't directly observable (or as Steve Roth says, neither supply or demand are observable). My tweet-length retort to this is that the wavefunction in quantum mechanics isn't directly observable either. In fact, given the scale invariance of the information equilibrium condition, we actually have the freedom to re-define demand as twice or half any given value. This is analogous to what is called gauge freedom in physics (a result of the gauge symmetry). The electric and magnetic potentials are only defined up to a "gauge transformation" and are therefore not directly observable.

To me, this is a very satisfying way to think about demand. It is not a direct observable, but we can compute things with a particular value knowing that we can scale it to any possible value we want (at least if we are careful not to break the scale invariance in the same way you try not to break the gauge invariance in gauge theories). Nominal GDP might be an incomplete measure of aggregate demand, but so long as aggregate demand is roughly proportional to NGDP we can proceed. What is important is whether the outputs of the theory are empirically accurate, such as this example for the labor market.

Information transfer economics: year in review 2017

I finally published my book in August of this year. Originally I was just going to have an e-book, but after requests for a physical paperback version I worked out the formatting. I'm glad I did — it looks nice!

With 2017 coming to a close, I wanted to put together a list of highlights like I did last year. This year was the year of dynamic information equilibrium as well as presentations. It was also the year I took some bigger steps in bringing my criticisms of economics and alternative approaches to the mainstream, having an article at Evonomics and publishing a book.

I'd like to thank everyone who reads, follows and shares on Feedly and Twitter, or who bought my book. It is really only through readers, word of mouth, and maybe your own blog posts on information equilibrium (like at Run Money Run) that there is any chance the ideas presented here might be heard or investigated by mainstream economists.

I'd also like to thank Cameron Murray for a great review of my book, Brennan Peterson for helping me edit my book, as well as Steve Roth at Evonomics (and Asymtosis) for being an advocate and editor of my article there.

Dynamic information equilibrium


The biggest thing to happen was the development of the dynamic information equilibrium approach to information equilibrium. The seeds were planted in the summer of 2014 in a discussion of search and matching theory where I noted that the rate of unemployment recovery was roughly constant — I called it a "remarkable recovery regularity". Another piece was looking at how the information equilibrium condition simplifies given an exponential ansatz. But the Aha! moment came when I saw this article at VoxEU.org that plotted the log of JOLTS data. I worked out the short "derivation", and applied it to the unemployment rate the next day.

Since that time, I have been tracking forecasts of the unemployment rate (and other measures) using the dynamic equilibrium model. I even put together what I called a "dynamic equilibrium history" of the US contra Friedman's monetary history. As opposed to other economic factors and theories, the post-war economic history of the US is almost completely described by the social transformation of women entering the workforce. Everything from high inflation in the 70s to the fading Phillips curve can be seen as a consequence of this demographic change.

Dotting the i's and crossing the t's


Instead of haphazardly putting links to my Google Drive, I finally created Github repositories for the Mathematica (in February) and eventually Python (in July) code. But the most important thing I did theoretically was rigorously derive the information equilibrium conditions for ensembles of markets which turned out to be formally similar equations to individual markets. This was a remarkable result (in my opinion) because it means that information equilibrium could apply to markets for multiple goods — and therefore macroeconomic systems. In a sense it makes rigorous the idea that the AD-AS model is formally similar to a supply and demand diagram (and under what scope it applies). The only difference is that we should also see slowly varying information transfer indices which would manifest by e.g. slowing growth as economic systems become large.

Connections to machine learning?


These are nascent intuitions, but there are strong formal similarities between information equilibrium and Generative Adversarial Networks (GANs) as well as a theoretical connection to what is called the "information bottleneck" in neural networks. I started looking into it this year, and I hope to explore these ideas further in the coming year!

Getting the word out


Over the past year or so, I think I finally reached a point where I sufficiently understood the ideas worked through on this blog that I could begin outreach in earnest. In May I published an article at Evonomics on Hayek and the price mechanism that works through a the information equilibrium approach and connection to Generative Adversarial Networks (GANs). In August, I (finally) published my book on how I ended up researching economics, on my philosophy to approaching economic theory, as well as some of the insights I've learned over four years of work.

I also put together four presentations throughout year (on dynamic equilibrium, a global overview, on macro and ensembles, and on forecasting). Several of my presentations and papers are collected at the link here. In November, I started doing "Twitter Talks" (threaded tweets with one slide and a bit of exposition per tweet) which were aided by the increase from 140 to 280 characters — in the middle of the first talk! They were on forecasting, macro and ensembles, as well as a version of my Evonomics article.

*  *  *

Thanks for reading everyone! This blog is a labor of love, written in my free time away from my regular job in signal processing research and development.

Saturday, December 2, 2017

Comparing the S&P 500 forecast to data (update)

I haven't updated this one in awhile — last time in September — most because there seem to be some issues with Mathematica's FinancialData[] function such that it's no longer pulling in the archived data that computes the projection. So I did a kind of kludgy workaround where I just overlaid an updated graph of the latest data on an old graphic:


Thursday, November 30, 2017

Comparing my inflation forecasts to data

Actually, when you look at the monetary information equilibrium (IE) model I've been tracking since 2014 (almost four years now with only one quarter of data left) on its own it's not half-bad:


The performance is almost the same as the NY Fed's DSGE model (red):


A more detailed look at the residuals lets us see that both models have a bias (IE in blue, NY Fed in red):


The thing is that the monetary model looks even better if you consider the fact that it only has 2 parameters while the NY Fed DSGE model has 41 (!). But the real story here is in the gray dashed and green dotted curves in the graph above. They represent an "ideal" model (essentially a smoothed version of the data) and a constant inflation model — the statistics of their residuals match extremely well. That is to say that constant inflation captures about as much information as is available in the data. This is exactly the story of the dynamic information equilibrium model (last updated here) which says that PCE inflation should be constant [1]:


Longtime readers may remember that I noted a year ago that a constant model didn't do so well in comparison to various models including DSGE models after being asked to add one to my reconstructions of the comparisons in Edge and Gurkaynak (2011). However there are two additional pieces of information: first, that was a constant 2% inflation model (the dynamic equilibrium rate is 1.7% [2]); second, the time period used in Edge and Gurkaynak (2011) contains the tail end of the 70s shock (beginning in the late 60s and persisting until the 90s) I've associated with women entering the workforce:


The period studied by Edge and Gurkaynak (2011) was practically aligned with a constant inflation period per the dynamic information equilibrium model [3]. We can also see the likely source of the low bias of the monetary IE model — in fitting the ansatz for 〈k〉 (see here) we are actually fitting to a fading non-equilibrium shock. That results in an over-estimate of the rate of the slow fall in 〈k〉 we should expect in an ensemble model, which in turn results in a monetary model exhibiting slowly decreasing inflation over the period of performance for this forecast instead of roughly constant inflation.

We can learn a lot from these comparisons of models to data. For example, if you have long term processes (e.g. women entering the workforce), the time periods you use to compare models is going to matter a lot.  Another example: constant inflation is actually hard to beat for inflation in the 21st century — which means the information content of the inflation time series is actually pretty low (meaning complex models are probably flat-out wrong). A corollary of that is that it's not entirely clear monetary policy does anything. Yet another example is that if 〈k〉 is falling for inflation in the IE model, it is a much slower process than we can see in the data.

Part of the reason I started my blog and tried to apply some models to empirical data myself was that I started to feel like macroeconomic theory — especially when it came to inflation — seemed unable to "add value" beyond what you could do with some simple curve fitting. I've only become more convinced of that over time. Even if the information equilibrium approach turns out to be wrong, the capacity of the resulting functional forms to capture the variation in the data with only a few parameters severely constrains the relevant complexity [4] of macroeconomic models.

...

Footnotes:

[1] See also here and here for some additional discussion and where I made the point about the dynamic equilibrium model as constant inflation mode before.

[2] See also this on "2% inflation".

[3] You may notice the small shock in 2013. It was added based on information (i.e. a corresponding shock) in nominal output in the "quantity theory of labor" model. It is so small it is largely irrelevant to the model and the discussion.

[4] This idea of relevant complexity is related to relevant information in the information bottleneck as well as effective information in Erik Hoel's discussion of emergence that I talk about here. By related, I mean I think it is actually the same thing but I am just too lazy and or dumb to show it formally. The underlying idea is that functions with a few parameters that describe a set of data well enough is the same process in the information bottleneck (a few neuron states capture the relevant information of the input data) as well as Hoel's emergence (where you encode the data in the most efficient way — the fewest symbols).

Tuesday, November 28, 2017

Dynamic information equilibrium: world population since the neolithic

Apropos of nothing (well, Matthew Yglesias's new newsletter where he referenced this book from Kyle Harper on Ancient Rome), I decided to try the dynamic information equilibrium model on world population data. I assumed the equilibrium growth rate was zero, and fit the model to data. The prediction is about 12.5 billion humans in 2100 (putting it at the somewhat middle-higher end of these projections) with an equilibrium population at about 13.4 billion.

There were four significant transitions in the data centered at 2390 BCE, 500 BCE, 1424, and 1954. The widths (transition durations) were ~ 1000 years, between 0 and 100 years (highly uncertain, but small), ~ 300 years, and ~ 50 years, respectively. Historically, we can associate the first with the neolithic revolution following the Holocene Climate Optimum (HCO). The second appears right around the dawn of the Roman Republic. The third follows the Medieval Warm Period (MWP) and is possibly another agricultural revolution that is ending, while the final one is our modern world and is likely associated with public health and medical advances (it began near the turn of the century in 1900). Here's what a graph looks like:


I included some random items from (mostly) Western history to give readers some points of reference. The interesting thing is that "exponential growth" with a positive growth rate of 1% to 2% is really only a local approximation. Over history, the population growth rate is typically zero:


Some major technology developments seem to happen on the leading edge of these transitions (writing, money, horse collar/heavy plow, computers). Maybe a more systematic study of technology might yield some pattern — my hypothesis (i.e. random guess) is that there are bursts of tech development associated with these transitions as people try to handle the changes in society during the population surges. There are also likely social organization changes as well — the third transition roughly coincides with the rise of nation-states, and the fourth with modern urbanization.

Tuesday, November 21, 2017

Dynamic information equilibrium: UK CPI

The dynamic information equilibrium CPI model doesn't just apply to the US. Here is the UK version (data is yellow, model is blue):


The forecast is for inflation to run at about 2.1% (close to the US dynamic equilibrium of 2.5%) in the absence of shocks: