Monday, December 25, 2017

Checking some long term market forecasts against data

Since Mathematica seems to have stopped supporting S&P 500 data, I've finally recovered the original forecast using some cobbled-together data sources with a bit of re-sampling:


Most of the rise since the beginning of 2017 has been pretty much on trend; the most recent data is a bit above (probably due to the increased likelihood of stock buybacks in the wake of the tax bill passing) — but still within the expected error.

The same events are likely influencing the bond market with yields up recently, but again the path is consistent with the forecast I made in 2015:



*  *  *

And then there's bitcoin ...

Bitcoin went through a bit of crash recently, which has helped reduce the uncertainty in the expected path (per the model I've been following here):


As the previous link states, I've given up on this as a useful forecasting tool (bitcoin is too volatile, and it also seems estimates of shock amplitudes are initially too small then too large). Instead it's more of a post hoc description of the data that is consistent with dynamic equilibrium. The only thing I'd take seriously from this graph is the slope of the future path (i.e. down due to the −2.6/y dynamic information equilibrium rate). Another shock could hit in 2018 (or the current shock could continue with the recent fall being a temporary fluctuation), but in the absence of future shocks and taking the model's estimate that the current shock is mostly over as genuine the graph above is what I'd expect [1]. Note that the best fit suggests a rebound from the current losses, but new data could easily revise that estimate (which is why I consider the model useless for forecasting, but not necessarily describing data after the fact).

...

Footnotes:

[1] One thing that I find interesting is that in other cases the exchange rates between two currencies represent (effectively) a ratio of the GDPs of the two countries. Is this bitcoin exchange rate a ratio of the "bitcoin economy GDP" to the US GDP? There seems to be a financial industry swarming around bitcoin (with e.g. futures markets just recently) which some people seem to be making money off of (regardless of its long term sustainability). I'd say there are far more questions than answers here.

Wednesday, December 20, 2017

18 signs you are not having a productive conversation about economics

Inspired by Beatrice Cherrier's tweet thread containing both Chris Auld's "18 signs you’re reading bad criticism of economics" and UnlearningEcon's "18 Signs Economists Haven’t the Foggiest", I'd like to present my "18 signs you are not having a productive conversation about economics":

  1. Contains any discussion about the usefulness or lack of usefulness of mathematics that does not reference empirical data
  2. Makes claims about assumptions' realism or lack thereof without reference to empirical data
  3. Contains any reference to "all models are false", pro or con
  4. The cause of recessions is assumed or claimed to be known
  5. Elides the difference between macro- and micro-economics
  6. Uses the elision of macro and micro-economics as an argument against a critique of economics or ignores the fact that most people think of macroeconomics when they think of "economics" in their daily lives
  7. Places any significant meaning on the words "neoclassical", "neoliberal", or "mainstream" except as a historian of economic thought or via citing one
  8. Assumes accurate forecasting is the only use of macroeconomic theory
  9. Insinuates inaccurate forecasting is not a problem for macroeconomic theory
  10. Ignores the fact that existing DSGE models are no better than simple regressions or stochastic processes for forecasting
  11. Ignores the fact that the finite difference equations used in the DSGE framework are sufficiently general so as to encompass nearly any dynamic model
  12. Uses the existence of autoregressive models, use of microeconomic data, or estimating parameters of structural models to argue that macroeconomics is sufficiently empirical
  13. Elides the difference between "not rejected" and "useful"
  14. Elides the difference between academic and popular economics  
  15. Uses the elision of the difference between academic and popular economics as an argument against a critique of economics 
  16. Makes a claim about the history of economic thought without referencing a historian of economic thought
  17. The cause of inflation is assumed or claimed to be known
  18. Mentions "money", or critiques a mention of "money" by asking what is meant by "money"


Tuesday, December 19, 2017

Emergence, rationality, and human behavior

Irrational mind from rational brain; illustration by Douglas Hofstadter in Godel Escher Bach. Is this a perturbation to rationality or emergent from rationality?

[Gary] Becker acknowledged that ... he was pushing the envelope. In his Nobel address he discusses this explicitly (Becker 1993). “I have intentionally chosen certain topics for my research—such as addiction—to probe the boundaries of rational choice theory. … My work may have sometimes assumed too much rationality, but I believe it has been an antidote to the extensive research that does not credit people with enough rationality”. 
Becker’s last sentence suggests an alternative definition of behavioral economics: crediting people with just the right amount of rationality and human foibles. The trick is in figuring out what is just the right amount. The approach taken by most behavioral economists has been to focus on a few important ways in which [Homo sapiens, HS] diverge from Homo economicus [HE].
That is from a recent article by Richard Thaler in JPE. Thaler's characterization suggests a particular interpretation of economic theory where real humans are perturbations from a rational Homo economicus:

(1) HS = HE + dB

Now this "behavior perturbation theory" formulation is entirely plausible. It represents the typical approach to quantum field theory where the electrons we observe are "ideal" electrons plus quantum interactions (dI):

Electron = Ideal electron + dI

However, this formulation is incorrect for thermodynamics. We do not say [1] that diffusion is an atom plus some diffusion perturbation (dD):

(✕) Diffusing atom = Atom + dD

but rather it is an entropic force that does not exist for a single atom and requires there to be an collection of a large number of atoms [7]:

Diffusion = d(Σ Atoms)

Diffusion is a pseudo-force that exists due to the possible configurations of a collection of atoms, but not for any single atom. This sets up a different paradigm from (1). Economic forces might well arise from collections of humans

(2) Economic forces = d(Σ HS)

That is to say that economic forces (macro or micro) are gradients in the state space accessible by real humans. In this paradigm, you might be able to observe many behavioral effects (dB) relative to Homo economicus that don't show up in economic forces as a simple aggregate if at all. This might explain Thaler's lament:
The field of behavioral economics has been around for more than three decades, but the application of its findings to societal problems has only recently been catching on.
Could this be because:

d(Σ HS) ≈ d(Σ HE)

(i.e. on average the idiosyncratic human behaviors dB 'cancel' such that Σ dB ≈ 0)? Or what about:

d(Σ HS) ≈ HE?

This approximation is something I speculated about awhile ago (that a rational representative agent might be emergent from irrational humans). My own particular view is that macroeconomics can be understood using (2) with the addition of perturbations due to collective behavior (CB):

(3) Economic forces = d(Σ HS) + dCB

These "collective perturbations" [6] are things like so-called groupthink or the panics that accompany financial crises that lead to correlations among Homo sapiens. This would also include information cascades (where the "conventional wisdom" undergoes a "phase transition" from thinking e.g. fundamentals justify asset prices to "this is a bubble"). Notice that you can't really have "groupthink" or "information cascades" without a collection of Homo sapiens or other agents.

Overall, there are several possible paradigms for understanding economic forces and each of these paradigms indicate which effects are important. But there's another way to organize these paradigms in terms of scales, emergence, and information theory. Erik Hoel wrote an interesting paper (or rather series of papers) that I discuss here where he uses information theory to describe how theories explain observations and data. You can think of a theory as a way to decode empirical evidence. The codes can be more or less efficient depending on how much information is lost (and whether lost information is relevant information). But it is unlikely a single code (theory) is efficient across all messages (phenomena), so you end up with specific codes that work well for collections of phenomena. The collections of phenomena are split up by what we can call scales, and so in economics we might have a macro scale (economic forces) and a micro scale (humans). There is no reason that a particular code (theory) at one scale has to have anything to do with a code at a different scale, and it also does not necessarily make sense to cross over from one scale to another because each scale has its own agents (degrees of freedom).

That may seem like a tangent, but I think it is critical to understanding the economic theory paradigms above. A behavioral theory may well describe Homo sapiens at the individual human scale, but may not apply (or have any direct analog) at the economic/macro scale [2]. Paradigm (3) keeps everything at the collective scale (collective forces with collective perturbations [3]) while paradigm (1) tells us that behavior crosses scales from individuals to economic forces among societies. Of course, either paradigm could be the most efficient way to understand economic theory, but I think there is a bit of an over-simplification in Thaler's 'Homo economicus plus behavioral perturbations' view [4]. However since economic forces at the macro scale aren't very well understood, I don't think we should limit ourselves to any specific paradigm [5] — but we should keep in mind that the connection between theories at different scales may be tenuous (or simply not informative).

At the top of this post, I have a picture of an illustration from one of my favorite books: Godel Escher Bach by Douglas Hofstadter. Hofstadter is trying to illustrate how rationality (illustrated by math facts) at one level (computer code or neuron behavior) can in fact result in irrationality (illustrated by 2+2=5) at another level (an artificial intelligence or human behavior). The connection between 10+6=16 and 2+2=5 is more than just tenuous, but complete logical disconnect. But this picture illustrates my point about how the "perturbations to rationality" view is limiting: in what sense is the irrationality of 2+2=5 a perturbation to the rationality of 10+6=16?

This disconnect could also occur when transitioning from the human scale to the economic scale in my modification of Hofstadter's illustration:



The irrationally behaving humans (made up of rationally behaving machine-like neurons) can themselves be collected into a rational macroeconomic system. With this post, I am only emphasizing that this is a theoretical possibility given the present state of understanding economic forces — something to keep in mind.

...

Footnotes:

[1] It is possible to set up an effective theory that works like this, but it would be far more limited in scope than the correct approach. Actually, I think that is a good way to think about behavioral economic formulated as perturbation theory: it functions for limited scope.

[2] This could also apply to agent based modelling. I'm not saying it is definitely true or anything like that, only that assuming agent-based modelling is the only way to answer questions is potentially flawed (more on that here).

[3] This is not to say that collective perturbations are not reducible to individual behavior, but rather I am thinking in terms of "weak emergence" where the the theory individual behavior that yields the collective behavior is not as informative or useful as just understanding the collective behavior at its own scale.

[4] Thaler may well be adopting this view for the purpose of persuasion: "economic theory is fine with some behavioral tweaks" rather than "burn it down".

[5] Individuals can limit themselves to study one paradigm if they want.

[6] These perturbations fall under the heading of "non-ideal information transfer".

[7] This "schematic" equation related directly to the gradient ("d") of entropy ("Σ Atoms") in the actual definition of entropic force.

Monday, December 18, 2017

Big facts

Chris Dillow has a great post about "big facts" in economics which he illustrates with three examples. The first is the Efficient Markets Hypothesis:
Take, for example, the efficient markets hypothesis. Researchers have found countless small facts that seem to refute this – well over 100 anomalies at the last count. All these, however, run into the Big Fact – that fund managers do not beat the market.
The second is involuntary unemployment:
Here’s a second example. A Big Fact is that the unemployed are significantly less happy than those in work. This is inconsistent with ideas that unemployment is voluntary: people should be happy if they’re on holiday. It thus refutes labour market-clearing real business cycle theories.
The third is the inability to predict recessions:
A third Big Fact is that mainstream economic forecasters have consistently failed to predict recessions – something which pre-dates the 2008 crisis – and in fact are much worse recession predictors than the simple yield curve.
Off the top of my head, I think I'd only add Okun's law to this list. Maybe the disappearing Phillips curve as well. But overall, I've addressed all three of these big facts using the information equilibrium framework:

Thursday, December 14, 2017

Dynamic information equilibrium model of initial claims


Weekly initial unemployment insurance claims (ICSA) were put up on FRED today, and so I thought I'd apply the dynamic information equilibrium model to the data. First, we need to divide by the size of the Civilian Labor Force (CLF) per the model $ICSA \rightleftarrows CLF$ in order to observe the ratio such that:

$$
\frac{d}{dt} \log \frac{ICSA}{CLF} \approx (k-1) r + \sum_{i} \sigma_{i}(t) \equiv \alpha + \sum_{i} \sigma_{i}(t)
$$

For the data since the mid-90s, there are two shocks $\sigma_{i}(t)$ corresponding to the early 2000s recession and "the Great Recession". The forecast, conditional on the absence of shocks (detectable e.g. via this algorithm), calls for a continued fall in the initial claims rate:


The dynamic equilibrium rate $\alpha$ is −0.103/y (that is to say roughly a 10% relative fall per year — e.g. a 1% initial claims rate would fall by 0.1 percentage points to 0.9%). I'll continue to follow this forecast in the new year.

...

Update 1 August 2019

lol. I never did track this one until today (a year and a half later). It did fine — here's the original graph and a zoomed-in version:



Wednesday, December 13, 2017

On these 33 theses


The other day, Rethinking Economics and the New Weather Institute published "33 theses" and metaphorically nailed them to the doors of the London School of Economics [1]. They're re-published here. I think the "Protestant Reformation" metaphor they're going for is definitely appropriate: they're aiming to replace "neoclassical economics" — the Roman Catholic dogma in this metaphor — with a a pluralistic set of different dogmas — the various dogmas of the Protestant denominations (Lutheran, Anabaptist, Calvinist, Presbyterian, etc). For example, Thesis 2 says:
2. The distribution of wealth and income are fundamental to economic reality and should be so in economic theory.
This may well be true, but a scientific approach does not assert this and instead collects empirical evidence that we find to be in favor of hypotheses about observables that are affected by the distribution of wealth. A dogmatic approach just assumes this. It is just as dogmatic as neoclassical economics assuming the market distribution is efficient.

In fact, several of the theses are dogmatic assertions of things that either have tenuous empirical evidence in their favor or are simply untested hypotheses. These theses are not things you dogmatically assert, but rather should show with evidence:
11. ... Economics needs a deeper understanding of how markets behave, and could learn from the science of complex systems ...
21. ... Our understanding of GDP growth may be improved if we see innovation as occurring within a constantly-evolving, disequilibrium ecosystem ...
23. Private debt also profoundly influences the rate at which the economy grows[,] and yet is excluded from economic theory. The creation of debt adds credit-financed demand, and affects both goods and asset markets. ...
25. The way in which money is created affects the distribution of wealth within society. ...
27. Economics needs a better understanding of how instability and crises can be created internally within markets, rather than treating them as ‘shocks’ that affect markets from the outside.
There were two more theses that are also dogmatic assertions but can in fact be shown to be false in one relatively empirically accurate approach:
4. Policy does not ‘level’ the playing field, but tilts it in a direction.
12. Institutions shape markets, and influence the behaviour of all economic actors. ...
To first order in the information equilibrium approach, it seems empirically that economic policy does not affect a lot of things from GDP to unemployment over the bulk of the time series, and many institutions (e.g. the business news, central banks) serve as nuclei of coordination (groupthink) that cause problems (e.g. financial crises) for the sparse shocks in the time series. I'm not asserting this approach is correct, but rather that these theses exclude information equilibrium from the purportedly pluralistic set of approaches.

As with the other theses above, this is primarily because they represent hypotheses that need to be tested that are not true in many theories and may not be true in reality. I've written that focusing on human decision-making may be a longstanding bias (unchallenged assumption) in economic theory that has held it back. But I would not write up a manifesto where one thesis is that human decisions don't matter. Leave it to research to figure out [2].

A few theses seem like the authors regret choosing economics and wish they'd chosen a different field like ecology, psychology, or physics:
6. ... [The economy] depends upon a continual through-flow of energy and matter, and operates within a delicately balanced biosphere.
11. ... Economics needs a deeper understanding of how markets behave, and could learn from the science of complex systems, as used in physics, biology, and computing.
15. ... Mainstream economics therefore needs a broader understanding of human behaviour, and can learn from sociology, psychology, philosophy, and other schools of thought.
The first entry in the last section on teaching economics almost had me spitting out my coffee:
29. [Economics education should] also [include] a wide range of current perspectives – such as institutional, Austrian, Marxian, post-Keynesian, feminist, ecological, and complexity.
Austrian economics is basically garbage, and I'm not sure what they mean by complexity. Usually teaching (at least at the undergraduate level) is reserved for approaches that have withstood the test of time and are generally agreed as useful [3]. The speculative or alternative methodologies are more appropriate to graduate school (they require more critical thinking skills). You write a thesis on post-Keynesian economics, but you take a test on supply and demand.

This thesis was weird given that in the third rationale in the preamble said that economics wasn't being scientific:
31. Economics should not be taught as a value-neutral study of models and individuals. ...
It's true that sometimes science isn't "value neutral", but the key here is admitting and documenting your values and biases — not saying that you should go ahead and include your value system in a scientific approach to your subject.

*  *  *

Overall, these 33 theses represent a lot of unfounded assumptions and hypotheses presented as facts that feels to me to be almost more dogmatic than "mainstream" (or neoclassical) economics. A neoclassical synthesis education has produced a range of voices from Brad DeLong to John Cochrane (and even Steve Keen who has a Phd in economics). I agree that economics should be taught with eyes wide open to where the approach gets things wrong, but it seems that a traditional economics education is just as likely to generate a person that can question the mainstream (both DeLong and Cochrane do so extensively!) as any other field that teaches critical thinking —dare I say physics?

In the end, this effort is not terribly dissimilar to a reviewer asking why you didn't cite his or her paper in your manuscript, or an attendee at a conference asking a question about how the presenter's approach relates to the attendee's research. Why don't college economics classes teach my research? I ask that as a tongue-in-cheek rhetorical question about my own approach (of course they shouldn't teach information equilibrium in economics textbooks yet [4]), but I think Steve Keen actually intends to get his approaches [1] in their current form into introductory economics textbooks.

Again, the way to be scientific is not to make assertions about how inequality affects growth. In fact, I would question the results in a paper on that subject from anyone signing on to these theses. These economists have admitted a bias in favor finding a negative effect of increased inequality on an economy. When their paper comes out and says that inequality causes lower growth, my first reaction is going to be "of course you found that", not "I want to read this paper". It is no different from the way I look at papers from Tyler Cowen or John Cochrane that say lower taxes or less regulation improve growth. In a document that says economics has fallen short of the standard of science, explicitly claiming bias on major research topics seems like an odd choice.

...

PS Some of the writing is just funny:
5. The nature of the economy is that it is a subset of nature ...
...

Footnotes:

[1] Steve Keen was one of the people "nailing" the document to the doors. Part of the document says that economics is "developing more as a faith than as a science" which I find terribly ironic. If I had to choose an approach that was less scientific than mainstream economics, I'd probably choose Keen's (see here, here, here, here, or here). Kate Raworth was also in the pictures and I have issues with her dogmatic approach as well.

[2] This is basically a problem that I've found many times in economics: what should be a research question is instead asserted as a definition or dogmatic position. I documented several of these with regard to what recessions are. If these 33 theses are supposed to form the basis of an economic theory framework, then it shouldn't make major assumptions about the objects of study in that framework. Economic theory is supposed to study the effects of inequality and recessions, not assert their properties.

[3] What's funny about this is that the 33 theses starts out with a statement that "neoclassical economics made a contribution historically and is still useful", which would mean that the consensus framework among the various pluralistic approaches that should be taught in undergraduate curriculum is in fact neoclassical economics.

[4] But that doesn't stop me from dreaming ... [see here and here]

How is the CPI forecast holding up?

Let's update the dynamic information equilibrium CPI (all items) forecast graph with the latest data (previous update here):



Here is the year-over-year inflation version as well:



How is the civilian labor force participation forecast holding up?

I've added some recent points to the prime age Civilian Labor Force (CLF) participation rate forecast (both the original as well as the "tiny 2016 shock" version):



Note: there appears to be a slight difference in the solution found for the original shock. It is possible I altered the initial guess for the optimization, or maybe my newly updated Mathematica 11 from a couple weeks ago chose a different optimization method with the "Automatic" setting. It is not  huge difference (the post-2015 period appears shifted down by about 0.1 percentage points relative to the previous solution), but I'll try and find the reason and recover the previous result.

Monday, December 11, 2017

JOLTS data day!

The latest data from the Job Openings and Labor Turnover Survey is out today on FRED and we're here with another update of the forecast performance/recession indicator. Here are the hires and openings data:


Here's the update of the hires shock counterfactual evolution (a fall in hires might be a leading indicator):


Here's the updated Beveridge curve as well:


Sunday, December 10, 2017

Another unemployment rate forecast comparison

Paul Romer tweeted a graph of an unemployment rate projection. I'm not sure where it came from — my guess is the World Bank — but I thought I'd add it to the forecast from the dynamic information equilibrium model last updated here. Already it (thick dark blue line) seems to be fairly wrong (no confidence limits were given; the dynamic equilibrium model uses 90%):


Of course the dynamic information equilibrium forecast is conditional on the lack of shocks (which can be identified via the algorithm discussed here). The forecast Romer tweeted could be the result of a very broad but small amplitude shock to the dynamic equilibrium model, but such a shock would be unlike any other adverse shock in the US data since the Great Depression.

Saturday, December 9, 2017

Latest unemployment numbers and structural unemployment

The latest monthly unemployment numbers for the US came out on Friday (unchanged at 4.1% from last month) and so I've yet again put the new data points on my old model forecast graphs to see how they're performing (just great, by the way — more details are below). There were several mentions of the old "structural unemployment" argument (against fiscal or monetary stimulus) given in the wake of the financial crisis saying that the arguments hadn't held up well as unemployment has fallen to the lowest levels in years. In particular, Paul Krugman noted:
Remember when all the Very Serious People knew that high unemployment was structural, due to a massive skills gap, and could never be expected to return to pre-crisis levels?
He linked back to an old blog post of his where he showed an analysis from Goldman Sachs about state unemployment rates and then looked at unemployment rates and the subsequent recovery by occupation. The data showed that occupations (and states) that had been hit harder (unemployment increased more) had recovered faster (unemployment had declined more). Krugman said this indicated unemployment was cyclical, not structural:
So the states that took the biggest hit have recovered faster than the rest of the country, which is what you’d expect if it was all cycle, not structural change. ... the occupations that took the biggest hit have had the strongest recoveries. In short, the data strongly point toward a cyclical, not a structural story ...
What was interesting to me was that the data Krugman showed was actually just a result of the dynamic information equilibrium model — the larger the shock, the faster the fall since the dynamic information equilibrium is a constant slope of (d/dt) log u(t). In fact, the data Krugman shows match up pretty well with the result you'd expect from the dynamic equilibrium model:


This tells us that the dynamic equilibrium is the same across different occupations (much like how the dynamic equilibrium is the same for different races, or for different measures of the unemployment rate). All of this tells us that unemployment recoveries [1] are closer to a force of nature (or "deep structural parameters" in discussions of the Lucas critique). But on another level, this is also just additional confirmation of the usefulness of the dynamic equilibrium model for unemployment.

*  *  *

As I mentioned above, I also wanted to show how the forecasts were doing. The first graph is the model forecast alone. The second graph shows comparisons with the (frequently revised) forecasts from the FRB SF. The third graph shows a comparison with the (also revised) forecast from the FOMC.





...

Footnotes:

[1] The shocks to unemployment are non-equilibrium processes in this model. It remains an open question whether these shocks can be affected by policy, or whether they too are a force of nature.

Tuesday, December 5, 2017

Does increased compensation cause increased productivity?

Noah Smith has an article at Bloomberg View asking why compensation hasn't risen in lockstep with productivity. Recent research seems to say it at least rises a little when output rises in the short run, but not one-for-one:
This story gets some empirical support from a new study by economists Anna Stansbury and Larry Summers, presented at a recent conference at the Peterson Institute for International Economics. Instead of simply looking at the long-term trend, Stansbury and Summers focus on more short-term changes. They find that there’s a correlation between productivity and wages — when productivity rises, wages also tend to rise. Jared Bernstein, senior fellow at the Center on Budget and Policy Priorities, checked the results, and found basically the same thing.
I thought the long run data would be a good candidate for the dynamic information equilibrium model, but came out with some surprising results. It's true that the models appear correlated. Real output per hour (OPH) seems to rise faster at 1.46%/y while real compensation per hour (CPH) rises at about 0.45%/y. This has held up throughout the data that isn't being subject to a non-equilibrium shock (roughly the "Great Moderation" and the post-global financial crisis period).

But the interesting part of this particular framing of the data is the timing of the shocks — shocks to real compensation per hour precede shocks to real output per hour:


The shocks to CPH (t = 1952.1 and t = 1999.6) precede the shocks to OPH (t = 1959.8 and t = 2001.0). Real compensation increases before real output increases. It's not that compensation captures some part of rising output; it's that giving people raises increases productivity.

Now it is entirely possible this framing of the data isn't correct (there is a less statistically significant version of the dynamic equilibrium that sees the periods of the shocks as the equilibrium and the 80s and 90s, as well as the post-crisis period as the shocks). However there is some additional circumstantial evidence that the productivity shocks correspond to real world events. The late 90s shock seems associated with the introduction of the internet to a wider audience than defense and education, while the 40s and 50s shock is likely associated with a post-war increase in production efficiency in the US. It is possible increased compensation is due to increased skills required to use new technologies and methods — with those raises and increased starting salaries needing to happen before firms can implement these technology upgrades [it costs more to get labor with the latest skills]. Those could well be just-so stories (economists like stories, right?), but I believe the most interesting aspect is simply the plausible existence of this entirely different (but mathematically consistent) way to look at the data along with the entirely different policy implications (i.e. needing to find ways to directly raise wages instead of looking for ways to increase growth or productivity).

...

Update: I added a bit of clarifying text — e.g. "upgrades" and bracketed parenthetical — in the last paragraph that was ambiguous. Also the reference to "post-crisis period" replaces "2010s" because the latter could be confused with the actual shock to output in 2008/9 whereas I am actually referring to the the period after the shock that we are still in.

Monday, December 4, 2017

Supply and demand and science

Sometimes recognizing a symmetry is the biggest step.

Sometimes when you give a seminar or teach a class a student or attendee brings up a point that perfectly sets you up to explain your topic. A few tweets from @UnlearningEcon and Steve Roth gave me this opportunity today:
UE: Confused by the amount of times it is claimed demand and supply has loads of empirical evidence behind it when I've barely seen any 
UE: Naming a couple of obvious insights or point estimates is not sufficient to prove the full model! Can somebody give me an actual falsifiable test, please? 
UE: Conclusion: demand-supply is largely an article of faith which people 'prove' with a couple of casual observations. Science! 
SR: Thing is you can't measure demand (desire) and supply (willingness) to buy/sell — necessarily, across a range of prices at a point in time. Only observe the P/Q where they meet. Why S/D diagrams are always dimensionless.
There's an almost perfect analogy here with the concept of "force" in physics. Force F, often described using the equation F = m a [1] or better F = dp/dt, is actually a definition. That is to say it's F ≡ dp/dt. At the time of Newton [2], it was an article of faith. It was an article of faith that organized a bunch of disparate empirical "point estimates" and insights from Kepler and Galileo.

That is all to say Newton represents more of a statement of "looking at it this way, it's much simpler" than an application of the oversimplified "scientific method" we all were taught in school that involves an hypothesis, collecting data, and using that data to confirm or reject the hypothesis. Unfortunately contributions to science like Newton's aren't easily reproduced in classrooms so most people end up thinking the hypothesis testing with data is all there is. Note that Einstein's famous contributions were like Newton's in the sense that the organized a bunch of disparate empirical point estimates (in this case they were deviations from a Newtonian world).

Though Newton and Einstein get all the plaudits, both of their big contributions are really specific instances of what is probably one of the greatest contributions to physics of all time: Noether's theorem. Emmy Noether was asked by David Hilbert about energy conservation in General Relativity, but she ended up proving the more general result that conservation laws are consequences of symmetry principles. Newton's symmetry was Galilean invariance; Einstein's were Lorentz covariance (special relativity) and general covariance (general relativity). Newton's laws are really just a consequence of conservation of momentum.

That gives us a really good way to think about Newton-like contributions to science: they involve recognizing (or steps towards recognizing) general symmetry principles.

What does this have to do with supply and demand?

This is where Steve Roth's tweet and the work on this blog comes in. Supply and demand relationships seem to be a consequence of scale invariance (the dimensionlessness Steve points out) of information equilibrium [3]. In fact, realizing supply and demand is a consequence of the scale invariance encapsulated by the information equilibrium condition gives us a handle on the scope conditions — where we should expect supply and demand to fail. And since those scope conditions (which involve e.g. changing supply much faster than demand can react) can easily fail in real-world scenarios, we shouldn't expect supply and demand as a general theory to always be observed and empirically validated. Good examples are labor and housing markets where it is really hard to make supply change faster than demand (in the former case because adding workers adds wage earners adding demand, and in the latter case because it is impossible to add housing units fast enough). What we should expect is that when the right conditions are in effect, supply and demand will be a useful way to make predictions. One of my favorites examples uses Magic, the Gathering cards.

Yes, I realize I have obliquely implied that I might be the Isaac Newton of economics [4]. But that isn't the main point I am trying to make here. I'm trying to show how attempts at "falsification" aren't the only way to proceed in science. Sometimes useful definitions help clarify disparate insights and point estimates without being directly falsifiable. The information equilibrium approach is one attempt at understanding the scope conditions of supply and demand. There might be other (better) ones. Without scope conditions however, supply and demand would be either falsified (since counterexamples exist) or unfalsifiable (defined in such a way as to be unobservable [5]).

...

Footnotes

[1] You may think that mass and acceleration are pretty good direct observables making force an empirical observation. While acceleration is measurable, mass is problematic given that what we really "measure" is force (weight) in a gravitational field (also posited by Newton). Sure, this cancels on a balance scale (m₁ g = m₂ g → m₁ = m₂), but trying to untangle the epistemological mess is best left to arguments over beer.

[2] Actually Newton's Lex II was a bit vague:
Lex II: Mutationem motus proportionalem esse vi motrici impressae, et fieri secundum lineam rectam qua vis illa imprimitur.
A somewhat direct translation is:
Second Law: The alteration of motion is ever proportional to the motive force impressed; and is made in the direction of the right line in which that force is impressed.
The modern understanding is:
Second Law: The change of momentum of a body is proportional to the impulse impressed on the body, and happens along the straight line on which that impulse is impressed.
Where momentum and impulse now have very specific definitions as opposed to "motive force" and "motion". This is best interpreted mathematically as

I ≡ Δp

where I is impulse and p is the momentum vector. The instantaneous force is (via the fundamental theorem of calculus, therefore no assumptions of relationships in the world)

I = ∫ dt F

F ≡ dp/dt

where p is the momentum vector. The alteration of "motion" (i.e. momentum) is Δp (or infinitesimal dp), and the rest of the definition says that the F vector (and impulse vector I) is parallel to the p vector. Newton would have writen in his own notes something like f = ẋ using his fluxions (i.e. f = dx/dt).

[3] I've talked about this on multiple occasions (here, here, here, or here).

[4] At the rate at which new ideas become incorporated into economic theory, I will probably have been dead for decades and someone else (with the proper credentials) will have independently come up with an equivalent framework.

[5] People often make the point that demand isn't directly observable (or as Steve Roth says, neither supply or demand are observable). My tweet-length retort to this is that the wavefunction in quantum mechanics isn't directly observable either. In fact, given the scale invariance of the information equilibrium condition, we actually have the freedom to re-define demand as twice or half any given value. This is analogous to what is called gauge freedom in physics (a result of the gauge symmetry). The electric and magnetic potentials are only defined up to a "gauge transformation" and are therefore not directly observable.

To me, this is a very satisfying way to think about demand. It is not a direct observable, but we can compute things with a particular value knowing that we can scale it to any possible value we want (at least if we are careful not to break the scale invariance in the same way you try not to break the gauge invariance in gauge theories). Nominal GDP might be an incomplete measure of aggregate demand, but so long as aggregate demand is roughly proportional to NGDP we can proceed. What is important is whether the outputs of the theory are empirically accurate, such as this example for the labor market.

Information transfer economics: year in review 2017

I finally published my book in August of this year. Originally I was just going to have an e-book, but after requests for a physical paperback version I worked out the formatting. I'm glad I did — it looks nice!

With 2017 coming to a close, I wanted to put together a list of highlights like I did last year. This year was the year of dynamic information equilibrium as well as presentations. It was also the year I took some bigger steps in bringing my criticisms of economics and alternative approaches to the mainstream, having an article at Evonomics and publishing a book.

I'd like to thank everyone who reads, follows and shares on Feedly and Twitter, or who bought my book. It is really only through readers, word of mouth, and maybe your own blog posts on information equilibrium (like at Run Money Run) that there is any chance the ideas presented here might be heard or investigated by mainstream economists.

I'd also like to thank Cameron Murray for a great review of my book, Brennan Peterson for helping me edit my book, as well as Steve Roth at Evonomics (and Asymtosis) for being an advocate and editor of my article there.

Dynamic information equilibrium


The biggest thing to happen was the development of the dynamic information equilibrium approach to information equilibrium. The seeds were planted in the summer of 2014 in a discussion of search and matching theory where I noted that the rate of unemployment recovery was roughly constant — I called it a "remarkable recovery regularity". Another piece was looking at how the information equilibrium condition simplifies given an exponential ansatz. But the Aha! moment came when I saw this article at VoxEU.org that plotted the log of JOLTS data. I worked out the short "derivation", and applied it to the unemployment rate the next day.

Since that time, I have been tracking forecasts of the unemployment rate (and other measures) using the dynamic equilibrium model. I even put together what I called a "dynamic equilibrium history" of the US contra Friedman's monetary history. As opposed to other economic factors and theories, the post-war economic history of the US is almost completely described by the social transformation of women entering the workforce. Everything from high inflation in the 70s to the fading Phillips curve can be seen as a consequence of this demographic change.

Dotting the i's and crossing the t's


Instead of haphazardly putting links to my Google Drive, I finally created Github repositories for the Mathematica (in February) and eventually Python (in July) code. But the most important thing I did theoretically was rigorously derive the information equilibrium conditions for ensembles of markets which turned out to be formally similar equations to individual markets. This was a remarkable result (in my opinion) because it means that information equilibrium could apply to markets for multiple goods — and therefore macroeconomic systems. In a sense it makes rigorous the idea that the AD-AS model is formally similar to a supply and demand diagram (and under what scope it applies). The only difference is that we should also see slowly varying information transfer indices which would manifest by e.g. slowing growth as economic systems become large.

Connections to machine learning?


These are nascent intuitions, but there are strong formal similarities between information equilibrium and Generative Adversarial Networks (GANs) as well as a theoretical connection to what is called the "information bottleneck" in neural networks. I started looking into it this year, and I hope to explore these ideas further in the coming year!

Getting the word out


Over the past year or so, I think I finally reached a point where I sufficiently understood the ideas worked through on this blog that I could begin outreach in earnest. In May I published an article at Evonomics on Hayek and the price mechanism that works through a the information equilibrium approach and connection to Generative Adversarial Networks (GANs). In August, I (finally) published my book on how I ended up researching economics, on my philosophy to approaching economic theory, as well as some of the insights I've learned over four years of work.

I also put together four presentations throughout year (on dynamic equilibrium, a global overview, on macro and ensembles, and on forecasting). Several of my presentations and papers are collected at the link here. In November, I started doing "Twitter Talks" (threaded tweets with one slide and a bit of exposition per tweet) which were aided by the increase from 140 to 280 characters — in the middle of the first talk! They were on forecasting, macro and ensembles, as well as a version of my Evonomics article.

*  *  *

Thanks for reading everyone! This blog is a labor of love, written in my free time away from my regular job in signal processing research and development.

Saturday, December 2, 2017

Comparing the S&P 500 forecast to data (update)

I haven't updated this one in awhile — last time in September — most because there seem to be some issues with Mathematica's FinancialData[] function such that it's no longer pulling in the archived data that computes the projection. So I did a kind of kludgy workaround where I just overlaid an updated graph of the latest data on an old graphic:


Thursday, November 30, 2017

Comparing my inflation forecasts to data

Actually, when you look at the monetary information equilibrium (IE) model I've been tracking since 2014 (almost four years now with only one quarter of data left) on its own it's not half-bad:


The performance is almost the same as the NY Fed's DSGE model (red):


A more detailed look at the residuals lets us see that both models have a bias (IE in blue, NY Fed in red):


The thing is that the monetary model looks even better if you consider the fact that it only has 2 parameters while the NY Fed DSGE model has 41 (!). But the real story here is in the gray dashed and green dotted curves in the graph above. They represent an "ideal" model (essentially a smoothed version of the data) and a constant inflation model — the statistics of their residuals match extremely well. That is to say that constant inflation captures about as much information as is available in the data. This is exactly the story of the dynamic information equilibrium model (last updated here) which says that PCE inflation should be constant [1]:


Longtime readers may remember that I noted a year ago that a constant model didn't do so well in comparison to various models including DSGE models after being asked to add one to my reconstructions of the comparisons in Edge and Gurkaynak (2011). However there are two additional pieces of information: first, that was a constant 2% inflation model (the dynamic equilibrium rate is 1.7% [2]); second, the time period used in Edge and Gurkaynak (2011) contains the tail end of the 70s shock (beginning in the late 60s and persisting until the 90s) I've associated with women entering the workforce:


The period studied by Edge and Gurkaynak (2011) was practically aligned with a constant inflation period per the dynamic information equilibrium model [3]. We can also see the likely source of the low bias of the monetary IE model — in fitting the ansatz for 〈k〉 (see here) we are actually fitting to a fading non-equilibrium shock. That results in an over-estimate of the rate of the slow fall in 〈k〉 we should expect in an ensemble model, which in turn results in a monetary model exhibiting slowly decreasing inflation over the period of performance for this forecast instead of roughly constant inflation.

We can learn a lot from these comparisons of models to data. For example, if you have long term processes (e.g. women entering the workforce), the time periods you use to compare models is going to matter a lot.  Another example: constant inflation is actually hard to beat for inflation in the 21st century — which means the information content of the inflation time series is actually pretty low (meaning complex models are probably flat-out wrong). A corollary of that is that it's not entirely clear monetary policy does anything. Yet another example is that if 〈k〉 is falling for inflation in the IE model, it is a much slower process than we can see in the data.

Part of the reason I started my blog and tried to apply some models to empirical data myself was that I started to feel like macroeconomic theory — especially when it came to inflation — seemed unable to "add value" beyond what you could do with some simple curve fitting. I've only become more convinced of that over time. Even if the information equilibrium approach turns out to be wrong, the capacity of the resulting functional forms to capture the variation in the data with only a few parameters severely constrains the relevant complexity [4] of macroeconomic models.

...

Footnotes:

[1] See also here and here for some additional discussion and where I made the point about the dynamic equilibrium model as constant inflation mode before.

[2] See also this on "2% inflation".

[3] You may notice the small shock in 2013. It was added based on information (i.e. a corresponding shock) in nominal output in the "quantity theory of labor" model. It is so small it is largely irrelevant to the model and the discussion.

[4] This idea of relevant complexity is related to relevant information in the information bottleneck as well as effective information in Erik Hoel's discussion of emergence that I talk about here. By related, I mean I think it is actually the same thing but I am just too lazy and or dumb to show it formally. The underlying idea is that functions with a few parameters that describe a set of data well enough is the same process in the information bottleneck (a few neuron states capture the relevant information of the input data) as well as Hoel's emergence (where you encode the data in the most efficient way — the fewest symbols).