Wednesday, February 14, 2018

Comparing CPI forecasts to data

New CPI data is out today, and here is the latest data point as both continuously compounded annual rate of change and year-over-year change. The latest uptick is consistent with the a general upward trend after the post-recession shock to the labor force.



Tuesday, February 13, 2018

Some historical myths about Einstein and relativity

Which "thought experiment" leads you this geometry?

One of the things I've noticed ever since I started doing some "freelance" (or "maverick" or "nutcase") economic research is how many strange accounts of how special relativity came about are out there in the world. It's a story frequently invoked by people from all walks of life from economists to philosophers to general fans of science as an example of an ideal process of science. However the story invoked is often at odds with what actually happened or with how physicists today view the outcome.

The popular re-telling actually has many parallels with the popular but erroneous [1] re-telling of how 70s inflation "proved Friedman was right" in macroeconomics — even to the point where some practitioners themselves believe the historical myths. The popular (but false) narrative goes something like this: Michelson and Morely conclusively disproved the idea of the aether and in order to solve the resulting problems, Einstein used intuition and some thought experiments about moving clocks to derive a new theory of physics that refuted the old Newtonian world.

This should immediately raise some questions. 1) What problems with Newtonian physics would be caused by showing the aether (which doesn't exist) doesn't exist? 2) Why do physicists still use Newtonian physics? 3) Isn't Einstein famous for the equation E = mc² — which thought experiment leads to that?

The real story is more like this: Maxwell had produced an aether-based framework that was unifying the physics of light waves, electricity, and magnetism but there were some counterintuitive aspects of this framework that all had to do with moving charges and light sources involving a bunch of ad hoc mathematical modifications like length contraction, models of the aether, and an inconsistency in the interpretation of Maxwell's equations; Einstein came up with a general principle that unified all of these ad hoc modifications, made the aether models unnecessary, and resolved the asymmetry.

This answers my questions 1) through 3) above. 1) The aether was shown to be unnecessary, not erroneous. 2) Newtonian physics is a valid approximation when velocity is small compared to the speed of light. 3) E = mc² is a result of Lorentz invariance (i.e. math), not the thought experiments that help us get over the counterintuitive aspects of Lorentz invariance.

Now I am not a historian, so you should take this blog post as you would any amateur's. I did an undergraduate research project on the motivations for special relativity as part of my interdisciplinary science honors program [2], presented the result in a seminar, and I'm fairly familiar with the original papers (translated from German and available in this book). I also spent a bit of time talking with Cecile DeWitt about Einstein, but I'd only really use this to confirm the popular notion that Einstein had a pretty robust sense of humor so direct quotes should be considered with that in mind.

Let's begin!

Myth: Einstein was "bad at math"

This takes many forms from denial that the theoretical advances Einstein made were extremely advanced math at the time, to that he was actually bad at math leading him to his "thought experiments". This myth likely arises from a quote from a 1943 letter in response to a high school student (Barbara Wilson) who had called Einstein one of her heroes (emphasis mine):
Dear Barbara: 
I was very pleased with your kind letter. Until now I never dreamed to be something like a hero. But since you have given me the nomination, I feel that I am one. It's like a man must feel who has been elected by the people as President of the United States. 
Do not worry about your difficulties in mathematics; I can assure you that mine are still greater.
This probably was just said as encouragement, and Einstein might even have been thinking about his own crash course in differential geometry and comparing himself to mathematicians he knew like his teacher Minkowski. Einstein was something of a mathematical prodigy when he was younger and all of his work on relativity is mathematically challenging even for modern physics students. It would be hard to look at mathematics like this and say the person who was able to use it to produce an empirically successful theory of gravity was "bad at math". Also, here's the blackboard he left after his death:

You forgot to contract the indices on the Christoffel symbols.

Update 19 January 2019 (H/T Beatrice Cherrier). Apparently the math was so obscure at the time that only Einstein and is close mostly German colleagues really understood it and due to English-German animosity in WWI it took some time to leak out to English physicists. The article that's from also has other things related to the rest of this post.

Myth: Einstein's thought experiments led to relativity

There are quite a few versions of this idea, but really it is more the reverse. Math led Einstein to conclusions he used thought experiments to understand (i.e. explain to himself and others) because of how counterintuitive they were. Maxwell's equations and their Lorentz invariance led Einstein to effectively promote a symmetry of electromagnetism to a symmetry of the universe. Einstein later used Minkowski's mathematical representation of a 4-dimensional spacetime as the framework for what would become general relativity.

It's somewhat ironic because Mach — who coined our modern use of "thought experiment" and that Einstein had learned "relativity" from — believed that human intuition was accurate because it was honed by evolution. But why would evolution provide humans with the capacity to intuitively understand the bending of space and time (or the quantum fluctuations at the atomic scale)? Einstein turned that upside-down, and used Mach's thought experiments to instead explain counterintuitive concepts like time dilation and length contraction. I think a lot of people confuse Einstein's and Mach's ideas of "thought experiments" which led to this myth [3]. You can read more about this here.

I once had a commenter on this blog who decided to argue against even direct quotes from Einstein saying he got the idea of space-time for general relativity from Minkowski's 4-dimensional mathematics. Although some things in physics get named for the wrong person (the Lorentz force wasn't first derived by Lorentz), it's called Minkowski space-time for a reason.

This is a powerful narrative for some reason; I suspect it is the math-phobic environment that seems unique to American discourse. It is fine as an American to freely admit you are bad at math and still think of yourself as somehow "cultured" or "intellectual" (or in fact to elevate your status). The myth that Einstein didn't need math to come up with relativity plays into that.

Myth: The aether was disproved just before (or by) relativity

As I talked about here, there were actually several different theories of the aether (e.g. aether dragging) and various negative results over 50 years from Fizeau's experiment to Michelson and Morely's were often seen as confirmation of particular versions. Experiments continued for many years after Einstein's 1905 paper [3], and despite the modern narrative that Michelson and Morely's experiment led to special relativity it was really more about mathematical theory than experiment [4].

I'm not entirely convinced that the aether has been completely "disproved" in the popular imagination or even among physicists anyway. We frequently see general relativity and gravity waves explained through the "rubber sheet" analogy which might as well be called an "aether sheet". If the strong and weak nuclear forces hadn't been discovered in the meantime it is entirely possible that Kaluza and Klein's 5-dimensional theory that combined general relativity and electromagnetism would have become the dominant "standard model" and the aether could have been re-written in history as what space-time is made of [5].

What the #$@& is this substance that's oscillating here?

Myth: Special relativity "falsified" Newtonian physics

This one can be partially blamed on Karl Popper, but also on various representations and interpretations of Popper. I've frequently found descriptions of Popper's idea of falsification that say something like "Eddington's 1919 experiment falsified Newton's theory of gravity and caused it to be replaced with Einstein's". For example, here:
Popper argues, however, that [General Relativity] is scientific while psychoanalysis is not. The reason for this has to do with the testability of Einstein’s theory. As a young man, Popper was especially impressed by Arthur Eddington’s 1919 test of GR, which involved observing during a solar eclipse the degree to which the light from distant stars was shifted when passing by the sun. Importantly, the predictions of GR regarding the magnitude shift disagreed with the then-dominant theory of Newtonian mechanics. Eddington’s observation thus served as a crucial experiment for deciding between the theories, since it was impossible for both theories to give accurate predictions. Of necessity, at least one theory would be falsified by the experiment, which would provide strong reason for scientists to accept its unfalsified rival.
As best as I can tell, Popper only thought that Eddington's experiment demonstrated the falsifiability of Einstein's general relativity (e.g. here [pdf]): Eddington's experiment could have come out differently meaning GR was falsifiable. I have never been able to find any instance of Popper himself saying Newton's theory was falsified (falsifiable, yes, but not falsified). Popper was a major fanboy for Einstein which doesn't help — it's hard to read Popper's gushing about Einstein and not believe he though Einstein had "falsified" Newton. Also it's important to note that general relativity isn't required for light to bend (just the equivalence principle), but the relativistic calculation predicts twice the purely "Newtonian" effect. That is to say that light bending alone doesn't "falsify" Newtonian physics, just the particular model of photon-matter gravitational scattering.

In any case, both Newtonian gravity and Newtonian mechanics are used today by physicists unless one is dealing with a velocity close to the speed of light or in the presence of significant gravitational fields (or at sufficient precision to warrant it such as in your GPS which includes some corrections due to general relativity). The modern language we use is that Newtonian physics is an effective theory.

More myths?

I will leave this space available for more myths that I encounter in my travels.

...

Footnotes

[1] Read James Forder on this.

[2] Dean's Scholars at the University of Texas at Austin

[3] I sometimes jokingly point out that there is a privileged frame of reference that observers would agree on: the Big Bang rest frame. We only recently discovered our motion with respect to it in the 1990s. This idea also complicates some of the "thought experiments" used to explain special relativity (i.e. an absolute clock could be defined as one ticking in the rest frame of the CMB).

[4] I blame Popper for this:
Famous examples are the Michelson-Morley experiment which led to the theory of relativity
Einstein actually begins [pdf] with the "asymmetries" in Maxwell's equations, and relegates the aether experiments to an aside:
Examples [from electrodynamics], together with the unsuccessful attempts to discover any motion of the earth relatively to the “light medium,” suggest that the phenomena of electrodynamics as well as of mechanics possess no properties corresponding to the idea of absolute rest.
The paper itself is titled On the electrodynamics of moving bodies, further emphasizing that Einstein's motivation was more understanding the "asymmetries" of Maxwell's equations and Lorentz's electrodynamics. Einstein's paper basically reformulates Lorentz's "stationary aether" electrodynamics, but does it without recourse to the aether.

Experiments like Michelson and Morely's (such as Fizeau's 50 years prior, and a long list of others) were part of a drumbeat of negative results of measurements of motion with respect aether. In a sense, Einstein is telling us the aether (and therefore any attempt to measure our motion with respect to it) is basically moot — not that some experiment "disproved" it:
The introduction of a “luminiferous ether” will prove to be superfluous inasmuch as the view here to be developed will not require an “absolutely stationary space” provided with special properties, nor assign a velocity-vector to a point of the empty space in which electromagnetic processes take place.
[5] For example: "In the early 1800s Fresnel came up with the wave theory of light where the electromagnetic vibrations occurred in a medium called the luminiferous aether that we now refer to as space-time after Kaluza and Klein's unification of the two known forces in the universe: gravity and electromagnetism."

Monday, February 12, 2018

Economic seismograms: labor and financial markets


Steve Randy Waldman wrote a tweet asking about whether the stock market falls imperfectly predicted recessions or caused them, to which I responded saying the former in the "Phillips curve era" and the latter in the "asset bubble era" (both described here). But I thought I'd show a dynamic information equilibrium history chart that helps illustrate this a bit better for the US data. I first started making these graphs a few months ago partially inspired by this 85 foot long infographic from the 1930s; I thought they provided a simpler representation of the important takeaways from the dynamic information equilibrium models (presentation here or see also my paper) that I plan on using in my next book. Be sure to click on the graphics to expand them.

The light orange bars are NBER recessions. The darker orange bars represent the "negative" shocks (in the sense that you'd consider a bad change in the measure — unemployment rate goes up or the stock market goes down) with the wider ones meaning a longer duration shock. The blue bars are "positive" shocks (unemployment rate goes down, stock market goes up). The models shown here are the S&P 500, unemployment rate, JOLTS (quits, openings, hires), and prime age Civilian Labor Force participation rate

As you can see in the top graph, major shocks to the S&P 500 precede recessions (and unemployment shocks) in the Phillips curve era (the 1960s to roughly the 1980s) and are basically concurrent with recessions (and unemployment shocks) in the asset bubble era (late 90s to the present).

At the bottom of this post, I focused in on the latter five labor market  measures. This graph illustrates the potential "leading indicators" in the JOLTS data with hires coming first, openings second, and quits third. I don't know if the order is fixed (if there is a recession coming up, openings appears to be leading a bit more than hires). The other interesting piece is that shocks (in both directions [1]) to prime age CLF participation lag shocks to unemployment. There's an intuitive "story" behind this: people become unemployed, search for awhile, and then leave the labor force.


PS I thought I'd include these measures that illustrate my contention that the "great inflation" of the 1970s was primarily a demographic phenomenon of women entering the workforce that I describe here in order to have a single post to reference for some of my more outside the mainstream conjectures. I present two measures of inflation (CPI and PCE) as well as the civilian labor force (total size) alongside the employment population ratio for men and women.



...

Footnotes

[1] You may be asking why there's a positive shock to unemployment, but no (apparent) shock to any of the JOLTS measures. That's an excellent question. The answer probably lies in the fact that shocks to unemployment are made up of a combination of smaller shocks to the other measures as well as a shock to the matching function itself. Therefore the shock to hires and openings might be too small to see in those (much noisier) measures. One way to think about it is that the unemployment rate is a sensitive detector of changes in both hires, openings, and the matching function.

Wednesday, February 7, 2018

What is the chance of seeing deviations in three JOLTS measures?

JW Mason had a post the other day wherein he said:
The probability approach in economics. Empirical economics focuses on estimating the parameters of a data-generating process supposed to underlie some observable phenomena; this is then used to make ceteris paribus (all else equal) predictions about what will happen if something changes. Critics object that these kinds of predictions are meaningless, that the goal should be unconditional forecasts instead (“economists failed to call the crisis”). Trygve Haavelmo’s writings on empirics from the 1940s suggest third possibile goal: unconditional predictions about the joint distribution of several variables within a particular domain.
To that end, I thought I'd look at the joint probabilities of the JOLTS data time series falling below the model estimates. First, let's look at some density plots of the deviation from the model (these are percentage points) for JOLTS hires (HIR), openings (JOR), and quits (QUR) for the data from 2004-2015 and then place the data from January 2017 to the the most recent (Dec 2017) on top of it (points):


Can we quantify this a bit more? I looked at two measures using the full 3-dimensional distribution: the probability of finding a point that is further out from the center as well as the probability that at least one of the data series has a worse negative deviation than the given point and plotted both of those measures versus the distance from zero:



The first measure doesn't account for the correlation between the different series very well, but does give a sense of how far out these points are from the center of the distribution. The second measure gives us a better indication of not only the joint probabilities but the correlation between them — even if one of the three series is far from the center, it can be mitigated by one that is closer especially if they are correlated.

While there is 19% chance that one of the hires, openings, or quits data could've come in worse than it did on Tuesday based on the data from 2004-2015, that's not all that small of a probability leaving open the possibility that the data is simply on a correlated jog away from the model. This is basically capturing the fact that most of the deviation is coming from the openings data while the other two are showing smaller deviations:


Tuesday, February 6, 2018

JOLTS data ... and that market crash?

The latest JOLTS data does seem to continue the deviation from the dynamic information equilibrium we might see during the onset of a new shock (shown here with the original forecast and updated counterfactual shock in gray; post-forecast data is in black):




I will admit that the way I decided to implement the counterfactual shock (as a Taylor expansion of the shock function that looks roughly exponential on the leading edge) might have some limitations if we proceed into the shock proper because adding successive terms causes the longer ranges of the forecast to wildly oscillate back and forth as can be seen here for a sine function. Using the full logistic function isn't necessarily a solution because it produces a  series of under- and over-estimates (see here). Basically, forecasting a function that grows exponentially at first can be hard. One other measure is the joint function of openings and unemployment making up the Beveridge curve which is starting to show a deviation from the expected path as well (moving almost perpendicularly to it):


This brings me to the discussion around the latest market crash which included a lot of "the market is not the economy" and a pretty definitive "literally zero percent chance we are in a recession now" from Tim Duy. The only thing I would bring up is that the JOLTS data is a possible leading indicator of a recession and that data is not obviously saying "no recession" — and is in fact hinting at one (in the next year or so).

Coincidentally, I just updated the S&P 500 model I've been tracking and the latest drop puts us almost exactly back at the dynamic equilibrium (red, data and ARMA process forecast is blue, post-forecast data is black):


Which is to say that we're right where we'd expect to be — not on some negative deviation from equilibrium (just a correction to a positive deviation). I think it is just coincidental that the market fell to exactly the dynamic equilibrium model center; I wouldn't read too much into that. The fluctuations we see are well within the historical deviations from the dynamic equilibrium (red band is the 90% band).

...

Update 7 February 2018

I thought I'd add in the interest rate model forecast that's been going on for over three years as well. Note that the model prediction is for monthly data, therefore the random noise in daily data will have somewhat larger spread, but it is still a bit high (which is one of the possible precursors of recession, connected to yield curve inversion in the model, see also here or here):


Sunday, February 4, 2018

Long term exercises in hubris: forecasting the S&P 500

I've been tracking the S&P 500 forecast made with the dynamic information equilibrium model. The latest mini-boom and subsequent fall are still within the normal fluctuations of the market:


However, I wouldn't be surprised if the massive giveaway to corporations in the latest Republican tax cut didn't in fact constitute a "shock" (dashed line in the graph above). Also relevant: the multi-scale self-similarity of the S&P 500 in terms of dynamic equilibrium.

...

Update 5 February 2018

Ha!

Also, the close today brings us almost exactly back to the dynamic equilibrium:


Also bitcoin continues to fall (this is not a forecast, but rather a model description):

...

Update 26 February 2018

Continued update of S&P 500 and bitcoin:



Saturday, February 3, 2018

African American unemployment spike

There's almost a sense of dramatic irony that after the State of the Union speech last week where credit was taken for the stock market and African American unemployment, both reversed themselves in the most recent data. While the spike in unemployment is outside the 90% confidence bands for the dynamic information equilibrium model for black unemployment, I do think it is just a fluctuation (statistical or measurement error):


We'd expect 90% of the individual measurements to fall inside the bands, so occasionally we should see one fall outside. It's not an actual increase in human suffering, and in fact is consistent with the continued decline in unemployment seen by the model. The unemployment rate is somewhat of a lagging indicator of recessions as well, so we should expect to see a decline in one or more JOLTS measures first if this is the leading edge of a recession.

However.

We should always keep our minds open to alternative theories, and along with the spike in hate crimes since the 2016 election it is possible that employers have felt more empowered to discriminate against African Americans. JOLTS data is not broken out by race, and so a racially biased decline in hires could well be hidden in the data (e.g. it could be partially responsible for the potential decline we are currently seeing in the aggregate measures — why would JOLTS hires fall when the "conventional wisdom" is that the economy is doing "great"?). This "leading" indicator wouldn't be as good of a leading indicator for a racially biased recession. In the past two recessions, the shocks to unemployment hit African Americans a couple months later (the centers are at 2002.0 vs 2001.8, and 2009.0 vs 2008.8), so a recession where black unemployment leads would be anomalous.

I don't think that is what is happening (it's just a single data point after all), but it can't be ruled out using available data. And after the experience of the past two years, I wouldn't put money on the better angels of white Americans' nature.

Friday, February 2, 2018

Unemployment and labor force participation (models vs data)

The latest employment situation data is out and the unemployment rate holds steady at 4.1%. This is still in line the dynamic information equilibrium model (here or in my recent paper) as we begin the model's second year of accurate forecasting:


The data is also still in line with the some of the latest forecasts from the Fed and FRBSF (but not their earlier ones):


Note that the unemployment rate seems to be a lagging indicator compared to JOLTS data (out next Tuesday 6 February 2018), so while there is some evidence in the JOLTS hires data of a possible turnaround it won't show up in the unemployment rate for several months.

Also out is the latest labor force participation data which doesn't help us distinguish between the two models (with and without a small positive shock in 2016) as it's consistent with both:


And finally there is the novel "Beveridge curve" connecting labor force participation and unemployment rate:


Update:

In light of this post by JW Mason, I decided to add the error bands to the "Beveridge" curve above based on the individual errors. It's not exactly looking at the probability of the joint distribution of multiple variables, but it's a step in that direction.


Thursday, February 1, 2018

When did we become gluten intolerant?

I don't know about you all, but I've been doing this since the early 2000s.

The dynamic information equilibrium approach I talk about in my recent paper doesn't just apply to economic data. The idea that the information content of observing one event relative to observing another event has rather general application. As an example, I will look at search term frequency. Now if the English language was unchanging, given that there are a huge number of speakers, we'd expect relative word frequencies to remain constant and the distributions to be relatively stable. Changes to the language would show up as "non-equilibrium shocks" — a change in the relative frequency of use that may or may not reach a new equilibrium. A given word becomes more or less common and therefore has a different information content when a that word is observed (a "word event").

We might be able to see some of these shocks in Google trends data — a collection of "word events" entered as search terms. It's is only available since 2004, so we really can only look at language changes that happen within a few years. Longer changes (e.g. words falling into disuse) won't show up clearly, but this time series is well-suited for looking at fads.

I wanted to try this because I read an offhand comment somewhere (probably on Twitter) that said something like "everyone suddenly became gluten intolerant in 2015" [1]. What does the search data say?


The gluten transition in the US is centered near January 2009, but takes place over about 6 years (using the full width at half maximum for the shock). It "begins" in the mid-2000s and we seem to have achieved a new equilibrium over the past couple years.

How about avocado toast? That happened around 2015 in the US:


However, I did notice on Twitter there were a lot more and earlier references to avocado toast from Australians (in fact I think it was a mention in Australian media that it wasn't just the breakfast I made myself for years after having been given it by a Chilean friend where it's been a common dish for a long time ("palta")). Was this hunch visible in the data? Yes — almost a full year earlier:


So anyway, I just wanted to show a fun application of the information equilibrium framework. It applies to a lot of situations where there is some concept of balance between different things: supply and demand, words and their language, cars and the flow of traffic, neurons and the cognitive state, or electrons and information.

...

Update 2 February 2018

The "macro wars" (Nov 2007–Mar 2011):


...

Footnotes:

[1] Update: found it.
As a casual student of American food faddism, something that is still more than alive and well today (Yes, it’s an amazing coincidence that a sizable percentage of the educated liberal upper middle class all became gluten intolerant over a 3 year period. Must be pollution or something), I always love stories about our ridiculous food history.
It's a 6-year period above, but the definition of the "width" of a transition is somewhat arbitrary (I used the full width at half maximum above).

Economic growth, path dependence, and non-equilibrium shocks

The Maddison database on economic growth just completed a new revision, and I thought it would be a great opportunity to see how information equilibrium models handle the data. However, the data is entirely in terms of "real" GDP without accompanying price level information. Now I fully understand the motivation behind it: not only are they trying to separate out purely inflationary effects but also compare data across countries. It is hard enough to compare UK pounds sterling in 1918 versus to pounds in 2018 much less US dollars in 2018 in any meaningful way — at least in the traditional view in economic growth. The problem here is that the separation of price level from output NGDP = P Y where we can just "divide by P" and then make some adjustments across different countries is model dependent. I wrote a post about how this can be seen to generate fluctuations based on nothing but the fact that the widths of shocks are different. These graphs show RGDP per capita as well as NGDP and the deflator which have fundamentally the same general structure:


That these shocks are not in fact identical, cancelling in the RGDP result, makes me think that it is the separation of price level data from nominal output that is the issue here — at least during non-equilibrium shocks — not that something we define as real output is fluctuating with a low growth spell in the 1970s.

I'm not in any way saying the Maddison project and databases like it are wrong-headed or even incorrect, only that it represents a particular view of "real income" that (while shared with the vast majority of economists) is model dependent.

One of the benefits (in my view) of the information equilibrium framework is its manifest scale invariance that essentially says the basic properties of any relationship (aside from non-equilibrium shocks) are captured by information transfer indices (which are related to Lyapunov exponents [1]). This property coupled with the dynamic information equilibrium approach tells us that levels do not matter to the underlying economic processes as much as growth rates — and growth rates can be separated from levels.

This property also makes it unclear as to whether cross national comparisons can be made meaningful in terms of measures like income. There would be no requirement for any relationship you derive to be stable through non-equilibrium shocks, therefore you end up with a lot of path dependence [2]. Add in the fact that you can potentially separate the economic processes from the income levels (that in turn depend on various currencies), and there is a case to be made that cross-national comparisons (like intertemporal ones) are more sociological than economic.

Then again, this might be considered a point against using the information equilibrium framework. But as different nations are structured differently it seems to me to be as difficult to compare living standards the US (with no national healthcare system and people reduced to poverty by medical costs) with living standards the UK (which has one and no medical bankruptcies that I am aware of) as it is to compare Rothschild's level of wealth in an era without antibiotics [2] to "equivalent" wealth today.

But what can you say about economic growth then? Since I couldn't use the Maddison database, I had to get my macro data fix somewhere and started working with the UK GDP time series. Here are the series of shocks necessary for a dynamic equilibrium growth rate of about 3.5%/y [3]:



I left out the Great Recession boom/bust shock in the model curve for some reason (probably because I was more focused on the data before 1950; see here for the US version as well as a more detailed look at the last 50 years of UK data), but the shocks are: 1916.7 (WWI), 1921.1 (end WWI, Ireland independence), 1929.7 (Great Depression), 1939.8 (WWII), 1951.5 (a post-war economic boom), and 1977.2 (the demographic transition of women entering the workforce also visible in US data (or see here)).

The "story" for the UK is mostly of major geopolitical events as well as the high inflation and growth associated with the demographic shift. Geopolitical events like wars seem (to me at least) mostly unpredictable on the 20-50 year time scale. But what is interesting to me is that we could potentially understand global economic conditions in a different way based on these "shocks". One question I would like to answer is how much of the modern disparities in income per capita can be attributed to where countries are relative to the major demographic shift that occurred in the 60s and 70s in several parts of the world. For example it appears that as countries develop, there is first a decline in women's labor force participation followed by a rise:


This is just one factor, but it goes back to the question of whether it makes sense to directly compare income/output levels for countries that haven't had the same series of shocks — or whether we should compare the sequence of shocks.

...

Footnotes:

[1] Lyapunov exponents measure the separation of paths in phase space in the system. In economic systems, we might say the paths of different agents or firms. Now the IT index k ~ 1/λ which means that a low k system has much faster separation in phase space than a high k system. But since high k is associated with high growth, this means it is low growth systems that have a much faster separation in phase space. An open question here is whether this has any bearing on e.g. inequality where the phase space paths (i.e. income time series trajectories) separate much faster — associating low growth and high inequality.

I'd also like to point out that this is how one might go about this issue scientifically. If I had made a declaration somewhere on this blog that "inequality is critical to understanding economic growth" like some heterodox economists have, I would have to make a great effort to show that I was not just finding what I wanted to find.

[2] Which may be the only way to make sense of the price level.

[3] This is lower than the estimate here of 3.9%/y due to the addition of data from the early 1900s. The boom preceding the "Great Recession" and the Lawson boom also become more noticeable when you zoom in a bit to the last 50 years.