Wednesday, October 5, 2016

Keen, chaos, and equilibrium

A brief back and forth on Twitter about Roger Farmer's very concise and critical response to Steve Keen quickly degenerated into squabbling among Keen, Noah Smith, and David Andolfatto. Here is the crux of Farmer's critique:
... are economic systems [described by chaotic nonlinear dynamics]? The answer is: we have no way of knowing given current data limitations. Physicists can generate potentially infinite amounts of data by experiment. Macroeconomists have a few hundred data points at most. ... Where does that leave non-linear theory and chaos theory in economics? Is the economic world chaotic? Perhaps. But there is currently not enough data to tell a low dimensional chaotic system apart from a linear model hit by random shocks. Until we have better data, Occam’s razor argues for the linear stochastic model.
Actually, as a physicist, I would say that even if the economy was a complex nonlinear chaotic system, linear stochastic models would still be its effective theory description. Regardless of what the quantum theory of gravity is, general relativity -- and even Newton's universal law of gravitation -- is still its long-distance effective theory.

Anyway, this prompted me to write something about Steve Keen's article in Forbes. Keen suffers from a problem that all public economists seem to suffer: asserting matters of opinion as matters of fact, and ongoing research programs as well-established frameworks. This will be made clear as we progress. Let's begin, shall we?

Equilibrium, but what kind?

I propose to ban the discussion of 'equilibrium' in economics unless it is accompanied by an adjective describing it. Olivier Blanchard says that "Macroeconomics is about general equilibrium ... ". General equilibrium (which doesn't violate my new rule) is the idea that there is a price vector that eliminates excess supply and excess demand -- and that an economy in general equilibrium has neither. As Keen notes, this has been expanded from referring to excess supply and demand at one point in time to an intertemporal version that sees e.g. the market for blueberries in spring of next year as a separate market from currently available blueberries.

This idea encapsulates two hundred years of economic wisdom. It passes cursory inspection as not to be entirely false. If I go to the grocery store, it is not overflowing with blueberries -- or if it is, they are on sale. I frequently like to post this graph as a joke:

It's 1-U where U is the unemployment rate. General equilibrium tells us there should be no excess supply of labor, i.e. that 1 - U ≈ 1. Presently 1 - U = 0.951. This physicist would say that's a jolly good first order approximation! Even at the height of a worst recessions in the US, 1 - U ~ 0.8 to 0.9. And this holds over multiple time periods. Another way of looking at unemployment equilibrium is as a dynamic equilibrium where dU/dt is constant over long periods of time punctuated by recessions.

The information equilibrium framework couches this idea of general equilibrium in terms of the information content of events drawn from the probability distributions of supply and demand -- but notes that it appears to hold only approximately in empirical data.

However, Keen wants to throw this out and assert (without evidence) "[m]acroeconomics is about complexity". I am of the opinion that asserting complexity in macroeconomics is the equivalent of Krugman's "very serious people" asserting whatever conventional wisdom they're asserting. Heads nod, and chins are stroked. Yes, complexity is important. (But what kind?) Keen says:
In the mid-20th century, other modelling disciplines developed the concept of “complex systems”, along with the mathematical and computing techniques needed to handle them. These developments led them to the realisation that these systems were normally never in equilibrium — but they were nonetheless general models of their relevant fields. ... Economics needs to embrace the reality that, even more so than the weather, the economy is a complex system, and it is never in equilibrium.
Emphasis in the original. There is no evidence that this approach actually captures economic phenomena, and in graphs later in the article Keen shows things that look nothing like an actual economy (which Noah Smith points out in the Twitter conversation linked above, saying "This oscillatory employment picture hardly matches anything in the actual economy" and "This picture also fails to match reality pretty starkly" with accompanying graphs). And even if it was the case that these graphs looked even remotely like a real economy, Keen's use of 'equilibrium' in the previous quote is completely wrong.

David Andolfatto asks of Keen's use of the word equilibrium: "Reading the first page of Keen's reply to [Blanchard], it seems he's confusing 'equilibrium' with 'a state of rest?'" to which Farmer replies "I'll let [Keen] answer that. I suspect he does understand that equilibria can be non-stationary". Keen does kind of reply by referring to "dynamic equilibrium", but in his article he does use it to mean 'state of rest', or at least a stationary point -- which he points out as the center points in the Lorenz model. This is where I started pulling out my hair.

Global weather is a system that isn't in global thermodynamic equilibrium at maximum entropy. Full stop. That's what is meant by "never in equilibrium". The Earth is differentially heated and has two different fluids (air and water) that can move this heat around. However, it is in local thermodynamic equilibrium -- because otherwise the idea that it will be 60 °F in Seattle today would not make any sense. And to a good approximation, it is always in local thermodynamic equilibrium. And if any of you have ever been outside on a nice day, you can see that the weather can be in local thermodynamic equilibrium over a fairly wide area. There are some places where the wind seems to blow constantly, but it's only because the Earth is so big and we are so small that the tiny pressure imbalances can knock houses down. [And generally, the Earth's atmosphere well described by a mechanical pressure equilibrium.] But we only say the Earth's weather is not in global thermodynamic equilibrium because we developed a concept of thermodynamic equilibrium that is relevant to the microfoundations of the weather system, i.e. physics.

Let me illustrate these points using a weather example. Here is the barometric pressure for Seattle for 2016 so far:

You can actually see the complex emergent structures (high/low pressure systems) passing through with a quasi-cyclic frequency as well as the "great moderation" that is summer in the Pacific Northwest. But we only know this because we've studied weather from thousands of weather stations all over the Earth and we understand the basic physics (thermodynamics) involved.

Now pretend that this is the only data we have [1]. One city with approximately hourly measurements over part of a year -- about 8000 data points. And we don't have a well-established theory of physics underlying it all. Can we really tell the difference between the complex nonlinear system of pressure systems and seasons (including the entire theory of physics that entails) ... and a pressure equilibrium of about 1013 mbar subjected to stochastic shocks? No, and that is Farmer's point. My joke graph in this case looks like this:

[The three lines are the average pressure at sea level, alongside the maximum and minimum pressures ever measured.]

[Update 11 Oct 2016] I hope it wasn't lost because I wasn't explicit with what I meant by confounding the nonlinear dynamics of pressure systems and the linear stochastic view. In one case, if you view the Earth as a nonlinear dynamical system (which you should), you can see high and low pressure systems in the pressure time series data being generated and moved by differential heating, coriolis forces, etc. Seattle can be affected by Canadian highs and Pacific lows coming from various directions (at various speeds).

However, if you only have data from Seattle, then a lot of the information about speed and direction of a particular pressure system is lost. Two "shocks" that are farther apart and different amplitudes because the low pressure system moved slower and passed to the south will be exactly equivalent to two shocks with random timing and amplitude (e.g. an AR process). [end update]

Whither microeconomics?

There is one piece of Keen's article that I half agree with. He quotes physicist Philip Anderson:
As [Philip Anderson] put it “Psychology is not applied biology, nor is biology applied chemistry”. The obvious implication for economists is that macroeconomics is not applied microeconomics.
I completely agree that we should not assume that macroeconomics cannot be studied as a field unto itself, separate from microeconomics, as the microfoundations purists would have us believe. But I also don't believe that this means we should assume macroeconomics is completely independent of microeconomics. In fact, I think Keen should pay more attention to microfoundations. The understanding of weather systems Keen points to as an example to be followed critically depends on understanding the underlying physics (the microfoundations) -- and given the physics, even with the paucity of data in the weather example above, we could accept the complex nonlinear system explanation.

Basically, while there isn't enough data to assert the macroeconomy is a complex nonlinear system, if Keen were to develop a convincing microfoundation for his nonlinear models that get some aspects of human behavior correct, that would go a long way towards making a convincing case. Keen wants to have it both ways: to be unhindered by the constraints of microfoundations and to have insufficient data to reject his models.


Keen: I didn't even use the word 'chaos'!

Andolfatto: Steve, you used 'complexity' ... by which we took be nonlinear dynamics. Don't split hairs.

Keen's article also contained several paragraphs with references to the most famous chaotic system resulting from a simple system of differential equations.

Three-equation Monte

Let me illustrate this with a macroeconomic model which in essence is even simpler than Lorenz’s, since it can be derived from three macroeconomic definitions that no macroeconomist can dispute:
  • The employment rate is the ratio of the number of people with a job to the total population;
  • The wages share of output is the total wage bill divided by GDP; and
  • The private debt ratio is the ratio of private debt to GDP
Yes, those are three definitions. Keen continues:
Differentiate those three definitions with respect to time. That generates the following three statements ... which, like the definitions themselves, have to be accepted by all macroeconomists simply because they are true by definition:
  • The employment rate will rise if economic growth exceeds the sum of population and labour productivity growth;
  • The wages share of output will rise if wages rise faster than labour productivity; and
  • The debt ratio will rise if debt grows faster than GDP
You mean differentiation and sneaking in the concept of labor productivity.


In the end, Keen thinks macroeconomics is complex, calls general equilibrium primitive, and makes a case for a research program based on nonlinear dynamics. He presents the first two opinions as facts. As Farmer points out, there is insufficient data to say macroeconomic cycles (of which there have only been a dozen or so in the post-war period from which there is quality data) are the result of nonlinear dynamics or a linear model subject to shocks -- therefore declaring the cycles to be evidence of nonlinear dynamics is an opinion.

His case for nonlinear dynamics as a framework (no doubt using his Minksy software) is not convincing (to me, but that is just my opinion). If it was just being presented as a research program, that would be fine. But it isn't:
This is not a complete model either of course ... But it is a lot closer to reality than DSGE models were before the crisis, and much more realistic than they can ever hope to be in the future, because they cling to a false modelling choice that forces equilibrium onto a manifestly non-equilibrium system.
So the road to a better, more realistic macroeconomics begins with a step away from the equilibrium and micro-foundations pillars on which it has to date been constructed. Will economists be brave enough to take that step? Only time will tell.
The thing is that this statement comes right after this graph [2]:

This is supposed to characterize this data (from the US):

Closer to reality? More realistic than DSGE models? More realistic macroeconomics? Here's a DSGE forecast of inflation for 2014 to 2019 (red):

You would have to be brave to abandon that DSGE model for Keen's graph!


Update 6 October 2016

Narayana Kocherlakota follows up with a piece that is more lenient with regard to Farmer's point about it being difficult to distinguish between nonlinear models and linear ones with stochastic shocks (he makes a few other points as well and his piece is well worth reading). I did like his description of the process for how the mainstream stays the mainstream:

  1. Mainstream consists of a class of models C.
  2. Someone proposes new class of models C’.  It is clear to most that the answers to many questions of interest will be different if C’ is true rather than C.
  3. It’s argued (usually by a theorist!) that we can’t readily distinguish C’ from C using existing data.
  4. One response to (3) is: let’s figure out, and undertake, a class of (possibly very expensive!) experiments that would allow us to distinguish C’ from C.  I don’t know much about the hard sciences but this response seems common there. 
  5. The standard approach to (3) in macro is: let’s abandon C’ and keep C.  Since (3) is a common issue, C tends to remain the mainstream.

The best argument in favor of Keen's proposed C' comes from a place of fairness because C (the DSGE paradigm) isn't really preferred by the data either. We can give the horses a try after all the King's men have had a go.

However in all sciences it should be the power to explain the data that is the ultimate arbiter. The issue is not just that Keen's C' will be indistinguishable from C given the existing data, but that his C' does not appear to explain any existing data that I've seen. If it was such a good model you'd think there'd be some theoretical curves going through some data that are easy to find on the internet, but there aren't any. The only one I've come up with is my own graph! Happy to be corrected.

Additionally, Kocherlakota seems to suggest that there could be data we could obtain now to distinguish C from C'. However the issue is that we need hundreds of years of data to nail down a difference between nonlinear cycles and linear models with stochastic shocks without established microfoundations (with established microfoundations, you wouldn't need as much data).

I should emphasize that I have no problem with anyone studying any given C' -- maybe there will be a breakthrough. I actually think Keen's nonlinear equations could be used to constrain DSGE models (i.e. a specific DSGE model is a perturbative expansion of the nonlinear theory near a particular point in the "dynamic equilibrium"). It is possible that Keen's C' could establish relationships between the different coefficients in models in C that could be rejected given the data. But his attitude is one of disdain, so I doubt he'll try to write out a linear approximation to his model in DSGE form [3].

But we should always consider whether model explains the empirical data first.



[1] Actually, this is not a bad proxy! Look at US NGDP growth data:

[2] Also note the time scale in this graph: 100 years. We don't have 100 years of decent data yet.

[3]  Note that I go out of my way to connect to mainstream economic theory: here or here for DSGE specifically. That's the burden you have bringing something from outside the mainstream. Keen doesn't seem to want to do the dirty work of showing how his models compare and contrast to DSGE theoretically. As an aside, this seems to be a common failure of outside the mainstream approaches -- no one wants to do the work of learning the mainstream because their approach is oh so much better and we can't dirty it by bringing it to the level of the mainstream. Sean Carroll has a good summary of this approach you should take. My point here is his point #2.


  1. I think that there may be a problem with your first joke graph. The official US unemployment rate was redefined by the Reagan administration to make it smaller. If the chart represents the official numbers, it is apples and oranges, apples before the change, oranges afterwards.

    An epsilon of 5% is a pretty good first approximation, but it is also represents a political choice to make it look good. ;)

    1. The series appear to have been updated after changes were made in order to maintain consistency, so the entire series represents some of the same methodology. The biggest change was in 1994 when they started to use computers and ask more questions in the CPS.

      The methodology has been largely unchanged since the 1940s.

      Is there some specific reference you can send me about this change?

      In any case, then it applies after 1980s and we have 25 years of "naive equilibrium" with 1 - U ~ 1.

    2. So the chart is of U3?

    3. Yes, but even U6 would work for my argument (peaking at ~17%).

  2. "Keen suffers from a problem that all public economists seem to suffer: asserting matters of opinion as matters of fact"

    Raghuram Rajan wrote something about that (I'm sure you'll disagree with some of his points, though):

    I think there is a lot of wasted effort in what Keen does, but I also think that he's right in his assertion that the stock and flow of private sector debt is a crucial macroeconomic variable and phenomenon.

    1. I like this quote:

      "Under some conditions, individual behavioral aberrations cancel one another out, making crowds more predictable than individuals. But, under other conditions, individuals influence one another in such a way that the crowd becomes a herd led by a few."

      This is the essence of the difference between information equilibrium and non-ideal information transfer.

      If someone could find something that some debt measure is in information equilibrium with, I'd happily go along with it.

      This graph:

      Looks a bit like the graphs here:

    2. What about the correlation Keen has found between private debt and unemployment? See 7:22-8:12 here:

      If you want to play with the data on private debt, Keen is using the BIS database:

      He's using the data series "United States - Private non-financial sector - All sectors - Market value - Domestic currency - Adjusted for breaks", I believe. FRED should have the data too, but the link you've found doesn't include debt of non-financial corporations.

    3. Correction to above: The data series is
      "Private non-financial sector - All sectors - Market value - Percentage of GDP - Adjusted for breaks"

  3. "Here's a DSGE forecast of inflation for 2014 to 2019 (red):"

    Yeap, that's a fairly useful inflation prediction for 2017 - a range of 0 - 3%.

    1. I didn't carry over the error band labels, but they are the 90% and 70% bands (effectively 2 sigma and 1 sigma).

      This is actually near the limit of possible predictions due the noise in the data.

    2. "This is actually near the limit of possible predictions due the noise in the data."

      Pretty obvious.

      But of what use is a prediction which has a 0 - 3% range?

      I would say none.

      And what's noise? Something the scientists can't explain and in effect pretend doesn't exist.

      Perhaps science should get interested in the noise.

  4. Keens's graphs are not predictions, they are theoretical model outcomes.

    1. I am not sure what this means. "Theoretical model outcomes" -- i.e. the output of theory -- should also look like the data regardless of whether it is a prediction or a fit to existing data.

      And Keen himself has said that the model predicted the 2008 financial crisis.

      This is a "theoretical model outcome" that looks like the data:

    2. "I am not sure what this means. "

      I am sure you do.

    3. No really. I wasn't talking about predictions in the above post, but rather explaining presently available data -- aka "retrodictions". The theory (which isn't a prediction) doesn't look like the data (which isn't from the future).

      So I have no idea what you mean by saying Keen's graphs aren't predictions.

    4. Keen isn't modeling a particular economy.

      He's formulated a model and played with the variables.

      That's all.

    5. But the theoretical model outputs don't look like any data from any country.

      And Keen seems to take these model outputs as indication of real world phenomena; here's Keen:

      "This simple model’s complex behaviour explains one thing that remains a mystery to conventional macroeconomics: the fact that the 2008 crisis was preceded by the “Great Moderation” ... It therefore generates a “Great Moderation” followed by a crisis—precisely the sequence that we experienced in the real world"

      If it's just playing with variables, why was this published with the title "Olivier Blanchard, Equilibrium, Complexity, And The Future Of Macroeconomics"?

      This is a textbook case of Pfleiderer's chameleon models (look it up). We are supposed to take the model conclusions as informing policy, but if we question the unrealistic assumptions, it's just a toy model.

    6. "But the theoretical model outputs don't look like any data from any country."

      Why should they, he is playing with models.

      His quote says the model explains the 2008 crisis, not predicts.

      Yeah, thanks I've read Pfleiderer.

      It is just a toy model- like the 1000s of DSGE models in 1000s of published papers.

    7. "This is a textbook case of Pfleiderer's chameleon models (look it up). We are supposed to take the model conclusions as informing policy, but if we question the unrealistic assumptions, it's just a toy model."

      I can't speak for Keen, but as far as I understand, what he is doing is endeavouring to formulate a model which can be used to explain the behaviour evident in the crash of 2008.

  5. In physics you often come across theories that seem very different -- e.g. Schrodinger's wave mechanics and Heisenberg's matrix mechanics, or the three theories formulated by Feynman, Schwinger and Tomonaga -- but eventually turn out to be identical.

    We may harbour the same doubt about General Equilibrium and Marshallian economics. One claims to be "general" while labelling the other as partial equilibrium. And yet so many of their results, if one ignores the fancy math, are the same. For example, both claim there is no such thing as involuntary unemployment. This leads one to suspect that they might not be so very different.

    Here is my post that argues that they are actually mathematically equivalent and that both make the same assumption that aggregate demand is constant. Warning: not very easy reading.

    1. Thanks for the link. I'd generally agree -- the Marshallian supply and demand diagrams are what you get for perturbations of supply and demand around one price in a general equilibrium price vector.

  6. Herb Gintis has some great papers on general equilibrium theory and the emergent behavior that arises in agent based models, and based on my quick and non-expert readings, what you see emerge from complexity does not look so very different from what simple blackboard GE models predict. So whilst one says "equilibrium price is $1", the other might say "the price will meander around in the vicinity of $1 but might occasionally deviate for no apparent reason"

    it's a cliche to say that different models are good for different jobs - this is why I side with Narayana Kocherlakota that Farmer is wrong to write off models with deterministic cycles, which might offer us useful insights. But I agree with you that a basis in micro behaviour would probably be helpful there.

    Nothing economists can do would have been as valuable as predicting and *averting* the financial crisis, and having a model which says "credit booms end in disaster" motivated by Minksy's idea that periods of stability in finance breed fragility, certainly looks like a great call. It seems ungenerous, to say the least, to argue that Keen's model might not actually be any good, event though he did appear to call the crisis for the right reason (although I am really not sure his prediction and reasoning quite got at how a credit bubble bursting nearly brought down the global banking system in the way that it did) but ... I think despite all that, his model might not be much better than all the stuff he criticizes in such strong terms.

    1. also I want to quickly say that I do not regard "no involuntary unemployment" as a defining characteristic of GE - there are GE models with equilibrium unemployment.

    2. Luis,

      I agree that I simplified GE in the discussion above and e.g. search/matching models can lead to non-zero rates of unemployment.

      And yes, I agree that Minksy's idea seems to be an insight into human behavior. My focus is on the mathematical models implementing such insights. A complex nonlinear model and a linear model subject to an stochastic AR process can both produce output that reflects Minsky's insight. You are not constrained to nonlinear models or stochastic linear models alone. The latter just happens to be simpler.

      Thank you for the link to Gintis's papers -- I actually have come to some of the same conclusions using random agents.

    3. "... there are GE models with equilibrium unemployment."

      These are models based on frictions - different than involuntary unemployment.

    4. Anonymous,

      I do not understand your definition of involuntary unemployment if someone that is unemployed because of a "friction" isn't involuntarily unemployed.

    5. It's a matter of strict definition.

    6. It's not a matter of what is happening to the unemployed person while they are looking for a job - frictional unemployment.

      It is a matter to do with how the unemployed person came to lose his job. A neoclassical economist would say that the only way a person loses his job is if he is not willing to accept the market rate of pay - this is voluntary unemployment.

      A Keynesian would say (because he doesn't assume away involuntary unemployment) that a person loses his job not because his desired wage is over the market rate but because a business folds or retrenches because its prospects of selling all its output have disappeared.

    7. This is not the definition of involuntary unemployment.

    8. So what is your version?

    9. "Involuntary unemployment occurs when a person is willing to work at the prevailing wage yet is unemployed."

      From your link.

      This is consistent with my definition (which emphasizes the reduction in demand).

      The rest of explanations in the Wiki deal with variations on the neoclassical definition of voluntary unemployment.

    10. No it's not.

      You said: It is a matter to do with how the unemployed person came to lose his job.

      The definition of involuntary unemployment has nothing to do with how the person became unemployed in the first place, only with whether that person can find another job.

    11. If a person is willing to work at the current wage then how is it that he does not have a job?

    12. And there you have the neoclassical rhetorical question!

      It doesn't happen in neoclassical economy. John Keynes wrote a book explaining how the neoclassical model can fail. Quite a lot of pages. Called it The General something or other.

    13. It's not a rhetorical question - well, mostly not.

      It is a valid question which you won't address.

    14. I think in search and matching models I've seen jobs destroyed by random shocks. And you are unemployed, involuntarily, whilst searching for a new one.

    15. what I haven't seen - and it is a few years since I read lit - is models where hiring and firing by firms is a function of aggregate market conditions.

      just by the by, I really don't understand this heterodox objection to getting important things into models via 'frictions'. If you are going to have a model in which people can trade things and set prices and take decisions, and you think this world does not operate like a very simple Arrow Debreu model, then your account of the world will include whatever it is that you think explains that. Frictions is just a word for those things.

    16. Who's objecting to frictions as a factor in delaying employment?

    17. oh sorry, making mistake of commenting on something I have observed at various points in my life, not on this thread.

      keep an eye out though, you will see disparaging comments from het econ about frictions

    18. "keep an eye out though"

      I'll be looking at myself in the mirror, then.

    19. "who objects to that?"

      "I object to that"


    20. There theories of frictional unemployment and there are theories of frictional unemployment.

  7. The problem with your physics analogies is that atoms don't have learning facilities and rarely watch the six o'clock news.

    Macro models without individual actors fail the Lucas critique. Searching for the 'deep parameters' is like looking for the end of the rainbow. It doesn't exist.

    Policies affect individual entities and their reaction via the learning process changes the aggregation outcome. So you really don't know how things will change *other than to try it out in a simulated analogue* where you can get a feel for how thing will change *if you include actual humans in that analogue*.

    So you don't just run agent based models, you actually invite people in to play them. Multi-player games involving hundreds of people allow you to see how real people respond to changes in policy.

    You get much more accuracy if your non-player characters watch actual humans playing the game and train their neural nets based upon those responses.

    1. Random,

      The problem with your argument against physics analogies is that there is no particular evidence of these news shocks having a sustained macroeconomic impact. As a first approximation, you should always assume that a given theoretical idea (e.g. humans watch the news and react to it) is neither confirmed nor rejected by the macroeconomic data. This is a good starting point.

      It is true transient impacts due to news shocks are observed -- for example many of Draghi's announcements move exchange rates. But that movement dissolves in the noise of the data within days.

      The onus is on you to present evidence that complex human behavior has an impact on macroeconomic outcomes. Behavioral economists have been at it for 30 years and haven't come up with anything definitively impacting macroeconomics (micro yes, but not macro).

      I have no objection to you simulating an economy using a massive online game. However there are general theoretical arguments that the complexity of humans should not matter -- the gist of which is that the existence of a small set of macroeconomic aggregates (interest rates, NGDP, private debt, various measures of the money supply, inflation, unemployment, etc) implies that the millions of dimensions in the agent problem (millions of agents with thousands of parameters describing them) are mapped to a low-dimensional subset in the macro problem (NGDP, interest rates, etc, and their coefficients in the model).

      It's a bit like insisting on a quark-level simulation of an atomic nucleus in order to understand an ideal gas for no reason other than the fact that you happen to be a quark and think your own actions are relevant.

      There is really only one way out of this: that there are millions of macroeconomic aggregates out there that are meaningful that we just don't measure. It is possible! I just have no idea what they are ... and you haven't presented any.

      But I would love to see some data from a massive online game -- if only to show that the output could be explained with information equilibrium :)

    2. "t's a bit like insisting on a quark-level simulation of an atomic nucleus in order to understand an ideal gas for no reason other than the fact that you happen to be a quark and think your own actions are relevant."

      There are times when human beings, en mass, think as one - the madness of crowds - atomic particles (I presume) don't do this.

    3. They do:

      But what you are saying implies an even larger dimensional reduction, making my point.

    4. So you are saying that atomic particles do exhibit macro states (I have not idea about the physics) - effectively exhibiting mass behaviour?

      How does this then fit in with your model?

    5. Does entropy operate/predominate at such small temperatures or do other forces come into play?

    6. Are you not aware of how my model works?

    7. The concept of temperature cannot be separated from the concept of entropy.

    8. I'm not a physicist so can you answer the question:

      Does entropy operate/predominate at such small temperatures or do other forces come into play?

    9. Entropy and temperature are two sides of the same coin, so your question doesn't really make sense. If there is a temperature and thermodynamic equilibrium, then there is an entropy and it is maximized. These statements are equivalent.

    10. I think I can understand that point.

      But how is it that particles exhibit behaviour, which at first flush for me, does not look like behaviour governed by entropy. Are there other forces at work which at very low temperatures overwhelm entropy?

    11. Actually it is the opposite: other forces can prevent the maximum entropy solution from being realized, but at low temperatures, those forces become less important than the state space (entropy).

    12. So how can entropy at one temperature engender maximum dispersion and at a much lower temperature engender aggregation?

    13. At lower temperature, fewer states are accessible (insufficient energy).

      Think of water. At low temperature it clusters into solid ice. At higher temperature it is a gas and fills its container.

    14. OK, got it.

      So what has this to do with your model?

    15. It's an example where a billion plus dimensional agent problem can be described much smaller dimensional macro problem.

      Pretty sure that's what this bit of the thread is about. But I may be mistaken.

    16. This doesn't sound very convincing.

      It seems to me that you don't want to admit that human behaviour cannot always be described by atomistic behaviour.

    17. Always admitted that. Forms the basis of the model. The fundamental tenet -- you can't receive more information at B than has been sent from A:

      I(A) ≥ I(B)

      If I(A) = I(B), you have information equilibrium which can be simulated using dumb particles and the right state space.

      If I(A) > I(B), then there is information loss due to e.g. correlations of the agents (not maximum entropy).

      The model then looks at the real world and empirically observes I(A) ≈ I(B). So I(A) > I(B) doesn't seem to matter very much. But when it does, the model is bounded by the differential equation

      dA/dB ≤ k (A/B)

      which can be used to estimate stochastic differential equations.

      So I do admit it! I embrace it with my heart full of love and wonder!

      ... but this has nothing to do with the dimensional reduction that we were talking about at the beginning of this thread. That is a more general argument that if you think e.g. the unemployment rate or NGDP is an interesting measure of something, then either the details of the agents don't really matter too much, or you have a representative agent. I go with the former, because that is consistent with the information equilibrium approach.

    18. "The model then looks at the real world and empirically observes I(A) ≈ I(B). "

      Some of the time - not all of the time.

      And the times that it doesn't, how does your model deal with them (you know, the real world)?

      "So I(A) > I(B) doesn't seem to matter very much."

      Yes, who cares about millions of people who want to work but can't - I guess they're just like the next atom, buzzing about blissfully.

    19. For someone who doesn't "understand all aspects of" my model, and doesn't "aim to become an expert in" my model, you certainly say a lot about it.

      And the times that it doesn't, how does your model deal with them (you know, the real world)?

      I answered this in my comment above which you apparently did not read all the way through:

      But when it does, the model is bounded by the differential equation ... dA/dB ≤ k (A/B) ... which can be used to estimate stochastic differential equations.

    20. "And the times that it doesn't, how does your model deal with them (you know, the real world)?"

      I wasn't referring to whether your model had a solution or not. I was alluding to policy responses to ameliorate the ensuing unemployment, which you don't seem to pay any attention to because "...I(A) > I(B) doesn't seem to matter very much".

    21. And like the neoclassical economists, you don't want to deal with it.

    22. No, you did not say that. Here is what you said:

      "The model then looks at the real world and empirically observes I(A) ≈ I(B). " [quoting me]

      Some of the time - not all of the time.

      And the times that it doesn't, how does your model deal with them (you know, the real world)?

      In your comment, "them" obviously refers to the times when we don't have I(A) ≈ I(B). (Which I have now responded to, twice).

      Now you are saying "them" actually refers to policy responses to negative outcomes. Now you didn't actually say this, you're just changing the subject. But I'll respond anyway.

      I don't ignore policy responses because I(A) > I(B) -- that makes no sense. What do you think I(A) > I(B) means? I literally have said that I(A) > I(B) in the context of recessions means that you need coordinated government policy to overcome the tendency to not fix a change in entropy.

      In general, I've defended the use of fiscal stimulus (when k ~ 1, i.e. inflation is low and the ISLM model is a good approximation), monetary policy is useless when k ~ 1, and consider a UBI to be the best policy to maximize exploration of the state space.

      You obviously don't know my views. You just make things up about what I or my models say. This isn't constructive. You should really sit down with yourself and figure out exactly what you want.

      1. Do you want to fight against right-wing, "neoliberal", or "classical economics" views that bring things like austerity and take away government programs for the poor?

      2. Or do you want to discuss mathematical economic theory?

      By attacking a fellow traveler with regards to leftist politics, you apparently don't want #1. And by refusing to learn the mathematics of the model, you apparently don't want #2.

      What, sir, do you really want?

    23. Feel free to respond with I wanna rock! or I'll tell you what I want, what I really really want.

    24. What I really really want is for you to explain how you run your models.

      I remember seeing one last year, for inflation (I think, not sure). The graphic was dynamic and showed the model prediction against the data - the graphic showed the curve of the model prediction changing with each new data point. At least, that's what I thought was happening. Consequently, the model prediction followed the data quite well. However, this seems to me to be a kind of curve fitting - recalibrating the model on the run. Is this what happens or am I entirely off beam?

    25. I believe you are speaking of this model of the interest rate, and while it does adjust to data as it comes in, the point is that it predicts behavior for the next ten years before it happens -- that data isn't fit (and it doesn't truly *need* to be adjusted either). All of the model details down to the actual code used to run it are in the paper.

      However let's address your pejorative use of "curve fitting". As in "this seems to me to be a kind of curve fitting" -- as if models could be more than curve fitting? I've seen this critique coming from various places around the internet. It seems to be distantly related to a valid critique taken to such a simplified level that it no longer makes sense.

      All science everywhere is curve fitting. Orbits of planets. High energy particle physics.

      There is no part of quantitative science where parameters weren't fit to the data.

      Now I think the curve fitting critique evolved from a genuine critique called overfitting. This is where your model has too many parameters given the number of data points. A good rule of thumb is that you should have 20 data points per parameter.

      In the interest rate example, the model has 2 parameters. It is fit to 10 years of monthly interest rate data (120 data points, which it describes fairly well) and extrapolated out another 10 years. That has an extremely good Akaike information criterion (there exist technical measures for your gut feelings about what things "seem"). Since "underfitting" is good in science (explaining a lot of data variance with few parameters), we actually just call this a good model instead.

      I'd recommend removing "curve fitting" from your vocabulary and replace it with "overfitting".

      The models I've put together with only 5 parameters have explanatory power on par with DSGE models with 40+ parameters. So while overfitting is a decent critique against mainstream economic theories, you can't just regurgitate it for use against the information transfer model-- it doesn't apply.

    26. Yes that's the model and thanks for the explanation.

      However, I would prefer to use the progressive curve fitting. I have no problem with curve fitting per se. The thing is that you tout this as a predictive model. If you were to say here is my ten year prediction based on data to date and hold the model there and see what happen in the data over the prediction period of the model, I would say fine. But your model updates its prediction as each new data point appears and over a period of time it appears that your model is a close fit to the data. Whereas if you look at the three static graphics, the prediction is way off compared to the run of subsequent data.

      So what is the value of your model?

    27. No idea what "progressive curve fitting" is; is it something you've made up?

      Not sure what your standard for "way off" is either. For a 2 parameter model, the predictions are amazing for economics.

      Because economic variables are stochastic processes, it is actually impossible to have a model that never updates a forecast as new data arrives. What you want to aim for is the least number of updates and my version above updated every 10 years. That's an improvement over the best Econ models that update every couple of quarters.

      Additionally, the parameter values converge (the changes get smaller and smaller with each update), and the prediction error is on the order of the irreducible error in the measurement.

      In fact, those facts mean my model is the best possible model of interest rates. You can't do better than irreducible measurement error. For a linear subset of a two dimensional plane (the data is a flat graph) there must be at least 2 parameters. And those parameters converge to a single value.

      I apparently will never satisfy your desires for my model because you make up new kinds of math and add new kinds of requirements. You've now reached a point where no model that meets your requirements can possibly exist. A psychic telling you the values of the interest rate for the next thousand years is the only "model" that can make you happy.

      If you think my model is useless, the problem is in you because you can't accept any kind of model is useful. I have no interest in meeting your impossible requirements but I would like you to understand that they are unreasonable. Either you have some assumption that Econ isn't understandable by humans, or you don't understand the requirements you've set for models.

      And since you obviously don't understand the requirements you've invented, I wonder how you came up with them in the first place?

    28. "No idea what "progressive curve fitting" is; is it something you've made up?"

      It's what your model does.

      "..the predictions are amazing for economics."

      Yes, it's very good at predicting the past - even then only once you have enough data points.

      "If you think my model is useless, the problem is in you because you can't accept any kind of model is useful."

      A model is only of use if it has predictive value. Yours doesn't. Most models don't. They are continually defeated by reality, by radical uncertainty. Models really only have a value as pedagogical machines. They assist in the understanding of the way an economic process might work and then only within the confines of their ceteris paribus conditions.

      From what I've seen of your model and others, you would be better off with a psychic. If your model is so good why don't you back it with serious money.

      As a kid I used to make model aeroplanes. One I didn't get to make was the Lancaster bomber. The prototype weighed over 16.5 tonnes and could carry a 10 tonne payload. I wonder how effectively a 200 gm plastic model would have performed in a bombing raid over Germany? (Ooops, don't mention the war.)

    29. "No idea what "progressive curve fitting" is; is it something you've made up?"

      Watching you play dumb is not very edifying. :-)

    30. Yes, my model does have predictive value. Apparently you can't be bothered to even check the predictions link right there on the front page.

      RE: progressive curve fitting

      It's not on wikipedia.

      And note that wikipedia does have a lot of things I don't know from math. For example, I didn't know what the Chapman-Robbins bound was. So if progressive curve fitting was a thing and I didn't know about it, it could very likely have appeared. But it doesn't. Because you made it up.

      I've actually used many different methods to make forecasts from the information transfer framework. Progressive curve fitting is not one. Curve fitting, yes. But that is what all of science is as I mentioned above.

    31. "It's not on wikipedia."

      This is your argument?! LOL!

  8. "It fits in just fine."


    1. Are you not aware of how my model works?

    2. Why don't you explain how it fits in.

    3. Wait -- so you haven't read my blog?

    4. Why don't you want to explain how it fit's in?

    5. Because by the back and forth I've surmised this is the same anonymous commenter that I have explained non-ideal information transfer to before and it doesn't seem to be of any use. It's just sealioning.

      So it's a better use of my time to make inside jokes :)

    6. I don't understand all aspects of your model and I don't recall seeing the explanation of how particles can exhibit macro states fits in. I don't aim to become an expert in your model.

      If you don't want to explain how it fits in then I can only presume it will expose weakness' in your model.

    7. Maybe you should give me the benefit of the doubt.

    8. Because I'm a stand-up guy.

      Fun at parties.

      Actually, that's not true. I tend to sit in the corner and sulk.


Comments are welcome. Please see the Moderation and comment policy.

Also, try to avoid the use of dollar signs as they interfere with my setup of mathjax. I left it set up that way because I think this is funny for an economics blog. You can use € or £ instead.

Note: Only a member of this blog may post a comment.