Friday, October 30, 2015

Core PCE inflation update

The new core PCE inflation data is out today and I currently have two head-to-heads comparisons of the IT model with the FOMC forecasts and the forecast of the NY Fed DSGE model. Here are the latest results:

The IT  model continues to do as well as a smoothed version of the data constant inflation model (right graph, blue and green lines). The difference between the NY Fed DSGE model and the IT model is still not significant (the blue and red distributions aren't sufficiently separated to say the models are distinct). Note that the IT model here only has two (independent) parameters, while the DSGE model has something close to 29. It is true that the DSGE model describes more macro variables, but that also means that the inflation rate depends on more than the monetary base (minus reserves).

However, the FOMC was pretty much wrong -- and they control monetary policy! I'll wait until the full year's data comes in to make a call, but 3% inflation for Q4 seems unlikely to me.

Update 11/3/2015:

The RGDP growth result is not quite as bad for the FOMC, but the IT model is still better:

Moral intuition versus logical intuition

Stumbling and Mumbling has a good counterpoint to my post about Scott Sumner's claim that economics is counterintuitive. I updated that post, but I thought I'd put the update out as a post on its own. 

As I said about Sumner's claim: "I agree -- there are lots of counterintuitive frames for results in economics." Framing (as Stumbling and Mumbling shows) inflation and unemployment as both bad can lead to the conclusion -- using the moral 'bad begets bad' heuristic -- that you get both together as opposed to the Phillips curve view (which may not be true at all times, but for sake of argument here let's say it is) that there is a trade-off. However, if framed in another way (would you rather have one, the other, both or neither) my intuition is that people would use a more 'logical' zero-sum heuristic. If you want low inflation, you have to tolerate higher unemployment.

Which brings us to the paradox of thrift. Is this result counterintuitive? Yes, if you use the 'good begets good' heuristic. If we all 'tighten our belts', we'll get more national income. But it's not counterintuitive if you use the logical frame where my spending is your income. Saying that we should reduce everyone income in order to have more national income is illogical on its face.

It is our moral heuristics that make us think economic results are counterintuitive ... using moral logic. This applies to the examples Sumner gives (price gouging, ATM fees). I immediately applied my logical frame when writing this post rather than my moral frame. I've always associated the word counterintuitive with a logical frame. The utilitarian solution to the trolley car problem with the fat man isn't counterintuitive so much as a dilemma.

It's not that economic results are counterintuitive, it's that they are immoral. They violate our moral heuristics like 'good begets good'. Since morality doesn't have a widely accepted purely logical framework [1], all that exists is our set of moral heuristics. Therefore violating those heuristics is violating morality. Price gouging as a result of the basic economics of supply and demand is a moral dilemma, not a counterintuitive result.


[1] Utilitarianism is a somewhat logical framework, but is not the only theory of morality. It in fact leads to some of the morally problematic results as economics for what are basically the same reasons (assigning a real number to human moral preferences).

Thursday, October 29, 2015

Knowing less than we think: NGDP edition (corrected)


GDPnow (and the Blue Chip consensus) is a forecast of real GDP, not nominal GDP as pointed out by eli below. Whoops. Should have noticed that "real" in the label on the graph. The result is basically in line with both projections and the ITM forecast based on PCE inflation. But again, that's more a statement of ignorance on the part of the ITM.

Original (incorrect) post:

So the advanced estimate of 2015 Q3 NGDP is in (2.7% NGDP growth) and it appears GDPnow and the Blue Chip consensus were a bit low. The Blue Chip consensus at least had some 'error' bands (a range).  The ITM forecast was right on (within the 50% confidence interval), but it's confidence intervals are so wide that that's more of a statement of ignorance:

Wednesday, October 28, 2015

Sterling-Euro exchange rates

Simon Wren-Lewis calculated the equilibrium exchange rate between the Euro and Pound back in 2003:
When I calculated an equilibrium sterling euro rate in 2003, my estimate was 1.365 E/£.
which corresponds to 0.73 £/€. Which is completely consistent with the information transfer model result:

We both expect a return to a slightly lower value (for €/£, a higher value for £/€). In the ITM it's part of the long run trend of the relative size of the two economies. Wren-Lewis's reasoning is a bit more complicated.

Do counterintuitive economic results make sense?

coun·ter·in·tu·i·tive adj. Contrary to what intuition or common sense would indicate
It is counterintuitive to me that markets are useful for solving counterintuitive problems. Counterintuitive? Here's Scott Sumner:
We know that economics is really, really counterintuitive.  It doesn’t seem logical that imports would be good for the economy, or that price gouging in a natural disaster would be good for consumers, or that regulations banning banks from charging fees on ATMS would be bad for bank consumers.  But they are.
I agree -- there are lots of counterintuitive frames for results in economics. The whole Freakonomics-industrial complex is built around counter-intuition. But if you think about it, the view of markets as aggregation mechanisms for rational information should not produce counterintuitive rational results. Adding up everyone's intuition -- when their intuition is in some way aligned (hence common sense) -- and getting a counterintuitive result is itself counterintuitive.

Ah, but remember the fallacy of composition ...

Yes, let's remember it. Let's say your information the market is 'aggregating' consists of a probability that A, B or C will happen: P(A), P(B) and P(C) [1]. Since these are real numbers (and you are rational), you can say P(A) < P(B) and P(B) < P(C), for example, and therefore P(A) < P(C).

However, if everyone's information can be represented as a real number, and if they're allowed to have different information (different orderings), the aggregated effect may not be able to be represented as a real number (it's not guaranteed to be transitive). Basically, Arrow's impossibility theorem. Therefore the market can think P(A) < P(B), P(B) < P(C) and P(C) < P(A) .

This is a counterintuitive result, but it's also not rational. That is to say the market mechanism can produce irrational counterintuitive results.

There are a couple ways out of this:

  • Markets can produce the same rational result of a subset of the people (a "dictator"), in which case the market will reward the dictator for being right and punish others for being wrong. In equilibrium, this will mean most of the people will have the same view as the dictator and in that case the market outcome should be intuitive.
  • Markets can produce an emergent rational result (intuitive or counterintuitive) from irrational (I like to say 'complicated') humans. Everybody's intuition would be irrational. However, this also means markets are not information aggregation mechanisms.

So basically, markets either can produce irrational counterintuitive results or they do not really aggregate information [2]. They can easily produce rational intuitive results, but not rational counterintuitive results.

Banks charge for money and gougers charge extra for gas and water in disaster areas because the market solves an allocation problem, not an information problem. There is however no way to tell whether this solution is optimal according to any objective function because finding the optimum of the economic linear programming problem is intractable -- it's too big. Since I'm always on the lookout for an opportunity to link to the greatest blog post of all time, here is Cosma Shalizi:
Planning for the whole economy [i.e. solving the linear programming problem] would, under the most favorable possible assumptions, be intractable for the foreseeable future, and deciding on a plan runs into difficulties we have no idea how to solve. ... That planning is not a viable alternative to capitalism (as opposed to a tool within it) should disturb even capitalism’s most ardent partisans. It means that their system faces no competition, nor even any plausible threat of competition.
Basically, there is no way to check whether lots of imports, price gouging or ATM fees are good outcomes or not because you can't solve the linear programming problem with an objective function that defines good to check and see if they are in fact good.

That means in addition to counterintuitive results either being irrational or not aggregating information, we can't check to see if the counterintuitive results are actually "good". We can't say markets produce good, rational and counterintuitive results that aggregate information. You have to leave "good" off generally, and you get your choice of either "rational", "counterintuitive" or "aggregate information".

In general, the information transfer model says to drop the last one. Markets solve an allocation problem with a maximum entropy solution, not an information aggregation problem. Does it optimize utility (i.e. is is good)? Who knows. It just is. People aren't rational, but the market creates rational rankings of goods (prices). Are these prices good? Again, who knows. They just are.

Update 10/30/2015

Stumbling and Mumbling has a good counterpoint to my post above. But as I said: "I agree -- there are lots of counterintuitive frames for results in economics." Framing inflation and unemployment as both bad can lead to the conclusion -- using the moral 'bad begets bad' heuristic -- that you get both together as opposed to the Phillips curve view (which may not be true at all times, but for sake of argument here) that there is a trade-off. However, if framed in another way (would you rather have one, the other, both or neither) my intuition is that people would use a more 'logical' zero-sum heuristic. If you want low inflation, you have to tolerate higher unemployment.

Which brings us to the paradox of thrift. Is this result counterintuitive? Yes, if you use the 'good begets good' heuristic. If we all 'tighten our belts', we'll get more national income. But it's not counterintuitive if you use the logical frame where my spending is your income. Saying that we should reduce everyone income in order to have more national income is illogical on its face.

It is our moral heuristics that make us think economic results are counterintuitive ... using moral logic. This applies to the examples Sumner gives above and I immediately applied my logical frame when writing this post rather than my moral frame. I've always associated the word counterintuitive with a logical frame. The utilitarian solution to the trolley car problem with the fat man isn't counterintuitive so much as a dilemma.

It's not that economic results are counterintuitive, it's that they are immoral. They violate our moral heuristics like 'good begets good'. Price gouging as a result of the basic economics of supply and demand is a moral dilemma, not a counterintuitive result.



[1] Imagine a prediction market where probabilities are translated into prices for options.

[2] We are breaking the assumptions of the Arrow theorem: individuals are rational and "no dictator".

A uniform distribution arises ...

I forgot to add the animations for the previous post; here are the two sets of rolling dice approaching a uniform (empirical) distribution as the number of rolls goes to infinity:

Tuesday, October 27, 2015

Info EQ 101, addendum

Update: animations!

In comments with Ken Duda on the Info EQ 101 post, I realized that the way I've been presenting the example allows confusion between σ and n to occur and also loses out on some important details. So let's set up an example where you roll a six-sided die (σ = 6) five hundred times (n = 500) each for "supply" (ns) and "demand" (nd). This represents 500 widgets being supplied by 6 firms that are going to be allocated among 6 different firms (demand).

The information revealed by each roll is log₂ 6. The information revealed by n rolls is n log₂ 6.

After the first 10 rolls (10 widgets), you have two very different empirical distributions:

In these graphs, you add a box to the column labelled 4 when you roll a 4. After 500 rolls, you have approximately equal, approximately uniform distributions:

This is why we must assume nd, ns >> 1 (many rolls of the die). Only then do the empirical distributions approximate the theoretical (uniform) distribution (and therefore each other). We can imagine these distributions as the distribution of widgets supplied and the distribution of widgets demanded. They are not equal for nd, ns ~ 1 and you have cases of too many goods supplied for one firm and too few goods supplied to another.

We can also see the empirical information entropy of the two distributions are 1) not exactly equal to each other and 2) not equal to the theoretical entropy of log₂ 6 (per widget, gray line):

However all three of these become approximately equal when nd, ns >> 1 (e.g. after 500 rolls). The KL divergence also gets smaller (note: log scale) when nd, ns >> 1:

Now the 4x4 boards I drew in the previous post represent a 16-sided die roll (σ = 16) and I showed only about 6 rolls. It looked like this:

In the limit of a large number of rolls nd, ns >> 1 (where information equilibrium becomes a good model), it should look more like this:

Saturday, October 24, 2015

Info EQ 101

Also on my flight yesterday, I started writing up what I would say in a chalkboard lecture (or brown bag seminar) about information equilibrium.
Update: see the addendum for a bit more on some issues glossed over on the first segment relating to comments from Ken Duda below.


At its heart, information equilibrium is about matching up probability distributions so that the probability distribution of demand matches up with the probability distribution of supply. More accurately, we'd say the information revealed by samples from one distribution is equal to the information revealed by samples from another. Let's say we have nd demand widgets on one board and ns supply widgets on another. The probability of a widget appearing on a square is 1/σ, so the information in revealing a widget on a square is - log 1/σ = log σ. The information in n of those widgets is n log σ.

Let's say the information in the two boards is equal so that nd log σd = ns log σs. Take the number of demand widgets to be large so that a single widget is an infinitesimal dD; in that case we can write nd = D/dD and ns = S/dS.


Let's substitute these new infinitesimal relationships and rearrange to form a differential equation. Let's call the ratio of the information due to the number of board positions log σd/log σs the information transfer index k.

We say the derivative defines an abstract price P.

Note the key properties of this equation: it's a marginal relationship and it satisfies homogeneity of degree zero.

We'll call this an information equilibrium relationship and use the notation P : D ⇄ S.


Note that the distributions on our boards don't exactly have to match up. But you don't sell a widget if there's no demand and you don't sell as many widgets as you can (with no wasted widgets) unless you match the supply distribution with the demand distribution.

We can call the demand distribution the source distribution, or information source and the supply distribution the destination distribution. It functions as an approximation to the Platonic source distribution.

You could measure the information loss using the Kullback-Liebler divergence. However, information loss has a consequence for our differential equation.


Since the information in the source is the best a destination (receiver) can receive, the information in the demand distribution is in general greater than the information in the supply distribution (or more technically it takes extra bits to decode a D signal using S than it does using D). When these differ, we call this non-ideal information transfer.

Non-ideal information transfer changes our differential equation into a differential inequality.

Which means (via Gronwall's inequality) that our solutions to the Diff Eq are just bounds on the non-ideal case.

What are those solutions?


The first solution is where you take both supply and demand to vary together. This corresponds to general equilibrium. The solution is just a power law.

If we say our supply variables is an exponential as a function of time with supply growth rate σ, then demand and price are also exponentials with growth rates δ ~ k σ and π ~ (k - 1) σ, respectively.


There are two other solutions we can get out of this equation. If we take supply to adjust more quickly than demand when it deviates from some initial value D0 (representing partial equilibrium in economics -- in thermodynamics, we'd say were in contact with a "supply bath"), then we get a different exponential solution.  The same goes for a demand bath and supply adjusting slowly.

Use ΔS and ΔD for S - S0 and D - D0, respectively.


If we relate our partial equilibrium solutions to the definition of price we come up with relationships that give us supply and demand curves. 

These should be interpreted in terms of price changes (the are shifts along the supply and demand curves). If price goes down, demand goes up. If price goes up, supply goes up.

Shifts of the curves involve changing the values of D0 and S0.


From this we recover the basic Marshallian supply and demand diagram with information transfer index k relating to the price elasticities.

Our general solution also appears on this graph, but for that one ΔS = 0 and ΔD = 0 since we're in general, not partial equilibrium. We'll relate this to the long run aggregate supply curve in a minute.


Note that if we have non-ideal information transfer, these solutions all become bounds on the market price, so the price can appear anywhere in this orange triangle.

If we take information equilibrium to hold approximately, we could get a price path (green) that has a bound (black). Normal growth here is punctuated by both a bout of more serious non-ideal information transfer (a recession?) and then a fast (and brief) change supply or demand (a big discovery of oil or a carbon tax, respectively).


Since we really haven't specified anything about the widgets, we could easily take these to be aggregate demand widgets and aggregate supply widgets and P to be the price level.

We have the same solutions to the info eq diff eq again, with the supply curve representing the short run aggregate supple (SRAS) curve and the general equilibrium solution representing the long run aggregate supply (LRAS) curve.


What if we have a more realistic system where aggregate demand is in information equilibrium with money and money is in info eq with aggregate supply?

Using the chain rule, we can show that the model is encompassed in a new information equilibrium relationship holds between AD and money (the AS relationship drops out in equilibrium) with a new information transfer index.

And we have the same general eq solution to the information equilibrium condition where AD grows with the money supply and so does the price level.

A generic "quantity theory of money"


Let's say the money supply grows exponentially (as we did earlier) at a rate μ, inflation (price level growth) is π and nominal (AD) growth is ν.

Then π ~ (k - 1) μ and ν ~ k μ

Note that if k = 2, inflation equals the money supply growth rate.

What else?


Let's say nominal growth is ν = ρ + π, where ρ is real growth and look at the ratio ν/π and write it in terms of the information transfer index and the growth rate of the money supply (which drops out).

If k is very large, then ν ≈ π, which implies that real growth ρ is small compared to inflation. That means large information transfer index is a high inflation limit.

Conversely, if the information transfer index is about 1, then the price level is roughly constant (the time dependence drops out to leading order). A low inflation limit.

Is the endowment effect rational?

In comments on the previous post, I entertained the possibility of writing a book -- but what would it be about? Equations kill sales (or so I hear), so after some thought I came up with the idea of something like a inverse Freakonomics: discussion of the many things that economists think are irrational behavior or cognitive biases, but actually make sense from an information transfer standpoint. Unfortunately I had only one example so far (money illusion), so on my flight home last night I tried to come up with another.

It turns out the endowment effect makes perfect sense if you are dealing with non-ideal information transfer.

In the case of non-ideal information transfer, you end up with a price that falls below the supply and demand curves. Let's take it to be the maximum entropy point (see e.g. here). This is illustrated in the graph labeled I below:

The endowment effect in non-ideal information transfer. The initial transaction at price p₁ (I) establishes a lower bound (II) so that a future price p₂ (III) is greater than p₁ (IV).

So the initial sale price p₁ that you buy at creates a new bound for how non-ideal that transaction should be in graph II. You know that the price shouldn't fall below p₁, therefore your asking price might appear near the maximum entropy point in the new smaller space above the red line at  p₂ as shown in graph III. It's below the equilibrium -- the intersection of the supply and demand curves -- and so could potentially be accepted. Therefore (IV), we find that the price you're willing to sell something is higher than the price at which you purchased it: p₂p₁.

Interestingly, this would lead to gradually more and more ideal information transfer as more and more market transactions took place (asking prices above the equilibrium wouldn't get accepted). Therefore maybe the endowment effect is not a problematic cognitive bias, but rather the reason for the existence of markets. If people were more willing to take bad deals (didn't have an endowment effect, less loss aversion), you could get a feedback that goes the other way leading to zero prices and broken markets. 

Thursday, October 22, 2015

Rejection #1

So it was desk rejection from the Economics e-journal for the preprint. Probably more of those to be expected given I'm not officially an economist by degree ... but I am a member of the AEA -- doesn't that count? It probably would have gone to my head (in a bad way) had it made it through peer review without a hitch.

It's on the arXiv and also apparently at EconPapers for those that are interested (and/or want to spread it around ... ):

I am somewhat partial to Igor Carron's idea of a public peer review process ... as I mentioned in my first blog post:
Instead of trying (and probably failing) to publish [information transfer model of supply and demand] as a paper, I was inspired by Igor Carron to just think out loud with a blog. This blog will be focused on determining if the framework established here is good for anything or just an interesting toy model. Or if it is completely wrong!

Draghi's open mouth operations

Another Draghi shock on the order of 1% in the exchange rate between the Euro and the dollar:

The Euro had already drifted higher than it was before the last Draghi shock and this fall basically brings it back to the level of September 3rd. And of course Scott Sumner is again saying this is evidence that monetary policy is effective in a liquidity trap. Here's my post from the last time -- basically everything still applies. The variance in the exchange rate is so large that it swamps any impact of "open mouth operations".

Could Draghi perform an open mouth operation every day for the next few weeks and depreciate the Euro by 50%? Could he even say something again tomorrow and make the Euro fall a total of 4%?

In the information transfer model, the long run exchange rate depends on the relative NGDPs of the two countries in comparison, something that is largely out of the control of the central bank when those countries have low inflation (when the IT index is near 1).

Wednesday, October 21, 2015

The paradigm dependence of the productivity slowdown

I mentioned theory-dependent 'facts' the other day, and here is a good example using a Brad DeLong post from yesterday. Here is RGDP and the trends DeLong sees:

There is a 'fact' of two cases of RGDP/productivity slowdown. The information transfer (IT) model trend tells a different story with different 'facts' (this graph is the same result from this post):

In this view, there hasn't been any falling productivity, just a trend towards a lower average RGDP growth rate. However, the more fundamental (and therefore better) measure in the IT model is NGDP:

I don't want to leave the impression that I am saying: Ha, ha, DeLong is wrong! Maybe the IT model is wrong (although the NGDP path is pretty excellent, I must say). I just wanted to illustrate how different theoretical models lead to different interpretations of data -- different 'facts'. And different theoretical models lead to different questions (and therefore research). Why has productivity slowed down? is a question you'd ask if you drew the trend lines DeLong drew; it's not a question that follows naturally from the IT model trend.

PS The NGDP path includes a demography slowdown component as changes in the growth rate of the labor force are the primary source of non-monetary changes. See here.

Monday, October 19, 2015

Interest rate dynamics!

John Cochrane has a new working paper where he looks at interest rate dynamics. I thought I'd do some simulations with the information equilibrium model -- in particular the DSGE form of it. First, as noticed by Ken Duda in comments at Cochrane's post (and Cochrane cedes), we don't really have an interest rate peg in the US today. In fact, I think pegged interest rates are important to whether or not interest rate targets can generate inflation (see here and here). It's actually fairly obvious in the (short term) interest rate data when rates were pegged and when they weren't:

With that out of the way, let's get to the results of the simulation. I looked at the effect of increasing the monetary base on (nominal) output, inflation and (nominal) interest rates using the DSGE form of the IT model. That means that we're taking the IT index k = 1/ κ to be constant. Therefore we're neglecting the impact of a changing index, which is the primary reason interest rates start off rising with the monetary base, but then start to fall with the inflection point some time in the 1980s. I normalized all the values to 1 at the first time step t = 1. And one last thing -- I took the nominal shocks (σ at the DSGE form link) to be zero.

Here's what happens when k is 'large' (or κ is 'small') as it was before the 1980s:

Expanding the base causes interest rates, output and inflation to rise. The rise in inflation is proportional to the rise in the base -- recall log P ~ (k - 1) log M. And output rises more than inflation -- expansionary monetary policy causes economic expansion. When k > 1, there is a tendency for k to fall since the increase of log n is smaller than the increase of log m if n > m. Which leads us to our next scenario.

When  is near 1 (meaning κ is near 1), like the situation in the US today, we have the following behavior:

Monetary expansion leads to lower interest rates and only a small amount of inflation and output. This is not quite the neo-Fisherite result since we still have inflation. In fact, this is the typical IS-LM result: monetary expansion lowers interest rates and raises output.

Finally, we have the case that may represent Japan with k < 1:

Monetary expansion leads to falling inflation, falling interest rates and falling output.  This is the neo-Fisherite result where inflation follows interest rates. Remember, this is output measured in money, not actual widgets. Also note that if we include nominal shocks, that puts a floor on the growth of n roughly equal to the growth of the labor force. If the labor force is growing at a rate of 1%, then n won't go below 1% on average (the same for 0% or -1%, as a shrinking labor force may be more relevant for some countries).

Those are cases I've talked about before (in much more confusing ways [1], [2]), but with the same basic results.

PS Here is the (very simple) code (onset of monetary expansion at t = 2, well, ii = 2). To get the other results, I used k = 1.1 and k = 0.9

Utility maximization and crystal spheres

Atrios gave Justin Wolfers one of his world's worst humans 'awards' for his rather stunning lack of reflection. My co-worker and I happend to discuss Kuhn the other day and it is particularly relevant here. I'll just quote from this summary instead of Kuhn directly because it's shorter:
The theory-dependence of observation, by rejecting the role of observation as a theory-neutral arbiter among theories, provides another source of incommensurability. Methodological incommensurability ... denies that there are universal methods for making inferences from the data. The theory-dependence of observation means that even if there were agreed methods of inference and interpretation, incommensurability could still arise since scientists might disagree on the nature of the observational data themselves.
In a sense, there is disagreement between humanities and social science/economics over what constitutes fact. And that's because you need theory in order to make sense of facts. There is no such thing as theory-neutral facts.

For example, a sociologist or historian might consider an economic system in terms of institutions. In that case the existence of institutions (law, marriage and family, religion, media, ...) counts as an important fact in moving towards understanding the system. And in fact, there are institutions!

Now contrast this with an economist considering an economic system in terms of utility maximization with (possibly bounded) rationality. We don't really know if this is true -- many experiments and 'stylized facts' seem to show we're not terribly rational as humans: money illusion, endowment effect, hyperbolic discounting, etc [1]. Therefore the construct you are attempting to understand the system with may not actually exist.

As an analogy, Wolfers was potentially saying: if other people made more convincing arguments -- prizing crystal spheres over rhetoric -- maybe we'd listen to you about astronomy.

We don't know if utility maximizing agents exist (much like how Aristotle didn't know if crystal spheres existed), but we do know institutions exist.

Now it is entirely possible (even likely) institutions are not the best way to understand economics. But I think it is more likely utility maximizing agents (even including perturbations around rationality) are the crystal spheres of economics -- and that utility maximization represents an unscientific approach to economics.

Like crystal spheres, utility maximizing agents have not been observed.

Even experiments in economics (that won a Nobel prize) don't really observe utility maximizing agents. They observe that given well-ordered preferences (i.e. defined by real numbers), agents maximize utility. But that's a bit like saying given utility and told to maximize agents maximize utility. The best predictive result in economics that Noah Smith is fond of pointing out used a random utility model (that isn't terribly different in structure to the partition function approach on this blog). Some of the most robust findings in economics are deviations from utility maximizing agents (money illusion, hyperbolic discounting, endowment effect). Utility maximizing agents do not seem to have been observed -- if you can think of any experiments that observe them, let me know in comments!

Ok, what about atoms and statistical mechanics?

Yes, atoms were not observed when they were postulated. However, when atoms were postulated, they were considered too small to be observed. Crystal spheres are actually more scientific than utility maximizing agents because crystal spheres were thought too far away to be observed. Utility maximizing agents are a model of human beings -- they are readily observable.

Like crystal spheres, utility maximizing agents are motivated by a specific philosophy.

Utility maximizing agents are motivated both by utilitarianism (Bentham, Mill) and how self-interest can self-regulate (Adam Smith). The crystal spheres are motivated by the idea that circles are perfect and that the universe itself must be perfect.

Like crystal spheres, utility maximizing agents make calculations tractable given mathematical tools at the time.

Sure, many epicycles and deferents were added to the crystal sphere model to make it more accurate -- regressions hadn't been invented, but would have greatly simplified the process. in the end, it's just geometry and Anaximander was a contemporary of Thales. Utility maximizing agents solve Lagrange multiplier problems invented in the 1700s (Lagrange was a contemporary of Adam Smith).

I just thought that last one was interesting. Most mathematical solutions to problems use mathematics available at the time. An exception is string theory which is not really tractable except in maybe the most symmetric of cases.

Utility maximizing agents: not observed (when they should be) and motivated by a specific philosophy (only) equals unscientific (in my view) [2].

But then, it is the standard approach in economics, so I will dutifully express information equilibrium results in terms of epicycles, I mean, utility maximizing agents and perturbations from rationality.



[1] Note that these things exist as stylized facts because of the utility maximizing framework. They might not be deviations in the 'correct' economic theory. It's like having a frame where all swans are black and saying: except that white one ... and that white one ... and that white one.

[2] You might ask: how does the information equilibrium approach fare under these criticisms? Well, information has been observed (it runs communication theory) and it's not motivated by a specific philosophy. There are general philosophical motivations (more for the methodological approach -- Kuhn's differing weights of simplicity, scope, accuracy, fruitfulness and consistency) but it doesn't really depend on a particular world-view. Unless you count nihilism -- I am motivated by a nihilistic approach to economics, that macroeconomic aggregates are as meaningless as the universe. We largely can't know the future and in the cases where we can, we can't change it.

Sunday, October 18, 2015

What is real growth?

Commenter John Handley points out (correctly) that Paul Romer is talking about real output, while in my previous post, I talk about the price level. The problematic extrapolation from RGDP growth rates doesn't really depend on using real output or the price level (specifically, the physical analogy I wrote down); I was attempting to connect the result to a previous result about the price revolution of the 1500s (or so).

So I went back and re-worked the result in terms of real output (R) with real growth rate ρ (i.e. R ~ exp ρ t). I also made use of some stuff from the section on the AD-AS model in the paper. Using the market (information equilibrium relationship) P : N ⇄ M we can show

k M ≤ R ≡ N/P

And the picture we produce is very similar to the price level one and has the same interpretation -- simply extrapolating back with constant real growth rate ρ doesn't capture the fact that there probably wasn't a monetary economy in the past:

What's also interesting is if we assume information equilibrium and use the "money mediated AD-AS model" from the paper, i.e.

N ⇄ M ⇄ S

we can show that, if kn/ks where kn is the IT index of the market N ⇄ S and ks is the IT index of the market M ⇄ S, we have the information equilibrium relationship R ⇄ S with IT index ks. That means

R ~ exp ρ t ~ exp ks σ t

where σ is the growth rate of aggregate supply. Therefore ρ = ks σ. The real rate of growth ρ is only proportional to the rate of growth σ of aggregate supply widgets. See the derivation in the postscript below.

We don't know if ks is changing or even what its value is (except that it is of order kn in order for k ~ 1 empirically). It could be greater than one or less than one. Therefore so-called real output doesn't necessarily represent real widgets, but rather some conversion of physical widget units to money units. We have  N/P = N/(dN/dM) so the units of real output are the units of M (i.e. dollars).

An argument about real growth isn't an argument about physical widgets, so saying 2% real growth extrapolated backwards to the year 1000 is a small number doesn't tell you anything about the number of physical widgets. If ks > 1, then σ < 2% and the number of physical widgets in the year 1000 could be much greater than would be surmised from a tiny value for R.

Of course, this is a model dependent result. And that is the point -- Romer's argument that real growth has been accelerating is also model dependent.


PS Here are more details of the derivation (in long hand):

Saturday, October 17, 2015

Can we extrapolate growth into the distant past?

Paul Romer on economic growth [1]:
My conviction that the rate of growth in GDP per capita at the technological frontier had to be increasing over time sprang from a simple calculation. Suppose the modern rate of growth of real GDP per capita (that is the growth rate after taking out the effects of inflation) is equal to 2% per year and that income per capita in year 2000 is €40,000. If this rate had prevailed for the last 1000 years, then in the year 1000, income per capita measured in the purchasing power of dollars today would have been €0.0001, or 0.01 cents. This is way too small to sustain life. If the growth rate had been falling over time instead of remaining constant, then the implied measure of GDP per capita in the year 1000 would have been even lower.
Emphasis mine. That is one possible conclusion. The other is that an RGDP per capita of €0.0001 (let's call this the Romer point) implies that there wasn't a functioning monetary economy in the year 1000. This second view can be described in the information transfer framework where it also makes sense of the price revolution of the 1500s (or so). Here is the picture I have in mind:

I use the price level P instead of RGDP per capita [2] not only because it is more relevant in the information equilibrium picture, but also better illustrates what is going on in my view. It's not that real output was zero, but rather the price of everything was effectively zero: there wasn't a functioning monetary economy. The market (information equilibrium relationship) P : N ⇄ M hadn't been established, I(N) > I(M) so that P k Mᵏ⁻¹. Real output R = N/P would be undefined. The gray line indicates ideal information equilibrium while the blue line is the realized economy (non-ideal at first, ideal later). There was a transition to a monetary economy in Europe, historically identified by the price revolution [3], followed by economic growth.

This picture is similar to what happens in the p-V diagram in the liquid-gas transition (discussed before here). But that brings up what I'll admit is an unfair characterization of Romer's argument, but it is not incorrect.

Romer's argument that proceeding back in time assuming a growth rate of 2% per year implies income too small to sustain life in the year 1000 is analogous to saying:
Looking at the p-V diagram for an isothermal process, it appears you can eventually compress an ideal gas into a volume smaller than an atom therefore the number of degrees of freedom in p = (2/f) U/V must be shrinking as a function of volume.
That is a plausible guess, but what really happens is that the gas condenses into a liquid and you go from ideal information equilibrium to non-ideal information transfer between the process variables U and V. Inter-molecular forces begin to matter. You undergo a phase transition. The correct conclusion to draw from the argument is that the gas should behave differently, but the argument is not evidence that you know in what way it behaves differently.

Romer should have drawn the conclusion that the economy in the year 1000 must have been different from a modern economy, not that the rate of growth must be increasing over time.

In terms of the price level picture (note: Romer is arguing about RGDP growth) the solution he proposes to the Romer point (real output isn't worth anything in the year 1000) above looks like this:

That is to say, we increase the estimated value of real output in the year 1000 by changing how we calculate it -- using an accelerating growth model instead of constant growth. But maybe real output wasn't worth anything in terms of money in the year 1000 [4].

As the title asks: can we extrapolate growth into the distant past? The real question is: can we extrapolate a monetary economy into the distant past?

Update 10/18/2015

John Handley points out (correctly) that Romer is talking about real growth. I noted that above, but here's a less hand-waving version of the idea that real output measures output in terms of money, but not necessarily physical widgets:



[1] I changed the dollar signs to € to prevent problems with my mathjax script, but even there we see why this might be a bit strange. Why would you measure income in dollars in the year 1000? Who would take fiat currency in the year 1000 except possibly in China?

[2] As I mention here, the real rate of growth may not be a well-founded concept.

[3] I think this price revolution might refer to the gold standard and there might be a different one associated with fiat currency.

[4] I was under the impression that I said something on the blog along the lines of if we measured output in terms of widgets we probably wouldn't have the decelerating growth of secular stagnation ... but I can't find it. The thing is that the information transfer model is a model of stuff measured in terms of money. The aggregate output of bread in terms of money has a different history than the aggregate output of bread in terms of loaves.

The SMD theorem and ... oh, no ... not another physics analogy

The SMD theorem says that the aggregate market demand curve only inherits a few properties from the individual demand curves: continuity, homogeneity of degree zero, Walras' law and a boundary condition that demand is large as prices go to zero.

This basically says that the aggregate theory doesn't necessarily have much to do with the individual agents ... that the SMD theorem is the "anything goes theorem" (at least from the Wikipedia article).

I've said before that homogeneity of degree zero seems to be a symmetry principle for economics. And Walras' law (that I've looked into before: here and here) is something like a conservation law. In physics, we'd say that there is a macro scale and a micro scale instead of aggregate market and individual agents, respectively. Translating the SMD theorem into these physics terms, we have the statement:
The theory at the macro scale only inherits conservation laws and symmetry principles from the theory at the micro scale.
That is exactly the idea behind constructing an effective field theory!

See the link for the definition of effective as it's used here. Let's say we start with the fundamental theory of quarks and gluons, QCD. It has an SU(3) symmetry (color) an (approximate) SU(2) symmetry (weak isospin/chiral) and an SO(3,1) symmetry (Lorentz), and obeys the basic conservation laws of physics (momentum, energy). Now QCD is impossible (well almost impossible -- nice job!) to solve at the "macro" scale, so what do we think it looks like?

Well, if we put the theory in a Lagrangian that takes care of the conservation laws [1]. Do we have a theory of a representative quark? That doesn't make sense, really. Quarks need to come in pairs or triplets to cancel their color charge. In economics, agents actually need to sell stuff to each other so you can't really have just one agent [2].

So what do we observe at the macro scale? Protons, neutrons, pions, ... hadrons. In macroeconomics, people have constructed aggregates like the price level and NGDP. Macro is primarily concerned with these aggregates. So hadrons on one hand and macro aggregates on the other are our "particle content".

In physics, we then put these together into terms in the Lagrangian in ways that maintain the symmetries [1]. It's mostly a process of index matching where you make sure there aren't any hanging spacetime indices (Lorentz symmetry) or isospin indices (chiral symmetry), etc. The key thing to understand is that an effective field theory (like chiral perturbation theory [3, 5] with empirical parameters and ) can look totally different from its microscopic (fundamental) theory (like QCD with only one parameter gs [4]).

I think this is the heart of the SMD theorem. The effective theory of macroeconomics can (and probably does) look entirely different from the fundamental theory of microeconomics.

The call for microfoundations was basically a call for a quark and gluon mechanism for any chiral perturbation theory result. That would be tedious. I can calculate a pion scattering cross sections more easily using and with measured empirically than I can from setting up a lattice calculation to calculate from QCD. Additionally, there is no parameter for a single quark! The pion decay constant only makes sense for a pion.

And there we can make a connection to the information equilibrium approach: we measure the information transfer (IT) index k (or kappa) empirically. You could potentially calculate k from a large simulation of individual agents ... and more power to those that want to try. But k does not exist for an individual agent -- it is a property of aggregate agents. [5]

The bottom line is that representative agents (at the micro scale) and microfoundations assume the structure of the effective macroeconomic theory. They represent massive limitations of the possibilities -- and from what I've seen, may even exclude the real theory of macroeconomics.


[1] I'd like to think of information equilibrium relationships as the equivalent to Lagrangians that essentially guarantees Walras' law and homogeneity of degree zero.

[2] Is there macro money confinement? At the macro scale we never see money outside a transaction? Potentially interesting ... but total speculation.

[3] Chiral perturbation theory was such a good paradigm in the development of this whole idea in physics that sometimes people say effective field theory (the general term) to mean chiral perturbation theory (a specific effective field theory of QCD).

[4] Chiral QCD has massless quarks.

[5] As I mention in a footnote in the paper, as well as at the effective field theory post mentioned above, the information equilibrium model I present may only be the leading order effective theory. We have 

dD/dS = k D/S + c (D/S) (dD/dS) + ...

For chiral perturbation theory, the Lagrangian gets pretty unwieldy pretty fast:

Friday, October 16, 2015

Emergent representative agents: a means to an end

David Glasner responds to Tom Brown:
The problem with an emergent representative agent is that you need to explain emergence before you know what the emergent representative agent actually looks like, so I don’t see that arguing in terms of an emergent representative agent actually accomplishes anything. It still seems to me like a form of question begging.
I'd completely agree with David's sentiment in the case where you are actually trying to understand how something works. That is to say my recent set of posts talking about an emergent representative agent are actually meta-arguments. Updating my language a bit to use Gary Becker's paper, I am trying to argue three things:
  • The use of rational agents is not an immediate reason for mainstream economic theory to be wrong on its face. There's a lot of what Noah Smith calls lazy econ criticism that makes arguments like 'economists assume rational people, but people aren't rational ... LOL'. If a seemingly rational agent on average can emerge from irrational behavior, then takes the wind out of the sails of behavioral theories.
  • Microfoundations are probably irrelevant to macroeconomics. Macroeconomics can have very different properties from its microfoundations. This should have been well understood after the SMD theorem, but as Kirman [pdf] says the representative agent tries to sneak around it. Showing that the emergent representative agent relevant to macro can have very different properties from the individual micro-agents should put a stop to the sneaking.
  • Since rational agents can emerge from solely the properties of opportunity sets, let's skip the middleman (middle-agent?) and just use the mathematics of opportunity sets. The mathematics of opportunity sets is information theory (what messages could be constructed "given the opportunity" to use x bits). This information theory leads to basic supply and demand logic (but allows for specific failure modes) and is the impetus behind my paper.
Only the third one would be directed at David Glasner since he agrees with the second and doesn't make the first argument.

The first argument amounts to a defense of a swath of traditional economics. The second is a swipe at a different swath. The third is a potential third way -- or more like a rethink of the traditional diagram approaches.

Thursday, October 15, 2015

When is an intertemporal budget constraint a true budget constraint?

David Glasner cautioned me about the use of an intertemporal budget constraint (since it is based on expectations that could be thwarted) in my emergent representative agent argument (that parallels Gary Becker's argument that even irrational agents can behave rationally). If you have no idea what I am talking about, you should start here. At that link, I said I would address this issue in a future post -- this is that post.

The question is whether we can pretend the intertemporal budget constraint is a true budget constraint analogous to a single period budget constraint for the purpose of constraining the (intertemporal) opportunity set (as Becker puts it). I called the opportunity set the simplex. The issue is that recessions happen and output falls below its expected value (plans are thwarted). That means that expected value isn't a true constraint (actual output is much less) and therefore the most likely state of the economy might not:
  • Manifest consumption smoothing
  • Saturate the intertemporal budget constraint
  • Have downward sloping demand curves
  • (Approximately) maximize utility
The crux of the argument showing these hold (if we ignore recessions) with random consumption is that the most likely point of a high-dimensional simplex is on its surface, rather than in the interior. The downward sloping demand curves hold as long as the centroid of the simplex isn't close to any particular axis.

Let's take the opportunity set defined by the intertemporal budget constraint Σ ci ≤ M (see the picture at the top of this post). For d dimensions (time periods) we have

Σ ci ~ M d/(d + 1)

If d >> 1, then we have Σ ci ~ M and the above properties hold: we have emergent rational agents.

What happens if we introduce recessions? Let's take a set of {ci} of size n to be zero: all plans are thwarted in those periods and there is no income ... a 100% recession. Note that for even a bad recession, you're really looking at 20%, not 100%, so this will be a conservative calculation. We can show

Σ ci ~ M (d - n)/(d + n + 1)

However, if d >> n and d >> 1, we still recover Σ ci ~ M. In fact, we still recover all of the properties listed above (if d >> n, then the centroid isn't close to a typical axis and you have downward sloping demand curves almost always). Basically, as long as the economy isn't in a recession most of the time, the intertemporal budget constraint can be treated as an ordinary budget constraint.

Note that the properties hold even if the consumption sets are of different size (i.e. there is economic growth) as long as there is no cj >> ck for most k (in which case Σ ci ~ cj). That case is a lopsided simplex (the opportunity set would be more like a spike along the dimension j).  Can growth make our simplex lopsided? The Euler equation says that successive consumption periods are related by the rate of interest  and we have:

ci ~ β (1 + R) cj

This means that the rate of interest can't be too large. Large here though is a bit extraordinary; we'd need

β c0 (1 + R)^d >> c0 (1 + R)^(d - 1)


β (1 + R) >> 1

to make a truly lopsided simplex.

So the final takeaway is that we can treat the intertemporal budget constraint like a true budget constraint in order to demonstrate emergent rationality from random behavior. The caveats are:

  • The economy is not in a recession in most time periods
  • Consumption isn't concentrated in a few time periods

It is true that when you look at time periods near recessions, things are more interesting. However that is the point of non-ideal information transfer. In the information equilibrium model, information equilibrium is taken to hold most (but only most) of the time.