Friday, January 30, 2015

Is the demand curve shaped by human behavior? How can we tell?

I'm in the process of writing (yet another) simple introduction to the information equilibrium view of supply and demand, but stumbled onto an issue that -- while I may be wrong about it -- really seems to fly in the face of basic economics. I asked myself the question: how would we go about determining the shape of the demand curve -- especially in a way that let's us see human behavior at work?

You might think experiments would help here. However e.g. Vernon Smith's approach assumes utility. If you give your agents utility functions that obey the conditions of the Arrow-Debreu theorem, then an equilibrium must result from just the pure mathematics of it, along with the mechanics of supply and demand -- regardless of human behavior. This is basically just a restatement of the idea that assuming homo economicus (by giving people well-defined utility functions) effectively implies ideal markets.

I started to look at some classroom experiments ... and saw that they don't actually demonstrate what they set out to demonstrate.

Take this experiment [1] for example (it is not unusual). The idea is that students write down a reservation price and the instructor collects the cards and tallies up the number that would buy at a given price (or they all stand up and sit down as the price called out gets too high in this version [2]). As the price goes up, the number of students willing to pay goes down. Makes sense.

But is this measuring a demand curve (i.e. things like diminishing marginal utility)? No. And it is especially clear in [1] if you look at their graph. It's not a demand curve, it's an inverse survival curve for a normal distribution:

That is to say it's a cumulative distribution function of a normal distribution turned on it's side (see the second graph here). What this is measuring is the (normal) distribution of price guesses from the students:

It is especially telling that in experiment [2] above, they leave off the last few students -- i.e. the last piece of the CDF where it stops being linear.

This doesn't have anything to do with human behavior. Why is that? Because I can get the exact same "demand curve" using brainless atoms. The contribution to the pressure from one atom is based on the force it exerts against the container -- the change in momentum as it reflects off the wall. That change in momentum is proportional to its velocity, and in an ideal gas, the atoms follow a Maxwell distribution:

If we asked atoms to sit down as different velocities were called out if the velocity was higher than theirs, we'd get the following "demand curve":

I used this example since both price vs demand and pressure (velocity) vs volume come from the same derivation in the information transfer model and neither require human behavior to explain.

Now you might say that students' knowledge of the average price of M&M's (in the example) shows how human behavior enters the equation; they see the value of other goods and make utility judgments. But! Atoms also seem to 'know' the average velocity of the ideal gas in the analogous experiment -- set by the thermodynamic temperature. The students know the value of a packet of M&M's because it is set by the value of money (perhaps set by an economic temperature) -- something controlled by the central bank in most economic models.

So how do we see a demand curve in a way that incorporates human behavior?

In the MR University video, after using 'Black Friday' as an example (which is actually the experiment discussed above), they move on to describing it in terms of substitution. That is definitely human behavior, right? We decide to buy other things with our money!

Well, actually ... how is that different from the experiment above? When you have an estimate of the price of M&M's and the price goes above that reservation price you are effectively making the statement "I'd rather spend my money on something else" (or "I don't have that much money" in some cases). Something being "too expensive" and opting to save the money for something else are logically equivalent statements.

Now you might say that in the case of atoms when things get "too expensive" (too high a velocity) it's because they "can't afford it" (their velocity is all they have), not because they've decided to keep their "money" (velocity) for something else.

And that would be true ... for a single container with an ideal gas. But multiple markets is like multiple containers (with the same number of atoms**, i.e. students) at different temperatures (i.e. prices). So while 20% of atoms have a velocity of at least 1 in one market, 20% will have a velocity of more or less in another, corresponding to 'money' they'd 'spend' on something else.

So, again, how do we see a demand curve in a way that incorporates human behavior?

I'm probably missing something. There could be other experiments*** that show human behavior shining through. Just because I don't know what they are doesn't mean they don't exist.

There is an ulterior motive here, and it's not just that I think starting with humans as optimizing agents is likely not only intractable, but unnecessary. It's that in writing that simple introduction I mentioned at the top of this post I realized that the information transfer model, in an ideal market, has literally nothing to do with the behavior of the agents. Supply and demand are a property of two quantities that are in information equilibrium ... and the mechanics follow**** from D = κ P S. Hold D constant and as S goes up, P must fall (a demand curve). Hold S constant and as D goes up, P must go up (supply curve).

That's all there is ... and if that's all there is ...


** We are glossing over the the fact that we have the capability to distinguish different people and e.g. assign a particular price estimate in each market to a particular person -- something we can't really do for identical atoms. However, there is the question of whether the market can see people as distinguishable ... using money makes transactions anonymous.

*** You might think of an experiment where you reduce the supply and watch how the price goes up and take a survey and ask why people decided not to buy something. However 1) the mechanism you are using assumes supply and demand, and 2) since you are reducing the supply, some people will have to buy less regardless of how they feel about it. (Humans are subject to post-hoc rationalizations, so the survey would be suspect, anyway.)

**** You can get different shaped demand curves from this equation -- it's actually just an instantaneous equation and P is a derivative (dD/dS).

Tuesday, January 27, 2015

This is why sociologists think economists are arrogant

Tyler Cowen cites a study that makes claims about inflation and tolerance of the LGBT community
On the other hand, the data shows that when a society has impressive scores on property rights security and low inflation — two other components of economic freedom indexes — these characteristics are strongly and positively correlated with tolerance of gays. It’s possible that low inflation, and the behavior of a central bank, are stand-ins for the general trustworthiness of a nation’s government and broader institutions, and such trustworthiness helps foster tolerance.
That conclusion kinda depends on low inflation being the result of the actions of institutions, doesn't it? That implies a monetarist view, but even if you allow the word 'institutions' to be more inclusive of e.g. a 'fiscal theory of the price level', that still implies some sort of control by national elites at the point of low inflation.

Countries mired in liquidity traps are on average more tolerant? That makes the question seem even weirder. And for market monetarists, a liquidity trap is a sign of incompetent institutions (namely the central bank).

In the information transfer model, low inflation is generally a result of a monetary policy regime being around for a long time. All economies tend toward lower inflation in the long run (see the graph at the top of this post).

That leads us to the conclusion that basically long periods of time with economic stability lead to tolerance.

Which is something well known in sociology.

Simon Wren-Lewis has a good piece on economists and sociologists. There was also this from Stephen Williamson. But it's the armchair sociology that derives from economic analyses that is problematic.

Lee Smolin's take on Arrow-Debreu

I've been reading Lee Smolin's (of loop quantum gravity fame) take on Arrow-Debreu equilibrium (mentioned by Tyler Cowen awhile ago), and I was struck by how much we are looking at the economic problem the same way. It's probably just our shared backgrounds in particle physics. His take is a lot more sympathetic to the idea that gauge symmetries and other mathematics may be of use, something I would probably share if the information transfer model didn't seem to point to non-ideal information transfer or spontaneous drops in entropy from time to time [1]. Anyway, I've put together (mostly just for my own notes) a collection of quotes from Smolin's paper with something similar I've said on this blog. I follow the quotes with square brackets that get at the differences between what we are saying.

Let's go ...

Smolin: We are then interested in the simplifications that may happen in limits of large numbers, such as a large number of agents, or of goods. In these limits there may be universality classes that only depend on a few parameters that characterize the behaviors of the human actors that comprise the economy

Me: In reality, there may be a detailed balance that keeps the equilibria in an equivalence class described by e.g. a given NGDP growth rate. But that's the rub! Macroeconomics is the study of the behavior of those equivalence classes, not the instances of them!
[This is basically the same thing -- universality class is a particular kind of equivalence class.]
Smolin: The observables of economics are accounting and other records. One should then try to construct a theory of economics that involves only observables. The importance of this kind of operational principle in physics and other sciences have been paramount. Let’s see what it can do for economics. This restricts attention to what is actually measured by companies, individuals and governments. And it removes from consideration fictional elements that have nothing to do with how real economies work such as fixed spaces of products, fixed production plans, utility functions etc

Me: Game theory is a representation of economic microfoundations; the information transfer framework makes as few assumptions about microfoundations as possible. We are assuming we don't know any game theory. ... The following is a rough sketch, but one way to think about the information transfer model is as an ensemble of games with random payoff matrices (zero-sum and not) with players repeating games, switching between games and possessing some correct and/or potentially incorrect information about the payoff matrix (or its probability distribution). The only "constraint" is that all of the realized payoffs sum up to a macroeconomic observable like NGDP.
[Where Smolin says leave out unobservable things like utility, I say assume you know nothing about them -- in this case a payoff matrix in game theory is essentially a set of contingent utility functions. Expectations also seem unobservable in this sense.]
Smolin: An analogy to physics might be helpful here. Just like there is micro and macro economics, there is micro and macro physics. The former is atomic physics, the latter includes thermodynamics and the description of bulk matter in different phases. Macrophysics mostly deals with matter in equilibrium. The bridge between them is a subject called statistical mechanics, which is a general study of the behavior of large numbers of atoms, both in and out of equilibrium. Indeed, even though there is not a close analogy between the notions of equilibrium in economics and physics, there is clearly a need for a subject that might be called statistical economics. It would be based on a microscopic model of the basic agents and operations or processes that make up an economy and study the behavior of large numbers of them in interaction.

Me: ... assume the principle of indifference: given the macrostate information you know (NGDP, price level, MB, unemployment, etc), assume the system could be in any microstate consistent with that information with equal probability [2]. In Bayesian language, this is the simplest non-informative prior. This way lies statistical mechanics, thermodynamics and information theory.
[This is the same idea.]
Smolin: Furthermore, since equilibria are in general non-unique, there is no mechanism in the theory to explain why one rather than another of these equilibria could be chosen by the market mechanism. All we know about the market mechanism in the theory is that it looks for Pareto efficient states, but if there are many the market mechanism cannot choose among them.

Me: ... imagine a world where fuel was slightly more expensive and cars were slightly less expensive. Depending on the relative price this could still clear the market with the same amount of money being spent in aggregate on cars and fuel. ... However! If there is less fuel and more cars, then there might be fewer ways to associate gallons of fuel with cars (lower entropy since the fuel is fungible) and you could select the actual equilibrium based on maximum entropy production.
[Maximum entropy selects which Arrow-Debreu equilibrium; this quote represents a possible solution the problem in the quote from Smolin.]
Smolin: Time must be incorporated in a way that recognizes the irreversibility of most actions taken, as well as the asymmetry of the present, past and future.

Me: Basically, because the market is made of people, we can violate the second law of thermodynamics (ΔS > 0) by coordinating ourselves in a way that atoms or particles can't. There is no second law of econo-dynamics because of human behavior -- which is unfortunate because otherwise (if human nature didn't matter) ΔS > 0 would imply ΔNGDP > 0 -- the economy would always grow (absent real shocks like natural disasters or resources running out).
[The second law holds most of the time as there is usually economic growth; a recession represents moving backwards. My arrow of time in economics is essentially the thermodynamic arrow of time, with some exceptions during recessions. Entropy producing processes are irreversible processes. This is another case where what I am saying is a solution to a problem stated by Smolin.]
Smolin: Markets with large numbers of agents have very large approximate symmetries, expressing the fact that there are many individuals with similar educations, interests or aspirations and many firms competing to offer similar products and services. In the steady states reached by real economies these symmetries are usually broken. This leads to multiple equilibria or steady states. The representation theory of the broken symmetries is then relevant to the distribution of equilibria.

Me: If your macro system appears to be described by n << N degrees of freedom, then it seems highly likely that among the total number of microstates, large subsets of the microstates are going to be described by a given macro state -- i.e. the equilibrium (the microstate satisfying macro constraints) is not going to be unique. For example, in an ideal gas, you can reverse the direction of the particle velocities and obtain another equilibrium (actually, all spatial, rotational and time-reversal symmetries lead you to other equilibria).
[There is a slight difference here in that Smolin is being much more accurate from the physics perspective. Equilibria related to each other by a symmetry transformation are not actually distinct (e.g. reversing the particle velocities) in physics. The sense here is that you could exchange iPads for Nexus tablets in theory, but the actual dominance of iPads breaks the symmetry leading to two equilibria: one where the Nexus dominates and one where the iPad dominates.]

[1] Smolin writes: ... we want to know far from ideal a real economy may be, and still count as evidence for the theory. For example, lets take the prediction that all markets clear in equilibrium. There are clearly lots of markets in the real world that do not perfectly clear. In the information transfer model, we have a theory that tells us the 'neoclassical' view holds if the system is in information equilibrium (we don't have episodes of non-ideal information transfer). Essentially, if the information transfer model holds assuming ideal information transfer, we have evidence for the ideal neoclassical theory -- except for the emergent aspects of macro like liquidity traps and nominal rigidity.

Monday, January 26, 2015

How do you measure the price level?

econ: PCE or CPI? Core or headline?
info: [Sigh] ...
econ: What?
info: They're barely different from each other!
econ: Isn't there a joke about a physicist and a spherical chicken?
That's from me a few days ago. I've been on an anti-utility kick lately and the process, I've been looking at a bunch of stuff on preferences, total orderings and ... differential geometry. Smolin's paper mentioned Malaney and Weinstein's gauge theory approach to price indices. I plan doing a bit more thinking about what Smolin says -- he thinks a non-equilibrium statistical mechanics approach to economics may be useful (like the one I am advocating on this blog), saying: Nonetheless, once one gets past this confusion [about thermodynamic equilibrium], there is still a cogent claim that non-equilibrium statistical mechanics may be a basis for a model of an economy. But for now, I'm going to concentrate on the gauge theory approach to the so-called index number problem.

I seriously dislike the name, because it makes me think of some kind of deep problem in number theory or the Atiyah-Singer index theorem. Oh, well. I don't have much of a choice there. Let's start with Wikipedia:
The "index number problem" refers to the difficulty of constructing a valid index when both price and quantity change over time. For instance, in the construction of price indices for inflation, the nature of goods in the economy changes over time as well as their prices.
Wikipedia continues: "There is no theoretically ideal solution to this problem."

In fact there may be, based on Malaney and Weinstein's differential geometry approach. In her thesis, she motivates an 'economic derivative' that is a covariant derivative. You may remember that this gauge theory approach was part of Chris House's rant about physicists in economics. I opined on this blog, and asked the question that should be asked of any mathematical approach to a problem: what is it good for?

Well, Malaney's approach allows you a way to solve the index number problem ... assuming you buy into the other assumptions. The idea is that a salary $S(t)$ that is always proportional to the 'correct' price index $P(t)$ (with constant of proportionality $\alpha$) should have the same purchasing power, which allows the construction of a covariant derivative

\frac{D}{Dt} = \frac{d}{dt} - \frac{d}{dt} \log P 

So that

\frac{D}{Dt}S = \frac{D}{Dt} \alpha P = 0

This choice what you mean by 'constant' (i.e. derivative equal to zero) solves the index problem, because it essentially makes all of the price indices equivalent to a so-called Divisia index. Don't let the Divisa terminology worry you. Essentially, various chained indices or indices with changing baskets of goods are approximations to a Divisia price index.

The problem economic theorists would have with this is that a Divisia price index is path dependent brought about by changing preferences. Basically, what makes the Divisia price index problematic is the issue of mapping it to utility in the sense that John Kay discusses here.

Would you rather be the richest person in history or live in a time of antibiotics? Divisia answers with the former if you equate the 'inflation adjusted' value of goods with utility as defined in ethical philosophy. The path dependence is obvious here -- because Rothschild would likely value antibiotics today. Malaney argues that this path dependence may be a desireable property in her thesis, but overall, the question of how money equates to utility will probably always be a philosophical one.

As an aside, Noah Smith talks about unstable preferences here. However, the mapping of money and price indices to information lacks these philosophical issues. I've talked about this before -- inflation is better understood as an information theoretical construct rather than a utilitarian philosophical one.

The new bit here is that I want to show that the Divisia price index is the proper measure in the information transfer model, just like in the differential geometry approach of Malaney and Weinstein. The underlying assumption is different, though. In our case, a price index isn't measuring 'constancy' (in Malaney's thesis), but rather 'meaningful aggregation'.

The question we ask is whether there exists a consistent price level $P$ that allows us to treat the economy as aggregate supply $Q$ and aggregate demand $N$ when $P$ is made up of individual prices $p_{i}$ (and likewise $q_{i}$ and $n_{i}$).

If we start with (taking $\kappa = 1$ WOLOG)

N = Q P

Then taking the logarithmic time derivative (this gives a percentage rate when multiplied by 100), we get

\frac{d}{dt} \log QP = \frac{d}{dt} \log Q + \frac{d}{dt} \log P

= \frac{1}{Q}\frac{dQ}{dt} + \frac{1}{P}\frac{dP}{dt} \;\;\;\text{ (1)}

However, if we take the individual markets (I'll use vector notation to simplify the equation a bit)

q \cdot p \equiv \sum_{i} q_{i} p_{i}

\frac{d}{dt} \log q \cdot p = \frac{1}{q \cdot p} \frac{d}{dt} q \cdot p

\frac{d}{dt} \log q \cdot p = \frac{1}{q \cdot p}  \left( \frac{dq}{dt} \cdot p + q \cdot \frac{dp}{dt}\right)

\frac{d}{dt} \log q \cdot p = \frac{1}{q \cdot p} \sum_{i}  \left( \frac{dq_{i}}{dt} p_{i} + q_{i} \frac{dp_{i}}{dt} \right) \;\;\;\text{ (2)}

The Divisa price index $P$ is defined by separately equating the two terms of (1) and (2) and solving the differential equations:

\frac{1}{Q}\frac{dQ}{dt} = \frac{1}{q \cdot p} \sum_{i}  \frac{dq_{i}}{dt} p_{i}

\frac{1}{P}\frac{dP}{dt} = \frac{1}{q \cdot p} \sum_{i}  q_{i} \frac{dp_{i}}{dt}

In the information transfer model, that means $P$ allows you convert between informationally equivalent baskets of goods. That is to say that a basket from 2014 that includes an iPad and one from 1980 that does not can potentially be equivalent in information.

However, an intervening redefinition of money or a monetary regime change will break the direct equivalence between $P$ for different times. That is to say Rothschild's and Kay's utility cannot be compared because they lived in different monetary regimes (a gold standard and fiat currency regime), even if both used 'pounds sterling'.

Interestingly, non-ideal information transfer where $N \geq Q P$ does not affect this derivation -- we are equating the RHS of equations when we equate (1) and (2); both are on the same side of that inequality.

Anyway, more on the Smolin article to come.


I kind of glossed over the role of the changing information transfer (IT) index (κ) in the more accurate model of the price level in terms of the money supply. In this post, we used an aggregate supply (Q) rather than the money supply (M), so I don't think there should be a time changing IT index in that case. In a sense, PQ = N represents both the expenditure method and income method of calculating NGDP which should be equal (and should have κ = 1).

Saturday, January 24, 2015

Keynesian economics in three graphs

In its most general form, Keynesian economics is the idea that government spending can buck the trend of other forms of spending without causing changes in other forms of spending (like crowding out). Essentially, it can be a counter-cyclical force for growth.

This comes to the forefront in the information transfer model. If we represent the economy as made up of many random markets (as I discussed here and originally modeled here), we have a picture like this:

Each square represents an "economic growth state" of a particular firm or market. The distribution is centered around the average growth rate of the macroeconomy. In this picture, the government (G) represents a large one of these squares (about 10-30% of the economy, depending on the country), and most of the time, its growth rate is about the average:

As a single entity (in the simplest model) it isn't subject to the entropic forces maintaining the distribution and so can cause the average growth rate to increase by moving right towards increased growth by political fiat:

The crowding out argument is that the other boxes will move left toward lower growth. However that represents a coordination problem. The remaining smaller boxes can't organize themselves to over-represent the lower growth rates -- that would require the thermodynamics equivalent of a spontaneous decrease in entropy. Now it could trigger a panic that can produce a spontaneous decrease in entropy, but that seems like a strange model (it is similar to the idea of Ricardian equivalence ... but in this model, it is a psychologically irrational force, not a result of rational expectations of future tax increases).

Also, the idea of the Keynesian multiplier is that not only does G go up, the new distribution moves to restore a maximum entropy (equilibrium) state. However, moving G does move the average even if the other markets don't follow it.

Here's a really great analogy: heating leftovers in the microwave. When your food comes out of the refrigerator, it's about 55 °F and the molecules have an average kinetic energy of kT where T is the temperature (in Kelvin), but are distributed according to a Maxwell-Boltzmann distribution (the analog of the distributions above). When you microwave food, you excite rotational modes of water molecules -- effectively picking them out of the distribution for additional energy (like raising G). The other molecules eventually come into thermal equilibrium with the new average kinetic energy kT' where T' > T

Ricardian equivalence would be the rather silly argument that the other molecules decide to get colder on seeing the water molecules heat up (out of spite, I guess).

Scott Sumner: Data mangler.

Scott Sumner has apparently chosen to die on the hill of budget sequestration. He claims that the lack of a recession is vindication of market monetarism. His calculation is flawed, as I've mentioned before. And his new defense, same as the old defense, is this:
Economists generally agree that when you have a shock that occurs at the beginning of a calendar year, you should look at growth over the course of the year (say Q4 to Q4), not the average GDP in 2013 compared with the average GDP in 2012.

So Sumner wants to get into it, huh? Well anyone who looks at data generally agrees that choosing only two points (regardless of whether they are Q4-Q4 or Q1-Q1) to measure your effect vastly increases the influence of statistical fluctuations in the data, especially when looking at growth rates. Additionally, Q4 to Q4 measures from Q4 of 2012 to Q4 of 2013, including a full quarter of data from before the shock hit supposedly hit (at the beginning of CY 2013). But vague assertions of economists generally agreeing is good enough to outweigh a real econometric argument (the same one made by Mike Konczal, by the way).

Now maybe expectations of the sequester allow you to include GDP data from October of 2012 in your 2013 growth rate -- but if you recall, the political process had gone up to these fiscal cliffs before and averted them at the last minute. Only 50% of the sequester could have been already priced in (assuming it had a 50-50 chance of happening; your percentages will vary).

As an aside, if people generally believed in the Keynesian model, wouldn't expectations of a fiscal cliff hurt regardless of whether the "concrete steppes" of actual cuts happened? And the more credible the fiscal authority was in making changes, the smaller the actual changes have to be, right? The market monetarist model depends not only on the model being right, but that the market thinks the model is right. If you rely on expectations without the need for concrete steps, then whatever model the market believes is the actual economic model.

But the biggest problem with Sumner's argument is that the biggest percentage cuts from the sequester came in FY 2014, not FY 2013 [1]:

Now FY 2014 is CY 2013 Q3 to 2014 Q3, which means that the big negative growth shock we saw in Q1 of CY 2014 could be attributed to the sequester.

The additional kicker is that the cuts were only specified at a top line level. That means that affected government agencies had to figure out how to allocate the cuts (and get approval from their leadership) -- the result was that many of the cuts did not go into effect until the end of 2013 (Q3). Cuts also couldn't be made to contracts for work already delivered, so by the time the agencies figured out how to allocate the cuts, pretty much all of it had to be in the future relative to when the plan was agreed on. That means we'd be looking for a shock somewhere inside the yellow box in this diagram:

A big shock somewhere inside the yellow box. Hmm.

The red line is the smoothed version of the "first order Keynesian model". The original model discussion is here. Sumner's faulty calculation is shown as the orange line. Like I said before, it goes from a negative fluctuation to a positive fluctuation making it maximally misleading.

What is interesting is that this calculation reinforces the information transfer model in several different ways:

  • The shock occurs not when the cuts were announced, but around the time they actually happened ("concrete steppes"). This invalidates an expectations view of NGDP. Under that model, the cuts should have had impact as soon as they were expected.
  • The shock is way too large to be just due to the concrete steps. It probably represents a panic among government contractors and the associated non-ideal information transfer.
  • You could never see this shock without a model that includes the capability to decompose monetary and fiscal shocks. It's hard to see in the data (it's still there) without a model for the path of inflation and NGDP in the absence of shocks.

As a final note, I'm not sure why Sumner is focusing on this particular episode other than ego (Krugman said it was going to be a test of market monetarism). This one data point is among hundreds for different countries. Sumner does cite approvingly of a purported "takedown" of Krugman's austerity graph, but that "takedown" assumes the market monetarist model in order to throw out data (basically, all the liquidity trap countries) that make up the bulk of the correlation. Even in the information transfer model, if I throw out most of the liquidity trap countries the correlation between expansionary fiscal policy and increased growth disappears. That's true of Paul Krugman's model as well! If a country's not in a liquidity trap, monetary policy will offset fiscal changes.

It's a pattern of selecting the data to get the result you want. Now that Sumner is on his way to Mercatus, I can only assume it will get worse. We'll start seeing things like this (from my old blog). At least he didn't join Freakonomics.


[1] FY is US Government Fiscal Year where FY 2014 is (the beginning of) CY 2013 Q3 to CY 2014 Q3, where CY is calendar year (or January to December, Q1 to Q3). I have to deal with this all the time at work because my company uses calendar years, but most of my work is with government contracts that go by fiscal years.

Friday, January 23, 2015

I'm not sure economists understand supply and demand

I had put off watching the new Economics 101 videos from "Marginal Revolution University" until I had a sufficient contiguous chunk of time available. My reaction to the first video is pretty much summed up by this blog post titled Why people hate economics, in one lesson. And I have some philosophical objections to starting with the axiom that incentives matter -- at least as a useful methodology for producing tractable models.

The first few videos after the introduction look at supply and demand curves and then look at equilibrium. Now I am completely behind the idea of supply and demand curves and the equilibrium price. I actually derived them from information theory here. But the logic behind the supply and demand curves MR University show -- to be honest it's not really any different from how they were taught in my high school economics class, or how they are explained on Wikipedia -- is way too dependent on the specific relationships of a particular good (they use oil). More oil is produced at a higher price because it becomes profitable for companies to look for and extract harder to find and extract oil. But this isn't true for software (additional units are have almost zero cost, so it gets cheaper because upfront development costs become lower per unit). And the fact that the demand curve slopes down seems to be more a result of inequality for some goods (more people would buy iPads if they could afford them) than marginal utility. And in real life, people get upset at Uber or 'price gougers' who raise their price when supply is scarce. This makes it seem brutal.

Yes, yes: barriers to entry, patents and upfront capital costs. However, my point isn't that the supply and demand curves are wrong because specific examples are wrong or seem heartless. As I said in the previous paragraph, I believe in supply and demand. I have an issue because the way MR University teaches assumes the marginal utility interpretation of supply and demand -- that "incentives matter", and specifically pecuniary incentives. More oil is produced because it is profitable. I buy more of something because it was cheaper.

This becomes more apparent in the equilibrium price video (linked above) where Alex Tabarrok tells us about an experiment by Vernon Smith and concludes that the supply and demand model works. However, the Smith experiment assumes the utility version of the supply and demand model! He hands out pieces of paper with different utilities (measured in money [1]) and different marginal unit costs of production, forcing the structure of supply and demand curves given in the models. It is a bit like saying all two-player games result in ties, then designing tic tac toe and saying it proves your assertion.

It is not clear that the result of the Smith experiment is anything other than an asynchronous poll [2] of the price on the student's cards -- and that has nothing to do with supply and demand [4]. 

Now I have my own interpretation here (and succinctly, supply and demand curves are models of the entropic forces that enforce information equilibrium), but you don't have to take my word for it. The information aggregation interpretation of markets [2] is a less restricted approach, though I have my disagreements with it. I'm sure there are other ways to teach this. It's an easy to use algorithm to try and solve a linear programming problem, for example. Or just posit supply and demand curves as an axiom themselves -- why do they even need a specific interpretation? It will help when you have different slopes for the supply and demand curves later in the class [3].


[1] In this setup, you assume the social welfare function of the market -- wealthy people are similar to utility monsters where it is better for you to give up your stuff for them to enjoy because they enjoy it much more than you.

[2] Markets are really strange as polls or information aggregation devices. It essentially says that if you are good at predicting things in one domain (say, business), you should be rewarded with money and hence a greater ability to try and predict things in unrelated domains (say, politics). If you make a lot of money in sporting goods, you have been granted the means to try your hand at farming or producing records. I don't really take being good at one thing as evidence you are good at something else.

[3] In the information transfer model, you do have the freedom to make different sign choices; it basically says that a negative change in a quantity is keeping the two quantities (supply and demand) in information equilibrium instead of a positive change. You could look at this as measuring the price of land in terms of the quantity of land left. The ordinary supply and demand model doesn't really allow you the freedom to look at the problem in this alternate way.

[4] It is nagging at me why Tabarrok doesn't get this. It's not like he doesn't have a PhD in economics or anything. Am I missing something? Or is it just motivated reasoning (from me or Tabarrok)? Tabarrok makes a point of saying that Smith thought it wouldn't work. I have no idea why Smith would think it wouldn't work and that's not just the curse of knowledge. Even if the prices announced in the auction were random, the result would be the same equilibrium price. The only two kinds of equilibria I can think of are ones that on average have the equilibrium price given by the cards and the ones where the market doesn't clear (not all of the cards are traded) that should generally have an observed price below the equilibrium price (according to the information transfer model).