Wednesday, December 9, 2015


Nick Rowe has an interesting incremental argument asking: when does the cash disappear? It's kind of another Sorites paradox, except it asks what properties must you take away for something to stop being "cash". In one sense, I agree: money seems to be better defined by symmetry properties, so whatever it is, it doesn't stop being money unless you take away those symmetry properties ... which means you've made the economy become a non-ideal information processor and lost lots of output.

But that's the key -- Nick Rowe's properties are mostly extraneous properties of money, not extraneous properties of "cash".

In the information transfer model, the key property that distinguishes what we call "cash" (bits of paper with e.g. Benjamin Franklin on them in the US) from other types of "money" like M1 or MZM is that "cash" (aka M0) is what anchors the core inflation rate as well as the NGDP path. That is to say the inflation rate (the change in the price level P) is set in the market:

P : NGDP ⇄ M0

M1 and M2 don't set the inflation rate (there are large fluctuations), but M1 and M2 are money in the sense that they can be used for transactions.

What is it about M0 that makes it have this property?

I don't know for certain. I think it may be the fact that central banks can't get a hold of your M0 to pay (positive or negative) interest on it. Things that are included in M1 and M2? Yes, definitely. I have some M1 in my bank right now gaining interest.

So when Nick says:
2. Muggers become a problem, so people put their currency in a box at the central bank with their name on it. ...

3. The bank notices it is now administratively easier to pay (positive or negative) interest on currency than it was when people kept their currency in their pockets. So the bank now has a second monetary policy instrument, in addition to Open Market Operations.
I think these steps are a bigger deal than Nick makes it out to be. The information entropy in the distribution of currency would suddenly vanish since the state of every dollar is now known (so that interest could be paid). Basically I have just acquired log2 C(n, k) ~ n H(k/n) ~ 1 trillion 4 billion bits of information (k = people in the country 300 million, n = 1.3 trillion dollars). That may not sound like a lot in terms of electronic storage, but the entire annual economy (n = 18 trillion dollars) represents only about 2 trillion 5 billion bits. (Assuming indistinguishablility.)

In a thermodynamic system, knowing the state of every atom in an ideal gas would allow you to extract useful work out of the system. However, knowing each bit of information about the state of every atom costs kT log 2 in energy, so this is an impossible task.

There is no energy restriction in the economic case, but this still represents a major change in the properties of the macroeconomy.

Basically, labeling those boxes at the central bank is what "cash" does.


  1. Check out Nick Edmonds' comment and Rowe's response. Edmonds echos your idea to some extent.

  2. What's H?

    Can you expand a bit on "Assuming indistinguishablility?" Do you mean indistinguishabililty between one cash dollar and another?

    "...this still represents a major change in the properties of the macroeconomy."

    What do you think would happen?

    1. H(p) = binary entropy function for probability p
      C(n, k) = binomial coefficient "n choose k"

      Indistinguishability means you can use the C(n, k) instead of P(n, k) = n!/(n-k)! ... a dollar of NGDP is no different from another nor is a dollar of M0 different from another.

      "What do you think would happen?"

      No idea! The electronic units might take over as the new "currency" (which ones? good question!), or inflation might go away completely. Monetary regime change? Who knows.

    2. So you're saying that in going from Nick's step 2 to Nick's step 3 we have a monetary regime change, and "who knows?" what happens at that point?

      Also, you're saying log2 C(18e12, 3e8) ~ 2e12

      I'm familiar with C(n, k) and P(n, k), but is C(n, k) supposed to be a count of all the ways n indistinguishable currency dollars (or dollars of NGDP) can be allocated to k unique people? Shoot I feel dumb, but I'm not quite seeing it.

      Previously we counted 42 ways to partition 10 (indistinguishable?) sheep, with between 1 and 10 partitions. And used that to say we'd need 5.4 bits (5.4 = log2(42)).

      Now why aren't we counting ways to partition 1.3e12 dollars (of currency), with up to 300e6 partitions?

      Returning briefly to the sheep, you wrote:

      "What has happened is that the extra information in the probability distribution of 15 sheep (or 90 sheep) relative to 10 sheep means there is more information in determining the allocation of a single sheep -- more information = higher price.

      Aside: there are 176 different allocations of 15 sheep versus 42 allocations of 10 sheep. Assuming a uniform distribution over the allocations, we need 7.5 bits to specify an allocation of 15 sheep versus 5.4 bits for 10 sheep."

      I went through the exercise of counting up those allocations for up to 10 sheep.

      1 partition: 10 sheep. 1 way.

      2 partitions: 9+1, 8+2, 7+3, 6+4, 5+5 (we don't count 4+6, 3+7, etc, since they aren't unique I guess). 5 ways then, with 2 partitions.


      Total ways, for everything from 1 to 10 partitions: 42 ways.

      So the idea was we were partitioning (dividing?) indistinguishable sheep to what? People (sheep buyers)? Dividing the sheep between up to 10 people? Indistinguishable or unique people? Seems they are indistinguishable if we only count one of these two ways to partition (for example): 7+3 and 3+7. We toss out 3+7 because 7+3 covered it. 7+3 means person 1 gets 7 sheep and person 2 gets 3, and everyone else gets 0. But 3+7 means person 1 gets 3 sheep and person 2 gets 7 and everyone else 0.

      Sorry! Ultimately I'm trying to understand why we used the log2 of the "integer ways to partition" function with the sheep to count bits of information, but now we're using the log2 of the C(n, k) function. I just realized that I never fully understood the sheep example and what we were counting (even though I know how to count it).

      Also, I don't see how the bits per sheep goes up as we increase the number of sheep.

      5.4 bits / 10 sheep = 0.54 bits per sheep

      7.5 bits / 15 sheep = 0.50 bits per sheep

      Seems like the bits per sheep went down.

      I shouldn't be doing this this time of night. )c:

    3. Simple example: the number of ways to distribute four indistinguishable dollars to 3 unique people:


      15 ways. log2(15) = 3.9 bits of information storage required to record which scenario exists.

      That's a lot different than C(4,3) = 4. Two bits for that. It's also different than the "integer partition function" of 4:


      So there's five integer partitions. 2.3 bits.

    4. There are a couple of things here ...

      First, integer partitions of 4 can be distributed among 4 people instead of 3 (limited number of terms in partition)

      So for the integer partition of 4 limited to 3 people would give you:


      And in the combinations, since I assumed indistinguishable people and money you have:


      Another thing to note is that integer partitions by themselves do not determine the distribution over the different partitions (binomial assumes uniform). 3+1 could be the most likely or 1 + 1 + 1 + 1 could be the most likely, depending on the model.

    5. Also, I made a math error: instead of 1 trillion and 2 trillion bits it should be 4 billion and 5 billion bits. Accidentally wrote trillion as 10^9 instead of 10^12.

    6. So ...

      log2 C(1.3e12, 3e8) ~ 4e9
      log2 C(18e12, 3e8) ~ 5e9

    7. Thanks Jason. Good point about limiting the partitions in my simple example to just three people. Is it just a coincidence then that the integer partition function of 4 limited to a maximum of 3 partitions (one for each person) = C(4,3)?

    8. Great, thanks for the link. I'd never heard of "stars and bars."

      Also, you could get ~6e9 and ~7e9 bits respectively if you went with pennies instead of dollars.

    9. BTW, I used Matlab's gammaln. I couldn't find an online calculator (even the free version of Wolframalpha) to handle those large arguments.

    10. "I assumed indistinguishable people"


    11. I used indistinguishable people for the same reason you can't find an online calculator. The brute force calculation would require gigabytes of memory just to store the numbers.

      log2 C(n, k) has a nice asymptotic expansion for n >> 1, rendering it tractable. Distinguishable people might have such an expansion log2 P(n,k) but I am not immediately aware of it because my experience is with physics and e.g. photons are indistinguishable.

      In truth there should actually be a scale of some kind that goes in the calculation that would tell you whether to use pennies or dollars. As your unit goes to 0, the two numbers actually approach each other.

      However, it was a back of the envelope calculation and the important point is the two numbers are not that different from each other.

    12. That link I give above to the integer partition function also gives an asymptotic formula:
      P(n) = exp(pi*sqrt(2*n/3))/(4*n*sqrt(3))

      (pi*sqrt(2*1.3e12/3) - ln(4*1.3e12*sqrt(3)))/ln(2) = 4.2e6

      42e6 with pennies

      then for n=18e12, it's 15.7e6, and 157e6 with pennies.

      I'm a little surprised by how small those are in comparison to log2(C)

    13. You'd have to limit it to partitions of size k or less ...

    14. "k or less"... true, but that should give even smaller numbers, right? We'd be taking away some ways to partition it.

    15. Sorry, that wasn't meant as an explanation of the size differential. For n and k >> 1

      log C(n,k) ~ n H(k/n)

      log p(n) ~ sqrt(n)

      Note P(n, k) in the comments above is permutations -- i.e. C where order matters. The p(n) directly above is integer partitions.


      log C(n, n/2) ~ n - log n

      for C we're talking about the factorial function n! It grows much faster.

      Table[Length[IntegerPartitions[ii]], {ii, 2, 10}]

      {2, 3, 5, 7, 11, 15, 22, 30, 42}

      Table[Binomial[ii, Round[ii/2]], {ii, 2, 10}]

      {2, 3, 6, 10, 20, 35, 70, 126, 252}

      With the stars and bars

      3 = 0 + 2 + 1 : ||**|*| counts for binomial, but not for partitions (the partition form would be |**|*|, missing the initial "|")
      3 = 1 + 1 + 1 |*|*|*| counts for both (every partition is a binomial, but not every binomial is a partition)


Comments are welcome. Please see the Moderation and comment policy.

Also, try to avoid the use of dollar signs as they interfere with my setup of mathjax. I left it set up that way because I think this is funny for an economics blog. You can use € or £ instead.

Note: Only a member of this blog may post a comment.