... this guy.
From the WSJ (via Tim Duy (via Mark Thoma)) we get a nice compilation of central bank forecasts from 2011 to 2015 along with actual inflation. Although the information equilibrium model didn't exist in 2011, I thought I'd run the projections using the data that would have been available at the time (although it does include revisions, but these shouldn't perturb the results too much since they're relatively stable) and overlay the results on the nice WSJ graph:
For reference, the various forecasts are here.
Go Jason!! BTW did you see Noah's recent article about barriers to entry in academics? Very interesting phenomenon, I think. 100 years ago PhDs were so rare that they could write their own ticket most of the time anywhere they went. Now they are so frequent that they have to make artificial signaling barriers to exclude potential competitors.ReplyDelete
The more I think about it, the more I think it's not so much barriers to entry as it is developing a common operating picture among practitioners.Delete
I think that's why e.g. I frequently understand the things Noah Smith writes as opposed to David Glasner where it takes me a couple reads. Noah was an undergrad physics major, so we have a little bit of a common operating picture, so I get what he is saying faster.
Practitioners in a field need to get a lot more information across quickly than amateurs and can do so with jargon, salient literature references and a library of common phrases or equations.
Even the way mathematicians, physicists, economists and engineers write equations is different. Example:
exp(j k x) vs exp(i k x)
electrical engineers use "j" instead of "i" for the imaginary unit.
Economists use π for inflation (log derivative of price level).
All that "busy work" is drilled into the heads of students so that you don't have to question every symbol that gets written down.
For example, if I was to write:
L = ϕ∂²ϕ
Every physicist out there would know I am probably talking about a massless scalar field, bringing up every association from the Higgs boson to renormalization.
Jason, I can't see the truth curve for Japan very well... I assume it's the light gray curve?... It appears to be climbing where I lose sight of it.ReplyDelete
It's at the origin of the purple forecasts. It falls back down to zero after that rise -- check the original picture at the links.Delete
Nice. If you use the AlFred data (https://alfred.stlouisfed.org/) you can use the appropriate real time values to check for revision sensitivity (at least with US data). I'd be curious to see a series of pseudo real-time forecasts for the US. Do you think revision magnitude contains independent information (expectation errors), or do you think fitting to the "best current" data on its own captures everything salient?
Thanks for the link; I will try to re-compute these forecasts with the archive data for the US.
My opinion is that there is way too much measurement noise in the data to get an additional signal out ... but I could be wrong.
I believe revisions are due to more data arriving over time; this means there's a strong possibility that uncontrollable factors like the vagaries of work schedules and computer systems determine the timing.
Agree about the signal to noise ratio; while revisions occur with the arrival of better data over time, I think there could be some small residual information in the revisions, which would work like this: first release is based on partial data, missing data that arrives with a lag is initially estimated, estimate is somewhat biased by forecaster's priors, so to the extent that the distribution of revisions is biased away from zero or skewed, could help calibrate uncertainty on the first release in a real time forecasting application. Probably too small an effect to incorporate given noise, though.