tag:blogger.com,1999:blog-6837159629100463303.post8505647764276410728..comments2023-06-18T01:25:08.748-07:00Comments on Information Transfer Economics: Ecological fallacy and emergent dynamicsJason Smithhttp://www.blogger.com/profile/12680061127040420047noreply@blogger.comBlogger2125tag:blogger.com,1999:blog-6837159629100463303.post-57866136145392950862017-03-03T14:18:57.906-08:002017-03-03T14:18:57.906-08:00That's a great question.
Let's just call ...That's a great question.<br /><br />Let's just call D/dD = nd and S/dS = ns the number of draws from the single event distributions with info entropy H(d) and H(s). When you say H(d) = H(s), then you are implicitly saying you have the same number of "draws" (i.e. nd = ns = 1). This is correct that this does imply k = 1. However, in general we allow not just different distributions, but different numbers of draws from them. So the information ent of some number of demand events I(d) = nd H(d) is equal to the information ent of some number of supply events I(s) = ns H(s).<br /><br />Let's take demand to be made up of die rolls (6-sided) and supply to be made up of coin flips. In order for the information entropy to be the same, you'd need 2.6 coin flips (each coin flip is one bit) for every die roll (2.6 bits per roll). So if nd goes up by one, ns needs to go up by 2.6.<br /><br />In this case we have k = 2.6 = log 6/log 2. And a "supply event" is 2.6 coin flips (2.6 draws with 2 supply symbols), while a "demand event" is one die roll (one draw with 6 demand symbols).<br /><br />The thing is: if the information theory derivation isn't intuitive, there's always the path of <a href="http://informationtransfereconomics.blogspot.com/2014/08/fishers-proto-information-transfer.html" rel="nofollow">simply generalizing Fisher's marginal utility equation</a> where k is just the relative utility of an element of D (i.e. satisfying the need, purchasing the item) to the relative utility of an element of S (i.e. seller's utility derived from selling the item).Jason Smithhttps://www.blogger.com/profile/12680061127040420047noreply@blogger.comtag:blogger.com,1999:blog-6837159629100463303.post-63981133157655385762017-03-03T12:28:16.689-08:002017-03-03T12:28:16.689-08:00Hey Jason, I have a question about the basics of i...Hey Jason, I have a question about the basics of information transfer economics. Like, "the very foundations of this blog" basics.<br /><br />In your post "information theory 101", you say that the Shannon information entropy of a single supply event is equal to the Shannon information entropy of a single demand event. Much of your blog is focused around substituting into this simple case different variables for demand and supply, and seeing how the results fit the data.<br /><br />I'm a little confused as to the jump you make between saying that H(demand symbols) = H(supply symbols) and (D/dD)H(demand symbols) = (S/dS)H(supply symbols). If a single sampling of the supply distribution offers the same (expected) information as a single sampling of the demand distribution, then it makes sense that repeated sampling should create the relationship (D/dD)H(demand symbols) = (S/dS)H(supply symbols). But the original equation still holds. Can't we divide both sides of the second equation by each side of the first, so that D/dD = S/dS, and wouldn't this imply that k is always equal to 1?<br /><br />I think a lot of this confusion stems from a non-rigorous definition of a "supply event" or "demand event". The fact that supply and demand have units makes them strikingly different phenomena from discrete, dimensionless symbols. I'm sure that I'm probably missing something obvious, but I feel like only one equation ((D/dD)H(demand symbols) = (S/dS)H(supply symbols) or H(demand symbols) = H(supply symbols) ) can be legitimate.<br /><br />Could you help me out? This has been bothering me for a while.SilasLockhttps://www.blogger.com/profile/17244392025719134554noreply@blogger.com