(no subject)

Oct 04, 2008 16:14

A thing that's bugged me for a long time is the apparent arbitrariness with which we have to define inflation. The CPI, for instance, picks some ol' basket of goods, and measures how its price changes. Depending on how that representative basket of goods is chosen, you get a different answer. In particular, if the basket of goods has twice as many, say, Xboxes in it as another, then you perceive inflation to be twice as sensitive to change in price of Xboxes compared to how you'd measure it with the other basket.

Is there a notion of average that is robust against this arbitrariness of counting?

Try this: Take a collection v1...vm of vectors in Rn. Compute the m × m matrix Dij = || vi - vj || of distances between them. Find a vector μ in Rm such that Dμ is a constantly-k vector for some k, and Σiμi = 1. We can take this to be some multiple of D-1(1 1 ... 1)T if D happens to be invertible. Now if I did my math right, Σiμivi serves as a kind of average of the vi, but is invariant under rotation, uniform scaling, (but not nonuniform scaling!) and most importantly, duplication of items in the list of vectors vi. If we literally duplicate entries the matrix D is not invertible, but the equation Dμ = k(1 1 ... 1)T allows exactly those reweightings that assign a total weight to the two duplicated vectors equal to the weight of the unduplicated one.

Anyhow this seems like an utterly elementary operation. I wonder why I haven't heard of it before? Is it because it doesn't actually work as I think it does?

average, math, unbiased, inflation

Previous post Next post
Up