The voluntary transparent society

Apr 17, 2008 17:17

pmb was talking recently about how the taboo on talking about salary hurts workers at the negotiating table, because the employers have more data. That reminded me of an idea I had a while ago ( Read more... )

transparency, ideas

Leave a comment

Comments 11

amoken April 18 2008, 01:09:59 UTC
One other application: research. For example, a lot of groups like medical researchers have access to a portion of some people's medical history, probably anonymized if they have any sizable grouping. But they don't always have the full picture. If you did this in such a way that there were lots of structured fields, it would be a cinch to grab all of that stuff and correlate it. With unstructured data, it's still doable, but the more free-form it gets the more reliability you sacrifice.

Reply

amoken April 18 2008, 01:11:22 UTC
Oh, and people like me could make interactive visualization applications to let you peruse and discover your own correlations. :D

Reply


triath April 18 2008, 03:01:48 UTC
I would be very interested to see this information, but fairly wary of any accuracy since people will selectively self-report the things they want to share (e.g. share if they have a high salary, but not a low one).

Note: pmb's post that you linked to is friends-locked so probably not everyone on your list can view it.

Reply

mbrubeck April 18 2008, 03:36:06 UTC
Yes, the voluntary nature means that you couldn't make valid generalizations from the (self-selected) sample to any broader population. It would be more useful for rough, qualitative analysis. For example, you might learn that few of your friends' salaries fall into the range you had thought was "normal."

[I changed the intro to paraphrase Peter's entry instead of linking; thanks.]

Reply


neonelephant April 18 2008, 03:52:22 UTC
I know the horse is, like one of those animals in a relativity problem, mostly out of the barn already as I'm trying to close the door here (I think personally I have enough paranoia to give me all of the stress, but not enough to actually secure myself when it comes to these things), but who would operate such a thing and what assurances would users have that a) the owner/operators would not exploit the data, and b) (and, I would hope, more relevantly) any future owners would not exploit the data?

Being able to compartmentalize and selectively share personal information with ease is interesting (and I'm sure the list of applications you mention is nowhere near complete), but it does (or at least it seems to me that the model you present does) require trust in those running the site.

Reply

mbrubeck April 18 2008, 15:38:55 UTC
That's the sort of "difficult problems of anonymity and confidentiality" I'm talking about. :) I think good security engineering can mitigate those concerns, but it might impose other constraints on the design ( ... )

Reply


paperclippy April 18 2008, 13:33:03 UTC
I think it would be really interesting. There are already sites that do some of that -- I can't remember whether it's salary.com or something like that, but there is at least one where you enter in your salary, years of experience, job title, education, and location, and then you can see average salaries of other people, with no identifying information.

For what it's worth, I'm happy to share my salary with anyone who wants to know. I am not a fan of salaries being kept secret. I would also be willing to share a lot of those other details, but either anonymously or only to my friends (for example, I don't think my coworkers need to know how often I have sex with my husband or how much I websurf at work).

Reply


(The comment has been removed)

mbrubeck April 18 2008, 16:59:04 UTC
An Advogato-style trust metric might mitigate problems of disinformation. Input to the trust metric could include both the explicit social graph of friend/contact relationships and the implicit graph formed by Gmail-style invitations. (Carefully-metered invitations would also limit users' ability to submit bad information en masse.) You could achieve some level of confidence in your view of the data as long as you trust your friends to some degree, their friends somewhat less, and so on.

Reply

(The comment has been removed)

mbrubeck April 18 2008, 20:42:35 UTC
Ed Felten points out that privacy promises are difficult to rely on:
Even though a company might make a contractual promise to honor some privacy rules, customers won’t have the time or training to verify that the promise is enforceable and free of loopholes. [...] But even if the contract is legally bulletproof, the company might still violate it.

That's one reason I favor technical measures that minimize the opportunity for abuse.

Reply


Leave a comment

Up