The open privacy problem
On Friday I went along to a talk by Cory Doctorow and Charles Stross organised by the Open Rights Group. The topic was "Resisting the all-seeing eye - anti-surveillance 101", but in the end it was a bit more of a general discussion of privacy-related issues, rather than specific tips for avoiding or confounding surveillance.
I left the talk feeling a bit at odds with the direction it had taken. The speakers both seemed to agree that we should always be looking to minimise the amount of information that we expose about ourselves, particularly online. Cory repeatedly made specific points about the problem of 'websites which reward us for giving away our personal information', meaning social networking sites such as Facebook, MySpace, et al.
I can see his point, but I have a problem here... I like social networking sites. I like the interesting new people you can meet via common interests, I like the near-constant stream of updates about the activities and well-being of your friends, I like putting my thoughts out there and seeing how they resonate with other people. I don't want to hide all my information away - I want it 'to be free', in the words of the great cliché.
Unfortunately, as things stand, if you put that info out there for your friends, it's going to be gathered by other parties too. Stalkers are the most obvious example, but as Cory pointed out, actually the least concerning in many ways - for a start they're rare, and secondly they're focused on one person. My concerns lie more with large-scale data-gathering and data-mining expeditions by companies and (most particularly) by governments, which unfortunately are neither rare nor focused - they're looking to gather as much data about as many people as possible, and then cross-correlate it until something interesting falls out. In the case of the companies that something is usually money. In the case of the government, who knows? Today's buzzword is 'terrorism', but what that means it that they're looking for people who don't agree with the government. And lately they seem to be having some trouble drawing the line between what is and isn't acceptable for them to keep tabs on.
After the walk, Helen and I grabbed dinner and chatted about the problem with sharing your personal information on sites such as Facebook. Basically Helen's theory (as I understood it) was that it would be great if we could all just share our information freely, and people would stop being judgemental about the things contained in that info. She thought this would to a large extent be an inevitable consequence of the ever-increasing amount of personal data 'smog' that our society is now generating around every individual - when everyone shares all of their info, there are no secrets to be leveraged by malicious observers. My response was that it could never work that way, because there will always be some people who react negatively to things that others find unremarkable, there will always be some people who have power to affect your life adversely, and there will always be some overlap between these two sets of people.
(It turns out that most of what we said and thought has already been covered by David Brin and Bruce Schneier, in Brin's book The Transparent Society and Schneier's article The Myth of the 'Transparent Society'.)
We came to the conclusion that what we really wanted was a body of law that reflected the way people actually think about these things - drawing a clear distinction between data use by individuals and data use by organisations such as companies or government departments. I'm happy to share my data with other individuals, but I'm reluctant to share it with companies unless they're offering me suitable incentives to do so, and I think their use of it should be heavily constrained. I'm even more reluctant to share my data with government bodies unless they can give me a very good reason for needing it - for instance, the taxman needs to know how much I earn, but nobody else in the government does. My doctor needs to know about any health problems I have, but staff at the local council offices do not. At one point I suggested that perhaps what I wanted was some sort of inversely proportional law, whereby the larger an organisation is, the less personal data it is allowed to retain about any individual person.
A related idea mentioned during the talk was that it would be nice if there was an embargo on use of personal data during a certain time-range. So for instance, once a piece of personal data is over 2 years old, it can't be accessed any more, but then once it's over 200 years old it can. This would stop organisations keeping/using people's personal data beyond a reasonable time limit now, but still leave a fascinating cultural resource for future historians - who are welcome to pick over the digital debris of my life once I'm safely out of the picture (along with everybody I know).
Unfortunately I can't see us ever reaching a point where a majority of people care enough about their personal data and their privacy for ideas such as these to gain any traction towards becoming law. Most particularly, I can't see us ever having a government that would legislate into place these kinds of constraints on their own ability to use our personal data. Nor would I particularly trust them to honour the law even if they did pass it.
So where do we go, with our conflicting desires for privacy and sharing, for liberty vs control, for the right to determine who gets to know what about us - and who doesn't? I don't know. And that bothers me. If I don't have a clear idea of where I want to end up, how am I supposed to try to get there?
Related Links
With reference to your suggestions, you're talking about something similar to software's personal vs commercial license on personal info? No idea how you's enforce this tho. Maybe you have to show the licenses on how you obtained the info whenever you use the info for commercial purposes?