The traditional privacy protection model focuses on user control at the point of data collection, but Malcolm Crompton says it is time to change our thinking on privacy to better serve individuals.

When will we shift from outdated thinking on privacy?

2012 is the 10-year anniversary of Microsoft’s Trustworthy Computing initiative (TwC), which began when Bill Gates sent around a companywide memo emphasising security, privacy and availability as the keys to instil trust in computing. To mark the occasion, on February 28 the Corporate Vice-President of Microsoft TwC, Scott Charneydelivered a paper entitled Trustworthy Computing Next, which reflected on the past 10 years and the new challenges that lie ahead. His observations on the latest developments in technology and their consequences on privacy are particularly noteworthy.

I would like to touch on something addressed in the paper that is in urgent need of reconsideration: the traditional privacy protection model which focuses on user control at the point of data collection.

Emphasis on user control is not working
For years, privacy advocates and policymakers have touted empowerment of the user — it is a fixture in the privacy frameworks of most jurisdictions worldwide. At the time of collection, data collectors should spell out what information they’re collecting and what they’re using it for. This will enable the individual to make an informed decision about whether to share their information. Or so the thinking goes.

The reality, however, is very different. As early as 2006, Fred Cate - a leading privacy authority and author of Privacy in the Information Age (1997) — was pointing out the shortcomings of this procedural approach. The problems are manifold:

  1. The goal of notice rarely accord with actual practice — Individuals simply don’t bother to read page after page of dense, technical information, often written in legalese.
  2. The current procedures impose substantial costs — Firms are burdened with the cost of publishing and distributing notices. Individuals are burdened by the time and effort required to read the notices and make an informed decision.
  3. User control is often illusory — If the individual doesn’t read the notice, it is questionable whether subsequent consent to the company’s use and disclosure of their information is meaningful. Alternatively, consent is often a required condition of service. How many of us will click “I do not accept” and elect not to use the program we went to the trouble of purchasing or downloading?
  4. The benefits of no-choice —There may be contexts in which consent or control is undesirable, for example in medical research and credit reporting, as well as cases in which the individual do not realise the potential value of data.

The problems will not be addressed by placing more responsibility on individuals and expecting them to be fully capable of making the right decisions.

Reframing the issue
Finding an approach that works will involve the understanding of two truths. Firstly, as humans we are subject to cognitive behaviours that limit our ability to optimally negotiate issues of privacy in the modern information society. Secondly, as the rate of data generated and used by society increases at a blistering pace, attempting to control everything at an individual level becomes impossible. It follows that we have to move beyond user control, and we must zoom out to examine how the data is being used.

Fred Cate was among the first to propose this — looking beyond collection to see what is actually being done with the data, and the potential harms involved. I have also argued for a long time on the need for this shift, noting in 2007 that the effectiveness of a privacy framework is tested better by asking “are individuals actually well served?” rather than “have certain processes been met?”.

It is significant that Scott Charney of Microsoft is also advocating such an approach.

So far I have spoken about the need for a shift away from the traditional privacy protection model, with its emphasis on user notice and consent. I believe that data usage is the name of the game now. What might this practically look like? I will suggest a possible way forward in my next post, stay tuned for part 2.