Can we gain from the enormous economic benefits of Big Data while maintaining privacy? Is it time for an ethical approach to Search and Personalisation?

We have a choice in front of us:  Big Data is emerging as one of THE Big Issues.

It has immense potential to provide us with economic gain, offer individuals free and made-just-for-them services, drive innovation and much, much more.

So where is the catch?

And yes, there is a catch or two.  Just like so many 'too good to be true' stories, we need to be careful that this one too doesn't end up that way.

Here are three evidence points.

The first is a Roundtable on The Economics of Personal Data and Privacy: 30 Years after the OECD Privacy Guidelines that was hosted by the OECD in Paris in December 2010.  I was privileged to attend.  The report on the Roundtable is available and should be read widely. The arguments put forward in favour of Big Data were impressive.  The most famous is the oft quoted story of Google's study of how user Search terms could be used to 'crowd source' the outbreak of flu even before the authorities could spot it - Flutrends.

The second is Will a Crackdown on Privacy Kill Big Data Innovation, a blog by Derrick Harris on 16 May.  He quotes Big Data: The next frontier for innovation, competition, and productivity by the McKinsey Global Institute as identifying six issues facing policymakers:

  1. Build human capital for big data.
  2. Align incentives to promote data sharing for the greater good.
  3. Develop policies that balance the interests of companies wanting to create value from data and citizens wanting to protect their privacy and security.
  4. Establish effective intellectual property frameworks to ensure innovation.
  5. Address technology barriers and accelerate R&D in targeted areas.
  6. Ensure investments in underlying information and communication technology infrastructure.

Of these, he identifies the third as the critical issue.  I would agree.

And the third evidence point?

Have a look at What Is The Internet Hiding in which Eli Pariser gives a TED presentation that shows us how tailored experiences on the internet is leading to censorship and filtering that is so subtle that most of us don't even notice it.

But importantly, Eli goes on to show that this is not a new phenomenon.  It is just an expression of previous behaviour in a new medium.

In particular, he points out that the media barons used to (still do?) have ways of filtering what we read.  He notes that before World War I, there was an emerging realisation that journalism and the practices of editors were influencing a product of great social importance and influence.

He notes that this was a human process for which a human solution was developed: an ethical framework.  We may argue about its efficacy, but at least it has made things better.

So, he asks when will we develop an ethical framework for the algorithms that deliver us our made-just-for-us services based on Big Data?

And like the best thinkers, he goes on to put up his proposed solution:

Eli proposes that algorithms in Search, Social Networks and elsewhere should ensure that we see results that are:

  • relevant
  • important
  • uncomfortable
  • challenging
  • other points of view

This is an important contribution: it suggests that there is more to the appropriate and inappropriate use of Big Data than privacy and the right to be let alone made popular by Warren & Brandeis in 1890 in The Right to Privacy.  There is also a huge risk that we won't even know what we don't know as we live our lives in a Filter Bubble.  For more, read his book The Filter Bubble: What the Internet is Hiding from You.

Pariser was given a standing ovation for his talk.  So he should.

Can we gain from the enormous economic benefits of Big Data while maintaining privacy and not fall into its pitfalls? Is it time for an ethical approach to the algorithms behind Search and Personalisation?