Data power conference (June, 22-23), part 1: Disconnect

I recently attended and presented at “Data Power” in what turns out was an excellent conference organized by the University of Sheffield. The conference had called upon academics to submit papers that approached the question of big data from a  societal (& critical) perspective. That being said, the conference papers were more often than not empirically founded and the presenters refrained from adapting a conspiratorial mindset, which might sometimes be the case when discussing big data.

Here are some of the key points that I picked up from attending the different panels:

Disconnect & Resignation / tradeoff fallacy

Stefan Larsson (Lund University) and Mark Andrejevic (Pomona College) both stressed that there is a disconnect between commercial claims that people happily trade their privacy for discounts and services and how people actually feel. In reality, people feel that they are “forced or bribed” to give up their data in order to access a service. Joseph Turow, Michael Hennessy and Nora Draper have recently published a survey on what they call the “tradeoff fallacy” which supports the disconnect and resignation hypothesis put forth by Larsson and Andrejevic.

Access rights are rarely respected

Clive Norris (University of Sheffield) and Xavier L’Hoiry (University of Leeds) had investigated if companies or the public sector (data controllers) actually respect that people have the right to access their own data according to current data protection legislation. Turns out, they don’t:

• “20 % of data controllers cannot be identified before submitting an access request;
• 43 % of requests did not obtain access to personal data;
• 56 % of requests could not get adequate information regarding third party data sharing;
• 71 % of requests did not get adequate information regarding automated decision making processes.”

Instead, the controllers consulted applied what Norris & L’Hoiry call “discourses of denial”, either questioning the rights themselves (we do not recognize them), falsely claiming that only law enforcement would have access to this data or even claiming that the researches were insane to make such a claim (why would you possibly want this information?). The most common response was, however, none at all. Deafening silence is an effective way to tackle unpopular requests.

Self-management of data is not a workable solution

Jonathan Obar (University of Ontario Institute of Technology & Michigan State University) showed that data privacy cannot possibly be better protected through individual auditing of how companies and officials use your personal data, calling this approach a “romantic fallacy”.

Even if data controllers would respect the so-called ARCO rights (access to data, rectification of data, cancellation of data & objection to data processing), it is far too difficult and time-consuming for regular citizens to manage their own data. Rather, Obar suggests that either data protection authorities (DPAs) or private companies would oversee how our data is used, a form of representative data management. The problem with this solution is of course the significant resources it would require.

There is no such thing as informed consent in a big data environment

Mark Andrejevic emphasized that data protection regulation and big data practice are based on opposing principles: big data on data maximization and data protection on data minimization. The notion of relevance does not work as a limiting factor for collecting data, since the relevance of data is only determined afterwords by aggregating data and looking for correlations. This makes informed consent increasingly difficult: what are we consenting to if we do not know the applications of the collection?

Advertisements

Behavioural advertising – Always Be Creeping

There’s a new business logic which permeates most of today’s online commerce. The ABC is no longer Always Be Closing, it’s Always Be Creeping.

But even as behavioural advertising evolves and targeting becomes more sophisticated, sometimes companies may wish to be subtler when offering targeted ads to consumers. In a much-cited New York Times article from 2012, a former employee of Target said that

[W]e started mixing in all these ads for things we knew pregnant women would never buy, so the baby ads looked random. We’d put an ad for a lawn mower next to diapers. We’d put a coupon for wineglasses next to infant clothes. That way, it looked like all the products were chosen by chance. And we found out that as long as a pregnant woman thinks she hasn’t been spied on, she’ll use the coupons. She just assumes that everyone else on her block got the same mailer for diapers and cribs. As long as we don’t spook her, it works. 

Tene and Polonetsky (2013) argue that it’s not the data collection itself which is creepy, but how statistical analysis is used to come to certain conclusions about you.

This is especially the case when “offline” purchases are combined with information on online behaviour, a practice referred to as “onboarding”. We have grown accustomed to personalised ads based on web browsing or Facebook likes, but today’s marketers want a complete picture of our everyday transactions as well.

Whether or not one sees this as invasive is up to each and everyone to decide, but one can bear in mind that one of the industry’s lead data brokers, Acxiom, has “information [on] about 700 million consumers worldwide with over 3000 data segments for nearly every U.S. consumer (FTC report, 2014).” Combined, the biggest data brokers have billions and billions of records on people and businesses.

In their defence, the Digital Advertising Alliance does offer consumers a choice to opt out of data tracking. If consumers know that such an option exists is another question entirely, and the registry only covers companies which have agreed to participate. In the end, such self-regulatory measures directed towards consumers are ineffective, as the most privacy-conscious are likely to use other means to conceal their actions online whereas the vast majority of people are unaware that such options exist.

 References

Federal Trade Commission, 2014: DATA BROKERS: A Call for Transparency and Accountability. 

Tene, Omer and Polonetsky, Jules, 2013: A Theory of Creepy: Technology, Privacy and Shifting Social Norms [September 16, 2013]. Yale Journal of Law & Technology, 2013. Available at SSRN:http://ssrn.com/abstract=2326830.