Is sensitive personal information becoming desensitized?

With all the hype surrounding big data, it should come as no surprise that people are worried about how their data is used.

"How concerned are you about the unnecessary disclosure of personal information?" Source: Eurobarometer 359, p. 59.
“How concerned are you about the unnecessary disclosure of personal information?” Source: Eurobarometer 359, p. 59.

Some data points have always been part of the modern bureaucratic state (see Giddens, 1985, for example). These are usually referred to as objective data points, which indicate facts such as births, deaths, age and income. Many of these data points are simply necessary to run a well-functioning state apparatus.

The reason why smartphones and social media is changing our relation to personal data is that subjective data points, such as what people read, what they search for, who they secretly stalk on Facebook, are much easier to come by than before. What’s more, people readily submit information on themselves on social networks that used to be hidden deep under the surface. There is a huge gap between what data protection officials think is sensitive information and what’s actually happening.

Fore example, the UK’s Information Commissioner’s Office defines sensitive personal data as something which has information on a) the racial or ethnic origin of a person, b) his/her political opinions or religious beliefs, c) his/her sexual life, among other things.

Facebook sees your sexual preferences, religious beliefs and political views as “basic info”. Not even “details about you”, but “basic”; information which is, undeniably, potentially very sensitive in many parts of the world.

The gist is that while marketers and companies are hoping to gather more and more sensitive information on potential customers, they really, REALLY don’t want to have their customer databases defined as collections of “sensitive data”. Because when that happens, you are suddenly forced to follow strict rules regarding what you can do with the information. Funnily enough, the best way to avoid that is not by refraining from collecting sensitive information, but rather by claiming that the information you have gathered is not on “real, identifiable people” but just “profiles”.

Giddens, Anthony (1985): The Nation-State and Violence: Volume Two of a Contemporary Critique of Historical Materialism. Cambridge: Polity Press.

Eurobarometer survey 359 on data protection


Manipulating Facebook’s filter bubble

The LikeThe other week, articles which address manipulating Facebook’s News Feed emerged. First out was Mat Honan, a Wired journalist who made the conscious choice of liking everything he saw on Facebook for 48 hours. Obvious risks include liking someone’s funeral or endorsing political fundamentalism, but some minor incidents aside Honan completed the experiment without being unfriended by his peers. He did, however, succeed in altering the News Feed, and not for the better:

My News Feed took on an entirely new character in a surprisingly short amount of time. After checking in and liking a bunch of stuff over the course of an hour, there were no human beings in my feed anymore. It became about brands and messaging, rather than humans with messages.

This has partly to do with Facebook’s changed “Page” policy. A couple of years ago brands could still count on reaching their fans trough their Facebook pages as long as their fans liked the page. Although the posts were not showed to all people who liked the page, a significant percentage did. In a highly controversial move Facebook decreased the reach of Page posts and introduced “boosting” of posts. By boosting brands get increased reach for a post by paying a one-off fee, which depends on the size of the audience the promoter wishes to reach. In other words, by paying for posts brands can guarantee that their updates will show up in the News Feed. If you, such as Honan, decide to like everything you see on Facebook, your News Feed will become filled with branded content since brands can boost posts and people can’t.

Sometimes you might have noticed that your News Feed is very different on your phone and on your desktop. This is partly because Facebook needs the same amount of ad impressions on mobile, yet there is less space to display posts. The result? More ads on mobile:

I was also struck by how different my feeds were on mobile and the desktop, even when viewed at the same time. By the end of day one, I noticed that on mobile, my feed was almost completely devoid of human content. … Yet on the desktop—while it’s still mostly branded content—I continue to see things from my friends.

A third consequence of Honan’s test was that he received political messages from both sides of the spectrum, but not in a balanced way. Rather, his News Feed was simultaneously both on the far-right and the far-left. In a way, Honan succeeded in bursting the filter bubble, but the result was still an alarmingly narrow worldview.

In a different experiment, Elan Morgan quit liking things on Facebook for two weeks. Instead of liking, Morgan would simply comment on issues she thought were worthy of, well, a like. According to Morgan, her News Feed became both better and more humane. One of her conclusions was that feeding the algorithm is actually worse than abstaining entirely.

You would think that liking certain updates on Facebook would teach the algorithm to give you more of what you want to see, but Facebook’s algorithm is not human. The algorithm does not understand the psychological nuances of why you might like one thing and not another even though they have comparatively similar keywords and reach similar audiences.

The “Like” is a blunt instrument created for crude profiling that has, to a large extent, become a victim of what it tried to eradicate: nuances of communication. By devoiding us of the choice to dislike posts, we are left with repurposing the Like, which algorithms are not able to grasp. If the Like is useless in shaping our News Feed according to our interests, what good does it do? Some (anecdotal) evidence suggests that our Page likes might make the Facebook algorithm better at showing ads. By liking your favourite authors, movies and musicians, you might just get more relevant advertising. The downside? A News Feed consisting of nothing but branded content.