Manipulating Facebook’s filter bubble

The LikeThe other week, articles which address manipulating Facebook’s News Feed emerged. First out was Mat Honan, a Wired journalist who made the conscious choice of liking everything he saw on Facebook for 48 hours. Obvious risks include liking someone’s funeral or endorsing political fundamentalism, but some minor incidents aside Honan completed the experiment without being unfriended by his peers. He did, however, succeed in altering the News Feed, and not for the better:

My News Feed took on an entirely new character in a surprisingly short amount of time. After checking in and liking a bunch of stuff over the course of an hour, there were no human beings in my feed anymore. It became about brands and messaging, rather than humans with messages.

This has partly to do with Facebook’s changed “Page” policy. A couple of years ago brands could still count on reaching their fans trough their Facebook pages as long as their fans liked the page. Although the posts were not showed to all people who liked the page, a significant percentage did. In a highly controversial move Facebook decreased the reach of Page posts and introduced “boosting” of posts. By boosting brands get increased reach for a post by paying a one-off fee, which depends on the size of the audience the promoter wishes to reach. In other words, by paying for posts brands can guarantee that their updates will show up in the News Feed. If you, such as Honan, decide to like everything you see on Facebook, your News Feed will become filled with branded content since brands can boost posts and people can’t.

Sometimes you might have noticed that your News Feed is very different on your phone and on your desktop. This is partly because Facebook needs the same amount of ad impressions on mobile, yet there is less space to display posts. The result? More ads on mobile:

I was also struck by how different my feeds were on mobile and the desktop, even when viewed at the same time. By the end of day one, I noticed that on mobile, my feed was almost completely devoid of human content. … Yet on the desktop—while it’s still mostly branded content—I continue to see things from my friends.

A third consequence of Honan’s test was that he received political messages from both sides of the spectrum, but not in a balanced way. Rather, his News Feed was simultaneously both on the far-right and the far-left. In a way, Honan succeeded in bursting the filter bubble, but the result was still an alarmingly narrow worldview.

In a different experiment, Elan Morgan quit liking things on Facebook for two weeks. Instead of liking, Morgan would simply comment on issues she thought were worthy of, well, a like. According to Morgan, her News Feed became both better and more humane. One of her conclusions was that feeding the algorithm is actually worse than abstaining entirely.

You would think that liking certain updates on Facebook would teach the algorithm to give you more of what you want to see, but Facebook’s algorithm is not human. The algorithm does not understand the psychological nuances of why you might like one thing and not another even though they have comparatively similar keywords and reach similar audiences.

The “Like” is a blunt instrument created for crude profiling that has, to a large extent, become a victim of what it tried to eradicate: nuances of communication. By devoiding us of the choice to dislike posts, we are left with repurposing the Like, which algorithms are not able to grasp. If the Like is useless in shaping our News Feed according to our interests, what good does it do? Some (anecdotal) evidence suggests that our Page likes might make the Facebook algorithm better at showing ads. By liking your favourite authors, movies and musicians, you might just get more relevant advertising. The downside? A News Feed consisting of nothing but branded content.

Advertisements