The Filter Bubble – Facebook’s Booster For Radicalism

This article about the Filter Bubble has been a long time in the making. It took a lot of thought to hopefully find the right words. The problem with this one is that it touches on politics, and this blog shouldn’t be political. Sometimes, you just can’t help it…

Social media gave everyone a voice – sometimes with a viral boosting mechanism. This is all good and great as long as the right stuff get’s boosted… ok, this came across a little strange. It’s not about the right stuff being boosted.

Click Here To Download A Free Ebook: “23 Stats About Social Media You Should Know In 2016”

It’s about the crap not getting boosted. Like racism, hate and illegal content for instance.

Why This Is Relevant

Wars and unbearable situations have lead to an increase in refugees all over Europe. And with the refugees, criticism (which is ok!) and pure hate (which is never ok!) grow within the population.

A long time ago this type of hate wasn’t visible as much – but today we’ve got social media sites. Facebook and Twitter – the most popular networks for personal opinions here in Germany – are suddenly crowded with expressions of racism, anti-democratic statements and manifestations of pure hate.

map-of-the-world-1005413_1280This is not a German only problem – quite the contrary. This is happening all over Europe and all over the world. Whenever challenging situations require whole nations to find working solutions, radicalism and hate surfaces. And elections are always just around the corner – leading politicians to believe they can gather a few votes by making public statements close to those that some (few!) people want to hear. Whole new parties get founded and start collecting a lot of votes in a very short time.

This post is not about saying which opinions are legitimate and which are not. That’s not for me to decide. Everyone needs to decide what he wants to believe and what not.

Fear leads to hate – but fear is also a good way to gather people around you. Nothing new – but a little more dangerous today than in times without social networks.

What is new in today’s world is that everyone also needs to make a decision about what to say publicly. Social Media gave everyone a public voice. But on social media, radicals often have a louder voice than they should have.

The Filter Bubble

That Facebook uses an algorithmic approach to showing content and posts to people isn’t anything new. Deciding what to show to a user based on his preferences, his surroundings, and his everyday actions seems to be a very sensible approach to the problem – at least usually. But there are a few problems with this sort of approach.

The problem is that all of us live in a sort of bubble on Facebook. Facebook shows us what we want to see And what is (or seems to be) popular around us.

(I didn’t come up with the term Filter Bubble – Eli Pariser did. And he is someone who knows how viral content works – he is a cofounder of a site called Upworthy – which made a business model out of producing viral content consistently. You can listen to his TED talk on the phenomenon here: Beware Of Filter Bubbles.)

For Facebook, this may work out pretty satisfying. The problem with the approach that is currently surfacing is that this also makes content and interactions on Facebook somewhat automatically censored to just what provokes reactions.

This starts at a harmless level – short, provocative content does provoke more reactions as an in depth piece. Therefore, the type of content that you find on BuzzFeed or Upworthy provokes a lot of reactions while in depth content pieces around the same topics is usually far less popular on social media than a funny video on BuzzFeed. But that is ok.

The filter bubble becomes a problem when it boosts radical political opinions – or suppresses the counter reactions.

A quick story from my own surroundings: A couple of weeks ago someone I know posted a borderline racist post around the refugee crisis, linking to a certain article. It wasn’t conscious racism – more like stupidity mixed with seemingly simple solutions to refugee problems that were – well, too simple to be real. I won’t go into detail here.

(This is not an excuse!)

The comment wasn’t nice to see, but strangely it had a couple of Likes and even positive comments. Without these I would have let it go – so I wrote a comment and went away.

It took about half an hour – then I had two answers. The original poster, now provoked, posted exceedingly stupid bullshit. And someone else, making a comment about how sad it was that “one wasn’t able to state obvious things anymore without being called a Nazi”. I hadn’t called anyone a Nazi. I hadn’t even pointed out the right wing nature of the original post. I had made a comment about stupidity.

I got angry – I wrote several drafts of a reply, then I gave up.

The Filter Bubble had me: I didn’t have the post in my feed originally (I had found while killing time by looking up what friends were doing lately) – because Facebook knew I wouldn’t like it. It was shown to people who would probably like the post – with the linked article providing enough base material for Facebook to make algorithmic estimates.

Why The Filter Bubble Makes Societies More Radical

Facebook “estimated” I wouldn’t react to this post, or at least not react positively. Other’s would see it – because Facebook would estimate they would. And the original poster gets instant positive feedback.

If you find a comment like this on the social web, you will have a hard time discussing. Which is why many people decide that the discussion wouldn’t be worth it.

Like I did. But I was wrong! Not challenging these posts and comments means legitimizing the original posts.

But one of the problems is that Facebook is making it very easy to state agreement (press like), but stating disagreement is always hard – you have to write a comment, which means you have to put in thought. Facebook might have emoticons now – but it doesn’t provide a dislike button. There is no vomiting emoticon. One of the reasons why Facebook is such a rewarding experience for many people is that it almost exclusively provides positive reactions.

Which leads people to believe that a majority of people agrees with their opinions or posts. Which in turn makes them continue to post – with more enthusiasm.

And then others see this as well. Some will be disgusted; some will agree, and some will change their opinions seeing the seemingly popular opinions spreading through the social web.

Social Media Marketers have used these mechanisms for years… Now politicians have joined the crowd. In Germany, as well as in other countries. Donald Trump has huge success on social media – and whatever you think of him, you cannot deny that this is to a high degree due to posting simple but provocative comments to complex problems and discussions.

These radical right-wing politicians use the same principles as terror organisations and islamic groups – they are growing a filter bubble around their political views.

So, It’s All Facebook’s Fault, Right?

No – it’s not. Facebook needs filtering – there is no way around it. And no filter is perfect.

Facebook creates and develops exactly the social network that we as a society demand. We might not say that we want a filtered newsfeed – but Facebook has measured our reactions, and it constantly develops it’s network in the way that we react best to.

We could also demand that Facebook deletes certain posts – and sometimes this may make sense. Obviously illegal posts can be deleted. Hate could be removed. Would this really help?

It would. Sometimes. In Germany, there is a discussion about this. Truly illegal content in media outlets is actively prosecuted – and illegal means offensive, right-wing radical political views, promotion of illegal symbols (like the swastica) or promotion of illegal activities like drug abuse or similar things. (Not included are nipples.)

So, a lot of German’s, including lawyers and politicians demand to view Facebook as a traditional media outlet and demand it to remove and be prosecuted for illegal content. I can see that this would sometimes make the situation less dire.

But it is only healing a few symptoms; it’s not a cure.

This might prevent certain comments from spreading virally – but not all of them, because a lot of hateful comments are not, strictly speaking, illegal. It also gives the impression that if someone posts illegal content on Facebook that that someone is safe – at best, Facebook will be prosecuted.

So, in some ways, this makes the problem worse, not better.

Part of the problem is: People have somehow started to believe that they are safe on social media. In the beginning, these sort of things would only get posted from fake and anonymous profiles. Today, hate and racism get posted from real profiles and real identities. This wouldn’t change with prosecuting Facebook – instead, this would lead to the feeling of being censored unfairly. Censorship and “lying” media is already one of the cornerstones of the current rise of right-wing radicalism, worldwide. The Filter Bubble makes this trend worse.

Don’t Believe Me? Here Is Facebook’s Own Research:

And it’s not that Facebook doesn’t do anything to investigate the Filter Bubble: In 2015, they published a study about the phenomenon – and proved that it is not as dire as some have feared (or at least it would seem so at first sight). On average, you only find around 6 percent more “hard news” content on Facebook that is favored by your political “side” than the other “side”. You can find Eli Pariser’s fun findings on this here and, more importantly, his main arguments against the conclusions here.

But that is only part of the story. If you go into more detail with this study, you will find that the problem is very pressing: While the 6% number from above may hold true for the average Facebook user, the number is significantly higher the more radical people get. If you’re the kind of guy who will post “Refugees should be shot”, you will probably live in a Filter Bubble – which means you are far less likely to receive a counter argument as if you would believe in more liberal views.

Also, the type of your personal Filter Bubble is also directly related to the amount of training you put into Facebook – the more you choose to click on posts, like posts and promote posts, the more your feed get’s filtered. Your bubble gets smaller. If you don’t believe me, then read this Wired story from 2014: I Liked Everything On Facebook For 2 Days – Here Is What It Did To Me by Mat Mohan… Pretty scary.

Also – Facebook stressed the fact your circle of friends already shields you from content you don’t want to see to a high degree. But is that good? Or is that part of the problem? Social behaviour means that many people will either hide content from friends with radical or offensive views – or unfriend them. But this creates even more limited filter bubbles. The more people who unfriend a radical person the far less likely this person is to receive countering or challenging comments. Which means that the percentage of agreement to this person rises according to his circle of friends.

People will also unfriend people who don’t agree to their posts – and I’ve been on the receiving end of this behaviour.

The Real Problem – We’ve Come To Rely On Filtering Too Much

Remember when I said that Facebook is not solely responsible? If not, who is responsible – at least to a certain degree?

We are.

Humans want filtered content. We’ve always been that way. If you went to a bookshop in the old days, you asked about the contents to find out whether you were going to enjoy the book. Similarly, Amazon recommends books and products that you are probably going to enjoy in the modern age.

The same happens with news, political views and offensive content.

How the Filter Bubble increases reactions on radical views on social media and boosts these virally. What can we do about that?By relying on machines to give us the best content for us, we rely on mechanical processes. Which means that we, ourselves, need to apply some human thoughts and feelings to our selection process ourselves. Social Media sites and search engines are tools – and tools can always be used for good and bad. As a society, we need to show we are adults enough to use these tools.

This is also not just a Facebook related problem – personalisation of search results on Google can also lead to Filter Bubbles. Internet based companies are obsessed with giving you the content you want – not the content you need. If we don’t wake up and demand the content we need, we are not going to get it.

That means we need to still do some filtering ourselves: We need to start reacting to posts that offend. We need to state disagreement. We need to right some wrongs once in a while. Like in real life, we are responsible for what we do on social media – and that includes what we don’t do. Whenever we keep quiet, we strengthen the Filter Bubble.

Facebook – and others – adapt to our behaviour. Not the other way around. We need to show Facebook, that we want to interact with even the stuff we would like not to see at all. Because not seeing it is not a solution to the core problem.

That means we cannot keep quiet.

There is that quote by Edmund Burke: “The only thing necessary for the triumph of evil is for good men to do nothing.”

Maybe today it’s more like: … is for good people to stay quiet.

Click Here To Download A Free Ebook: “23 Stats About Social Media You Should Know In 2016”

Less Than $1000 in Revenues? No Marketing Budget? We Can Help!

A new kind of marketer is shaking up the world! With little to no budget, but the help of social media sharing and content marketing, small businesses and entrepreneurs spread their products and ideas like wildfire. Here is your chance: Scale your business from zero to infinity by learning The Social Traffic Code.

Tired Of Mailchimp? Looking For A Better Email Marketing Toolset?

We were too! We’ve gone through more email marketing automation tools than we can remember, and we finally ended up with Drip. We’ll never go back. And the best part? It’s not just great, it’s also affordable!

  • Ken

    Thank you for the article. I was subconsciously aware of the filter bubble but this puts it all in perspective. As individuals, we can choose to seek out only those who agree with us or seek engaged conversation with intelligent people with diverse viewpoints. Obviously, the filter bubble plays into this and makes it more difficult to seek out those who have diverse viewpoints. From the standpoint of a content creator, there is a huge issue here that you did not touch on. I recently had a friend who created a video and I shared it exclusively when it was created. The video later went viral, primarily on sites created by the far right. Am I now stuck in the bubble no matter what I post? How long until I break out of that bubble? I need to read the linked articles to learn more. Thanks for giving me yet another task for the day!

    • TheSocialMarketers

      Thank you for your comment.

      That is an interesting thought – Facebook should actually have a pretty good idea about your political preference, a single video you shared shouldn’t break the algorithm… but who knows?


  • laura routh

    That was an eye opener. I’m shy about disagreeing with friends from the past on facebook. I usually remain silent and move on to reading another post. Their racism is subtle, but present, nonetheless. I’ve made the excuse that I can’t change their views, but I see now what that does to content. I guess I figured that if no one clicked liked, the person would stop posting this type of content. These posts must be in my feed because I’ve clicked like on the animal videos. What I thought I was doing was rewarding her good qualities. I haven’t wanted to spend a lot of time on my personal facebook page as it distracts me from my work on my blog. If someone were racist on my blog facebook page, I would be quick to act, though. On twitter, I choose carefully who I follow back as I don’t want to see racist or hate content of any kind or ever be mistakenly affiliated with this kind of thinking. But you’ve given me something to think about regarding friends from the past. I’m going to re-read this post, and I must confess, I’m squirming a bit. I’ll need to sort out the difference between racism and opinion. Is she racist because she doesn’t agree completely with me on the issues? Sometimes the racism is implied as she’s against certain activist groups. So I will need to choose carefully when to respond, and when to let it go. Because my politics have changed drastically from when I was growing up with all of these people, I’ve considered not participating in my personal facebook page at all because I get so annoyed, and truthfully, I don’t know how important many of these “old friends” are to me anymore. So thank you for giving me food for thought. I’m most comfortable living in my more liberal leftish bubble. But is this really what’s best for promoting equality and the changes we need to make to solve the world’s problems? Isn’t this why politics has become a battleground in the US for extremists, too, without much middle ground? I have a voice; maybe I need to use it.

    • TheSocialMarketers

      Thank you for your comment!

      You bring up an interesting point which I avoided on purpose in the original article – the difference between racism and opinion. I guess this is a decision that everyone has to make on his own.

      The Filter Bubble prevents us from having discussions, though. And even when you simply disagree with an opinion – sometimes that maybe worth it as well.


      • laura routh

        Yes, I appreciate your insight. I probably shouldn’t have even attempted to tackle the subject in a comment. I’m sure this person, and others, don’t think that they are racist. Thanks for the tip on disagreeing.

  • Colin Bielckus

    Well worth reading. Thank you. Part of it is a function of the busy lives we now lead – if I stop to negatively comment about a post I disagree with I may not have time to positively react to a post that ought to be shared far and wide… Not sure how you can code something like that into an algorithm!

  • Camp Full Monte

    Great Post. I think we often suffer from the impacts of the filter bubble when trying to promote our off-grid, clothing optional, eco-campsite in Montenegro (google Camp Full Monte). It seems social media messages promoting our commitment and interest in: environmental issues; alternative technologies; body confidence and acceptance of non-sexual nudity; are not gaining the “exposure” (excuse the pun) we would hope for. Our various social media feeds have become dominated by posts from others in those areas which is great. We been able to access some quality information but passing on that information seems to be increasingly about preaching to the converted. The majority of our guests find us because they are looking for (googling) *any* campsite in Montenegro and not necessarily our unique selling points. A guest survey last year revealed that almost 60% of our guests were experiencing a clothing optional holiday for the first time. Of those over 80% said they would be likely or very likely to visit a clothing optional resort again. Despite this, it’s likely our social media promotions never reached them. Obviously we are adapting our social media presence accordingly but it does serve to reinforce your point. Of course, in relation to facebook, their tendancy to ban anything related to nudity (breast feeding, naturism, body acceptance, naked charity events) doesn’t help. This policy, when compared to their willingness to promote violence & hate is particularly annoying.

    • TheSocialMarketers

      That may be – I noticed that my own Facebook feed often puts me into a bubble surrounding social media and tech. Which is sad – even though I do a lot of stuff in that area, and most of my professional life surrounds these topics, I would still like to see everything else!