The Filter Bubble – Facebook’s Booster For Radicalism

Sharing is caring!

This article about the Filter Bubble has been a long time in the making. It took a lot of thought to hopefully find the right words. The problem with this one is that it touches on politics, and this blog shouldn’t be political. Sometimes, you just can’t help it…

Social media gave everyone a voice – sometimes with a viral boosting mechanism. This is all good and great as long as the right stuff gets boosted… ok, this came across a little strange. It’s not about the right stuff being boosted.

It’s about the crap not getting boosted. Like racism, hate and illegal content for instance.

Why This Is Relevant

Wars and unbearable situations have led to an increase in refugees all over Europe. And with the refugees, criticism (which is ok!) and pure hate (which is never ok!) grow within the population.


Before you read on - we have various resources that show you exactly how to use social networks to gain massive traffic and leads. For instance, check out the following:

FREE Step-by-Step Twitter Marketing Guide
FREE Pinterest Marketing Ebook

A long time ago this type of hate wasn’t visible as much – but today we’ve got social media sites. Facebook and Twitter – the most popular networks for personal opinions here in Germany – are suddenly crowded with expressions of racism, anti-democratic statements and manifestations of pure hate.

map-of-the-world-1005413_1280This is not a German only problem – quite the contrary. This is happening all over Europe and all over the world. Whenever challenging situations require whole nations to find working solutions, radicalism and hate surfaces. And elections are always just around the corner – leading politicians to believe they can gather a few votes by making public statements close to those that some (few!) people want to hear. Whole new parties get founded and start collecting a lot of votes in a very short time.

This post is not about saying which opinions are legitimate and which are not. That’s not for me to decide. Everyone needs to decide what he wants to believe and what not.

Fear leads to hate – but fear is also a good way to gather people around you. Nothing new – but a little more dangerous today than in times without social networks.

What is new in today’s world is that everyone also needs to make a decision about what to say publicly. Social Media gave everyone a public voice. But on social media, radicals often have a louder voice than they should have.

The Filter Bubble

That Facebook uses an algorithmic approach to showing content and posts to people isn’t anything new. Deciding what to show to a user based on his preferences, his surroundings, and his everyday actions seems to be a very sensible approach to the problem – at least usually. But there are a few problems with this sort of approach.

The problem is that all of us live in a sort of bubble on Facebook. Facebook shows us what we want to see And what is (or seems to be) popular around us.

(I didn’t come up with the term Filter Bubble – Eli Pariser did. And he is someone who knows how viral content works – he is a cofounder of a site called Upworthy – which made a business model out of producing viral content consistently. You can listen to his TED talk on the phenomenon here: Beware Of Filter Bubbles.)

For Facebook, this may work out pretty satisfying. The problem with the approach that is currently surfacing is that this also makes content and interactions on Facebook somewhat automatically censored to just what provokes reactions.


Hey, before you read on - we have in various FREE in-depth guides on similar topics that you can download. For this post, check out:

FREE workbook: CREATE AWESOME BLOG POSTS
FREE Beginner's Guide: START A BLOG

This starts at a harmless level – short, provocative content does provoke more reactions as an in depth piece. Therefore, the type of content that you find on BuzzFeed or Upworthy provokes a lot of reactions while in depth content pieces around the same topics is usually far less popular on social media than a funny video on BuzzFeed. But that is ok.

The filter bubble becomes a problem when it boosts radical political opinions – or suppresses the counter reactions.

A quick story from my own surroundings: A couple of weeks ago someone I know posted a borderline racist post around the refugee crisis, linking to a certain article. It wasn’t conscious racism – more like stupidity mixed with seemingly simple solutions to refugee problems that were – well, too simple to be real. I won’t go into detail here.

(This is not an excuse!)

The comment wasn’t nice to see, but strangely it had a couple of Likes and even positive comments. Without these I would have let it go – so I wrote a comment and went away.

It took about half an hour – then I had two answers. The original poster, now provoked, posted exceedingly stupid bullshit. And someone else, making a comment about how sad it was that “one wasn’t able to state obvious things anymore without being called a Nazi”. I hadn’t called anyone a Nazi. I hadn’t even pointed out the right wing nature of the original post. I had made a comment about stupidity.

I got angry – I wrote several drafts of a reply, then I gave up.

The Filter Bubble had me: I didn’t have the post in my feed originally (I had found while killing time by looking up what friends were doing lately) – because Facebook knew I wouldn’t like it. It was shown to people who would probably like the post – with the linked article providing enough base material for Facebook to make algorithmic estimates.

Learn to use social media to reach a targeted audience for your business with the ebook “The Social Traffic Code!”

traffic-code

Why The Filter Bubble Makes Societies More Radical

Facebook “estimated” I wouldn’t react to this post, or at least not react positively. Other’s would see it – because Facebook would estimate they would. And the original poster gets instant positive feedback.

If you find a comment like this on the social web, you will have a hard time discussing. Which is why many people decide that the discussion wouldn’t be worth it.

Like I did. But I was wrong! Not challenging these posts and comments means legitimizing the original posts.

But one of the problems is that Facebook is making it very easy to state agreement (press like), but stating disagreement is always hard – you have to write a comment, which means you have to put in thought. Facebook might have emoticons now – but it doesn’t provide a dislike button. There is no vomiting emoticon. One of the reasons why Facebook is such a rewarding experience for many people is that it almost exclusively provides positive reactions.

Which leads people to believe that a majority of people agrees with their opinions or posts. Which in turn makes them continue to post – with more enthusiasm.

And then others see this as well. Some will be disgusted; some will agree, and some will change their opinions seeing the seemingly popular opinions spreading through the social web.

Social Media Marketers have used these mechanisms for years… Now politicians have joined the crowd. In Germany, as well as in other countries. Donald Trump has huge success on social media – and whatever you think of him, you cannot deny that this is to a high degree due to posting simple but provocative comments to complex problems and discussions.

These radical right-wing politicians use the same principles as terror organizations and islamic groups – they are growing a filter bubble around their political views.

So, It’s All Facebook’s Fault, Right?

No – it’s not. Facebook needs filtering – there is no way around it. And no filter is perfect.

Facebook creates and develops exactly the social network that we as a society demand. We might not say that we want a filtered newsfeed – but Facebook has measured our reactions, and it constantly develops it’s network in the way that we react best to.

We could also demand that Facebook deletes certain posts – and sometimes this may make sense. Obviously illegal posts can be deleted. Hate could be removed. Would this really help?

It would. Sometimes. In Germany, there is a discussion about this. Truly illegal content in media outlets is actively prosecuted – and illegal means offensive, right-wing radical political views, promotion of illegal symbols (like the swastica) or promotion of illegal activities like drug abuse or similar things. (Not included are nipples.)

So, a lot of German’s, including lawyers and politicians demand to view Facebook as a traditional media outlet and demand it to remove and be prosecuted for illegal content. I can see that this would sometimes make the situation less dire.

But it is only healing a few symptoms; it’s not a cure.

This might prevent certain comments from spreading virally – but not all of them, because a lot of hateful comments are not, strictly speaking, illegal. It also gives the impression that if someone posts illegal content on Facebook that someone is safe – at best, Facebook will be prosecuted.

So, in some ways, this makes the problem worse, not better.

Part of the problem is: People have somehow started to believe that they are safe on social media. In the beginning, these sort of things would only get posted from fake and anonymous profiles. Today, hate and racism get posted from real profiles and real identities. This wouldn’t change with prosecuting Facebook – instead, this would lead to the feeling of being censored unfairly. Censorship and “lying” media is already one of the cornerstones of the current rise of right-wing radicalism, worldwide. The Filter Bubble makes this trend worse.

Don’t Believe Me? Here Is Facebook’s Own Research:

And it’s not that Facebook doesn’t do anything to investigate the Filter Bubble: In 2015, they published a study about the phenomenon – and proved that it is not as dire as some have feared (or at least it would seem so at first sight). On average, you only find around 6 percent more “hard news” content on Facebook that is favored by your political “side” than the other “side”. You can find Eli Pariser’s fun findings on this here and, more importantly, his main arguments against the conclusions here.

But that is only part of the story. If you go into more detail with this study, you will find that the problem is very pressing: While the 6% number from above may hold true for the average Facebook user, the number is significantly higher the more radical people get. If you’re the kind of guy who will post “Refugees should be shot”, you will probably live in a Filter Bubble – which means you are far less likely to receive a counter argument as if you would believe in more liberal views.

Also, the type of your personal Filter Bubble is also directly related to the amount of training you put into Facebook – the more you choose to click on posts, like posts and promote posts, the more your feed get’s filtered. Your bubble gets smaller. If you don’t believe me, then read this Wired story from 2014: I Liked Everything On Facebook For 2 Days – Here Is What It Did To Me by Mat Mohan… Pretty scary.

Also – Facebook stressed the fact your circle of friends already shields you from content you don’t want to see to a high degree. But is that good? Or is that part of the problem? Social behaviour means that many people will either hide content from friends with radical or offensive views – or unfriend them. But this creates even more limited filter bubbles. The more people who unfriend a radical person the far less likely this person is to receive countering or challenging comments. Which means that the percentage of agreement to this person rises according to his circle of friends.

People will also unfriend people who don’t agree to their posts – and I’ve been on the receiving end of this behaviour.

The Real Problem – We’ve Come To Rely On Filtering Too Much

Remember when I said that Facebook is not solely responsible? If not, who is responsible – at least to a certain degree?

We are.

Humans want filtered content. We’ve always been that way. If you went to a bookshop in the old days, you asked about the contents to find out whether you were going to enjoy the book. Similarly, Amazon recommends books and products that you are probably going to enjoy in the modern age.

The same happens with news, political views and offensive content.

How the Filter Bubble increases reactions on radical views on social media and boosts these virally. What can we do about that?By relying on machines to give us the best content for us, we rely on mechanical processes. Which means that we, ourselves, need to apply some human thoughts and feelings to our selection process ourselves. Social Media sites and search engines are tools – and tools can always be used for good and bad. As a society, we need to show we are adults enough to use these tools.

This is also not just a Facebook related problem – personalisation of search results on Google can also lead to Filter Bubbles. Internet based companies are obsessed with giving you the content you want – not the content you need. If we don’t wake up and demand the content we need, we are not going to get it.

That means we need to still do some filtering ourselves: We need to start reacting to posts that offend. We need to state disagreement. We need to right some wrongs once in a while. Like in real life, we are responsible for what we do on social media – and that includes what we don’t do. Whenever we keep quiet, we strengthen the Filter Bubble.

Facebook – and others – adapt to our behavior. Not the other way around. We need to show Facebook, that we want to interact with even the stuff we would like not to see at all. Because not seeing it is not a solution to the core problem.

That means we cannot keep quiet.

There is that quote by Edmund Burke: “The only thing necessary for the triumph of evil is for good men to do nothing.”

Maybe today it’s more like: … is for good people to stay quiet.

[wd_hustle id=”twitter-cheat-sheet” type=”embedded”]

 

Sharing is caring!