Politics / Media

Are You Living in a Filter Bubble? Here Is the Test

Photo by Clem Onojeghuo on Unsplash

I live in a filter bubble.
That doesn’t surprise me.

Many people don’t know what to make of the term — it’s just used far too often. A filter bubble is primarily a digital place where unpleasant opinions, facts, and ideas are filtered out.

All that remains is what you already agree to.

A filter bubble can arise anywhere. On YouTube, Facebook, and Twitter — maybe even your circle of friends can be one.
On the Internet, it’s possible thanks to algorithms.

Digital platforms identify what you like — and show you more of what you might like.

To be honest, it seems like I live in such a filter bubble.
I noticed it in this little test.

Are you talking about a group you don’t know personally?

Never have words spread so quickly as they do today.

We speak of complex groups in one word. We say “feminists,” “liberals,” “Nazis,” “the rich,” and so on — of course, that’s a bit superficial.

I always notice a funny thing about this: Only those who speak negatively about the groups use the terms I just mentioned. The conservatives or feminists in their filter bubble don’t even have to call themselves that — their affiliation is clear to everyone.

Conversely, conservatives talk about “the feminists,” and feminists talk about “the conservatives.”

I also like to talk superficially about whole groups. Often I don’t know a single person who belongs to the group. I only judge based on what I know from the filter bubble.

A simplified image is created of huge, complex groups — and sometimes hatred is stoked.

If you think negatively of a homogeneous group or idea when hearing individual terms, you’re probably stuck in a filter bubble.

Name three political influencers.

Many of our political ideas stem from individuals—politicians, authors, speakers, presenters, etc. The political landscape is vast, with many prominent names on all political sides.

Thanks to social media, we can connect with them everywhere. Short messages on Twitter, videos on YouTube, or whole discussions in the form of podcasts.

This is where the algorithms come into play again.
Once we find people we like, we will see them again and again.
Suddenly, our only source of information is a small circle of people.

When I name three political influencers, I notice one thing: they almost all share the same opinions. A promising sign that I live in the filter bubble.

Have you ever shared fake news?

The filter bubble not only ensures that we see the same ideas over and over again. It also ensures that we are presented with supposed facts without critical examination.

Time and again, I notice something blatant: In one political camp, a “fact” makes the rounds.

Immediately, the opposing parties jump on it and try to refute it.
Often they succeed — what is circulating is fake news.

Nevertheless, this false information continues to spread on the Internet for days or even weeks — at least in the one-sided spaces on the Internet.

It seems that each of the parties has its own Internet.

I already shared ridiculously false information.
This happened because I was trapped in my filter bubble and believed what I saw so often. A cognitive distortion is probably responsible for this.

We are talking about the repetition bias¹. We are more likely to believe what we repeatedly hear — especially when it comes from different sources.

The problem: In the age of fast information, one source is often the only one. One newspaper article gets thousands of people tweeting — yet there is only one real source behind it.

If you’ve ever spread false information, that’s a good sign that you’re living in a filter bubble.

The way out of the filter bubble

You may have just realized that you’re living in a filter bubble.
The good news is that there is a way out.

Pull, instead of push media.

The kind of digital media we use most is pushing media. They are apps that show us tailored feeds and notifications. They would thrive on delivering us what we should see — even if we weren’t looking for it. Twitter or YouTube can be examples of this. They suggest content to us according to the algorithm.

When it comes to forming opinions, we shouldn’t leave anything to the algorithms. I’ve gotten out of the habit of checking my Twitter feed. If I want to find out something about a topic, I search for it on the Internet on my own — and try to look at different media sites.

The opposite side.

I have tried an exciting experiment.

On YouTube, Twitter, Instagram, and elsewhere, I have subscribed to content creators whose opinions I don’t like. Now, content I disagree with is suggested to me.

I noticed that my view is often too one-sided. The opposite side also has good arguments.

By including a bit of contra in your preferences, you confuse the algorithms. The chances are now good that you will get a more balanced report.

Think, research, share.

The filter bubble has often made me not believe my eyes.

Intelligent people have shared things that are so moronic and wrong. Statements, graphics, and photos are blindly shared — without context and without checking whether they are correct.

Everything can be so simple.
Thank God there are many fact-checkers on the internet. If you find something provocative, google it. Often you will find that what you have seen is complete nonsense.

Sources

[1]: https://psychology.wikia.org/wiki/Repetition_bias

Finances, Programming & Psychology. Figuring out life, one idea at a time.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store