Beating the filter bubble

Humans love having their opinions confirmed. It makes us feel part of a community; understood – even loved. Parallel to digitalisation and the automated filtering of the incredible amount of information available to us online, this human instinct becomes an instigator for what’s referred to as a digital filter bubble. 

Information consumption in 2020

If there’s one area where digitalisation and globalisation have made a truly profound impact, it’s in media. Newspapers have gone from analogue to digital. Journalists have gone from local to global. And news consumers have gone from reading a handful of publications to reading thousands. With the Internet, we have access to everything, everywhere. In fact, there is so much information at hand that we’re having a hard time figuring out how we’re going to process it.

The most obvious and well-documented effect of this development is the decrease in humans’ attention span, which follows digitalisation closely. And when people focus for a shorter period of time (down from 12 seconds to 8 between 2000 and 2013), it has a serious impact on how news is created – and distributed. Online users, as they’re bombarded with information 24/7, have decreased their attention span as a defense mechanism. So what do online platforms need to do in order to make sure people still read what they’re putting out there? Filter it.

What is a filter bubble?

In March 2011, Eli Pariser coined the term ‘filter bubble’ at the TED2011 event. He raised the question of what happens when the news consumer is no longer responsible for the information they consume. When the digital news feed we see isn’t by our own making, but by a computer’s – and is based on personal relevance, rather than objective importance. In his talk, Pariser identified algorithms as the main culprit in today’s filter bubble problem.

Algorithms are pieces of computer code put in place on various digital platforms. Their objective is to analyse the behaviour of the website’s visitors, determine which information is more relevant to each individual, and so create an output that is personalised. The problem with this is that we’re no longer in control over which information we see. And perhaps more importantly: A filter bubble also means that we don’t know which information we don’t see. The algorithm filters it out for us, leaving us none the wiser.

Your own subjective information feed

It’s easiest to understand filter bubbles in the context of social media feeds. Most of us have experienced it in practise on platforms like Facebook or Instagram. Let’s say you really like videos of animals. As you browse your social media account, you spend a lot more time looking at this content, than you do on, say, recipes. After some time, the platform’s algorithm picks up on your behaviour, and uses it to start filtering a personalised output for you.

Fast forward a few weeks, and you’re loving your social media feed. It’s packed with great animal videos that are exactly the kind you like, interspersed with updates from your best and funniest friends. The thing about a filter bubble is that the consumer often doesn’t realise or think about what happened to the content they used to see. In the case of animal videos and recipes, it’s also a fairly harmless phenomenon. But what happens when you only see political posts that support your own view? When you’re only shown the newspaper articles that focus on your three favourite topics – and nothing else? 

Personal filter bubble efforts

How to break the personal filter bubble

Even though Pariser made an important point back in 2011, in that algorithms certainly contribute to creating filter bubbles, it’s an overly simplified explanation. By blaming the computers, and computers alone, we also remove all personal responsibility where online information consumption is concerned. In reality, you can do a whole lot to free yourself from the digital echo chamber that’s built for you. The first step is recognising that there is no way to consume all the information that’s out there – and that, by extension, this means you’re probably seeing a very small part of the bigger whole in your news and information feeds.

One of the best ways of breaking the filter bubble is by actively seeking out source material that’s different to the one you’d usually look out for. Let’s say you lean towards liberal newspapers when you want to follow political developments. Try having a look at some conservative ones. Be sure to read information that’s coming from both sides of the story if you’re interested in a current conflict. We are all biased; it’s part of what makes us human. But as humans, we’re also able to recognise this bias and do something about it.

Bigger scale efforts to pre-empting echo chambers

Algorithms are used practically everywhere on digital platforms, and in recent years, the spotlight has turned to what this means for data privacy. If Google, Facebook or a media outlet is tracking your every move, doesn’t that somehow infringe on your right to privacy? The EU’s General Data Protection Regulation came into effect in 2018, as an effort to stifle the extreme growth of personal data collection by big data companies. The idea was to give power back to the online user. Allowing them to control what information digital platforms could collect about them.

By extension, this actually helps preempt digital filter bubbles. When you can turn off personalisation, it gives the algorithms less material to work with. It may mean you see a lot of stuff that feels irrelevant to you. But the point is that you do see it. It isn’t removed from your information feed before you’re even made aware of its existence. These kinds of international initiatives that question data collection and the effects of doing this in a big way will be important to forestall the creation of filter bubbles.

Ansofy’s tools for making a difference

Another approach to actively preventing echo chambers is by source and fact checking. Created by sophisticated AI technology, the Ansofy news feeds do exactly that. The AI journalist compiles the facts – and only the facts – from numerous sources reporting on a specific event or topic. As the reportage grows, the AI also adds onto the articles, giving readers an increasingly comprehensive perspective on events around the globe.

Furthermore, Ansofy gives its users freedom of choice. You can build your news feeds from an incredible range of publications. In this manner, the app becomes a tool you can use to actively break down your own filter bubble. Dare to challenge your views by subscribing to material that doesn’t necessarily agree with your own opinions. The more you understand, the more objective you can be. And objectivity and self-insight are qualities we certainly need more of in today’s media sphere. 

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *