This HuffPost Canada page is maintained as part of an online archive.

Why We Self-Censor When We Read Online

The Internet not only changes what media we view, but how we view it. In order to avoid being overwhelmed by informational anxiety as so much data prattles uncontrollably at us from our screens, we have subconsciously become our own content editors and censorship committees, determining for ourselves which sites are worth frequenting and which ones are not, what content is good and what content is bad.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.
Alamy

Getting information off the Internet is like taking a drink from a fire hydrant. There is so much to read, watch, post, tweet -- a person can easily be overcome by facts and figures, making it incredibly difficult to fully comprehend an issue or arrive at a truly informed decision.

Rather paradoxically, this resulting abundance of information, a.k.a. infobesity, can be the source of more productivity lost than gained. Any regular, honest Internet user can attest to this. Check Gmail, update LinkedIn, troll Reddit, upload to Flickr, pin to Pinterest, browse the HuffPost -- I could go on and on but this clip sums it up nicely.

Media theorist Marshall McLuhan predicted the suffocating effects of information overload back in the 1960s. He argued that far from media being just a passive channel of information, the medium that we employ to engage with the media -- in this case the Internet -- actively determines how exactly we interact with the content.

In short, the Internet not only changes what media we view, but how we view it. The "how" being short blocks of conversational text injected with hyperlinks, blinking adds, infographics, pop-ups, videos, and sound bites. Not only is the amount of content available online mentally stifling -- an estimated 14.18 billion pages as of January 3 -- but the process by which we are relentlessly bombarded by this stifling content is over-stimulating in itself.

In order to avoid being overwhelmed by informational anxiety as so much data prattles uncontrollably at us from our screens, we have subconsciously become our own content editors and censorship committees, determining for ourselves which sites are worth frequenting and which ones are not, what content is good and what content is bad.

These forms of self-censorship are, perhaps justifiably, a defence mechanism, a barrier between our brains and the vast recesses of cyberspace. Yet there are many negative repercussions when users engage in self-censoring practices on the Web.

First off, a problem created by the lack of authority-approval online, we, the self-appointed editor-in-chief of our cyberspace adventures, still tend to believe far too much of what we read on the Internet merely because our brains our lazy.

According to researchers at the University of Western Australia, this laziness is due in part to the fact that weighing the source and credibility of a message is cognitively more difficult than merely accepting the message as true -- it requires additional motivation and cognitive resources that many readers subconsciously decide not to employ.

Thus, when met with the option of fact-checking statements or figures, our lethargic brains prefer to make a snap decision. If the argument does a good job of convincing us, we tend to believe it. If not, we tend to reject it. "Obama is a Muslim," "global warming is a hoax," last year's U.S. Presidential election was ripe with such unfounded allegations, many of which continue to stick thanks to our intuitive disdain for factual scrutiny.

But how does an argument do a good job of convincing us in the first place? What's more, why does the persuasiveness of an argument have much more to do with our preconceived notions than it does with some lofty concept of truthfulness?

Well for starters, every person's cognitive reasoning is plagued by something psychologists refer to as the confirmation bias. Meaning that we only trust an expert, data, research, or opinion if the conclusions are in line with our previously held beliefs. Hence, we tend to monopolize our time online skimming articles, blog posts, and websites that reconfirm our own ideals. Accepting complacent arguments with little or no scrutiny because we want to believe them.

In doing so, we censor ourselves from a whole breadth of stimulating and informative content online because it may not tell us exactly what it is we want to hear. Opting instead to have our opinions regurgitated back to us by like-minded individuals in safe spaces of media that reify the validity of our personal ideologies.

Take a hypothetical labour activist and an investment banker. This activist is about as unlikely to stray from the angsty embraces of the message boards of rabble.ca as the banker is from the monetarist cliques of the comments section in the Financial Post. And the longer both parties refuse to cogitate opposing sources, the more confident they become in their one-sided world-views.

Eventually these one-sided world-views of this activist or banker can become so interwoven into the underpinnings of their own self-image, they reach a point where they are afraid to scrutinize their own beliefs because that would mean questioning the core foundations from which they have based their lives upon. Once it has taken hold, this transformation from opinion to dogma is hard to reverse.

So how do we get around the cognitive laziness and the confirmation biases that hinder our experiences online in order to better challenge our accepted beliefs this year?

We need to read more self-aware. This starts by familiarizing yourself with your personal biases and seeking evidence to the contrary of every opinion -- especially those which you have appropriated for yourself. A well-formulated guide to your research online such as the one created by Lifehacker should help to get you started.

Moreover, it is important to remember that everyone has the same problems with the confirmation bias and cognitive laziness as you do. This means that whenever you consider someone else's opinions, you must simultaneously consider their biases. At the risk of sounding cliché, you must be sure to put yourself in the advice-givers' shoes, being aware that a person's history, personality, and experiences weigh heavily on whatever information they're imparting upon you.

After all, as many of us have experienced first hand, the Internet can be a great tool of knowledge, allowing a curious citizen, researcher, student, journalist, or this Huffington Post blogger to discover in hours from the comfort of their own home, what used to take days buried in paperwork down at the city archives or university library.

Yet with great access to information comes great cognitive responsibility. And it is up to us as our own personal editors, to fight self-censorship in all its forms by always pushing ourselves to seek out evidence to the contrary of what we believe. There are enough people out there attempting to censor online activity already, let's try not to make their jobs any easier.

Close
This HuffPost Canada page is maintained as part of an online archive. If you have questions or concerns, please check our FAQ or contact support@huffpost.com.