This article exists as part of the online archive for HuffPost Canada, which closed in 2021.

Why Do Facebook And Google Show Us What They Show Us?

we've all seen how effective the algorithms are at distinguishing genuine, authentic content from bullshit. And we've already talked about how those algorithms are shaped by your ultimate goal: do you want engagement, or do you want veracity? Do you want to be clicky, or do you want to be authentic? Can't always have both.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.
(FILES)This February 25, 2013 photo taken in Washington, DC, shows the splash page for the Internet social media giant Facebook. Those Facebook 'likes' can reveal a lot more than you think. Research released March 11, 2013 shows patterns from these Facebook preferences can provide surprisingly accurate estimates of the user's race, age, IQ, sexuality and other personal information. The researchers developed an algorithm which uses Facebook likes -- which are publicly available unless a user chooses stronger privacy settings -- to create personality profiles, potentially revealing a user's intimate details. These mathematical models proved 88 percent accurate for differentiating males from females and 95 percent accurate distinguishing African-Americans from whites. The algorithms were also able to extrapolate information such as sexual orientation, whether the user was a substance abuser, or even whether their parents had separated.AFP PHOTO / Karen BLEIER / FILES (Photo credit should read KAREN BLEIER/AFP/Getty Images)
KAREN BLEIER via Getty Images
(FILES)This February 25, 2013 photo taken in Washington, DC, shows the splash page for the Internet social media giant Facebook. Those Facebook 'likes' can reveal a lot more than you think. Research released March 11, 2013 shows patterns from these Facebook preferences can provide surprisingly accurate estimates of the user's race, age, IQ, sexuality and other personal information. The researchers developed an algorithm which uses Facebook likes -- which are publicly available unless a user chooses stronger privacy settings -- to create personality profiles, potentially revealing a user's intimate details. These mathematical models proved 88 percent accurate for differentiating males from females and 95 percent accurate distinguishing African-Americans from whites. The algorithms were also able to extrapolate information such as sexual orientation, whether the user was a substance abuser, or even whether their parents had separated.AFP PHOTO / Karen BLEIER / FILES (Photo credit should read KAREN BLEIER/AFP/Getty Images)

What we have here are two seemingly disparate stories involving two of the most dominant content distributors on the planet.

In one, our friend Mathew Ingram discusses Mark Zuckerberg's video initiative, which, he says, undercuts the Faceborg supremo's insistence that he's not running a media company. In a subsequent post, he highlights Faceborg's ostensible commitment to fighting fake news.

It's not hard to see why Zuckerberg resisted the characterization as long and as hard as he did. We've talked about that previously ourselves. And as Ingram points out:

Facebook likes things that are neat and tidy, like algorithms -- not things that are all muddy and gray and complicated, like defining what constitutes fake news.

Well, we've all seen how effective the algorithms are at distinguishing genuine, authentic content from bullshit. And we've already talked about how those algorithms are shaped by your ultimate goal: do you want engagement, or do you want veracity? Do you want to be clicky, or do you want to be authentic? Can't always have both.

And which one you prioritize is going to determine what floats to the top of your menu.

There's no great insight in observing that this is going to get a lot messier before it gets any neater. The accusations of bias, censorship, lack of transparency, and hidden agendas are going to be deafening, and they're going to be coming from all sides. The language is going to be heated and ugly. If there's any small comfort to be drawn from this, and it's a big "if," it'll be in Facebook's acceptance of responsibility for the content it serves up.

(In any event, it might all be academic anyway. As our friend Jonathan Albright argues, fake news is soon to become the least of our problems.)

Secondly, a disturbing piece in the Guardian by Carole Cadwalladr. When she tried a Google search involving the Holocaust, the first thing that happened was that the search bar auto-completed her query to read "did the Holocaust happen?"

And there, at the top of the list, was a link to Stormfront, a neo-Nazi white supremacist website and an article entitled "Top 10 reasons why the Holocaust didn't happen".

She then recounts Google's insistence that it would not rewrite its search algorithm* or remove the results, despite its declaration that it did not endorse those views. Eventually Cadwalladr did an end run around the organic search results by buying a paid Google ad that bumped Wikipedia's entry about the Holocaust to the top of the page. For now, at least.

The rest of the piece examines how and why such a self-evidently repugnant outcome becomes possible. Not so much about why Google won't edit the results, but why Stormfront would rank so highly -- and, unsurprisingly: it comes down to money:

" ... empirically speaking, people tend to treat Google like an authority. So this is an appalling shirking of responsibility. It's about money. It always is. The commercial imperative trumps all other aims at the company, including moral ones."

Why this content and not that?

So, a few revealing insights about what motivates two of the most powerful content platforms on the planet. These entities control what we see, what we read, what we're exposed to, and what we consume. These entities control the vast majority of the information available to us. If they don't want to show it to us, chances are we're not going to see it.

What lessons do we draw from this? Once again: the importance of critical thinking. Why is Facebook serving up this story and burying that one? Why is Google ranking this at the top of its search results, and not that? What are we not seeing here? Why is our attention being directed to this thing at this time? There's no need to go full-bore conspiracy theory here -- just a healthy skepticism and willingness to do the work.

*In the spirit of disclosure, there are times when one doesn't necessarily want Google to rewrite its algorithms.

Follow HuffPost Canada Blogs on Facebook

Also on HuffPost:

Close
This article exists as part of the online archive for HuffPost Canada. Certain site features have been disabled. If you have questions or concerns, please check our FAQ or contact support@huffpost.com.