What we’re talking about here is that little box in the upper right of your Facebook page — the short list of news topics that are being discussed on Facebook at the moment. They’re clearly tailored to the user. Because Facebook has one-sixth of the world using it every day, pretty much everything is being talked about to some extent. The company uses an automatic system (an algorithm) to surface what’s currently popular, and a team of staffers then further curates the list to tailor it to meet particular standards.

And there’s the problem. Gizmodo quotes several former curators suggesting that conservative news stories would be booted from the automatically generated list of trending stories for two reasons. One was if the story came from a conservative-leaning site, such as Breitbart.com or Newsmax.com, in which case curators were told to find the same story on a mainstream media site, if possible. The other was if the curator didn’t want to include the story or didn’t recognize the story as important. It’s hard to know the extent to which the latter judgments took place, but one of the former curators — a conservative — told Gizmodo, “I believe it had a chilling effect on conservative news.”

That’s problematic, for obvious reasons. (Gizmodo notes that it’s not clear whether this is still happening, because the trending news algorithm is constantly being tweaked, and that it’s not clear whether liberal news was similarly affected.) The bigger question is the extent to which Facebook overlays another filter on top of what you see — and the extent to which that can influence political decisions.

We already knew (even if we sometimes forget) that there are a lot of layers of filtration that occur before you see anything on Facebook. There’s the filtering that you yourself do, picking friends, clicking links, posting stuff. There’s the main Facebook algorithm that puts things in your feed. That’s based in large part on what you tell the system you like. Two years ago, journalist Mat Honan liked everything in his feed, telling Facebook, in short, that he liked everything. Within 48 hours, his feed was a garbage dump. His human curation had failed.

So this manipulation of the trending news is another layer. But it’s significant in part because it’s the most obvious manifestation of what Facebook wants you to see. Facebook slips ads in your feed and highlights some posts over others, but the trending news is Facebook itself sharing content with you. And as Gizmodo reports, its employees are deliberate in doing so. For example:

In other instances, curators would inject a story – even if it wasn’t being widely discussed on Facebook – because it was deemed important for making the network look like a place where people talked about hard news. “People stopped caring about Syria,” one former curator said. “(And) if it wasn’t trending on Facebook, it would make Facebook look bad.”

Advertisement

Facebook was also criticized for not having a trending topic on the Black Lives Matter movement, one former curator claimed. So they “injected” it into the feed. “This particular injection is especially noteworthy because the #BlackLivesMatter movement originated on Facebook, and the ensuing media coverage of the movement often noted its powerful social media presence,” Gizmodo’s Michael Nuñez writes. Black Lives Matter existed without Facebook, but this injection could only have helped.

In April, Nuñez reported that Facebook employees were advocating for chief executive Mark Zuckerberg to explain during a company meeting what responsibility Facebook had to block Donald Trump’s candidacy. (The question doesn’t appear to have been answered.) If it wanted to block Trump from appearing on the site, an expert told Nuñez, it was within its legal rights to do so, just as it can block other forms of content. The report resulted in assurances from the company that it would never interfere with people’s voting choices. “We as a company are neutral,” a spokesman told The Hill, “we have not and will not use our products in a way that attempts to influence how people vote.”

Any news organization, including The Washington Post, is subject to bias introduced by the people that work for it. Hand-tailoring what the trending-news algorithm spits out introduces bias (not that the algorithm itself is without any bias, given that it, too, is cobbled together by humans). But that bias affects an audience of a size that The Post could only dream about.

This is a company that wants to create a system to bring the Internet to the entire world — so that the entire world can use Facebook. It’s a company whose chief executive, Zuckerberg, led a recent effort to reform immigration policies in the United States. If Facebook wanted to, it could put a message in support of immigration at the top of every user’s news feed, completely legally — though risking huge backlash.

Or it could use its influence more quietly. In 2010, Facebook conducted a social experiment, introducing a tool letting people tell friends when they’d voted in that year’s elections. People who saw that message were 0.4 percent more likely to vote — resulting in an estimated 300,000 more people getting to the polls. This prompted a lot of questions about how Facebook could influence turnout, either at its own whim or as a product offered to political campaigns.

That’s the issue at the heart of the question over what Facebook is suppressing or promoting. This is a media company at a scale that’s without precedent in the world. Nearly three-quarters of American adults who use the Internet use Facebook. And those adults didn’t see stories about political topics in their trending news feeds because a human who works at Facebook decided not to show it.

filed under: