After years of optimizing its products for engagement, no matter the costs, Facebook announced today it will “test” changes to its News Feed focused on reducing the distribution of political content. The company qualified these tests will be temporary, impact a small percentage of people, and will only run in select markets, including the U.S., Canada, Brazil, and Indonesia.
The point of the experiments, Facebook says, is to explore a variety of ways it can rank political content in the News Feed using different signals, in order to decide on what approach it may take in the future.
It also notes that COVID-19 information from authoritative health organizations like the CDC and WHO, as well as national and regional health agencies and services, will be exempt from being downranked in the News Feed during these tests. Similarly, content from official government agencies will not be impacted.
The tests may also include a survey component, where Facebook asks impacted users about their experience.
Facebook’s announcement of the tests is meant to sound underwhelming because any large-scale changes would be an admission of guilt, of sorts. Facebook has the capacity to make far greater changes — when it wanted to downrank publisher content, it did so, decimating a number of media businesses along the way. In previous years, it also took harder action against low-quality sites, scrapers, clickbait, spam, and more.
The news of Facebook’s tests comes at a time when people are questioning social media’s influence and direction. A growing number of social media users now believe tech platforms have been playing a role in radicalizing people, as their algorithms promote unbalanced views of the world, isolate people into social media bubbles, and allow dangerous speech and misinformation to go viral.
In a poll this week, reported by Axios, a majority of Americans said they now believe social media radicalizes, with 74% also saying misinformation is an an extremely or very serious problem. Another 76% believe social media was at least partially responsible for the Capitol riot, which 7 in 10 think is the result of unchecked extreme behavior online, the report noted.
Meanwhile, a third of Americans regularly get their news from Facebook, according to a study from Pew Research Center, which means they’re often now reading more extreme viewpoints from fringe publishers, a related Pew study had found.
Elsewhere in the world, Facebook has been accused of exacerbating political unrest, including the deadly riots in Indonesia, genocide in Myanmar, the spread of misinformation in Brazil during elections, and more.
Facebook, however, today argues that political content is a small amount of the News Feed (e.g. 6% of what people in the U.S. see) — an attempt to deflect any blame for the state of the world, while positioning the downranking change as just something user feedback demands that Facebook explore.