Virginia Review of Politics

View Original

Facebook is Failing the World—And They Know It

Photo by Book Catalog is licensed for use under CC BY 2.0.

Everybody knows Facebook. The social media platform owned by what is now known as Meta Platforms has over 2.9 billion monthly active users and raked in $85.9 billion in revenue in 2020. And Facebook is only continuing to grow. Mark Zuckerberg, co-founder and CEO of Meta Platforms Inc, has assured users and governments that Facebook will be an agent for positive change in the world. Facebook's mission is "to give people the power to build community and bring the world closer together." In reality, Facebook has done the exact opposite. 

In early October 2021, Frances Haugen, a former product manager in the Facebook Civic Integrity Department, leaked thousands of documents showing Facebook's negligence in the problems they have created. Haugen worked with Jeff Horwitz from the Wall Street Journal to expose the findings from these documents to the public. The series of articles that followed is known as The Facebook Files. The files reveal, among many other things, that Facebook's algorithm promotes radical content and rewards misinformation, toxicity, and hate speech. It also revealed that Facebook lacks the technology or human resources to regulate its platform. And the most shocking revelation of all: Facebook knows about all of these problems in sharp detail.  

Facebook's algorithm is flooding people's pages with radical content. Internal memos at Facebook divulge, "Misinformation, toxicity, and violent content are inordinately prevalent among reshares." Facebook developed a point system to measure a post's success and, thus, how much it will show up in other people's feeds. The algorithm awards one point for a like; five points for a reaction, a reshare without text, or a reply to an invite; and 30 points for a significant comment, message, reshare with text, or RSVP. This system means the more outrageous a post is, the more it will spread regardless of the content. Moreover, Facebook's Civic Integrity Department concluded that a story with negative comments will get shared more than a story with positive comments. Political parties have learned to abuse this system to gain attention. Two internal researchers at Facebook wrote that “One party’s social media management team estimates that they have shifted the proportion of their posts from 50/50 positive/negative to 80% negative, explicitly as a function of the change to the algorithm.”​​ The result is that people see consistently negative political content that drives them to have more polarizing views.

To confirm this, a Facebook researcher made a fake account under the alias Carol Smith. Smith was a conservative mom from North Carolina whose interests included "young children, parenting, Christianity, Civics and Community." By the second day, Facebook recommended almost exclusively right-wing content to Smith. By day five, Smith was presented with QAnon content, including videos promoting hate groups and false claims of white genocide. This experiment demonstrated the speed at which Facebook guides users toward the most radical content possible. The effects of this include Americans being pushed further toward the left or the right, with the middle ground quickly disappearing.

In addition to radical and polarized content, Facebook's algorithm is excellent at distributing misinformation. For example, misinformation about the safety of vaccinations ran wild all over Facebook. In March 2021, Zuckerberg revealed Facebook's plan to help get people vaccinated by connecting people to trustworthy information on the safety and availability of vaccines. Despite this proclaimed plan, Facebook's inability to regulate itself led to the platform becoming a cesspool of misinformation and anti-vaccine content. 

A study executed by Avaaz, an advocacy group, determined that the top ten accounts producing health misinformation garnered four times as many views as the top ten sources of authoritative health information. A Facebook employee randomly sampled comments related to COVID-19 and vaccines and determined that two-thirds were "anti-vax." In reality, only about a quarter of people have anti-vaccination views, meaning that Facebook disproportionately contains anti-vaccination content compared to public opinion. Much of this content is untrue, unconfirmed, or conspiracy. 

As bad as these problems are within the United States, they are even worse abroad. Even though more than 90 percent of Facebook users reside outside the United States and Canada, only 13 percent of Facebook's budget for classifying misinformation is spent outside of the United States. Facebook does not even attempt to regulate content posted internationally properly. Documents show that Facebook has little to no employees who speak the dialects needed to identify dangerous or criminal uses of the platform in some countries with millions of users. 

For example, in Afghanistan, a Facebook employee estimated that the company took action on just 0.23 percent of hate speech posts in the country. The lack of ability to regulate has contributed to Facebook becoming a hub for groups to organize crime and terror. As proof, Facebook discovered an extensive sex trafficking operation that used the platform to lure in women from Thailand and other countries. The internal investigation revealed that these women were "held captive, denied access to food and forced to perform sex acts in Dubai massage parlors." Facebook is a bystander in organized crime worldwide, as the company fails to stop the abuse of its platform.

Facebook has kept politics at the forefront of its regulation tactics. In June 2020, some Facebook employees argued that Breitbart should be removed from the "News Tab" because of their role in a “concerted effort at Breitbart and similarly hyperpartisan sources (none of which belong in News Tab) to paint Black Americans and Black-led movements in a very negative way.” After all, a Facebook study determined Breibart is the least trusted news source across the United States and Great Britain. Despite these concerns, other employees were hesitant to remove Breibart from the News Tab because of political blowback. One employee explained, ​​"We're scared of political backlash if we enforce our policies without exemptions." Ultimately, Breitbart remained on the News Tab primarily because of Facebook's fear of being called biased by former President Donald Trump and other prominent conservatives. 

Just weeks later, in July 2020, Trump reposted a Breitbart video claiming masks were unnecessary and that the cure against COVID-19 involved hydroxychloroquine. After it circulated to millions of viewers, Facebook and other social media sites eventually removed the video. However, Facebook did not punish Breitbart for this infraction. 

In the end, Facebook is tearing down democracy. It has knowingly driven people apart and made it harder for anyone to agree on anything, including basic facts. Misinformation and hate speech are running rampant on the platform, and Facebook does not have the facilities to stop it. Since Facebook continually fails to self-regulate, individual governments should force Facebook to leave their countries if the social media platform cannot provide the proper resources to monitor its content. Facebook must abandon engagement-based algorithm models in favor of designing people's feeds based on recency. Haugen suggested that having public officials working within Facebook to hold them accountable would lead to more transparency and urgency in fixing problems.

If Mark Zuckerberg is sincere in his goals of connecting the world and effecting positive change, Facebook must reinvent itself. If he is not sincere and is happy to continue raking in profits as his platform tears us apart, governments and individuals have to apply pressure on the company to change the way it works. Until sweeping revisions are made, Facebook will continue to watch itself fail the world in a devastating fashion.