
Lessons from the Facebook Files: PR Fire and Political Repression

Addressing the Harvard Institute of Politics’ JFK Jr. Forum, Wall Street Journal’s Jeff Horwitz, describes, “Facebook is deeply partisan …. in favor of Facebook.”
In September of this year, Horowitz spearheaded the Facebook Files, Wall Street Journal’s investigation of leaked internal documents from Facebook Whistleblower Frances Haugen. A former Facebook data scientist and Harvard Business School graduate, Haugen disclosed private data to WSJ and the US Security and Exchange Commission outlining Facebook’s malpractices regarding free speech, mental health, political violence, and quality control.
This exposé substantiates the growing concern over Facebook’s lack of data transparency and increased political extremism. Haugen, Horowitz, and other digital activists have especially focused on Facebook’s purported lack of agency regarding its black-boxed network regulations. These issues have pervaded the respective platforms with real-life consequences — frustratingly disregarded by management.
The Facebook PR fire has continued to rage with politics at the epicenter of corporate decision making in a company aspiring to ‘bring the world closer together.’ As Facebook moves to Meta, its new roots within social responsibility and governance extend further. Our society, in this watershed, must reflect on our role in digital accountability and force precedent onto a new social world order.
Information Mismanagement with Real-Life Consequences
In order to give context to the pressing issues of Facebook, it is important to consider its historical precedence. In 2016, the company was accused of amplifying Russian interference in the national election. In 2018, it was leaked that Facebook had sold the data of over 87 million profiles without user consent. Since then, the PR Fire has continued to ablaze, amplified by an increased climate of political polarization.
The first major point contends the international spread of political violence built upon Facebook’s artificial flow of information. India, currently Facebook’s largest market, has been swept with Hindu nationalist zeal perpetuating persecution and outright political violence originating on the platform. According to Facebook’s own integrity researchers, Facebook management recognizes that their services have been used to amplify hate speech and radical agendas. Nevertheless, they have tried to sweep this under the rug changing the societal violence team, or the “genocide team” as Horowitz calls it, to be the social cohesion team and reducing human verification checks.
About 5 out of every 10,000 content views contains hate speech. Facebook’s solution has focused increasingly on its AI filtering mechanisms. In 2019, Facebook drastically reduced the number of human hours reviewing hate-speech complaints, placing greater emphasis on its AI enforcement. Zuckerburg notably claimed at a Senate committee hearing in 2018 that, “over the long term, building AI tools is going to be the scalable way to identify and root out most of this harmful content.”
Nevertheless, this March, an internal review of Facebook’s employees estimated that their AI systems effectively removed only 3 to 5 percent of all hate speech on the platform. Despite “bringing in AI overlords,” the move to AI filtering appears to have more economical than moral impetus. Over the past two years, the decrease of human content review operations cut expenditures by 15%.
In more extreme cases, errors in content filtering have permitted illicit trafficking including indentured servitude. Internal documents have shown that the integrity team was aware of indentured servitude on Facebook as far back as 2019, yet it was only until pressure from Apple that Facebook began to crack down on this abuse. Facebook’s trend of finding convenient solutions to persisting issues has been a common problem. Following the 2020 election, Facebook disbanded its Civic Integrity team that monitored election interference and political extremism.
Hate speech has plagued Facebook since its inception. In Zuckerberg’s early days at Harvard building, Facemash, he was widely criticized for exacerbating cyberbullying amongst its users. The irony of the fact is that Facebook and Instagram currently profit off of a concealed, yet similar promise — distorting body image and promoting harmful echo chambers. Haugen characterizes these as profiting off of the “addict’s narrative.”
Externalities Left Unchecked
This neglect and lack of self-awareness characterize Facebook’s continued mismanagement. While it regularly complies with internal diligence and purported third-party investigations, Facebook management rarely creates progressive measures. With increasing bureaucratic red tape, accountability has been a key issue for the Zuckerberg-led team.
Facebook’s self-regulation and recommendation systems are especially messy. During the 2020 election, low-level content was topping the charts leading to massive misinformation. It turns out that a system that was supposed to demote content had accidentally been promoting fake news. Horowitz notes that the Facebook development team often forgets about the scores of such recommendation systems.
One especially criticized platform filter is Xcheck, also known as cross-check, Facebook’s quality control for VIP accounts. Through a tiered network, it shields VIPs from normal enforcement of community guidelines, with some tiers removing complete restrictions of guidelines from its users entirely. This, in turn, has caused doubts about the legitimacy and fairness behind free speech on the platform as well as issues regarding selective biases and echo chambers.
Many of Facebook’s key externalities have been identified by its own internal diligence teams, yet they continue to cast a blind eye. Horowitz said that he is most appalled at how solvable many of the problems Facebook faces are. Primarily, the platform is unwavering in sacrificing any painful tradeoffs.
Yet, these tradeoffs are not as straightforward as the dollar basis — it is difficult to create objective solutions surrounded by people whose lives are supported by the company you built. It is even more difficult to motivate change within a platform at the backbone of contemporary media and communication. Last year, Patagonia and other companies doubled down on boycotts of Facebook, nevertheless most of them regressed to the platform in need of advertising services by the end of the fiscal year. Collective action is required to force hand against the social customs that sustain Facebook’s apathy.
Efficient Regulatory and Market Solutions
Horowitz has stated, “Counting on whistleblowers like Frances is not a functional system to know about what’s going on in the background of the world’s largest interface.” Facebook needs to step up in transparency about its company policy as well as its market-based user retention solutions in order to remain relevant and access accountability.
Primarily, Facebook must address how encrypted systems tie into unencrypted systems. In the case of the commons, most people would favor a more absolutist approach to encryption. When you are broadcasting to millions of people, there should be more clarity on who the speaker is and what their biases may be.
User education on algorithm recommendations as well as accessible literature on censorship models is a good start in solving this issue. Yet, this must complement procedural changes. Killing or limiting sharing from a friend of a friend alone is a very effective tool without hard choices of censorship, limiting the speed of misinformation to the 85 percentile. On Whatsapp, the implemented forward limit of 5 has proven to slow down the speed of content dramatically.
Facebook has built itself to have as little friction as possible, trapping users in its propaganda-ridden slip-and-slide. Especially, people who are extremely heavy users tend to promote the most extremist content on the platform. By enforcing limits or caps on links that are popular with that crowd, content-agnostic, more latency can be measured into misinformation.
The Future of Facebook
Facebook’s former mantra — “Move fast and break things” — rewarded a culture of disruptive innovation. As Horowitz stated clearly, Facebook is “a tremendously successful platform, but, in a typical Silicon Valley way, did not consider its externalities.”
Now, many fundamental facets of truth and ownership are at stake, and the Facebook Files raise concern over a future shaped by the same company that grapples to govern its own internet protocols. As Meta positions itself to lead the burgeoning Metaverse space, the stakes have become higher than ever.
Meta will confront, to a larger degree, the same issues of privacy, data security, and misinformation that have plagued its unembodied platform. With internet power and encrypted profiles as catalysts for prior issues, they must learn from the research failures of web2 Facebook. If management policies continue to be anything like those of the past, we should be concerned about the type of faddish regulations and governance that Meta will prescribe for our new virtual world.
Digital activists like Haugen and Horowitz will continue to fight the good fight over truth in our post-truth world. Nonetheless, as the gears of our utopian digital double begin to crumble, we must start to question platform transparency and personal accountability. Many of the problems outlined in the Facebook Files are evident concerns, even whilst unsubstantiated, and it should not take whistleblowers to galvanize action.