One presidential election cycle after Facebook “reduced” the distribution of The New York Post’s reporting on Hunter Biden’s laptop and suspended the accounts of former President Donald Trump on Facebook and Instagram, a member of Meta’s oversight board says the Big Tech platform “had not done enough” to control users’ speech.
In an interview with Wired published Friday, board member Pamela San Martín claimed that as the tech platform enters 2024, “even though we’re addressing the problems that arose in prior elections as a starting point, it is not enough.”
“Between the U.S. election [in 2020] to the Brazilian election [in 2022], Meta had not done enough to address the potential misuse of its platforms through coordinated campaigns, people organizing, or using bots on the platforms to convey a message to destabilize a country, to create a lack of trust or confidence on electoral processes,” she added.
Really? With the encouragement of intel agencies, Facebook engaged in plenty of election interference in 2020.
On the same day The New York Post published bombshell emails recovered from a laptop Hunter Biden left at a Delaware repair shop, Facebook’s policy communications director Andy Stone tweeted, “While I will intentionally not link to the New York Post, I want be clear that this story is eligible to be fact checked by Facebook’s third-party fact checking partners. In the meantime, we are reducing its distribution on our platform.”
NPR admitted “that means the platform’s algorithms won’t place posts linking to the story as highly in people’s news feeds, reducing the number of users who see it.” The platform had also removed a Trump campaign ad earlier that year, and would suspend Trump’s account in January 2021.
When a Republican staff report from the House Oversight Committee and Judiciary Committee described “How Democrats Are Attempting to Sow Uncertainty, Inaccuracy, and Delay in the 2020 Election,” another staff report weeks later notes Facebook “flagged the Judiciary Committee Republicans’ post about the report, and linked to a website that Facebook describes as containing ‘official election resources.'” However, as the latter report insisted, “the content Facebook presents as ‘official’ is not always neutral. Instead it amplifies certain points of view and undermines others.”
Ahead of the 2020 election, Facebook also promised to ban political ads that it deemed to be making false claims about such things as “voter fraud.”
Two months before the election, bemoaning North Carolina’s mail-in voting systems, Trump said absentee voters were “going to have to go and check their vote by going to the poll and voting that way,” out of a concern that absentee votes might not be tabulated. “Let them send it in and let them go vote, and if the system is as good as they say it is, then obviously they won’t be able to vote,” Trump said.
Soon after, Facebook told USA Today “that it will remove any videos supporting the president’s suggestion, as well as any videos without captions or context.” The Big Tech platform would, however, allow posts from “those who share the video criticizing the suggestion or noting that voting twice is illegal.”
Aside from meddling with election-related content to control voters’ access to information, Facebook also spent the months and years leading up to the election censoring conservative voices, including the sitting president.
In August 2020, Facebook took down a clip Trump posted of himself saying children were “almost immune from this disease,” speaking about Covid-19. But it was true that children were far less likely to become seriously sick from Covid, with the Centers for Disease Control noting that pediatric hospitalizations were “much lower” for Covid than for the regular flu.
After race riots ravaged the country in summer 2020, Facebook nuked any “praise and support” or donation page links for Kyle Rittenhouse, the young man who shot three men in what a jury agreed was self-defense in Wisconsin.
Since then, the platform has worked with the Biden White House to censor the administration’s dissenters. Facebook removed posts sharing heterodox beliefs about Covid-19 because, as a Facebook VP put it in an internal email, “we were under pressure from the administration and others to do more.” The platform also nuked a page belonging to an organization run by Robert F. Kennedy Jr., who is running for president against Biden.
But San Martín told Wired for last week’s story that Facebook needs to do more. “Social media platforms need to learn from past mistakes to be able to address them better this year,” she said, acknowledging that since 2020 “we’ve seen an advance in Meta using more tools to address election-related issues.”
She also listed “election-specific initiatives” — read: censorship techniques — that Meta has tested out “in different countries.” These have included “working with electoral authorities, adding labels to posts that are related to elections, directing people to reliable information, prohibiting paid advertisement when it calls into question the legitimacy of elections, and implementing WhatsApp forward limits,” San Martín casually explained to Wired.
San Martín acknowledged “how [Meta’s] own algorithms, their own newsfeeds, their own recommendation systems, their own political ads can play a part” in what she euphemistically called “protection” of “electoral processes.” And as she told Wired, it’s something she wants to see more of, not less.