Facebook’s continued lack of urgency surrounding voter suppression, misinformation and algorithmic bias could have “dangerous (and life threatening) real-world consequences,” according to a new audit of the company.
Facebook’s Civil Rights Audit, released today, July 8, is an independent survey of the company led by Laura W. Murphy, a civil rights lawyer and former American Civil Liberties Union executive, and a team from the law firm Relman Colfax. It’s actually the third and final report after two preliminary releases; in all, the two-year audit encompassed interviews with “100 civil rights organizations, hundreds of advocates and several members of Congress.”
The report comes at a time when over 900 companies have threatened or started a boycott of the social media site — and when a Zoom meeting this week between civil rights groups and Facebook’s top executives Mark Zuckerberg and Sheryl Sandberg was considered “very disappointing.”
The auditors claim their report on Facebook was not acting as a comparison with other tech industry peers (“In some areas [Facebook] may outperform peers with respect to civil rights, and in other areas, it may not”) and only concentrates on the “core” Facebook app and only as it relates to the U.S. As well, the report wasn’t limited to only racial justice, but also touched upon unfair treatment and discrimination in education, employment, housing, credit, voting, public accommodations and more.
The official @Facebook #CivilRights audit report was published today. In sum, Facebook has made "vexing and heartbreaking decisions . . . that represent significant setbacks for civil rights." They also selectively enforce their content moderation policies.
— Elizabeth M. (@hackylawyER) July 8, 2020
While the audit does suggest Facebook made some improvements over the course of the two-year study, it also notes that any progress is merely a “start, not a destination.”
Five startling conclusions:
- “Ironically, Facebook has no qualms about reining in speech by the proponents of the anti-vaccination movement, or limiting misinformation about COVID -19, but when it comes to voting, Facebook has been far too reluctant to adopt strong rules to limit misinformation and voter suppression.”
- “… politicians are free to mislead people about official voting methods (by labeling ballots illegal or making other misleading statements that go unchecked, for example) and are allowed to use not-so-subtle dog whistles with impunity to incite violence against groups advocating for racial justice.”
- In regards to establishing a civil rights infrastructure, the report is somewhat positive about Facebook’s newest initiatives but notes, “Currently, all but one of the product review processes these [civil rights] questions will be inserted into are voluntary, rather than mandated reviews required of all products.”
- In regards to inflammatory advertisements as they pertain to race, the auditors note that expanded ad restrictions are limited to physical threats and advertisers can still run ads “that paint minority groups as a threat to things like our culture or values (e.g., claiming a religious group poses a threat to the ‘American way of life.’).”
- The social media brand’s idea to label all posts related to voting with a “neutral”-worded label and a link to a Voting Information Center received a mixed response. As the report suggests, “There is concern that labeling all voting-related posts (both those that are accurate and those that are spreading misinformation) with neutral language will ultimately be confusing to users and make it more difficult for them to discern accurate from misleading information.”
The report did laud Facebook’s census interference policy, voting misinformation posts related to COVID-19 and the site’s increased capacity to combat coordinated inauthentic behavior (e.g., foreign actors trying to sway an election from fake or banned accounts). While the report featured an extensive overview of Facebook’s commitment to weed out algorithmic and machine learning biases, those programs were either too new or the commission did not have enough access to make conclusions; they did suggest the company should approach these issues with “a greater sense of urgency.”
As Sandberg, Facebook’s COO, noted in a blog post today, “What has become increasingly clear is that we have a long way to go. As hard as it has been to have our shortcomings exposed by experts, it has undoubtedly been a really important process for our company.”
Subscribe here for our free daily newsletter.
Thanks for reading InsideHook. Sign up for our daily newsletter and be in the know.