Rate this post


These sorts of events are insidious because it’s hard to form a response that isn’t a bad one. Talking about the video just gives its concocted message more oxygen. Ignoring it risks surrendering truth to the ignorant whims of tech companies. The problem is, a business like Facebook doesn’t believe in fakes. For it, a video is real so long as it’s content. And everything is content.


The problem starts when journalists assume that the problem with fakes is obvious. As The Washington Post, CNN, and other outlets covering the video have been careful to note, doctored materials are nothing new, especially online. But those outlets have also gone on to claim that something is novel about videos like this one. At the Post, Drew Harwell wrote that “the outright altering of sound and visuals signals a concerning new step for falsified news,” especially as the 2020 campaigns heat up. At CNN, Donnie O’Sullivan also argued that the situation was unique. It is unprecedented, O’Sullivan claimed, that a “fake video” could be quickly viewed by millions of people, and that official political operatives, such as Giuliani, would promote it.

YouTube removed the video, but Twitter and Facebook did not. Facebook did deprioritize the content, making it appear less often. That step also triggered the site to prompt users before they share it—although those warnings, which read in part, “Before sharing this content, you might want to know that there is additional reporting on this,” might be incomprehensible to an average person. There’s additional reporting on everything these days.

Normally, tech companies don’t offer much in the way of comment about controversies of truth online. But this time, Facebook has gone on the record to explain its decision to retain the video in direct and high-profile ways. The company’s vice president for product policy and counterterrorism, Monika Bickert, spoke with Anderson Cooper the day after it started spreading to explain why Facebook hadn’t removed the doctored Pelosi clip.

Introducing the television segment, Cooper called the video “fake” and “manipulated,” noting that “Facebook knows it’s fake,” since the company decided to make the material less prominent. How then, Cooper asked Bickert, can Facebook claim that it’s committed to fighting fake news while still hosting and amplifying a doctored video?

Bickert’s response is instructive. She clarified that Facebook doesn’t have a policy against misinformation as such. Outside fact-checkers review controversial material like this, she explained, and then “we dramatically reduce the distribution of that content.”

Cooper asked the obvious question: Why keep it up at all once you know it’s false? Absent inviting immediate harm, Bickert explained, “we think it’s important for people to make their own informed choice about what to believe.”