NBC News: Facebook spread rumors about arsonists setting fires in Oregon. It’s part of their business model.
As wildfires were burning across Oregon and California this week, conspiracy theories about how the fires started were moving nearly as rapidly on Facebook. Posts falsely blaming members of antifa or Black Lives Matter spread across the platform nearly unchecked, causing calls about “antifa arsonists” to clog emergency phone lines. Local and national law enforcement had to spend precious time and resources rebutting the false claims, instead of rescuing residents and aiding in evacuations.
Facebook said last Saturday that it was banning fire-related conspiracy talk from the platform. But, according to research by the German Marshall Fund of the United States, the misinformation continued to circulate for days afterward, eluding whatever mechanisms Facebook had put in place to end it.
Facebook had time to prepare for such a contingency; it is certainly not the first time the company has been called upon — but unable — to quell conspiracy-mongering around major national events. For example, following the killing of George Floyd in May and the ensuing protests, Facebook posts falsely alleging that Floyd’s death had been faked, or that the entire protest movement was organized by the CIA, were being spread on the platform. Facebook pledged to crack down on the spread of vile nonsense, but its efforts, never made fully transparent, were similarly ineffective.
Why does Facebook find itself, over and over, unable to cope with the exploitation of its platform to spread conspiracies, misinformation and propaganda? Because sensationalized content is how Facebook makes money. So, until its business model changes, the problems it aids won’t stop.