Amid Life-or-Death Crises, Facebook Struggles to Keep Users Well-Informed


Image courtesy of The Guardian

Ana Clara Monaco, S&G Editor

Facebook, a social media platform with 2.7 billion users worldwide, is attempting to control the spread of misinformation concerning the election, coronavirus, and the Oregon fires- among other topics. The platform houses a multitude of mainstream pro-Trump groups, known as the QAnon phenomenon, distributing false information and posing a potential domestic threat to American democracy according to the Federal Bureau of Investigation; as well as “Re-Open” groups, protesting the worldwide lockdown in response to the Coronavirus. Since early September, Facebook has been applying warnings and removing accounts and posts that violate their Misleading Information Policy, but the inept rate at which the site continues to act against rumors and conspiracies has led to confusion among first responders combating the Portland fires and has provided a means through which like-minded wrongdoers can congregate and gather misinformed supporters.

Members of the QAnon sensation promote a conspiracy declaring that the world is run by a group of Satanic pedophiles extracting the life-extending chemical Adrenochrome from the blood of children (ranging from politicians like Barack Obama and Hillary Clinton, Hollywood celebrities like Ellen DeGeneres and Tom Hanks, and religious figures like Pope Frances and the Dalai Lama); President Donald Trump and his military allies, on direct instructions from the Christian God, are working to unmask these mainly-Democratic criminals. This information is obtained from a government insider “Q”, who has made absurd claims regarding 9/11, the assassination of JFK, the 2012 Sandy Hook shooting, and COVID-19. Amidst a 190% increase of average daily tweets containing Q-related hashtags, Facebook has shut down 7,000 accounts linked to the cult since March and has repeatedly updated their plan of action against such threats; Facebook spokesperson Liz Bourgeois reported that workers have been advancing technology on the website to be able to “identify and take down posts that include violating key words, images and videos”. On October 5, the platform announced it would ban all users representing QAnon even if they do not include violent content, although these efforts lag behind the constant creation and spread of rumors throughout the site. These falsehoods engulf news about the Oregon fires, as beginning on September 8 online users crafted rumors blaming Black Lives Matter groups and Antifa for the natural disasters. As firefighters struggled to put out the blazes last month, the attention of law enforcement and Portland personnel was diverted to wild goose chases created by fascists on social media. Emergency dispatchers in Douglas County described an inundation of calls concerning this false belief that anti-fascists were starting the fires, the majority of which were actually caused by downed power lines. Facebook accounts promoting the re-opening of the United States (also known as “Re-Open” groups), some with 171,000 followers, were found to be distributing the incorrect fire-related news before the site removed this content, much of which is still visible and circulating. Following their September announcement, the rate at which posts/users are being reviewed has not slowed down, prompting researchers such as Karen Kornblu of the German Marshall Fund to urge the platform to focus on the “risk of widespread harm, as opposed to imminent harm”. In a society preparing for a chaotic Presidential election, attempting to curtail an unceasing global pandemic, and suffering the consequences of decades of environmental disregard, it is vital that a source so widespread and accessible promotes accurate, honest information to users. Facebook has implemented new strategies and policies to combat the spread of false information, yet the efficacy of such practices can only be told in time. The public has spoken, and it is evident that a once-trustable site has lost credibility and appeal in a digital world, doing too little too late to protect their platform and consumers.