Amid increasing political pressures from the EU and the US government, Facebook has had its internal rulebook leaked, fueling new controversy about the social network”s policies on questionable user content.
Internal training manuals, spreadsheets and flowcharts obtained by The Guardian show what appear to be conflicting rules on the appropriateness of posts – including Facebook Live posts – involving child or animal abuse, revenge porn, war and terrorism, hate speech, and more.
In one example, the newspaper reveals that Facebook views live-stream attempts at self-harm as okay to leave online because it “doesn”t want to censor or punish people in distress.” Earlier this year, it faced a backlash for failing to pull videos of murders in the US and Thailand posted on the social network. In contrast, the company removed an iconic Vietnam war photo simply because there was a naked girl in the picture, promoting public outcry.
The leak further reveals that Facebook considers remarks such as “Someone shoot Trump” should be deleted. Facebook”s explanation? Trump falls in a protected category as head of state. At the same time, the rulebook allows things like “F*** off and die” because Facebook regards these as non-credible threats.
“Keeping people on Facebook safe is the most important thing we do,” Facebook”s Head of Global Policy Management Monica Bickert said in a statement. “We work hard to make Facebook as safe as possible while enabling free speech,” Facebook”s Head of Global Policy Management Monica Bickert said in a statement. “This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously.”
Facebook is reportedly hiring some 3,000 additional moderators, as existing moderators say they are drowned in requests. Often they are left with “just 10 seconds” to make a decision. One source told the UK publication: “Facebook cannot keep control of its content. It has grown too big, too quickly.” Moderators are said to be at odds with inconsistent and peculiar policies – chiefly on sexual content.
“It”s one thing when you”re a small online community with a group of people who share principles and values, but when you have a large percentage of the world”s population and say “share yourself”, you are going to be in quite a muddle. Then when you monetise that practice you are entering a disaster situation,” said Sarah T Roberts, an expert on content moderation.
To Facebook”s credit, equally disturbing posts have helped uncover police killings and other abuses that would have otherwise remained unpunished. But when it comes to appropriateness, a lot of content falls in a gray area for context, culture, ethnicity and even age. And, while Facebook promises to do everything in its power to keep users safe, the social network has a ways to go to meet this goal.
tags
Filip has 15 years of experience in technology journalism. In recent years, he has turned his focus to cybersecurity in his role as Information Security Analyst at Bitdefender.
View all postsNovember 14, 2024
September 06, 2024