Homenews

Facebook explains why it ‘gets things wrong’ as it’s trying to moderate content

Like Tweet Pin it Share Share Email

The social network responded to the Guardian’s exclusive reporting on how it handles revenge porn, sexual and graphic violence, and harassment in an op-ed column for the newspaper.

Chiranut Trairat, (second from left) mother of an 11-month-old baby girl, stands next to her daughter’s portrait before her funeral in Phuket, Thailand, Saturday, April 29, 2017. Her distraught husband hanged their daughter on social media website Facebook Live and then killed himself, police said. CREDIT: AP Photo

The Guardian published a series of articles, videos, and guides this week analyzing a cache of official documents that detail the policies Facebook moderators use to govern content and users.

The “Facebook Files” takes a comprehensive look at the materials and guidelines Facebook requires its moderators to follow in an effort to make the platform safer, and what it looks like when, despite its best efforts to parse the complexities of sex, art, racism, humor, harassment, and sadism, the company gets it wrong.

Now, Facebook responded to the Guardian’s reporting — not through a company statement, but with an op-ed published in the same newspaper penned by the company’s head of global policy, Monika Bickert.

Bickert, who formally worked as a criminal prosecutor in the U.S. and Thailand, opens by addressing the violent videos of Syrian children dead and injured from a chemical weapons attack that circulated on the platform last month.

“The images were deeply shocking — so much so that we placed a warning screen in front of them and made sure they were only visible to adults. But the images also prompted international outrage and renewed attention on the plight of Syrians.”

The Ethics Of Banning A Brutal Beheading Video

Bickert goes on to compliment the Guardian’s reporting, saying it “gets a lot of things right” to “show just how hard it can be to identify what is harmful — and what is necessary to allow people ability to share freely.”

The column goes on to outline Facebook’s rationale for determining what’s artistic, what’s offensive, and what’s inexcusable content online. And, as Bickert emphasized, context is everything.

It’s hard to judge the intent behind one post, or the risk implied in another. Someone posts a graphic video of a terrorist attack. Will it inspire people to emulate the violence, or speak out against it?

Someone with a dark sense of humour posts a joke about suicide. Are they just being themselves, or is it a cry for help?

Cultural context is part of it too. In the UK, being critical of the monarchy might be acceptable. In some parts of the world it will get you a jail sentence. It’s easy to comply with a clear-cut law, but most of the time what’s acceptable is more about norms and expectations. Social attitudes are constantly evolving, and every society has its flash points. New ways to tell stories and share images can bring these tensions to the surface faster than ever.

Bickert raises examples of how, even with the best intentions, Facebook’s policies are counter-intuitive and could result in leaving up some offensive content. For example, the platform doesn’t immediately take down self-harm livestream videos because experts advise that leaving them up can promote people offering and getting help. The video is then taken down later once the opportunity to help has passed. Facebook also tells moderators to ignore suicide threats “expressed through hashtags and emoticons,” if the context suggests its unlikely to carried out, or if the threat is for more than five days in the future.

Why Did Instagram Ban ‘Curvy’?

As our collective lives migrated and melded online, Facebook became the de facto arbiter of what’s acceptable to post. And across its platforms, including Instagram, the company has in its long struggled with the lines here — deciphering the difference between breastfeeding and gratuitous nudity, for example, and hesitating to ban users for harassment out of free speech concerns. A recent spate of murders caught on video and broadcast on the platform caused Facebook to increase its moderator staff, hoping to prevent the next viral graphic video from proliferating.

Bickert stressed that while the company may not always get things right, they do as much as they can, including hiring experts in rape crisis, terrorism, and education, to make the online space safe.

“We face criticism from people who want more censorship and people who want less. We see that as a useful signal that we are not leaning too far in any one direction. The alternative — doing nothing and allowing anything to be posted — is not what our community wants,” she wrote.

“We get things wrong, and we’re constantly working to make sure that happens less often. We put a lot of detailed thought into trying to find right answers, even when there aren’t any.”


Facebook explains why it ‘gets things wrong’ as it’s trying to moderate content was originally published in ThinkProgress on Medium, where people are continuing the conversation by highlighting and responding to this story.

Comments (0)

Leave a Reply

Your email address will not be published. Required fields are marked *

اخبار حلويات الاسرة طب عام طعام وشراب