Facebook unveils new way for users to appeal content removal

Article Image

Facebook is admitting it needs oversight with content moderation.CEO Mark Zuckerberg said Thursday the company plans to establish an...

Posted: Nov 19, 2018 11:41 AM
Updated: Nov 19, 2018 11:41 AM

Facebook is admitting it needs oversight with content moderation.

CEO Mark Zuckerberg said Thursday the company plans to establish an independent group to oversee users' appeals of content policy decisions, starting next year.

Critcs have long urged the company to exhibit greater transparency into what content Facebook's human moderators and artificial intelligence take down, keep up and why.

The news, which was announced on a call with reporters, came one day after a report from The New York Times revealed how Facebook execs, including Zuckerberg and COO Sheryl Sandberg, navigated the company's recent crises, ranging from Russian interference to data privacy. The company's response included digging up dirt on competitors and critics, according to the report.

"I think people want to trust our intention," said Zuckerberg, who said the press call announcement was scheduled before the report published. "People expect companies to learn and not keep making the same mistake."

Facebook's decisions have come back to bite the company in recent years. For example, human rights groups have slammed Facebook for not properly cracking down on the hate and misinformation that fueled political division and violence in Myanmar. Facebook admitted it was "too slow" to act.

"I've increasingly come to believe that Facebook should not make so many important decisions about free expression and safety on our own," Zuckerberg wrote on Thursday in a blog post, which provided updates on the company's content governance and enforcement processes. "The purpose of this body would be to uphold the principle of giving people a voice while also recognizing the reality of keeping people safe."

Zuckerberg noted that while the details of the independent group are not yet determined -- including how it will select members and ensure its independence -- it'll aim to provide more transparency into decisions.

Earlier this year, the company finally made public its detailed internal community standards policies, detailing what is and what's not acceptable content, including hate speech, nudity, and bullying. Facebook also started rolling out a content appeals process this year for users inquiring about their own content that's been removed. The company said Thursday it will expand on this concept so users can appeal decision on reports filed on other people's content.

In addition, Facebook said it is training AI to detect and reduce the spread of "borderline content," which it describes as "click-bait and misinformation."

"People naturally engage with more sensational content," said Zuckerberg, adding the company is trying to de-incentivize that.

By late next year, the company expects to release transparency and enforcement reports on a quarterly basis, he said.

Article Comments

Fort Wayne
Broken Clouds
47° wxIcon
Hi: 71° Lo: 51°
Feels Like: 47°
Angola
45° wxIcon
Hi: 68° Lo: 50°
Feels Like: 45°
Huntington
Scattered Clouds
45° wxIcon
Hi: 70° Lo: 51°
Feels Like: 45°
Decatur
Clear
50° wxIcon
Hi: 71° Lo: 52°
Feels Like: 50°
Van Wert
Clear
50° wxIcon
Hi: 71° Lo: 51°
Feels Like: 50°
Warmer Saturday
WFFT Radar
WFFT Temperatures
WFFT National

Community Events