Write For Us

Facebook removes 8.7 million content items violating child nudity and exploitation rules


Facebook announced on Wednesday that it took down 8.7 million pieces of content violating child nudity and exploitation rules. File Photo by Sascha Steinbach/EPA-EFE/UPI

By Ed Adamczyk, UPI

Facebook announced Wednesday that it removed 8.7 million pieces of content that violated its child nudity or child sexual exploitation policies in a three-month span.

About 99 percent of the affected content was removed before any users reported them, it said.

The company said it used artificial intelligence and machine-learning techniques flagging software to detect the images as they were uploaded in the past year. The figure it gave, of 8.7 million pieces of content found worldwide, covered actions taken between July and September. It is an improvement on photo matching, which Facebook has used for years to stop sharing of child exploitation images.

The software is able to "get in the way of inappropriate actions with children, review them and if it looks like there's something problematic, take action," Antigone Davis, Facebook's global head of safety, said.

The use of artificial intelligence can quickly identify content and notify the National Center for Missing and Exploited Children, and close the accounts of Facebook users promoting inappropriate actions with children, CNET reported on Wednesday.

"We have specially trained teams with backgrounds in law enforcement, online safety, analytics, and forensic investigations, which review content and report findings to NCMEC," the company said on Wednesday.

Facebook has historically erred on the side of caution in the past in deleting and reporting inappropriate photos of children. The process has led, in the past, to the removal of photographs of emaciated children taken at Nazi concentration camps, as well as a Pulitzer Prize-winning war photo of a naked Vietnamese girl after a napalm attack.

In the past, though, the company has relied largely on users who flag and report inappropriate images.

The new system allows Facebook to "proactively detect child nudity and previously unknown child exploitative content when it's uploaded," it said


Note: If you think this story need more information or correction, feel free to comment below your opinion and reaction.
Like & Follow to Stay Updated ...


Alabama,1,Arizona,1,Barack,1,California,3,Colorado,1,Connecticut,1,Crime,531,Florida,1,Franc,1,Hawaii,8,Illinois,1,Los Angeles,2,Massachusetts,3,Mississippi,1,New York,130,U.S.,1558,Washington,4,
U.S. - U.S. Daily News: Facebook removes 8.7 million content items violating child nudity and exploitation rules
Facebook removes 8.7 million content items violating child nudity and exploitation rules
U.S. - U.S. Daily News
Loaded All Posts Not found any posts VIEW ALL Read More Reply Cancel reply Delete By Home PAGES POSTS View All RECOMMENDED FOR YOU LABEL ARCHIVE SEARCH ALL POSTS Not found any post match with your request Back Home Sunday Monday Tuesday Wednesday Thursday Friday Saturday Sun Mon Tue Wed Thu Fri Sat January February March April May June July August September October November December Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec just now 1 minute ago $$1$$ minutes ago 1 hour ago $$1$$ hours ago Yesterday $$1$$ days ago $$1$$ weeks ago more than 5 weeks ago Followers Follow THIS PREMIUM CONTENT IS LOCKED STEP 1: Share. STEP 2: Click the link you shared to unlock Copy All Code Select All Code All codes were copied to your clipboard Can not copy the codes / texts, please press [CTRL]+[C] (or CMD+C with Mac) to copy