San Francisco; Facebook has removed 8.7 million images of child nudity in just three months before they were reported by the users.
The social network reports that a new software has been developed by the moderators to automatically flags possible sexual images of children, a report in BBC said.
Facebook was heavily criticised by the chairman of the 'Commons Media Committee', Damian Collins last year, over the popularity of child sexual abuse material on the network which followed a BBC investigation in 2016 leading to the evidences that pedophiles were sharing obscene images of children via secret Facebook groups.
Antigone Davis, the head of Facebook's global safety has said that Facebook is considering rolling out systems for spotting child nudity and grooming to Instagram as well and a separate system is used to block child sexual abuse imagery which has previously been reported to authorities.
Ms Davis said in an online video about the technology. 'Recently, our engineers have been focused on classifiers to actually prevent unknown images, new images.'
Such newly discovered material is reported by Facebook to the National Center for Missing and Exploited Children (NCMEC)(UNI)