Facebook released its bi-annual Community Standards Enforcement report that signified how the company is dealing with fake accounts, harassment, spam, and other content that violates its policies. The report came just a day after the New York Times interrogation regarding the Facebook and Russian interference on its platform.
According to the social media company, they took down approximately 1.5 billion fake accounts in the past six months. Facebook also claims that the majority were taken down before the users reported them.
However, the company does admit that fake accounts represent around 3 to 4 percent of monthly active users in the second two quarters of 2018. Additionally, the metrics have remained consistent prior to this report as well.
The report released the first report in May of this year and in the current study, the company claims to “take action” on many violators that partook in harassment, nudity, child exploitation, and bullying of other users.
The company’s VP of Product Guy Rosen also commented that on more than one occasion, Facebook has a difficult time understanding whether the poster is commenting in a negative way or just bantering with friends. For example, if someone comments, “you’re crazy” on a picture of someone, Facebook requires time to understand whether it is a cunning remark or just a joke between friends.
Although, Facebook is doing a praiseworthy job flagging down harassment content from its portal – the CEO Mark Zuckerberg has repeatedly admitted that the company still has to progress further in developing an Artificial intelligence system that detects hate speech more efficiently.