Youtube is often criticized for not doing enough to remove objectionable and harmful content from its site. But, recently Youtube has taken an admirable step by removing harmful videos from the platform. According to the Google-owned firm’s first quarterly moderation report, it took down a massive 8.3 million videos in the last quarter of 2017.
In a recent blog post, YouTube said it removed more than 8 million videos between October and December 2017.”The majority of these 8 million videos were spam or people attempting to upload adult content and represent a fraction of a percent of YouTube’s total views during this time period,” the post said. Of those 8 million videos deleted, 6.7 million were flagged by an algorithm and rest was picked by Youtube employees that they thought were not up to the community guidelines of the Youtube.
Here is a full breakdown of the Videos that are deleted:
- Sexual: 30.01%
- Spam or misleading: 26.4%
- Hateful or abusive: 15.6%
- Violent or repulsive: 13.5%
- Harmful dangerous acts: 7.6%
- Child abuse: 5.2%
- Promotes terrorism: 1.6%
The video-sharing platform said in a blog post that it would continue to release reports on the volume of videos removed for violating its guidelines.
The company said:
“By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, the speed of removal, and policy removal reasons.”
According to The Guardian, YouTube is one of several internet-based companies facing pressure from “national governments” to remove videos containing extremist or abusive content.
No doubt, it is a great step Youtube has taken to combat negativity and remove the concerns of governments and public. But still, it demands a lot of effort to meet the ethical standards of the content uploaded to the platform and follow the Youtube’s community guidelines.
The post This is why YouTube removed more than 8 million videos appeared first on TechJuice.