Google Inc.'s YouTube is stepping up efforts to review and remove inappropriate content on its platform by hiring more content moderators.
In a blog post, YouTube CEO Susan Wojcicki said Google aims in 2018 to have more than 10,000 people reviewing content that might violate its policies. The company also plans to work with more academics, industry groups and subject matter experts to help understand emerging issues.
Regarding advertising, Wojcicki said YouTube plans to hire more advertisement reviewers to ensure that the ads are "only running where they should." The platform will also implement stricter criteria and manually curate more ads, details of which will be revealed to advertisers and content creators over the coming weeks.
Furthermore, YouTube has begun using machine-learning technology to flag content that promotes hate speech and is harmful to children, Wojcicki said.
The platform will also, from 2018, regularly report the amount of content flagged and the actions it took to remove inappropriate content.
"Our goal is to stay one step ahead of bad actors, making it harder for policy-violating content to surface or remain on YouTube," Wojcicki said.
Several entities earlier stopped advertising on YouTube following allegations that it profited from videos that exploit children and attract pedophiles. In March, YouTube drew criticism for placing ads alongside content promoting terrorism and hate.
Google is a unit of Alphabet Inc.
