Google outlines four steps it's taking to combat extremist on YouTube

"In previous deployments of this system, potential recruits have clicked through on the ads at an unusually high rate, and watched over half a million minutes of video content that debunks terrorist recruiting messages", Walker said. Flaggers, the company said, can better identify the difference between violent propaganda and news and that they are more than 90 percent accurate.

Alphabet Inc.'s Google says it is creating new policies and practices to suppress terrorism-related videos, a response to United Kingdom lawmakers who have said the internet is a petri dish for radical ideology.

"While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done".

These videos will be prefaced by a warning, and will have comments, recommendations, user endorsement and monetisation blocked, which Google says will make them less engaging and harder to find. "We will now devote more engineering resources to apply our most advanced machine learning research to train new "content classifiers" to help us more quickly identify and remove extremist and terrorism-related content", he added. Google has, for some time now, been using image-matching technology to prevent people from reloading content that was previously flagged and removed.

Not relying on just the video models alone, Google has vowed to hire more members of the YouTube's Trusted Flagger programme for human detection to identify problematic videos.

"There should be no place for terrorist content on our services", Google's General Counsel Kent Walker wrote. He said the company is working through its Jigsaw initiative to expand use of the "Redirect Method" across Europe. This includes partnering with other major tech firms to work together on tackling terror, as well as using targeted advertising to steer potential ISIS recruits towards de-radicalising content.

The company, for instance, already engages thousands of people from around the world to inspect and review potentially extremist and terrorist-related videos.

Google has also previously committed to working with other tech giants such as Facebook, Microsoft, and Twitter to establish and worldwide forum to tackle terrorism online. "Google and YouTube are committed to being part of the solution. It is a sweeping and complex challenge", Walker wrote. Last week, Facebook also announced a two-pronged approach to fighting terrorist content with both artificial intelligence and human experts.

Germany, France and Britain - where civilians have been killed and wounded in bombings and shootings by Islamist militants in recent years - have pressed social media platforms to do more to remove militant content and hate speech.

  • Arturo Norris