Skip to content Skip to footer

[authorbox authorid=”59″ title=”The Author”]

Manifold were the sighs of relief after ISIS was defeated in Mosul and Raqqa in the course of 2017. Yet, while its territorial caliphate has been reduced significantly, the online caliphate is still alive. And that is partly due to the negligence of YouTube, Facebook, and Twitter.

The promise of social media as the ultimate democratization of the internet is gradually falling apart. What began with Facebook’s alleged role in Donald Trump’s election victory, culminated in the Cambridge Analytica scandal.

Yet, there is a third problem, which until now has largely remained under the radar. YouTube, Facebook, and Twitter also provide a platform to extremism – from ISIS-related videos to white nationalism. Even though this content clearly violates these companies’ Terms of Service and Community Guidelines, (too) little is being done to permanently remove it from their platforms.

To alleviate this problem, the Counter Extremism Project (CEP) – an international non-profit focused on targeting online extremism – reached out to Professor Hany Farid. Farid is an authority in the field of digital forensics, which investigates whether photos, videos, and audio files have been tampered with.

Technological solution exists

For a very long time, social media companies proclaimed that a technological solution to the problem simply did not exist. To counter this narrative, professor Farid developed eGLYPH, a further extension of his previous project PhotoDNA. While PhotoDNA was developed to curb the spread of online child pornography images, eGLYPH specifically tackles online extremist content, both images, audio, and video.

In essence, the program ascribes a unique fingerprint – a so-called hash – to each piece of known extremist content. If this footage is later re-uploaded, the program immediately identifies the match and ensures that it is taken down immediately.

Even though professor Farid offers his program free of charge, YouTube, Facebook, and Twitter opt to use their own technology instead. As such, they joined forces in the so-called ‘Global Internet Forum to Counter Terrorism’. Yet, professor Farid remains sceptical regarding its efficacy. According to him, this is just a “throwaway project” that was literally “brushed off the shelf”‘ in order to convince politicians and the general public that they were taking their responsibility.

To substantiate his claims, professor Farid provides some examples. During a six-week period, from March 8 until April 18, 2018, the CEP applied professor Farid’s eGLYPH technology to YouTube. With a database of 256 registered extremist hashes, they found no less than 853 ISIS-related videos, which garnered a total of 99,361 views. 221 or 26% of those videos remained online for more than 2 hours and no less than 84% was uploaded more than once.

Or to make it even more concrete: On March 10, 2018, a video titled “Hunt Them O, Monotheist” was uploaded to YouTube, calling for firearm and vehicular attacks in Western Europe. The video was originally uploaded by a Somali ISIS-affiliate on December 25, 2017.

On March 10, it was available for 1 day 5 hours and 3 minutes before it was removed – amassing 405 views. The following day, the same video was re-uploaded and stayed online for 1 day 15 hours and 29 minutes – receiving 113 views. The video was then once again re-uploaded and available for 21 hours and 34 minutes, with 226 views. In the following days and weeks, the video was re-uploaded four more times, garnering hundreds of extra views. And this isn’t just an isolated incident, according to professor Farid.

Fundamental problem: Big Tech’s unwillingness

According to professor Farid, the problem fundamentally boils down to Big Tech’s unwillingness to tackle the matter head-on. Their reluctance to monitor their content stems from the fact that this would mean that they no longer benefit from their protected status as a ‘platform’ under section 230 of the ‘Communications Decency Act’. Since these companies are regarded as platforms, they aren’t responsible for the content on their websites. Once they actively start to monitor the content, however, they would be seen as ‘publishers’ and would have editorial responsibility.

Nevertheless, in the last couple of years, the companies have gradually started to acknowledge the problem and are also showing efforts to fix the situation. In addition, it seems as if the general public has woken up to the dangers of these social media.

Could legislation provide a solution?

Taking the aforementioned into consideration, maybe legislation could offer a way out. As such, I discussed the EU Commission’s recommendation concerning illegal terrorism-related content with professor Farid.

While he applauds the Commission’s sentiment and intent to tackle the subject, he sees two critical weaknesses. First of all, the recommendation calls for a removal within the hour after notification. As such, content could literally be online for weeks or months, without being noticed. While a takedown two hours after uploading results in 100s of views, a removal one hour after notification could thus result in 1,000s or even 10,000 views.

Secondly, this kind of regulation offers YouTube and other social media companies a way out. They could stop aggressively monitoring their content and just wait until content gets flagged. Hence, professor Farid remains wary of legislation, since it always entails unintended consequences.

Advertisers hold the key

Since social media companies rely on the financial contributions of their advertisers, it is they who can affect real change, according to professor Farid. If they would agree to boycott social media until they work out a fundamental solution, Silicon Valley would be forced to deal with the problem.

Only a few days ago, CNN published a report affirming that ads from companies like Adidas, Amazon, and Netflix ran on YouTube channels promoting white nationalists and neo-Nazi’s. This has happened before and will happen again. That’s why professor Farid proposes to bring the CEO’s of the five biggest advertisers around the table. According to him, this could really be the “game changer” we’re waiting for, as they have the leverage to get Silicon Valley to act.

To substantiate his claim, professor Farid refers to the dispute between the music and film industry and the social media platforms. Since people could freely share songs and movies, the industry lost a lot of its revenues. Yet, after a successful lobbying campaign, Congress passed the Digital Millennium Copyright Act. As a result, all videos with possible copyright infringements are taken down very quickly.

Hence, the ball is in the advertisers’ court. If they step up and demand Silicon Valley to take its responsibility, real change could be achieved. Until then, the situation reminds professor Farid of his youth: “When I was young, my mother would always make me apologise. I didn’t want to apologise, I didn’t mean it, but I would do it anyway, right. It’s sort of like this: If I’m forcing you to do something you don’t want to do, you’re going to find ways out of it.”

1 Comments

  • Peter Steinfeld
    Posted 29/05/2018 14:47 0Likes

    Is it not tooo easy to consider the Caliphate as an event of the past? As something that is over?
    Think of Turkey and its support for ISIS/DAESH, than think of the North of Syria, where Idlib and
    Jarabulus becoming turkish fortresses, where you can find a lot of DAESH/ISIS people who live there under Turkish protection. It is not over, it is just that the Islamist Rebels wear turkish uniforms now – so just wait and you will have them in NATO sooner or later, as well as in Europe.
    And yes, to your question: it is not comprehensible that those leftovers of DAESH are still on the web available, even if – maybe in military terms- DAESH is over. That does not mean much. Due to the fact that nothing is done to rebuild cities such as Rakka and Mossul, it is just a question of time and money till when the Turkenarmee will be the next DAESH/ISIS. They are already in the
    right area, right?
    As to your question: we do not understand why those leftovers of DAESH/ISIS have not been eliminated.

Comments are closed.