Share on facebook
Share on twitter
Share on linkedin
Share on reddit
Share on pinterest
New research shows that social media is still falling short on the terrorist crackdown
The Digital Citizens Alliance has uncovered a lot of evidence suggesting that terrorist content still thrives on these social media platforms.
Terrorism Content Online

[vc_row][vc_column][insikt_heading title=”New research shows that social media is still falling short on the terrorist crackdown” title_color=”#ffffff”][/vc_column][/vc_row][vc_row css=”.vc_custom_1530005615579{margin-right: 15px !important;margin-left: 15px !important;}”][vc_column css=”.vc_custom_1530005637267{margin-right: 15px !important;margin-left: 15px !important;}”][vc_column_text css=”.vc_custom_1534229955945{padding-right: 15px !important;padding-left: 15px !important;}”]

As we reported before, big social media websites such as Facebook, YouTube and Twitter have set their goals high for this year regarding terrorist propaganda removal. Recent reports from the Counter Extremism Project (CEP) are showing that these websites are still falling short on their goals

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/2″][vc_column_text]

CEP recently published a study showing that extremists are still finding their targeted audience on YouTube and Facebook with little restrictions. ISIS supporters have uploaded a total of 1,348 YouTube videos garnering 163,391 views between March and June. Hany Farid, a senior CEP adviser said that this is a lot of eyes and that YouTube needs to be more rigorous on their banning policy. 24% of those videos have stayed on YouTube for more than two hours which gave them enough time to be copied and re-distributed on Facebook and other video streaming services.CEP tracked down 278 different accounts from which the videos were posted. A staggering 60% of those accounts remained active on the platform after the videos got taken down.

“It’s discouraging that accounts caught posting terrorist material are allowed to continue uploading videos even after they’ve had their videos removed for violating YouTube’s terms of service. We know these videos are being created for propaganda purposes to incite and encourage violence, and I find those videos dangerous in a very real way,” said Farid.

CEP used 183 keywords for this study. They included Arabic words for “crusader” and “jihad,” along with ISIS controlled provinces and media outlets that support this group. A sophisticated software was used for this study. It breaks down the videos “to their essence” and recognizes the extremist content. Something similar to INVISO.

[/vc_column_text][/vc_column][vc_column width=”1/2″][vc_single_image image=”612″ img_size=”900×400″][/vc_column][/vc_row][vc_row][vc_column css=”.vc_custom_1530005173202{padding-right: 20px !important;padding-left: 20px !important;}”][vc_column_text]

The organization conducting this study has been a big critic of social media websites for years because of their approach to extremist content. Still, Farid is saying that they are making progress but it is slow and needs improvement. “Clearly Google is doing something about the amount of terrorism-related content posted to its platform. I’m encouraged because they’ve gone from not acknowledging the problem to actively pursuing it.”

We have to keep in mind that a lot of videos haven’t been found by this study. There were keywords that they could have missed and a lot of daily uploads make things even harder. We have also talked about the issue of terrorists finding new ways to disguise their content and trick recognition software or AI.

These numbers aren’t alarming but it is clear now that all of the big social media networks and video streaming services are in need of third-party help. INVISO and CEP are great examples of platforms operating outside these social media giants that are more successful at discovering extremist content than the websites themselves.



[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/4″][icon_box_content image=”569″ img_pos=”center” title=”EUROPEAN SECURITY SECRETARY ENTHUSIASTIC ABOUT INSIKT’S EC-FUNDED RESULTS” slink=””][/icon_box_content][/vc_column][vc_column width=”1/4″][icon_box_content image=”570″ img_pos=”center” title=”THIS IS HOW EXTREMISTS TRY TO TRICK YOUTUBE” slink=””][/icon_box_content][/vc_column][vc_column width=”1/4″][icon_box_content image=”571″ img_pos=”center” title=”TERRORIST CONTENT REMOVAL IN 60 SECONDS” slink=””][/icon_box_content][/vc_column][vc_column width=”1/4″][icon_box_content image=”156″ img_pos=”center” title=”WHAT ARE SOCIAL MEDIA COMPANIES REALLY DOING TO REMOVE CRIMINAL CONTENT?” slink=””][/icon_box_content][/vc_column][/vc_row][vc_row][vc_column][vc_btn title=”CONTACT US” size=”lg” align=”center” link=”||target:%20_blank|”][/vc_column][/vc_row]

Read next

Copyright © 2021 INSIKT AI All rights reserved

Tell us about your need
Members of

Our technology has been co-funded by the European Union’s Horizon 2020 research and innovation programme under grant agreement number 767542.