1,084
Views
7
CrossRef citations to date
0
Altmetric
Research Note

Hyperlinked Sympathizers: URLs and the Islamic State

&
Pages 239-257 | Received 17 Jan 2018, Accepted 10 Mar 2018, Published online: 18 Apr 2018
 

ABSTRACT

The self-proclaimed Islamic State of Iraq and Syria (ISIS) and its supporters take measured steps to ensure the group's survival in the virtual sphere, despite continued efforts to undercut the organization. This study examines a time-bound sample of 240,158 Uniform Resource Locators shared among English-language ISIS sympathizers on Twitter to better understand how networks in the jihadisphere inoculate radical materials and communities online. A thematic but descriptive analysis of results illustrates the dynamic apparatus of digital communications leveraged by ISIS. Findings suggest a more comprehensive strategy to undercut ISIS's web of online information requires a similarly networked response by counterextremism practitioners.

Acknowledgments

The authors thank the George Washington University's Program on Extremism and the Applied Research Lab at Penn State University for their support of this collaborative research project. In particular, Pete Forster, Julia Erdley, Alexander Meleagrou-Hitchens, Seamus Hughes, and Lorenzo Vidino were instrumental in making this article possible. Sarah Metz and Tanner Wrape, research assistants at the Program, also played a critical role in processing the dataset. Last, the authors thank Daniel Kerchner and the team at the Scholarly Technology Group of the George Washington University Libraries for their continued support and immeasurable contributions to this dataset and initiative.

Notes

1. Alexander Meleagrou-Hitchens, Audrey Alexander, and Nick Kaderbhai, “Literature Review: The Impact of Digital Communication Technology on Radicalization and Recruitment,” International Affairs 93(5) (2017).

2. The United Kingdom, United States, Germany, Italy, and France serve as prime examples.

3. Bethan McKernan, “Isis Launches Full-Scale Propaganda Offensive as It Loses Battles in Syria and Iraq,” The Independent (29 September 2017); Charlie Winter, “How ISIS Survives the Fall of Mosul,” The Atlantic (3 July 2017).

4. J. M. Berger and Heather Perez, “The Islamic State's Diminishing Return on Twitter: How Suspensions Are Limiting the Social Networks of English-Speaking ISIS Supporters” (Washington, DC: George Washington University, Program on Extremism, 2016); Maura Conway, Moign Khawaja, Suraj Lakhani, Jeremy Reffin, Andrew Robertson, and David Weir, “Disrupting Daesh: Measuring Takedown of Online Terrorist Material and Its Impacts” (Dublin, Ireland: Dublin City University, VOX-Pol, 2017); Martyn Frampton, Ali Fisher, and Nico Prucha, “The New Netwar: Countering Extremism” (London: Policy Exchange, 2017).

5. Audrey Alexander, “Digital Decay? Tracing Change Over Time Among English-Language Islamic State Sympathizers on Twitter” (Washington, DC: George Washington University, Program on Extremism, 2017).

6. Ibid.

7. Ibid.

8. This dataset contains 845,646 tweets by 1,782 unique accounts of English-language ISIS sympathizers on Twitter. Accounts were selected between 15 February 2016 and 1 May 2017 using snowball sampling methods to identify accounts that met a predetermined selection criteria. The key requirements for inclusion in the study were: (1) At least half of the account's tweets were English-language and (2) accounts shared pro-ISIS sentiment via organic tweets or retweets.

9. Craig Whiteside, “Lighting the Path: the Evolution of the Islamic State Media Enterprise (2003–2016)” (The Hague: International Centre for Counter-Terrorism, 2016); Aaron Zelin, “The State of Global Jihad Online” (Washington, DC: New America Foundation, 2013); Martyn Frampton, Ali Fisher, and Nico Prucha, “The New Netwar: Countering Extremism” (London: Policy Exchange, 2017).

10. Daniel Milton, “Communication Breakdown: Unraveling the Islamic State's Media Efforts” (West Point, NY: Combating Terrorism Center, 2016), p. V.

11. Charlie Winter, “‘Documenting the Virtual ‘Caliphate’: Understanding Islamic State's Propaganda’” (London: Quilliam Foundation, 2015). Brendan Koerner, “Why ISIS Is Winning the Social Media War—And How to Fight Back,” WIRED (April 2016).

12. Craig Whiteside, “Lighting the Path: the Evolution of the Islamic State Media Enterprise (2003–2016)” (The Hague: International Centre for Counter-Terrorism, 2016), p. 26.

13. Aaron Zelin, “ICSR Insight—The Decline in Islamic State Media Output” (London: International Centre for the Study of Radicalisation, 2015); Daniel Milton, “Communication Breakdown: Unraveling the Islamic State's Media Efforts” (West Point, NY: Combating Terrorism Center, 2016), p. 21.

14. “Radical Preacher Anjem Choudary Jailed for Five Years,” BBC News (2016).

15. Cecilia Kang and Matt Apuzzo, “U.S. Asks Tech and Entertainment Industries Help in Fighting Terrorism,” The New York Times (24 February 2018).

16. In the official “Twitter Rules,” the company identifies “Violent threats (direct or indirect)” as grounds for temporarily locking and/or permanently suspending accounts, explaining, “You may not make threats of violence or promote violence, including threatening or promoting terrorism.” Twitter, “The Twitter Rules” (2017). Available at https://help.twitter.com/articles/18311?lang=en (accessed 16 January 2018).

17. Please note that Twitter's efforts to combat violent extremism focus predominantly, but not exclusively, on IS. As a product, the suspension figure provided by Twitter also includes extremism-related suspensions that do not pertain to IS. Twitter, “Government TOS Reports—January to June 2017” (2017). Available at https://transparency.twitter.com/en/gov-tos-reports.html (accessed 16 January 2018). Additionally, in a 2016 blogpost, the company claims to investigate “accounts similar to those reported” and “leverage[s] proprietary spam-fighting tools to surface other potentially violating accounts for review.” Twitter, “Combating Violent Extremism” (2016). Available at https://blog.twitter.com/2016/combating-violent-extremism (accessed 16 January 2018).

18. Twitter, “Guidelines for Law Enforcement,” (2017). Available at https://help.twitter.com/articles/41949?lang=en (accessed 16 January 2018). See also, Twitter, “An Update on Our Efforts to Combat Violent Extremism” (2016). Available at https://blog.twitter.com/2016/an-update-on-our-efforts-to-combat-violent-extremism (accessed 16 January 2018).

19. Twitter, “Partnering to Help Curb the Spread of Terrorist Content Online” (2016). Available at https://blog.twitter.com/2016/partnering-to-help-curb-the-spread-of-terrorist-content-online (accessed 16 January 2018).

20. Facebook, “Global Internet Forum to Counter Terrorism to Hold First Meeting in San Francisco” (2017). Available at https://newsroom.fb.com/news/2017/07/global-internet-forum-to-counter-terrorism-to-hold-first-meeting-in-san-francisco/ (accessed 16 January 2018).

21. Ali Fischer, “Eye of the Swarm: The Rise of ISIS and the Media Mujahedeen” (Los Angeles: USC Center on Public Diplomacy, 2014).

22. Nico Prucha, “IS and the Jihadist Information Highway—Projecting Influence and Religious Identity via Telegram,” Perspectives on Terrorism 10(6) (2016), p. 54.

23. Ibid.

24. For more information regarding the collection of this dataset, refer to the methodology section of this report: Audrey Alexander, “Digital Decay? Tracing Change Over Time Among English-Language Islamic State Sympathizers on Twitter” (Washington, DC: George Washington University, Program on Extremism, 2017).

25. Ibid.

26. Alberto Fernandez, “Here to Stay and Growing: Combating ISIS Propaganda Networks” (Brookings Institution, 2015).

27. The dataset included tweets containing anywhere between one and five hyperlinks.

28. For retweets, full URLs were parsed from the metadata associated with the particular tweet, as well as the message that was retweeted. This step was taken in order to get a representative take on the intended digital destination. Notably, retweets can also be retweeted. This study considered only the two most recent iterations of retweet history in the isolation of URLs.

29. The broader coding framework accounted for three variables: language, directional flow of information, and domain type.

30. Three coders performed a coding trial of the first 150 domains. Fleiss's kappa statistic was performed to assess the interrater reliability (IRR) for this sample. With an IRR score of kappa = 0.759 (p = .0000), 95% CI (.712, .806), the coding scheme was deemed acceptable. Although a product of limited resources, the authors acknowledge the trial sample does not adhere to traditional standards for piloting studies.

31. Research assistants at the Program on Extremism served as coders for this project. The authors trained the coders as a group and provided each with a copy of the codebook, as well as detailed instructions outlining the process. In order to preserve the integrity of IRR, coders were instructed not to consult one another.

32. Additional research included basic Web searches, as well as consultation of archived Web data. Coders were advised against navigating to any domain that posed potential security risks. The “inaccessible” and “blocked” domain categories were also available options for the Secondary Code.

33. For more detail on the logistics of this process, please contact the authors.

34. Interrater reliability tests used the PC values from each coder for domains without consensus.

35. J. L. Fleiss et al., “Measuring Nominal Scale Agreement among Many Raters,” Psychological Bulletin 76(5) (1971), pp. 378–382; Furthermore, due to methodological limitations and an overarching emphasis on qualitative discussions supported by quantitative findings, the authors did not conduct separate tests of validity for the coding scheme.

36. Indexed by base domain.

37. Consensus was achieved for 4,336 of the 4,375 domains with an interrater reliability score of Fleiss's kappa = 0.709 (p = .0000), 95% CI (0.701, 0.717). This kappa value demonstrates substantial agreement amongst the coders, and a p-value of .0000 with α = 0.05 rules out the possibility that this agreement is due to chance alone. The threshold for substantial agreement was accepted as kappa > 0.61, in accordance with Landis and Koch: J. Richard Landis and Gary G Koch, “The Measurement of Observer Agreement for Categorical Data,” Biometrics 33(1) (1977), pp. 159–174.

38. Researchers relied on a number of tools for data processing, analysis, and visualization; Minitab® 18 Statistical Software was used to determine the degree of agreement between coders: Minitab® 18 Statistical Software. (2010). State College, PA: Minitab Inc. (www.minitab.com). MINITAB® and all other trademarks and logos for the company's products and services are the exclusive property of Minitab Inc. All other marks referenced remain the property of their respective owners. See minitab.com for more information.; Python and Pandas were used to clean, analyze, and visualize data: Python Core Team (2015). Python: A dynamic, open source programming language. Python Software Foundation. Available at https://www.python.org/; Wes McKinney, “Data Structures for Statistical Computing in Python,” Proceedings of the 9th Python in Science Conference (2010), pp. 51–56. Available at http://conference.scipy.org/proceedings/scipy2010/mckinney.html (accessed 16 January 2018); Tableau and Inkscape were used to create visualizations: Tableau Desktop (2003). Mountain View, California: Tableau Software. Inkscape (version 0.92). open-source, 2003; The ELK Stack is a collection of three open-source products: Elasticsearch, Logstash, and Kibana: for more information, see https://www.elastic.co/products/kibana (accessed 16 January 2018).

39. The metadata presents retweets as URLs directed at Twitter.

40. Amarnath Amarasingam, “What Twitter Really Means for Islamic State Supporters,” War on the Rocks (30 December 2015).

41. Throughout the following discussion, percentages round to the nearest whole number. As a result, they do not always add up to 100.

42. See Additional Observations section for more detail.

43. It is interesting to note that one article was shared multiple times despite the fact that it was originally published before the time series used in this study. Azad Essa, “Muslims Being ‘Erased’ from Central African Republic,” Al Jazeera (31 July 2015).

44. “Syria War: Dozens Killed in ‘US-Led Strikes’ on Manbij,” Al Jazeera (20 July 2016).

45. Ned Parker and Jonathan Landay, “Special Report: Massacre Reports Show U.S. Inability to Curb Iraq Militias,” Reuters (23 August 2016).

46. Aymenn Jawad Al-Tamimi, “The Archivist: Media Fitna in the Islamic State” (28 September 2017). Available at http://www.aymennjawad.org/20357/the-archivist-media-fitna-in-the-islamic-state (accessed 16 January 2018).

47. In the dataset, At-Tamkin link shares were especially common, garnering more attention than Amaq and Al-Bayan base domains, respectively. Western efforts to remove extremist content may affect this outcome but further examination is necessary. For more information about At-Tamkin, see “Islamic State Set to Release New Video on Bangladesh,” Dhaka Tribune (23 September 2016). Available at http://www.dhakatribune.com/bangladesh/2016/09/23/17691/ (accessed 16 January 2018).

48. For a more expansive take on this discussion, see Nico Prucha, “IS and the Jihadist Information Highway—Projecting Influence and Religious Identity via Telegram,” Perspectives on Terrorism 10(6) (2016), p. 51.

49. Telegram, “Shared Files and Fast Mute” (1 February 2015). Available at https://telegram.org/blog/shared-files (accessed 16 January 2018).

50. Twitter, “How to Share and Watch Videos on Twitter” (11 December 2017). Avaialable at http://help.twitter.com/en/using-twitter/twitter-videos (accessed 16 January 2018).

51. The top 20th and 21st base domains had the same number of shares, so both were included in the list. After combining domains with the same end destination (such as justpaste.it and jpst.it), the final list of top “file-sharing” domains contained nineteen base domains.

52. This conceptualization of social networking sites includes applications, as well as plug-ins supported by existing sites that further optimize the dissemination of information.

53. The Global Internet Forum to Counter Terrorism was initially built on a partnership to develop a shared industry database of “hashes,” or virtual fingerprints. Such tools make the detection of problematic content easier, and possibly users, that flow across platforms engaged in the partnership. Google, “Partnering to Help Curb the Spread of Terrorist Content Online” (5 December 2016). Available at https://blog.google/topics/google-europe/partnering-help-curb-spread-terrorist-content-online/ (accessed 16 January 2018).

54. To identify the leading “social network” sites, the authors fused instances where unique domains appeared to led to the same destination.

55. In this study, base domains include only the top- and second-level domains.

56. Audrey Alexander and William Braniff, “Marginalizing Violent Extremism Online,” Lawfare, 21 January 2018.

57. Twitter Public Policy Blog, “Global Internet Forum to Counter Terrorism” (26 June 2017). Available at https://blog.twitter.com/official/en_us/topics/company/2017/Global-Internet-Forum-to-Counter-Terrorism.html (accessed 16 January 2018).

58. Tech Against Terrorism, “Knowledge Sharing Platform.” Available at http://ksp.techagainstterrorism.org/ (accessed 16 January 2018).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.