Publication Cover
The Information Society
An International Journal
Volume 39, 2023 - Issue 1
835
Views
6
CrossRef citations to date
0
Altmetric
Articles

Social network dynamics, bots, and community-based online misinformation spread: Lessons from anti-refugee and COVID-19 misinformation cases

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Pages 17-34 | Received 06 Jan 2021, Accepted 15 Feb 2022, Published online: 08 Nov 2022

References

  • Bennett, W. L., and S. Livingston. 2018. The disinformation order: Disruptive communication and the decline of democratic institutions. European Journal of Communication 33 (2):122–39. doi: 10.1177/0267323118760317.
  • Beskow, D. M., and K. M. Carley. 2018. Bot-hunter: A tiered approach to detecting & characterizing automated activity on Twitter. Accessed October 16, 2022. http://www.casos.cs.cmu.edu/publications/papers/LB_5.pdf.
  • Bessi, A., and E. Ferrara. 2016. Social bots distort the 2016 U.S. Presidential election online discussion. First Monday 21 (11). https://firstmonday.org/ojs/index.php/fm/article/view/7090.
  • Block, P. 2015. Reciprocity, transitivity, and the mysterious three-cycle. Social Networks 40:163–73. doi: 10.1016/j.socnet.2014.10.005.
  • Brainard, L. A. 2009. Cyber-communities. In International encyclopedia of civil society, ed. H.K. Anheier and S. Toepler, 587–600. New York: Springer.
  • Burt, R. S. 2005. Brokerage and closure: An introduction to social capital. Oxford, UK: Oxford University Press.
  • Centers for Disease Control and Prevention. 2020. Reducing stigma. Accessed October 16, 2020. https://www.cdc.gov/coronavirus/2019-ncov/daily-life-coping/reducing-stigma.html.
  • Chalmers, A. W., and P. A. Shotton. 2016. Changing the face of advocacy? Explaining interest organizations’ use of social media strategies. Political Communication 33 (3):374–91. doi: 10.1080/10584609.2015.1043477.
  • Chan, C., and K. Fu. 2017. The relationship between cyberbalkanization and opinion polarization: Time-series analysis on Facebook pages and opinion polls during the Hong Kong Occupy Movement and the associated debate on political reform. Journal of Computer-Mediated Communication 22 (5):266–83. doi: 10.1111/jcc4.12192.
  • Chen, E., K. Lerman, and E. Ferrara. 2020. Tracking social media discourse about the COVID-19 pandemic: Development of a public coronavirus Twitter data set. JMIR Public Health and Surveillance 6 (2):e19273. doi: 10.2196/19273.
  • Cinelli, M., W. Quattrociocchi, A. Galeazzi, C. M. Valensise, E. Brugnoli, A. L. Schmidt, P. Zola, F. Zollo, and A. Scala. 2020. The COVID-19 social media infodemic. Scientific Reports 10 (1):16598. doi: 10.1038/s41598-020-73510-5.
  • Colleoni, E., A. Rozza, and A. Arvidsson. 2014. Echo chamber or public sphere? Predicting political orientation and measuring political homophily in Twitter using big data. Journal of Communication 64 (2):317–32. doi: 10.1111/jcom.12084.
  • Dechêne, A., C. Stahl, J. Hansen, and M. Wänke. 2010. The truth about the truth: A meta-analytic review of the truth effect. Personality and Social Psychology Review 14 (2):238–57. doi: 10.1177/1088868309352251.
  • Del Vicario, M., A. Bessi, F. Zollo, F. Petroni, A. Scala, G. Caldarelli, H. E. Stanley, and W. Quattrociocchi. 2016. The spreading of misinformation online. Proceedings of the National Academy of Sciences of the United States of America 113 (3):554–9. doi: 10.1073/pnas.1517441113.
  • DeMarzo, P. M., D. Vayanos, and J. Zwiebel. 2003. Persuasion bias, social influence, and unidimensional opinions. The Quarterly Journal of Economics 118 (3):909–68. doi: 10.1162/00335530360698469.
  • Dixon, G. N., B. W. McKeever, A. E. Holton, C. Clarke, and G. Eosco. 2015. The power of a picture: Overcoming scientific misinformation by communicating weight-of-evidence information with visual exemplars. Journal of Communication 65 (4):639–59. doi: 10.1111/jcom.12159.
  • Doran, D., H. Alhazmi, and S. S. Gokhale. 2013. Triads, transitivity, and social effects in user interactions on Facebook. In 2013 Fifth International Conference on Computational Aspects of Social Networks, 68–73. New York: IEEE.
  • Dvir-Gvirsman, S. 2017. Media audience homophily: Partisan websites, audience identity and polarization processes. New Media & Society 19 (7):1072–91. doi: 10.1177/1461444815625945.
  • Ferrara, E. 2017. Disinformation and social bot operations in the run up to the 2017 French presidential election. First Monday 22 (8). https://firstmonday.org/ojs/index.php/fm/article/view/8005.
  • Freelon, D., and C. Wells. 2020. Disinformation as political communication. Political Communication 37 (2):145–56. doi: 10.1080/10584609.2020.1723755.
  • Guo, L., J. A. Rohde, and H. D. Wu. 2020. Who is responsible for Twitter’s echo chamber problem? Evidence from 2016 U.S. election networks. Information, Communication & Society 23 (2):234–51. doi: 10.1080/1369118X.2018.1499793.
  • Guo, L., and C. Vargo. 2020. “Fake news” and emerging online media ecosystem: An integrated intermedia agenda-setting analysis of the 2016 U.S. presidential election. Communication Research 47 (2):178–200. doi: 10.1177/0093650218777177.
  • Hameleers, M., and T. G. L. A. van der Meer. 2020. Misinformation and polarization in a high-choice media environment: How effective are political fact-checkers? Communication Research 47 (2):227–50. doi: 10.1177/0093650218819671.
  • Haustein, S., T. D. Bowman, K. Holmberg, A. Tsou, C. R. Sugimoto, and V. Larivière. 2016. Tweets as impact indicators: Examining the implications of automated “bot” accounts on Twitter. Journal of the Association for Information Science and Technology 67 (1):232–8. doi: 10.1002/asi.23456.
  • Hjorth, F., and R. Adler-Nissen. 2019. Ideological asymmetry in the reach of pro-Russian digital disinformation to United States audiences. Journal of Communication 69 (2):168–92. doi: 10.1093/joc/jqz006.
  • Hochschild, J. L, and K. L. Einstein. 2015. Do facts matter? Information and misinformation in American politics. Norman, OK: University of Oklahoma Press.
  • Holbert, R. L., R. K. Garrett, and L. S. Gleason. 2010. A new era of minimal effects? A response to Bennett and Iyengar. Journal of Communication 60 (1):15–34. doi: 10.1111/j.1460-2466.2009.01470.x.
  • Holland, P. W., and S. Leinhardt. 1971. Transitivity in structural models of small groups. Comparative Group Studies 2 (2):107–24. doi: 10.1177/104649647100200201.
  • Ibarra, H. 1993. Network centrality, power, and innovation involvement: Determinants of technical and administrative roles. Academy of Management Journal 36 (3):471–501. doi: 10.2307/256589.
  • Iribarren, J. L., and E. Moro. 2011. Affinity Paths and information diffusion in social networks. Social Networks 33 (2):134–42. doi: 10.1016/j.socnet.2010.11.003.
  • Jang, S. M., T. Geng, J.-Y Q. Li, R. Xia, C.-T. Huang, H. Kim, and J. Tang. 2018. A computational approach for examining the roots and spreading patterns of fake news: Evolution tree analysis. Computers in Human Behavior 84:103–13. doi: 10.1016/j.chb.2018.02.032.
  • Jerit, J., and Y. Zhao. 2020. Political misinformation. Annual Review of Political Science 23 (1):77–94. doi: 10.1146/annurev-polisci-050718-032814.
  • Jost, J. T., S. van der Linden, C. Panagopoulos, and C. D. Hardin. 2018. Ideological asymmetries in conformity, desire for shared reality, and the spread of misinformation. Current Opinion in Psychology 23:77–83.
  • Kasprak, A. 2020. The origins and scientific failings of the COVID-19 “bioweapon” conspiracy theory. Accessed October 16, 2022. https://www.snopes.com/news/2020/04/01/covid-19-bioweapon.
  • Koch, T., and T. Zerback. 2013. Helpful or harmful? How frequent repetition affects perceived statement credibility. Journal of Communication 63 (6):993–1010. doi: 10.1111/jcom.12063.
  • Krishna, A. 2017. Motivation with misinformation: Conceptualizing lacuna individuals and publics as knowledge-deficient, issue-negative activists. Journal of Public Relations Research 29 (4):176–93. doi: 10.1080/1062726X.2017.1363047.
  • Lai, G., and O. Wong. 2002. The tie effect on information dissemination: The spread of a commercial rumor in Hong Kong. Social Networks 24 (1):49–75. doi: 10.1016/S0378-8733(01)00050-8.
  • Lazer, D. M. J., M. A. Baum, Y. Benkler, A. J. Berinsky, K. M. Greenhill, F. Menczer, M. J. Metzger, B. Nyhan, G. Pennycook, D. Rothschild, M. Schudson, S. A. Sloman, C. R. Sunstein, E. A. Thorson, D. J. Watts, and J. L. Zittrain. 2018. The science of fake news. Science 359 (6380):1094–6. doi: 10.1126/science.aao2998.
  • Lee, T. K., Y. Kim, and K. Coe. 2018. When social media become hostile media: An experimental examination of news sharing, partisanship, and follower count. Mass Communication and Society 21 (4):450–72. doi: 10.1080/15205436.2018.1429635.
  • Lombard, M., J. Snyder-Duch, and C. C. Bracken. 2002. Content analysis in mass communication: Assessment and reporting of intercoder reliability. Human Communication Research 28 (4):587–604. doi: 10.1111/j.1468-2958.2002.tb00826.x.
  • Magelinski, T., D. Beskow, and K. M. Carley. 2019. Graph-Hist: Graph classification from latent feature histograms with application to bot detection. Accessed October 16, 2022. http://arxiv.org/abs/1910.01180
  • McMillan, C., D. Felmlee, and D. W. Osgood. 2018. Peer influence, friend selection, and gender: How network processes shape adolescent smoking, drinking, and delinquency. Social Networks 55:86–96. doi: 10.1016/j.socnet.2018.05.008.
  • McPherson, M., L. Smith-Lovin, and J. M. Cook. 2001. Birds of a feather: Homophily in social networks. Annual Review of Sociology 27 (1):415–44. doi: 10.1146/annurev.soc.27.1.415.
  • Mejias, U. A., and N. E. Vokuev. 2017. Disinformation and the media: The case of Russia and Ukraine. Media, Culture & Society 39 (7):1027–42. doi: 10.1177/0163443716686672.
  • Milman, O. 2020. A quarter of all tweets about climate change are produced by bots. Accessed October 16, 2022. https://grist.org/climate/a-quarter-of-all-tweets-about-climate-change-are-produced-by-bots/.
  • Mocanu, D., L. Rossi, Q. Zhang, M. Karsai, and W. Quattrociocchi. 2015. Collective attention in the age of (mis)information. Computers in Human Behavior 51:1198–204. doi: 10.1016/j.chb.2015.01.024.
  • Monge, P. R., and N. S. Contractor. 2003. Theories of communication networks. Oxford, UK: Oxford University Press.
  • Pasek, J., G. Sood, and J. A. Krosnick. 2015. Misinformed about the Affordable Care Act? Leveraging certainty to assess the prevalence of misperceptions. Journal of Communication 65 (4):660–73. doi: 10.1111/jcom.12165.
  • Robins, G., P. Pattison, Y. Kalish, and D. Lusher. 2007. An introduction to exponential random graph (p*) models for social networks. Social Networks 29 (2):173–91. doi: 10.1016/j.socnet.2006.08.002.
  • Roth, Y., and N. Pickles. 2020. Bot or not? The facts about platform manipulation on Twitter. Accessed August 22, 2021. https://blog.twitter.com/en_us/topics/company/2020/bot-or-not.
  • Shin, J., and K. Thorson. 2017. Partisan selective sharing: The biased diffusion of fact-checking messages on social media. Journal of Communication 67 (2):233–55. doi: 10.1111/jcom.12284.
  • Shin, J., L. Jian, K. Driscoll, and F. Bar. 2017. Political rumoring on Twitter during the 2012 U.S. presidential election: Rumor diffusion and correction. New Media & Society 19 (8):1214–35. doi: 10.1177/1461444816634054.
  • Shin, J., L. Jian, K. Driscoll, and F. Bar. 2018. The diffusion of misinformation on social media: Temporal pattern, message, and source. Computers in Human Behavior 83:278–87. doi: 10.1016/j.chb.2018.02.008.
  • Shulman, S. 2011. DiscoverText: Software training to unlock the power of text. In Proceedings of the 12th Annual International Digital Government Research Conference: Digital Government Innovation in Challenging Times, 373. New York: ACM.
  • Song, H., and H. G. Boomgaarden. 2017. Dynamic spirals put to test: An agent-based model of reinforcing spirals between selective exposure, interpersonal networks, and attitude polarization. Journal of Communication 67 (2):256–81. doi: 10.1111/jcom.12288.
  • Southwell, B. G., and E. A. Thorson. 2015. The prevalence, consequence, and remedy of misinformation in mass media systems. Journal of Communication 65 (4):589–95. doi: 10.1111/jcom.12168.
  • Sunstein, C. R. 2007. Neither Hayek nor Habermas. Public Choice 134 (1-2):87–95. doi: 10.1007/s11127-007-9202-9.
  • Thorson, E. 2016. Belief echoes: The persistent effects of corrected misinformation. Political Communication 33 (3):460–80. doi: 10.1080/10584609.2015.1102187.
  • Vargo, C. J., L. Guo, and M. A. Amazeen. 2018. The agenda-setting power of fake news: A big data analysis of the online media landscape from 2014 to 2016. New Media & Society 20 (5):2028–49. doi: 10.1177/1461444817712086.
  • Vosoughi, S., D. Roy, and S. Aral. 2018. The spread of true and false news online. Science 359 (6380):1146–51. doi: 10.1126/science.aap9559.
  • Vraga, E. K., and L. Bode. 2017. Using expert sources to correct health misinformation in social media. Science Communication 39 (5):621–45. doi: 10.1177/1075547017731776.
  • Waisbord, S. 2018. Why populism is troubling for democratic communication. Communication, Culture and Critique 11 (1):21–34. doi: 10.1093/ccc/tcx005.
  • Wang, L., A. Yang, and K. Thorson. 2021. Serial participants of social media climate discussion as a community of practice: A longitudinal network analysis. Information, Communication & Society 24 (7):941–59. doi: 10.1080/1369118X.2019.1668457.
  • Wang, P., G. Robins, P. Pattison, and E. Lazega. 2013. Exponential random graph models for multilevel ­networks. Social Networks 35 (1):96–115. doi: 10.1016/j.socnet.2013.01.004.
  • Wang, X., and Y. Song. 2020. Viral misinformation and echo chambers: The diffusion of rumors about genetically modified organisms on social media. Internet Research 30 (5):1547–64. doi: 10.1108/INTR-11-2019-0491.
  • Wardle, C. 2017. Fake news. It’s complicated. Accessed October 16, 2022. https://firstdraftnews.org:443/latest/fake-news-complicated/.
  • Watts, D. J. 2002. A simple model of global cascades on random networks. Proceedings of the National Academy of Sciences of the United States of America 99 (9):5766–71. doi: 10.1073/pnas.082090499.
  • Weeks, B. E. 2015. Emotions, partisanship, and misperceptions: How anger and anxietymoderate the effect of partisan bias on susceptibility to political misinformation. Journal of Communication 65 (4):699–719. doi: 10.1111/jcom.12164.
  • Weeks, B. E., and H. Gil de Zúñiga. 2021. What’s next? Six observations for the future of political misinformation research. American Behavioral Scientist 65 (2):277–89. doi: 10.1177/0002764219878236.
  • Willson, M. 2010. Technology, networks and communities: An exploration of network and community theory and technosocial forms. Information, Communication & Society 13 (5):747–64. doi: 10.1080/13691180903271572.
  • Wojcik, S., S. Messing, A. Smith, L. Rainie, and P. Hitlin. 2018. Bots in the Twittersphere. Accessed October 16, 2022. https://www.pewresearch.org/internet/2018/04/09/bots-in-the-twittersphere/.
  • World Health Organization. 2020. How to report misinformation online. Accessed December 27, 2020. https://www.who.int/campaigns/connecting-the-world-to-combat-coronavirus/how-to-report-misinformation-online.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.