2,491
Views
17
CrossRef citations to date
0
Altmetric
Articles

Impacts of Attitudes Toward Government and Corporations on Public Trust in Artificial Intelligence

& ORCID Icon

References

  • Anderson, A. A., Scheufele, D. A., Brossard, D., & Corley, E. A. (2011). The role of media and deference to scientific authority in cultivating trust in sources of information about emerging technologies. International Journal of Public Opinion Research, 24(2), 225–237. https://doi.org/10.1093/ijpor/edr032
  • Boyd, R., & Holton, R. J. (2018). Technology, innovation, employment and power: Does robotics and artificial intelligence really mean social transformation? Journal of Sociology, 54(3), 331–345. https://doi.org/10.1177/1440783317726591
  • Broussard, M., Diakopoulos, N., Guzman, A. L., Abebe, R., Dupagne, M., & Chuan, C. H. (2019). Artificial Intelligence and Journalism. Journalism & Mass Communication Quarterly, 96(3), 673–695. https://doi.org/10.1177/1077699019859901
  • Camporesi, S., Vaccarella, M., & Davis, M. (2017). Investigating public trust in expert knowledge: Narrative, ethics, and engagement. Journal of Bioethical Inquiry, 14(1), 23–30. https://doi.org/10.1007/s11673-016-9767-4
  • Carlson, M. (2015). The robotic reporter: Automated journalism and the redefinition of labor, compositional forms, and journalistic authority. Digital Journalism, 3(3), 416–431. https://doi.org/10.1080/21670811.2014.976412
  • Carter, D. (2018). How real is the impact of artificial intelligence? The business information survey 2018. Business Information Review, 35(3), 99–115. https://doi.org/10.1177/0266382118790150
  • Castelvecchi, D. (2016). Can we open the black box of AI? Nature News, 538(7623), 20. https://doi.org/10.1038/538020a
  • Chang, R. C. S., Lu, H. P., & Yang, P. (2018). Stereotypes or golden rules? Exploring likable voice traits of social robots as active aging companions for tech-savvy baby boomers in Taiwan. Computers in Human Behavior, 84, 194–210. https://doi.org/10.1016/j.chb.2018.02.025
  • Cui, D., & Wu, F. (2019). The influence of media use on public perceptions of artificial intelligence in China: Evidence from an online survey. Information Development, 026666691989341. https://doi.org/10.1177/0266666919893411
  • Deng, B. (2015). Machine ethics: The robot’s dilemma. Nature News, 523(7558), 24. https://doi.org/10.1038/523024a
  • Dörr, K. N. (2016). Mapping the field of algorithmic journalism. Digital Journalism, 4(6), 700–722. https://doi.org/10.1080/21670811.2015.1096748
  • Dunlap, R. E., & McCright, A. M. (2011). Organized climate change denial. In J. S. Dryzek, R. B. Norgaard & D.Schlosberg (Eds.), The Oxford handbook of climate change (pp. 144–160). London, UK: Oxford. https://doi.org/10.1093/oxfordhb/9780199566600.003.0010
  • Edwards, A., Edwards, C., Spence, P. R., Harris, C., & Gambino, A. (2016). Robots in the classroom: Differences in students’ perceptions of credibility and learning between “teacher as robot” and “robot as teacher”. Computers in Human Behavior, 65, 627–634. https://doi.org/10.1016/j.chb.2016.06.005
  • Feldman, L., Maibach, E. W., Roser-Renouf, C., & Leiserowitz, A. (2012). Climate on cable: The nature and impact of global warming coverage on Fox News, CNN, and MSNBC. The International Journal of Press/Politics, 17(1), 3–31. https://doi.org/10.1177/1940161211425410
  • Ferrara, E., Varol, O., Davis, C., Menczer, F., & Flammini, A. (2016). The rise of social bots. Communications of the ACM, 59(7), 96–104. https://doi.org/10.1145/2818717
  • Freimuth, V. S., Musa, D., Hilyard, K., Quinn, S. C., & Kim, K. (2013). Trust during the early stages of the 2009 H1N1 pandemic. Journal of Health Communication, 19(3), 321–339. https://doi.org/10.1080/10810730.2013.811323
  • Gauchat, G. (2012). Politicization of science in the public sphere: A study of public trust in the United States, 1974 to 2010. American Sociological Review, 77(2), 167–187. https://doi.org/10.1080/10810730.2013.811323
  • Gherheș, V., & Obrad, C. (2018). Technical and Humanities Students’ Perspectives on the Development and Sustainability of Artificial Intelligence (AI). Sustainability, 10(9), 3066. https://doi.org/10.3390/su10093066
  • Giddens, A. (1991). The Consequences of Modernity. Stanford University Press. https://doi.org/10.7146/politica.v23i1.69311
  • Graefe, A., Haim, M., Haarmann, B., & Brosius, H. B. (2018). Readers’ perception of computer-generated news: Credibility, expertise, and readability. Journalism, 19(5), 595–610. https://doi.org/10.1177/1464884916641269
  • Gulrez, T., & Neftimeziani, S. (2015). Loneliness Kills: Can Autonomous Systems and Robotics Assist in Providing Solutions? International Journal of Swarm Intelligence and Evolutionary Computation, 5(1). https://doi.org/10.4172/2090-4908.1000e113
  • Gulson, K. N., & Webb, P. T. (2017). Mapping an emergent field of ‘computational education policy’: Policy rationalities, prediction and data in the age of Artificial Intelligence. Research in Education, 98(1), 14–26. https://doi.org/10.1177/0034523717723385
  • Gunkel, D. J. (2012). Communication and artificial intelligence: Opportunities and challenges for the 21st century. Communication+ 1, 1(1), 1–25. https://doi.org/10.7275/R5QJ7F7R
  • Guzman, A. L. (2019). Voices in and of the machine: Source orientation toward mobile virtual assistants. Computers in Human Behavior, 90, 343–350.https://doi.org/10.1016/j.chb.2018.08.009
  • Guzman, A. L., & Lewis, S. C. (2020). Artificial intelligence and communication: A Human–Machine Communication research agenda. New Media & Society, 22(1), 70–86. https://doi.org/10.1177/1461444819858691
  • Hengstler, M., Enkel, E., & Duelli, S. (2016). Applied artificial intelligence and trust—The case of autonomous vehicles and medical assistance devices. Technological Forecasting and Social Change, 105, 105–120. https://doi.org/10.1016/j.techfore.2015.12.014
  • Hmielowski, J. D., Feldman, L., Myers, T. A., Leiserowitz, A., & Maibach, E. (2014). An attack on science? Media use, trust in scientists, and perceptions of global warming. Public Understanding of Science, 23(7), 866–883. https://doi.org/10.1177/0963662513480091
  • Jho, H., Yoon, H. G., & Kim, M. (2014). The relationship of science knowledge, attitude and decision making on socio-scientific issues: The case study of students’ debates on a nuclear power plant in Korea. Science & Education, 23(5), 1131–1151. https://doi.org/10.1007/s11191-013-9652-z
  • Jones, S. (2014). People, things, memory and human-machine communication. International Journal of Media & Cultural Politics, 10(3), 245–258. https://doi.org/10.1386/macp.10.3.245_1
  • Kahan, D. M. (2013). Ideology, motivated reasoning, and cognitive reflection: An experimental study. Judgment and Decision Making, 8, 407–424. https://doi.org/10.2139/ssrn.2182588
  • Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2(10), 732. https://doi.org/10.1038/nclimate1547
  • Khakurel, J., Penzenstadler, B., Porras, J., Knutas, A., & Zhang, W. (2018). The Rise of Artificial Intelligence under the Lens of Sustainability. Technologies, 6(4), 100. https://doi.org/10.3390/technologies6040100
  • Krosnick, J. A., & MacInnis, B. (2010). Frequent viewers of Fox News are less likely to accept scientists’ views of global warming. Stanford University. Report for The Woods Institute for the Environment. http://woods.stanford.edu/docs/surveys/Global-Warming-Fox-News.pdf
  • Leiserowitz, A. A., Smith, N., & Marlon, J. R. (2010). Americans’ knowledge of climate change. Yale Project on Climate Change Communication.
  • Lewis, S. C., Guzman, A. L., & Schmidt, T. R. (2019). Automation, journalism, and human–machine communication: Rethinking roles and relationships of humans and machines in news. Digital Journalism, 7(4), 409–427. https://doi.org/10.1080/21670811.2019.1577147
  • Lewis, S. C., Sanders, A. K., & Carmody, C. (2019). Libel by algorithm? Automated journalism and the threat of legal liability. Journalism & Mass Communication Quarterly, 96(1), 60–81. https://doi.org/10.1177/1077699018755983
  • Lewis, S. C., & Westlund, O. (2015). Actors, actants, audiences, and activities in cross-media news work: A matrix and a research agenda. Digital Journalism, 3(1), 19–37. https://doi.org/10.1080/21670811.2014.927986
  • McCright, A. M., Dentzman, K., Charters, M., & Dietz, T. (2013). The influence of political ideology on trust in science. Environmental Research Letters, 8(4), 044029. https://doi.org/10.1088/1748-9326/8/4/044029
  • Nam, H. H., Jost, J. T., & Van Bavel, J. J. (2013). “Not for all the tea in China!” Political ideology and the avoidance of dissonance-arousing situations. PloS One, 8(4), e59837. https://doi.org/10.1371/journal.pone.0059837
  • Neri, H., & Cozman, F. (2019). The role of experts in the public perception of risk of artificial intelligence. AI & Society, 1–11. https://doi.org/10.1007/s00146-019-00924-9
  • Nisbet, E. C., Cooper, K. E., & Ellithorpe, M. (2015). Ignorance or bias? Evaluating the ideological and informational drivers of communication gaps about climate change. Public Understanding of Science, 24(3), 285–301. https://doi.org/10.1177/0963662514545909
  • Nisbet, E. C., Cooper, K. E., & Garrett, R. K. (2015). The partisan brain: How dissonant science messages lead conservatives and liberals to (dis) trust science. The Annals of the American Academy of Political and Social Science, 658(1), 36–66. https://doi.org/10.1177/0002716214555474
  • Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York University Press. https://doi.org/10.2307/j.ctt1pwt9w5
  • Papacharissi, Z. (ed.). (2019). A networked self and human augmentics, artificial intelligence, sentience. Routledge. https://doi.org/10.4324/9781315202082
  • Pechar, E., Bernauer, T., & Mayer, F. (2018). Beyond political ideology: The impact of attitudes towards government and corporations on trust in science. Science Communication, 40(3), 291–313. https://doi.org/10.1177/1075547018763970
  • Peter, J., & Kühne, R. (2018). The new frontier in communication research: Why we should study social robots. Media and Communication, 6(3), 73–76. https://doi.org/10.17645/mac.v6i3.1596
  • Primo, A., & Zago, G. (2015). Who and what do journalism? An actor-network perspective. Digital Journalism, 3(1), 38–52. https://doi.org/10.1080/21670811.2014.927987
  • Reeves, J. (2016). Automatic for the people: The automation of communicative labor. Communication and Critical/Cultural Studies, 13(2), 150–165. https://doi.org/10.1080/14791420.2015.1108450
  • Rosenthal-von der Pütten, A. M., Straßmann, C., Yaghoubzadeh, R., Kopp, S., & Krämer, N. C. (2019). Dominant and submissive nonverbal behavior of virtual agents and its effects on evaluation and negotiation outcome in different age groups. Computers in Human Behavior, 90, 397–409. https://doi.org/10.1016/j.chb.2018.08.047
  • Sandry, E. (2015). Robots and Communication. Palgrave Pivot. https://doi.org/10.1057/978113746
  • Scheufele, D. A. (2014). Science communication as political communication. Proceedings of the National Academy of Sciences, 111(Suppl. 4), 13585–13592. https://doi.org/10.1073/pnas.1317516111
  • Sharon, A. J., & Baram-Tsabari, A. (2014). Measuring mumbo jumbo: A preliminary quantification of the use of jargon in science communication. Public Understanding of Science, 23(5), 528–546. https://doi.org/10.1177/0963662512469916
  • Spence, P. R. (2019). Searching for questions, original thoughts, or advancing theory: Human-machine communication. Computers in Human Behavior, 90, 285–287. https://doi.org/10.1016/j.chb.2018.09.014
  • Su, K. P. (2019, May 22). Preventing AI abuse requires cooperation. Taipei Times, p. 8. http://www.taipeitimes.com/News/editorials/archives/2019/05/22/2003715563
  • Suchman, L. A. (2009). Human-machine reconfigurations: plans and situated actions. Cambridge University Press. https://doi.org/10.1017/CBO9780511808418
  • Taber, C. S., Cann, D., & Kucsova, S. (2009). The motivated processing of political arguments. Political Behavior, 31(2), 137–155. https://doi.org/10.1007/s11109-008-9075-8
  • Tussyadiah, I., & Miller, G. (2019). Perceived impacts of artificial intelligence and responses to positive behaviour change intervention. In J. Pesonen & J. Neidhardt (Eds.), Information and communication technologies in tourism 2019 (pp. 359–370). Springer, Cham. https://doi.org/10.1007/978-3-030-05940-8
  • Wilner, A. S. (2018). Cybersecurity and its discontents: Artificial intelligence, the Internet of Things, and digital misinformation. International Journal, 73(2), 308–316. https://doi.org/10.1177/0020702018782496

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.