364
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Trust in Online Search Results During Uncertain Times

ORCID Icon, ORCID Icon, , & ORCID Icon

References

  • Ananny, M. (2019). Probably speech, maybe free: Toward a probabilistic understanding of online expression and platform governance. Free Speech Futures. https://knightcolumbia.org/content/probably-speech-maybe-free-toward-a-probabilistic-understanding-of-online-expression-and-platform-governance
  • Aten, J. (2022, July 8). The one word that explains why people trust Google more than Amazon, Apple, or Microsoft. Inc. https://www.inc.com/jason-aten/the-1-word-that-explains-why-people-trust-google-more-than-amazon-apple-or-microsoft.html
  • Bodó, B. (2021). Mediated trust: A theoretical framework to address the trustworthiness of technological trust mediators. New Media & Society, 23(9), 2668–2690. https://doi.org/10.1177/1461444820939922
  • Bogert, E., Schecter, A., & Watson, R. T. (2021). Humans rely more on algorithms than social influence as a task becomes more difficult. Scientific Reports, 11(1), 1–9. https://doi.org/10.1038/s41598-021-87480-9
  • Case, D. O., & Given, L. M. (2016). Looking for information: A survey of research on information seeking, needs and behavior (4th ed.). Emerald Press.
  • Castelfranchi, C., & Falcone, R. (2000). Trust is much more than subjective probability: Mental components and sources of trust. In Proceedings of the 33rd annual Hawaii international conference on system sciences (pp. 10). https://doi.org/10.1109/HICSS.2000.926815
  • Castelo, N., Bos, M. W., & Lehmann, D. (2019). Let the machine decide: When consumers trust or distrust algorithms. NIM Marketing Intelligence Review, 11(2), 24–29. https://doi.org/10.2478/nimmir-2019-0012
  • Chakravartty, P., Kuo, R., Grubbs, V., & McIlwain, C. (2018). #CommunicationSoWhite. Journal of Communication, 68(2), 254–266. https://doi.org/10.1093/joc/jqy003
  • Dervin, B. (1976). Strategies for dealing with human information needs: Information or communication? Journal of Broadcasting & Electronic Media, 20(3), 323–333. https://doi.org/10.1080/08838157609386402
  • Dutton, W. H., & Shepherd, A. (2006). Trust in the Internet as an experience technology. Information, Communication & Society, 9(4), 433–451. https://doi.org/10.1080/13691180600858606
  • Farooq, A., Laato, S., & Islam, A. N. (2020). Impact of online information on self-isolation intention during the COVID-19 pandemic: Cross-sectional study. Journal of Medical Internet Research, 22(5), e19128. https://doi.org/10.2196/19128
  • Fernandez, M., & Alani, H. (2018, April). Online misinformation: Challenges and future directions. In Companion Proceedings of the The Web Conference 2018 (pp. 595–602). https://doi.org/10.1145/3184558.3188730
  • Ghandi, M. (2021, January 11). Do Americans trust tech giants? SeoClarity. https://www.seoclarity.net/blog/americans-and-trusting-tech
  • Grodzinsky, F. S., Wolf, M. J., & Miller, K. W. (2011May). Quantum computing and cloud computing: Humans trusting humans via machines. In 2011 IEEE International Symposium on Technology and Society (ISTAS) (pp. 1–5). IEEE. https://doi.org/10.1109/ISTAS.2011.7160598
  • Halavais, A. (2017). Search engine society. John Wiley & Sons.
  • Hannak, A., Sapiezynski, P., Khaki, A. M., Balachander, K., Lazer, D., Mislove, A., & Wilson, C. (2013). Measuring personalization of web search. In Proceedings of the 22nd international conference on World Wide Web (WWW’13), ACM (pp. 527–538). https://doi.org/10.1145/2488388.2488435.
  • Hayes, A. F. (2018). Introduction to mediation, moderation, and conditional process analysis: A regression-based approach (2nd ed.). Guilford Press.
  • Heider, F. (1958). The psychology of interpersonal relations. Taylor & Francis.
  • Higgins, E. T. (1996). Knowledge activation: Accessibility, applicability, and salience. In E. T. Higgins & A. W. Kruglanski (Eds.), Social psychology: Handbook of basic principles (pp. 133–168). The Guilford Press.
  • Hofeditz, L., Mirababaie, M., Stieglitz, S., & Holstein, J. (2021, June 14-16). Do you trust an AI-journalist? A credibility analysis of news content with ai-authorship. In Twenty-Ninth European Conference on Information Systems (ECIS 2021), Marrakesh, Morocco. https://aisel.aisnet.org/ecis2021_rp/50/
  • Hoff, K. A., & Bashir, M. (2015). Trust in automation: Integrating empirical evidence on factors that influence trust. Human Factors: The Journal of the Human Factors and Ergonomics Society, 57(3), 407–434. https://doi.org/10.1177/0018720814547570
  • Karaca-Mandic, P. K., Georgiou, A., & Sen, S. (2021). Assessment of COVID-19 hospitalizations by race/ethnicity in 12 States. JAMA Internal Medicine, 181(1), 131–134. https://doi.org/10.1001/jamainternmed.2020.3857
  • Klawitter, E., & Hargittai, E. (2018). Shortcuts to well being? Evaluating the credibility of online health information through multiple complementary heuristics. Journal of Broadcasting & Electronic Media, 62(2), 251–268. https://doi.org/10.1080/08838151.2018.1451863
  • Knijenburg, B. P., Willemsen, M. C., Ganter, Z., Soncu, H., & Newell, C. (2012). Explaining the user experience of recommender systems. User Modeling and User-Adapted Interaction, 22(4–5), 441–504. https://doi.org/10.1007/s11257-011-9118-4
  • Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480–498. https://doi.org/10.1037/0033-2909.108.3.480
  • Lankton, N., Mcknight, D. H., & Thatcher, J. B. (2014). Incorporating trust-in-technology into expectation disconfirmation theory. The Journal of Strategic Information Systems, 23(2), 128–145. https://doi.org/10.1016/j.jsis.2013.09.001
  • Lee, M. K. (2018). Understanding perception of algorithmic decisions: Fairness, trust, and emotion in response to algorithmic management. Big Data & Society, 5(1), 1–16. https://doi.org/10.1177/2053951718756684
  • Li, J. B., Yang, A., Dou, K., & Cheung, R. Y. (2020). Self-control moderates the association between perceived severity of coronavirus disease 2019 (COVID-19) and mental health problems among the Chinese public. International Journal of Environmental Research and Public Health, 17(13), 4820–4830. https://doi.org/10.3390/ijerph17134820
  • Lucassen, T., & Schraagen, J. M. (2012). Propensity to trust and the influence of source and medium cues in credibility evaluation. Journal of Information Science, 38(6), 566–577. https://doi.org/10.1177/0165551512459921
  • Lu, Y., Chau, M., & Chau, P. Y. (2017). Are sponsored links effective? Investigating the impact of trust in search engine advertising. ACM Transactions on Management Information Systems (TMIS), 7(4), 1–33. https://doi.org/10.1145/3023365
  • Lustig, C., & Nardi, B. (2015). Algorithmic authority: The case of bitcoin. In 2015 48th Hawaii International Conference on System Sciences (pp. 743–752). IEEE. https://doi.org/10.1109/HICSS.2015.95
  • Mayer, R. C., Davis, J. H., & Schoorman, F. D. (1995). An integrative model of organizational trust. Academy of Management Review, 20(3), 709–734. https://doi.org/10.2307/258792
  • Misztal, B. (1996). Trust in modern societies: The search for the bases of social order. Polity Press.
  • Mittelstadt, B. D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society, 3(2), 1–21. https://doi.org/10.1177/2053951716679679
  • Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York University Press.
  • O’Brien, H. L., & Lebow, M. (2013). Mixed-methods approach to measuring user experience in online news interactions. Journal of the American Society for Information Science and Technology, 64(8), 1543–1556.
  • Omi, M., & Winant, H. (2014). Racial formation in the United States. Routledge.
  • Pan, B., Hembrooke, H., Joachims, T., Lorigo, L., Gay, G., & Granka, L. (2007). In Google we trust: Users’ decisions on rank, position, and relevance. Journal of Computer-Mediated Communication, 12(3), 801–823. https://doi.org/10.1111/j.1083-6101.2007.00351.x
  • Parker, K., Horowitz, J. M., & Anderson, M. (2020, December). Majorities across racial, ethnic groups express support for the black lives matter movement. Pew Research Center’s Social & Demographic Trends Project. https://www.pewresearch.org/social-trends/2020/06/12/amid-protests-majorities-across-racial-and-ethnic-groups-express-support-for-the-black-lives-matter-movement/
  • Petty, R. E., & Cacioppo, J. T. (1986). The elaboration likelihood model of persuasion. In Richard E. Petty, & John T. Cacioppo (Eds.), Communication and persuasion (pp. 1–24). Springer.
  • Raghavan, P. (2020, October 15). How AI is powering a more helpful Google. Google. https://blog.google/products/search/search-on/
  • Ranjit, Y. S., Shin, H., First, J. M., & Houston, J. B. (2021). COVID-19 protective model: The role of threat perceptions and informational cues in influencing behavior. Journal of Risk Research, 24(3–4), 449–465. https://doi.org/10.1080/13669877.2021.1887328
  • Rieh, S. Y. (2002). Judgment of information quality and cognitive authority in the Web. Journal of the American Society for Information Science and Technology, 53(2), 145–161. https://doi.org/10.1002/asi.10017
  • Russonello, G. (2020, June 5). Why most Americans support the protests. The New York Times. https://www.nytimes.com/2020/06/05/us/politics/polling-george-floyd-protests-racism.html
  • Sharabi, L. L. (2021). Exploring how beliefs about algorithms shape (offline) success in online dating: A two-wave longitudinal investigation. Communication Research, 48(7), 931–952. https://doi.org/10.1177/0093650219896936
  • Shin, D. (2020a). User perceptions of algorithmic decisions in the personalized AI system: Perceptual evaluation of fairness, accountability, transparency, and explainability. Journal of Broadcasting & Electronic Media, 64(4), 541–565. https://doi.org/10.1080/08838151.2020.1843357
  • Shin, D. (2020b). Expanding the role of trust in the experience of algorithmic journalism: User sensemaking of algorithmic heuristics in Korean users. Journalism Practice, 1–24. https://doi.org/10.1080/17512786.2020.1841018
  • Shin, D., & Park, Y. J. (2019). Role of fairness, accountability, and transparency in algorithmic affordance. Computers in Human Behavior, 98, 277–284. https://doi.org/10.1016/j.chb.2019.04.019
  • Shin, D., Zaid, B., Biocca, F., & Rasul, A. (2022). In platforms we trust? Unlocking the black-box of news algorithms through interpretable AI. Journal of Broadcasting & Electronic Media, 66(2), 1–22. https://doi.org/10.1080/08838151.2022.2057984
  • Sirdeshmukh, D., Ahmad, N. B., Khan, M. S., & Ashill, N. J. (2018). Drivers of user loyalty intention and commitment to a search engine: An exploratory study. Journal of Retailing and Consumer Services, 44, 71–81. https://doi.org/10.1016/j.jretconser.2018.06.002
  • Sundar, S. S., & Nass, C. (2001). Conceptualizing sources in online news. Journal of Communication, 51(1), 52–72. https://doi.org/10.1111/j.1460-2466.2001.tb02872.x
  • Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50(3), 755–769. https://doi.org/10.1111/j.1540-5907.2006.00214.x
  • Taylor, R. S. (1962). The process of asking questions. American Documentation, 13(4), 391–396. https://doi.org/10.1002/asi.5090130405
  • Taylor, S. E., & Fiske, S. T. (1978). Salience, attention, and attribution: Top of the head phenomena. Advances in Experimental Social Psychology, 11, 249–288. https://doi.org/10.1016/S0065-2601(08)60009-X
  • Thomas, D., & Horowitz, J. M. (2020, September 16). Support for black lives matter has decreased since June but remains strong among Black Americans. Pew Research Center. https://www.pewresearch.org/fact-tank/2020/09/16/support-for-black-lives-matter-has-decreased-since-june-but-remains-strong-among-black-americans/
  • Yang, H., Guo, X., Wu, T., & Ju, X. (2015). Exploring the effects of patient-generated and system-generated information on patients’ online search, evaluation and decision. Electronic Commerce Research and Applications, 14(3), 192–203. https://doi.org/10.1016/j.elerap.2015.04.001
  • Zarsky, T. (2016). The trouble with algorithmic decisions: An analytic road map to examine efficiency and fairness in automated and opaque decision making. Science, Technology, & Human Values, 41(1), 118–132. https://doi.org/10.1177/0162243915605575

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.