283
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Preventing satisficing: A narrative review

ORCID Icon & ORCID Icon

References

  • Anduiza, E., & Galais, C. (2017). Answering without reading: Imcs and strong satisficing in online surveys. International Journal of Public Opinion Research, 29(3), 497–519. https://doi.org/10.1093/ijpor/edw007
  • Arthur, W., Hagen, E., & George, F. (2021). The lazy or dishonest respondent: Detection and prevention. Annual Review of Organizational Psychology and Organizational Behavior, 8(1), 105–137. https://doi.org/10.1146/annurev-orgpsych-012420-055324
  • Barge, S., & Gehlbach, H. (2012). Using the theory of satisficing to evaluate the quality of survey data. Research in Higher Education, 53(2), 182–200. https://doi.org/10.1007/s11162-011-9251-2
  • Berry, K., Rana, R., Lockwood, A., Fletcher, L., & Pratt, D. (2019). Factors associated with inattentive responding in online survey research. Personality and Individual Differences, 149, 157–159. https://doi.org/10.1016/j.paid.2019.05.043
  • Bowling, N. A., Huang, J. L., Bragg, C. B., Khazon, S., Liu, M., & Blackmore, C. E. (2016). Who cares and who is careless? Insufficient effort responding as a reflection of respondent personality. Journal of Personality and Social Psychology, 111(2), 218–229. https://doi.org/10.1037/pspp0000085
  • Brehm, J. W. (1966). A theory of psychological reactance. Academic Press.
  • Brosnan, K., Bettina, G., & Dolnicar, S. (2018). Identifying superfluous survey items. Journal of Retailing and Consumer Services, 43, 39–45. Effects of survey question comprehensibility on response quality. https://doi.org/10.1016/j.jretconser.2018.02.007
  • Callegaro, M., Murakami, M. H., Tepman, Z., & Henderson, V. (2015). Yes–no answers versus check-all in self-administered modes: A systematic review and analyses. International Journal of Market Research, 57(2), 203–224. https://doi.org/10.2501/IJMR-2015-014a
  • Chandler, J., Sisso, I., & Shapiro, D. (2020). Participant carelessness and fraud: Consequences for clinical research and potential solutions. Journal of Abnormal Psychology, 129(1), 49–55. https://doi.org/10.1037/abn0000479
  • Clifford, S., & Jerit, J. (2015). Do attempts to improve respondent attention increase social desirability bias? Public Opinion Quarterly, 79(3), 790–802. https://doi.org/10.1093/poq/nfv027
  • Conrad, F. G., Couper, M. P., Tourangeau, R., & Zhang, C. (2017). Reducing speeding in web surveys by providing immediate feedback. Survey Research Methods, 11(1), 45–61. doi:https://doi.org/10.18148/srm/2017.v11i1.6304
  • Costa, P. T., Jr., & McCrae, R. R. (1997). Stability and change in personality assessment: The revised NEO personality inventory in the year 2000. Journal of Personality Assessment, 68(1), 86–94. doi:10.1207/s15327752jpa6801_7
  • Couper, M. P., Antoun, C., & Mavletova, M. (2017). Mobile web surveys. In P. P. Biemer, E. D. D. Leeuw, S. Eckman, B. Edwards, F. Kreuter, L. E. Lyberg, C. Tucker, & B. T. West (Eds.), A total survey error perspective (pp. 133–154). Wiley. 10.1002/9781119041702.ch7
  • Couper, M. P., & Peterson, G. J. (2017). Why do web surveys take longer on smartphones? Social Science Computer Review, 35(3), 357–377. doi:10.1177/0894439316629932
  • Couper, M. P., Tourangeau, R., Conrad, F. G., & Crawford, S. D. (2004). What they see is what we get: Response options for web surveys. Social Science Computer Review, 22(1), 111–127. doi:10.1177/0894439303256555
  • Couper, M. P., Tourangeau, R., Conrad, F. G., & Zhang, C. (2013). The design of grids in web surveys. Social Science Computer Review, 31(3), 322–345. doi:10.1177/0894439312469865
  • Credé, M. (2010). Random responding as a threat to the validity of effect size estimates in correlational research. Educational and Psychological Measurement, 70(4), 596–612. doi:http://doi.org/10.11177/0013164410366686
  • Credé, M., Harms, P., Niehorster, S., & Gaye-Valentine, A. (2012). An evaluation of the consequences of using short measures of the big five personality traits. Journal of Personality and Social Psychology, 102(4), 874–888. doi:10.1037/a0027403
  • DeRouvray, C., & Couper, M. P. (2002). Designing a strategy for reducing “no opinion” responses in web-based surveys. Social Science Computer Review, 20(1), 3–9. doi:10.1177/089443930202000101
  • DeSimone, J. A., DeSimone, A. J., Harms, P. D., & Wood, D. (2018). The differential impacts of two forms of insufficient effort responding. Applied Psychology: An International Review, 67(2), 309–338. doi:10.1111/apps.12117
  • DeSimone, J. A., Harms, P. D., & DeSimone, A. J. (2015). Best practice recommendations for data screening. Journal of Organizational Behavior, 36(2), 171–181. doi:10.1002/job.1962
  • Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Internet, mail, and mixed-mode surveys: The tailored design method. Wiley.
  • Fazio, R. H., & Williams, C. J. (1986). Attitude accessibility as a moderator of the attitude-perception and attitude-behavior relations: An investigation of the 1984 presidential election. Journal of Personality and Social Psychology, 51(3), 505–514. doi:https://doi.org/10.1037/0022-3514.51.3.505
  • Fleischer, A., Mead, A. D., & Huang, J. (2015). Inattentive responding in Mturk and other online samples. Industrial and Organizational Psychology, 8(2), 196–202. doi:10.1017/iop.2015.25
  • Francavilla, N. M., Meade, A. W., & Young, A. L. (2019). Social interaction and internet‐based surveys: Examining the effects of virtual and in‐person proctors on careless response. Applied Psychology, 68(2), 223–249. doi:http://doi.org/10.1111/apps.12159
  • Galesic, M., & Bosnjak, M. (2009). Effects of questionnaire length on participation and indicators of response quality in a web survey. Public Opinion Quarterly, 73(2), 349–360. doi:10.1093/poq/nfp031
  • Galesic, M., Tourangeau, R., Couper, M. P., & Conrad, F. G. (2008). Eye-tracking data: New insights on response order effects and other cognitive shortcuts in survey responding. Public Opinion Quarterly, 72(5), 892–913.
  • Gao, Z., House, L., & Xiang, B. (2016). Impact of satisficing behavior in online surveys on consumer preference and welfare estimates. Food Policy, 64, 26–36. doi:10.1016/j.foodpol.2016.09.001
  • Gehlbach, H., & Barge, S. (2012). Anchoring and adjusting in questionnaire responses. Basic and Applied Social Psychology, 34(5), 417–433. doi:http://doi.org/10.1080/01973533.2012.711691
  • Gibson, A. M., & Bowling, N. A. (2019). The effects of questionnaire length and behavioral consequences on careless responding. European Journal of Psychological Assessment, 36(2), 410–420. Advance online publication. https://doi.org/10.1027/1015-5759/a000526
  • Goldammer, P., Annen, H., Stöckli, P. L., & Jonas, K. (2020). Careless responding in questionnaire measures: Detection, impact, and remedies. The Leadership Quarterly, 31(4), 101384. doi:https://doi.org/10.1016/j.leaqua.2020.101384
  • Grau, I., Ebbeler, C., & Banse, R. (2019). Culture differences in careless responding. Journal of Cross-Cultural Psychology, 50(3), 336–357. doi:10.1177/0022022119827379
  • Hamby, T., & Taylor, W. (2016). Survey satisficing inflates reliability and validity measures: An experimental comparison of college and amazon mechanical turk samples. Educational and Psychological Measurement, 76(6), 912–932. doi:10.1177/0013164415627349
  • He, Y. (2023). An exponentially weighted moving average procedure for detecting back random responding behavior. Journal of Educational Measurement, 60(2), 282–317. doi:10.1111/jedm.12351
  • Heggestad, E. D., Scheaf, D. J., Banks, G. C., Hausfeld, M. M., Tonidandel, S., & Williams, E. B. (2019). Scale adaptation in organizational science research: A review and best-practice recommendations. Journal of Management, 45(6), 2596–2627. doi:10.1177/0149206319850280
  • Herzog, A. R., & Bachman, J. G. (1981). Effects of questionnaire length on response quality. Public Opinion Quarterly, 45(4), 549–559. doi:10.1086/268687
  • Holland, J. L., & Christian, L. M. (2009). The influence of topic interest and interactive probing on responses to open-ended questions in web surveys. Social Science Computer Review, 27(2), 196–212. doi:10.1177/0894439308327481
  • Hong, M., Steedle, J. T., & Cheng, Y. (2020). Methods of detecting insufficient effort responding: Comparisons and practical recommendations. Educational and Psychological Measurement, 80(2), 312–345. doi:10.1177/0013164419865316
  • Huang, J. L., Curran, P. G., Keeney, J., Poposki, E. M., & DeShon, R. P. (2012). Detecting and deterring insufficient effort responding to surveys. Journal of Business and Psychology, 27(1), 99–114. doi:10.1007/s10869-011-9231-8
  • Huang, J. L., Liu, M., & Bowling, N. A. (2015). Insufficient effort responding: Examining an insidious confound in survey data. Journal of Applied Psychology, 100(3), 828–845. doi:10.1037/a0038510
  • Johnson, J. A. (2005). Ascertaining the validity of individual protocols from web-based personality inventories. Journal of Research in Personality, 39(1), 103–129. doi:https://doi.org/10.1016/j.jrp.2004.09.009
  • Kam, C. C. (2019). Careless responding threatens factorial analytic results and construct validity of personality measure. Frontiers in Psychology, 10, 1–10. doi:10.3389/fpsyg.2019.01258
  • Kim, Y., Dykema, J., Stevenson, J., Black, P., & Moberg, D. P. (2019). Straightlining: Overview of measurement, comparison of indicators, and effects in mail–web mixed-mode surveys. Social Science Computer Review, 37(2), 214–233. doi:https://doi.org/10.1177/0894439317752406
  • Krosnick, J. A. (1991). Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology, 5(3), 213–236. doi:10.1002/acp.2350050305
  • Krosnick, J. A., Holbrook, A. L., Berent, M. K., Carson, R. T., Hanemann, W. M., Kopp, R. J., Mitchell, R. C., Presser, S., Rudd, P. A., Smith, V. K., Moody, W. R., Green, M. C., & Conaway, M. (2002). The impact of “no opinion” response options on data quality: Non-attitude reduction or an invitation to satisfice? Public Opinion Quarterly, 66(3), 371–403. doi:10.1086/341394
  • Krosnick, J. A., Narayan, S., & Smith, W. R. (1996). Satisficing in surveys: Initial Evidence. New Directions for Evaluation, 1996(70), 29–44. doi:10.1002/ev.1033
  • Kunz, T., & Fuchs, M. (2019). Dynamic instructions in check-all-that-apply questions. Social Science Computer Review, 37(1), 104–118. doi:https://doi.org/10.1177/0894439317748890
  • Lenzner, T. (2012). Effects of survey question comprehensibility on response quality. Field Methods, 24(4), 409–428. doi:https://doi.org/10.1177/1525822X12448166
  • Lenzner, T., Kaczmirek, L., & Lenzner, A. (2010). Cognitive burden of survey questions and response times: A psycholinguistic experiment. Applied Cognitive Psychology, 24(7), 1003–1020. doi:https://doi.org/10.1002/acp.1602
  • Lugtig, P., & Toepoel, V. (2015). The use of PCs, smartphones, and tablets in probability-based panel survey: Effects on survey measurement error. Social Science Computer Review, 34(1), 1–17. doi:http://doi.org/10.1177/0894439315574248
  • Lu, J., Wang, F., Wang, X., Lin, L., Wang, W., Li, I., & Zhou, X. (2019). Inequalities in the health survey using validation question to filter insufficient effort responding: Reducing overestimated effects or creating selection bias? International Journal of Equity Health, 18(1), 131. doi:http://doi.org/10.1186/s12939-019-1030-2
  • Maniaci, M. R., & Rogge, R. D. (2014). Caring about carelessness: Participant inattention and its effect on research. Journal of Research in Personality, 48, 61–83. doi:10.1016/j.jrp.2013.09.008
  • Mavletova, A., Couper, M. P., & Lebedev, D. (2018). Grid and item-by-item formats in PC and mobile web surveys. Social Science Computer Review, 36(6), 647–668. doi:10.1177/0894439317735307
  • Meade, A. W., & Craig, S. B. (2012). Identifying careless responses in survey data. Psychological Methods, 17(3), 437–455. doi:http://doi.org/10.1037/a0028085
  • Necka, E. A., Cacioppo, S., Norman, G. J., Cacioppo, J. T., & Wicherts, J. M. (2016). Measuring the prevalence of problematic respondent behaviors among mturk, campus, and community participants. PloS One, 11(6). https://doi.org/10.1371/journal.pone.0157732
  • Nichols, A. L., & Edlund, J. E. (2020). Why don’t we care more about carelessness? Understanding the causes and consequences of careless participants. International Journal of Social Research Methodology, 23(6), 1–14. doi:10.1080/13645579.2020.1719618
  • Nichols, E. M., Falcone, B., & Figeuroa, I. (2018). Redesigning grids in the mobile-optimized national survey of college graduates: Results of a usability evaluation. Center for Survey Measurement. https://www.census.gov/content/dam/Census/library/working-papers/2018/adrm/rsm2018-07.pdf
  • Oppenheimer, D. M., Meyvis, T., & Davidenko, N. (2009). Instructional manipulation checks: Detecting satisficing to increase statistical power. Journal of Experimental Social Psychology, 45(4), 867–872. doi:10.1016/j.jesp.2009.03.009
  • Palaniappan, K., & Kum, I. Y. S. (2019). Underlying causes behind research study participants’ careless and biased responses in the field of sciences. Current Psychology, 38(6), 1737–1747. doi:https://doi.org/10.1007/s12144-017-9733-2
  • Roßmann, J., Gummer, T., & Silber, H. (2018). Mitigating satisficing in cognitively demanding grid questions: Evidence from two web-based experiments. Journal of Survey Statistics and Methodology, 6(3), 376–400. doi:https://doi.org/10.1093/jssam/smx020
  • Schmidt, K., Gummer, T., & Rossmann, J. (2020). Effects of respondent and survey characteristics on the response quality of an open-ended attitude question in web surveys. Methods, Data, Analyses, 14(1), 3–34. doi:https://doi.org/10.12758/mda.2019.05
  • Schmitt, N., & Stults, D. M. (1985). Factors defined by negatively keyed items: The result of careless respondents? Applied Psychological Measurement, 9(4), 367–373.
  • Schonlau, M., & Toepoel, V. (2015). Straightlining in web survey panels over time. Survey Research Methods, 9(2), 125–137. doi:https://doi.org/10.18148/srm/2015.v9i2.6128
  • Silber, H., Roßmann, J., & Gummer, T. (2018). When near means related: Evidence from three web survey experiments on inter-item correlations in grid questions. International Journal of Social Research Methodology: Theory & Practice, 21(3), 275–288. doi:10.1080/13645579.2017.1381478
  • Simon, H. A. (1957). Models of man. Wiley.
  • Smith, G. T., McCarthy, D. M., & Anderson, K. G. (2000). On the sins of short-form development. Psychological Assessment, 12(1), 102–111. doi:10.1037/1040-3590.12.1.102
  • Smyth, J. D., Dillman, D. A., Christian, L. M., & Stern, M. J. (2006). Comparing check-all and forced-choice question formats in web surveys. The Public Opinion Quarterly, 70(1), 66–77. doi:10.1093/poq/nfj007
  • Struminskaya, B., Weyandt, K., & Bosnjak, M. (2015). The effects of questionnaire completion using mobile devices on data quality. Evidence from a probability-based general population panel. Methods, Data, Analyses, 9(2), 261–292.
  • Toepoel, V., & Lugtig, P. (2022). Modularization in an Era of mobile web: Investigating the effects of cutting a survey into smaller pieces on data quality. Social Science Computer Review, 40(1), 150–164. doi:10.1177/0894439318784882
  • Tourangeau, R. (1984). Cognitive sciences and survey methods. In T. Jabine, M. Straf, J. Tanur, & R. Tourangeau (Eds.), Cognitive aspects of survey methodology: Building a bridge between disciplines (pp. 73–100). National Academy Press.
  • Tourangeau, R., Couper, M. P., & Conrad, F. G. (2004). Spacing, position, and order: Interpretive heuristics for visual features of survey questions. Public Opinion Quarterly, 68(3), 368–393. doi:10.1093/poq/nfh035
  • Tourangeau, R., Rips, L. J., & Rasinski, K. (2000). The psychology of survey response (Tourangeau, R., Rips, L. J., & Rasinski, K., Eds.) Cambridge University Press. https://doi.org/10.1017/CBO9780511819322
  • Tourangeau, R., & Yan, T. (2007). Sensitive questions in surveys. Psychological Bulletin, 133(5), 859–883. doi:10.1037/0033-2909.133.5.859
  • Truebner, M. (2021). The dynamics of “neither agree nor disagree” answers in attitudinal questions. Journal of Survey Statistics and Methodology, 9(1), 51–72. doi:10.1093/jssam/smz029
  • Vannette, D. L., & Krosnick, J. A. (2014). Answering questions: A comparison of survey satisficing and mindlessness. In A. le, C. T. Ngnoumen, & E. J. Langer (Eds.), The Wiley Blackwell handbook of mindfulness., Vols. I and II (pp. 312–327). Wiley Blackwell. doi:10.1002/9781118294895.ch17
  • Ward, M. K., & Meade, A. W. (2018). Applying social psychology to prevent careless responding during online surveys. Applied Psychology: An International Review, 67(2), 231–263. doi:https://doi.org/10.1111/apps.12118
  • Ward, M. K., & Pond, S. B. (2015). Using virtual presence and survey instructions to minimize careless responding on internet-based surveys. Computers in Human Behavior, 48, 554–568. doi:10.1016/j.chb.2015.01.070
  • Zhang, C., & Conrad, F. G. (2018). Intervening to reduce satisficing behaviors in web surveys: Evidence from two experiments on how it works. Social science computer review, 36(1), 57–81. https://doi.org/10.1177/0894439316683923

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.