242
Views
0
CrossRef citations to date
0
Altmetric
Research Article

The impact of indirect questioning: asking about you versus your friends

ORCID Icon, , , &
Pages 785-800 | Received 28 Sep 2021, Accepted 05 Aug 2022, Published online: 30 Aug 2022
 

ABSTRACT

Indirect questioning attempts to overcome social desirability bias in survey research. However, to properly analyze the resulting data, it is crucial to understand how it impacts responses. This study analyzes results from a randomized experiment that tests whether direct versus indirect questioning methods lead to different results in a sample of 8,426 youths in Kenya and Pakistan. Through an examination of differential item functioning and regression analyses, we find that question wording leads to differences in how scales should be scored. We conclude that the use of indirect questioning should be undertaken with caution as a method to replace direct questioning.

Acknowledgement

The authors would like to graciously acknowledge and thank the study participants who shared their experiences through our survey. In addition, we thank Simon Grinsted, Jonah Ondieki, Faiza Mushtaq, Mariam Vadria, and Haider Fancy for their assistance. This work was supported by the Lyle Spencer Foundation under Grant 201700045.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Supplementary material

Supplemental data for this article can be accessed online at https://doi.org/10.1080/13645579.2022.2117452

Additional information

Notes on contributors

Daphna Harel

Daphna Harel is Associate Professor of Applied Statistics and co-Director of the A3SR MS Program at New York University. She received her PhD at McGill University and her research interests include survey design and analysis, differential item functioning, and the impact of model misspecification.

Dorothy Seaman

Dorothy Seaman received her master’s degree in Applied Statistics for Social Science Research at New York University. Her research experience centers on policy and program evaluation using quasi-experimental and exploratory statistical methods.

Jennifer Hill

Jennifer Hill is Professor of Applied Statistics, Director of the PRIISM Center, and co-Director of the A3SR MS Program at New York University. She works to build robust, easy-to-use causal inference software that is accessible to researchers from a variety of backgrounds.

Elisabeth King

Elisabeth King is Professor of International Education and Politics at New York University. She was the Principal Investigator for the Kenya side of Project THINK and has a broader research agenda focused on inclusive identities and institutions in diverse and conflict-affected contexts

Dana Burde

Dana Burde is Associate Professor of International Education and Politics at New York University and Director of the International Education Program. She was the Principal Investigator for the Pakistan side of Project THINK and has a broader research agenda focused on the relationship between education and political violence in countries affected by conflict.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 323.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.