242
Views
0
CrossRef citations to date
0
Altmetric
Research Article

The impact of indirect questioning: asking about you versus your friends

ORCID Icon, , , &
Pages 785-800 | Received 28 Sep 2021, Accepted 05 Aug 2022, Published online: 30 Aug 2022
 

ABSTRACT

Indirect questioning attempts to overcome social desirability bias in survey research. However, to properly analyze the resulting data, it is crucial to understand how it impacts responses. This study analyzes results from a randomized experiment that tests whether direct versus indirect questioning methods lead to different results in a sample of 8,426 youths in Kenya and Pakistan. Through an examination of differential item functioning and regression analyses, we find that question wording leads to differences in how scales should be scored. We conclude that the use of indirect questioning should be undertaken with caution as a method to replace direct questioning.

Acknowledgement

The authors would like to graciously acknowledge and thank the study participants who shared their experiences through our survey. In addition, we thank Simon Grinsted, Jonah Ondieki, Faiza Mushtaq, Mariam Vadria, and Haider Fancy for their assistance. This work was supported by the Lyle Spencer Foundation under Grant 201700045.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Supplementary material

Supplemental data for this article can be accessed online at https://doi.org/10.1080/13645579.2022.2117452

Additional information

Notes on contributors

Daphna Harel

Daphna Harel is Associate Professor of Applied Statistics and co-Director of the A3SR MS Program at New York University. She received her PhD at McGill University and her research interests include survey design and analysis, differential item functioning, and the impact of model misspecification.

Dorothy Seaman

Dorothy Seaman received her master’s degree in Applied Statistics for Social Science Research at New York University. Her research experience centers on policy and program evaluation using quasi-experimental and exploratory statistical methods.

Jennifer Hill

Jennifer Hill is Professor of Applied Statistics, Director of the PRIISM Center, and co-Director of the A3SR MS Program at New York University. She works to build robust, easy-to-use causal inference software that is accessible to researchers from a variety of backgrounds.

Elisabeth King

Elisabeth King is Professor of International Education and Politics at New York University. She was the Principal Investigator for the Kenya side of Project THINK and has a broader research agenda focused on inclusive identities and institutions in diverse and conflict-affected contexts

Dana Burde

Dana Burde is Associate Professor of International Education and Politics at New York University and Director of the International Education Program. She was the Principal Investigator for the Pakistan side of Project THINK and has a broader research agenda focused on the relationship between education and political violence in countries affected by conflict.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.