References
- Best, H., & Wolf, C. (2015). Logistic regression. In H. Best & C. Wolf (Eds.). The SAGE handbook of regression analysis and causal inference. (Chap. 8, pp. 153–171). Los Alamitos, CA: Sage.
- Bohlender, A., & Glemser, A. (2016). SOEP-IS 2014 – Methodenbericht zum Befragungsjahr 2014 des SOEP-innovationssamples. SOEP survey papers 339: Series B. Berlin: DIW/SOEP
- Conrad, F. G., Couper, M. P., & Sakshaug, J. W. (2016). Classifying open-ended reports: Factors affecting the reliability of occupation codes. Journal of Official Statistics, 32, 75–92.
- Couper, M. P., Tourangeau, R., Conrad, F. G., & Crawford, S. D. (2004). What they see is what we get: Response options for web surveys. Social Science Computer Review, 22, 111–127.
- Couper, M. P., & Zhang, C. (2016). Helping respondents provide good answers in web surveys. Survey Research Methods, 10, 49–64.
- Döring, N., & Bortz, J. (1995). Forschungsmethoden und Evaluation für Sozialwissenschaftler. Berlin, DE: Springer.
- Fowler, F. J. (1995). Improving survey questions: Design and evaluation. Applied social research methods series. Thousand Oaks, London: New Dehli: Sage Publications.
- Funke, F., & Reips, U.-D. (2007). Dynamic form: Online surveys 2.0. Paper presented at the General Online Research Conference (GOR 2007), Leipzig, Germany.
- Galesic, M., Tourangeau, R., Couper, M. P., & Conrad, F. G. (2008). Eye-tracking data: New insights on response order effects and other cognitive shortcuts in survey responding. Public Opinion Quarterly, 72, 892–913.
- Groves, R. M., Fowler, F. J., Couper, M. P., Lepkowski, J. M., Singer, E., & Tourangeau, R. (2011). Survey Methodology (2nd ed.). Wiley series in survey methodology. Hoboken, NJ: John Wiley & Sons.
- Healey, B. (2007). Drop downs and scroll mice: The effect of response option format and input mechanism employed on data quality in web surveys. Social Science Computer Review, 25, 111–128.
- Heerwegh, D. (2003). Explaining response latencies and changing answers using client-side paradata from a web survey. Social Science Computer Review, 21, 360–373.
- Heerwegh, D., & Loosveldt, G. (2002). An evaluation of the effect of response formats on data quality in web surveys. Social Science Computer Review, 20, 471–484.
- Hoffmeyer-Zlotnik, J. H. P. (2016). Standardisation and harmonisation of socio-demographic variables. GESIS survey guidelines. Mannheim, DE: GESIS – Leibniz Institute for the Social Sciences. doi:10.15465/gesis-sg_en_012
- Keusch, F. (2014). The influence of answer box format on response behavior on list-style open-ended questions. Journal of Survey Statistics and Methodology, 2, 305–322.
- Krosnick, J. A., & Alwin, D. F. (1987). An evaluation of a cognitive theory of response-order effects in survey measurement. Public Opinion Quarterly, 51, 201–219.
- Krosnick, J. A., & Presser, S. (2009). Question and questionnaire design. In J. D. Wright & P. V. Marsden (Eds.). Handbook of survey research. (Chap. 9, pp. 263–315). San Diego, CA: Elsevier.
- Lachin, J. M. (1981). Introduction to sample size determination and power analysis for clinical trials. Controlled Clinical Trials, 2, 93–113.
- Lenzner, T., Kaczmirek, L., & Lenzner, A. (2010). Cognitive burden of survey questions and response times: A psycholinguistic experiment. Applied Cognitive Psychology, 24, 1003–1020.
- Malhotra, N. (2008). Completion time and response order effects in web surveys. Public Opinion Quarterly, 72, 914–934.
- Olson, K., & Parkhurst, B. (2013). Collecting paradata for measurement error evaluations. In Kreuter (Ed.). Improving surveys with paradata: Analytic uses of process information. (Chap. 3, Vol. 581, pp. 43–72). Wiley series in survey methodology. Hoboken, NJ: John Wiley & Sons.
- Ratcliff, R. (1993). Methods for dealing with reaction time outliers. Psychological Bulletin, 114, 510–532.
- Redline, C. D., Tourangeau, R., Couper, M. P., Conrad, F. G., & Ye, C. (2009). The effects of grouping response options in factual questions with many options. In JPSM research paper. Washington, DC: Annual Conference of the Federal Committee on Statistical Methodology. Retrieved from http://www.fcsm.gov/09papers/Redline_IX-B.pdf
- Richter, D., & Schupp, J. (2012). SOEP innovation sample (SOEP-IS) – description, structure and documentation. SOEP papers on multidisciplinary panel data research. Berlin, DE: Deutsches Institut für Wirtschaftsforschung, DIW.
- Richter, D., & Schupp, J. (2015). The SOEP innovation sample (SOEP IS). Schmollers Jahrbuch, 135, 389–399.
- Schierholz, M., Gensicke, M., Tschersich, N., & Kreuter, F. (2018). Occupation coding during the interview. Journal of the Royal Statistical Society: Series A (Statistics in Society), 181, 379–407.
- Schneider, S. L. (2008). Suggestions for the cross-national measurement of educational attainment: Refining the ISCED-97 and improving data collection and coding procedures. In S. L. Schneider (Ed.). The International Standard Classification of Education (ISCED-97). An evaluation of content and criterion validity for 15 European countries. (Chap. 17, pp. 311–330). Mannheim, DE: MZES.
- Schneider, S. L. (2013). The international standard classification of education 2011. In E. Birkelund (Ed.), Class and stratification analysis (Vol. 30, pp. 365–379). Comparative Social Research. Bradford, UK: Emerald Group Publishing Limited.
- Schneider, S. L., Briceno-Rosas, R., Herzing, J. M. E., & Ortmanns, V. (2016). Overcoming the shortcomings of long list showcards: Measuring education with an adaptive database lookup. In 9th International Conference on Social Science Methodology. RC33 Conference, Leicester: UK.
- Schneider, S. L., Briceno-Rosas, R., Ortmanns, V., & Herzing, J. M. E. (2018). Measuring migrants’ educational attainment: The CAMCES tool in the IAB-SOEP migration sample. In D. Behr (Ed.), Surveying the migrant population: Consideration of linguistic and cultural issues. Cologne, DE: GESIS Schriftenreihe. Retrieved from https://nbn-resolving.org/urn:nbn:de:0168-ssoar-58550-6
- Schneider, S. L., Joye, D., & Wolf, C. (2016). When translation is not enough: Background variables in comparative surveys. In C. Wolf, D. Joye, T. W. Smith, & Y.-C. Fu (Eds.). The SAGE handbook of survey methodology. (Chap. 20, pp. 288–307). London, UK: Sage.
- Schneider, S. L., & Ortmanns, V. (2019). Database of educational attainment, with explanatory note (Deliverable 8.8 of the SERISS project funded under the European Union’s Horizon 2020 research and innovation programme GA No: 654221). Retrieved from www.seriss.eu/resources/deliverables
- SOEP Group. (2014). SOEP 2013 – Documentation of person-related status and generated variables in PGEN for SOEP v30. SOEP Survey paper 250 (Series D). Berlin, DE: DIW/SOEP.
- SOEP-IS Group. (2017). SOEP-IS 2011 – Questionnaire for the SOEP Innovation Sample. SOEP survey papers 456: Series A – Survey instruments (Erhebungsinstrumente). Berlin, DE: DIW Berlin/SOEP.
- SOEP-IS Group. (2018). SOEP-IS 2014 – Fragebogen für die SOEP-Innovations-Stichprobe (update soep.is.2016.1). SOEP survey papers 519: Series A – Survey instruments (Erhebungsinstrumente). Berlin, DE: DIW Berlin/SOEP.
- Stern, M. J. (2008). The use of client-side paradata in analyzing the effects of visual layout on changing responses in web surveys. Field Methods, 20, 377–398.
- Tijdens, K. (2014). Dropout rates and response times of an occupation search tree in a web survey. Journal of Official Statistics, 30, 23–43.
- Tijdens, K. (2015). Self-identification of occupation in web surveys: Requirements for search trees and look-up tables. Survey Methods: Insights from the Field. Retrieved from https://surveyinsights.org/?p=6967
- Turner, G., Sturgis, P., & Martin, D. (2014). Can response latencies be used to detect survey satisficing on cognitively demanding questions? Journal of Survey Statistics and Methodology, 3, 89–108.
- UNESCO-UIS. (2014). ISCED 2011 International Standard Classification of Education. Montreal: Author. Retrieved from http://uis.unesco.org/en/isced-mappings
- Yan, T., & Tourangeau, R. (2008). Fast times and easy questions: The effects of age, experience and question complexity on web survey response times. Applied Cognitive Psychology, 22, 51–68.