450
Views
6
CrossRef citations to date
0
Altmetric
Research Article

Reliability of Perceived Usability Assessment via Crowdsourcing Platform: Retrospective Analysis and Novel Feedback Quality Inspection Method

, &

References

  • Alghannam, B. A., Albustan, S. A., Al-Hassan, A. A., & Albustan, L. A. (2018). Towards a standard arabic system usability scale: Psychometric evaluation using communication disorder app. International Journal of Human–Computer Interaction, 34(9), 799–804.
  • Alharbi, A., & Mayhew, P. (2015). Users performance in lab and non-lab enviornments through online usability testing: A case of evaluating the usability of digital academic libraries, 2015 Science and Information Conference (SAI), London, UK.
  • Andreasen, M. S., Nielsen, H. V., & Stage, J. (2007). What happened to remote usability testing?: Anempirical study of three methods. Sigchi conference on Human Factors in Computing Systems, ACM, San Jose, CA.
  • Assila, A., De Oliveira, K. M., & Ezzedine, H. (2016). Standardized usability questionnaires: Features and quality focus. Electronic Journal of Computer Science & Information Technology, 6(1), 15–31.
  • Atterer, R., Wnuk, M., & Schmidt, A. (2006, May 23–26). Knowing the user’s every move: User activity tracking for website usability evaluation and implicit interaction. Proceedings of the 15th international conference on World Wide Web, WWW 2006, Edinburgh, Scotland, UK: DBLP.
  • Bangor, A., Kortum, P., & Miller, J. (2009). Determining what individual sus scores mean: Adding an adjective rating scale. Journal of Usability Studies, 4(3), 114–123.
  • Bangor, A., Kortum, P., & Miller, J. A. (2008). The system usability scale (SUS): An empirical evaluation. International Journal of Human-Computer Interaction, 24(6), 574–594. doi:10.1080/10447310802205776
  • Berkman, M. I., & Karahoca, D. (2016). Re-assessing the usability metric for user experience (UMUX) scale. Journal of Usability Studies, 11(3), 89–109.
  • Blažica, B., & Lewis, J. R. (2015). A slovene translation of the system usability scale: The sus-si. International Journal of Human-Computer Interaction, 31(2), 112–117. doi:10.1080/10447318.2014.986634
  • Borsci, S., Federici, S., & Lauriola, M. (2009). On the dimensionality of the system usability scale: A test of alternative measurement models. Cognitive Processing, 10(3), 193–197. doi:10.1007/s10339-009-0268-9
  • Bosley, J. J. (2013). Creating a short usability metric for user experience (umux) scale. Interacting with Computers, 25(4), 317–319. doi:10.1093/iwc/iwt007
  • Bruun, A., & Stage, J. (2015). New approaches to usability evaluation in software development: Barefoot and crowdsourcing. Journal of Systems and Software, 105, 40–53. doi:10.1016/j.jss.2015.03.043
  • Castillo, J. C., Hartson, H. R., & Hix, D. (1998). Remote usability evaluation: Can users report their own critical incidents? Proceedings of CHI 1998 (pp. 253–254). Los Angeles, CA: ACM Press.
  • Chalil Madathil, K., & Greenstein, J. S. (2017). An investigation of the efficacy of collaborative virtual reality systems for moderated remote usability testing. Applied Ergonomics, 65, 501–514.
  • Chandler, J., Mueller, P., & Paolacci, G. (2014). Nonnaïveté among Amazon mechanical turk workers: Consequences and solutions for behavioral researchers. Behavior Research Methods, 46(1), 112–130. doi:10.3758/s13428-013-0365-7
  • Clarke, M. A., Belden, J. L., & Kim, M. S. (2014). Determining differences in user performance between expert and novice primary care doctors when using an electronic health record (ehr). Journal of Evaluation in Clinical Practice, 20(6), 1153–1161.
  • Doan, A. H., Ramakrishnan, R., & Halevy, A. Y. (2011). Crowdsourcing systems on the world-wide web. Communications of the Acm, 54(4), 86–96. doi:10.1145/1924421
  • Downs, J. S., Holbrook, M. B., Sheng, S., & Cranor, L. F. (2010). Are your participants gaming the system?: Screeningmechanical turk workers. Sigchi conference on Human Factors in Computing Systems, Atlanta, GA.
  • Dyson, M., & Haselgrove, M. (2010). The effects of reading speed and reading patterns on the understanding of text read from screen. Journal of Research in Reading, 23(2), 210–223. doi:10.1111/1467-9817.00115
  • Finstad, K. (2006). The system usability scale and non-native english speakers. Journal of Usability Studies, 1(4), 185–188.
  • Furnham, A., Chamorro-Premuzic, T., & Mcdougall, F. (2002). Personality, cognitive ability, and beliefs about intelligence as predictors of academic performance. Learning & Individual Differences, 14(1), 47–64. doi:10.1016/j.lindif.2003.08.002
  • Furnham, A., Hyde, G., & Trickey, G. (2013). On-line questionnaire completion time and personality test scores. Personality and Individual Differences, 54(6), 716–720. doi:10.1016/j.paid.2012.11.030
  • Gomide, V. H. M., Valle, P. A., & Ferreira, J. O. (2014). Affective crowdsourcing applied to usability testing. International Journal of Computer Science & Information Technology, 5(1), 575–579.
  • Gosling, S. D., & Mason, W. (2015). Internet research in psychology. Annual Review of Psychology, 66(1), 877–902. doi:10.1146/annurev-psych-010814-015321
  • Harms, C., Jackel, L., & Montag, C. (2017). Reliability and completion speed in online questionnaires under consideration of personality. Personality and Individual Differences, 111, 281–290. doi:10.1016/j.paid.2017.02.015
  • Howe, J. The rise of crowdsourcing, Wired Magazine 14.6, 1–4 (2006).
  • Ivcevic, Z., & Brackett, M. (2014). Predicting school success: Comparing conscientiousness, grit, and emotion regulation ability. Journal of Research in Personality, 52, 29–36. doi:10.1016/j.jrp.2014.06.005
  • Ivory, M. Y., & Hearst, M. A. (2001). The state of the art in automating usability evaluation of user interfaces. ACM Computing Surveys, 33(4), 470–516. doi:10.1145/503112.503114
  • Kittur, A. E. (2008). Crowdsourcing user studies with mechanical turk. Chi 08 Proceeding of the Twenty-sixth Sigchi conference on Human Factors in Computing Systems, Florence, Italy.
  • Kortum, P, & Acemyan, C. Z. (2019). The impact of geographic location on the subjective assessment of system usability. International Journal of Human–Computer Interaction, 35(2), 123–130.
  • Kortum, P., & Peres, S. C. (2015). Evaluation of home health care devices: Remote usability assessment. Jmir Human Factors, 2(1), e10. doi:10.2196/humanfactors.4570
  • Kortum, P., & Sorber, M. (2015). Measuring the usability of mobile applications for phones and tablets. International Journal of Human-Computer Interaction, 31(8), 518–529. doi:10.1080/10447318.2015.1064658
  • Kortum, P. T., & Bangor, A. (2013). Usability ratings for everyday products measured with the system usability scale. International Journal of Human Computer Interaction, 29(2), 67–76. doi:10.1080/10447318.2012.681221
  • Kujala, S., & Miron-Shatz, T. (2013). Emotions, experiences and usability in real-life mobile phone use. Sigchi conference on Human Factors in Computing Systems (pp. 1061–1070). Paris, France.
  • Lah, U., & Lewis, J. R. (2016). How expertise affects a digital-rights-management-sharing application’s usability. IEEE Software, 33(3), 76–82. doi:10.1109/MS.52
  • Lewis, J. R. (1994). Sample sizes for usability studies: Additional considerations. Human Factors, 36(2), 368. doi:10.1177/001872089403600215
  • Lewis, J. R. (2018a). Measuring perceived usability: The csuq, sus, and umux. International Journal of Human–Computer Interaction, 34(12), 1148–1156. doi:10.1080/10447318.2017.1418805
  • Lewis, J. R. (2018b). The system usability scale: Past, present, and future. International Journal of Human–Computer Interaction, 34(7), 577–590.
  • Lewis, J. R., & Sauro, J. (2017). Can i leave this one out?: The effect of dropping an item from the sus. Journal of Usability Studies, 13(1), 38–46.
  • Lewis, J. R., Utesch, B., & Maher, D. E. (2013). UMUX-LITE: When there’s no time for the SUS. Sigchi conference on Human Factors in Computing Systems, ACM, Paris, France.
  • Lievens, F., Reeve, C. L., & Heggestad, E. D. (2007). An examination of psychometric bias due to retesting on cognitive ability tests in selection settings. Journal of Applied Psychology, 92(6), 1672. doi:10.1037/0021-9010.92.6.1672
  • Lin, L., Robertson, T., & Lee, J. (2009). Reading performances between novices and experts in different media multitasking environments. Computers in the Schools, 26(3), 169–186.MacDormandoi:10.1080/07380560903095162
  • Liu, D., Bias, R. G., Lease, M., & Kuipers, R. (2012). Crowdsourcing for usability testing. Proceedings of the American Society for Information Science and Technology, 49(1), 1–10.
  • Lund ., A. (2001). Measuring usability with the USE questionnaire. Usability and User Experience Newsletter, STC Usability SIG, 8,(2), 1–4.
  • Maccann, C., Duckworth, A. L., & Roberts, R. D. (2009). Empirical identification of the major facets of conscientiousness. Learning & Individual Differences, 19(4), 0–458. doi:10.1016/j.lindif.2009.03.007
  • MacDorman, K. F., Whalen, T. J., Ho, C.-C., & Patel, H. (2011). An improved usability measure based on novice and expert performance. International Journal of Human-Computer Interaction, 27(3), 280–302.
  • Madathil, K. C., & Greenstein, J. (2011). Synchronous remote usability testing: A new approach facilitated by virtual worlds. ACM.
  • Mao, K., Capra, L., Harman, M., & Jia, Y. (2017). A survey of the use of crowdsourcing in software engineering. Journal of Systems and Software, 126(2017), 57–84.
  • Mason, W. A., & Watts, D. J. (2010). Financial incentives and the “performance of crowds”. Acm Sigkdd Explorations Newsletter, 11(2), 100–108. doi:10.1145/1809400
  • Mclellan, S., Muddimer, A., & Peres, S. C. (2012). The effect of experience on system usability scale ratings. Journal of Usability Studies, 7(2), 56–67.
  • Molich, R., Chattratichart, J., Hinkle, V., Jensen, J. J., Kirakowski, J., Sauro, J., ... Traynor, B. (2010). Rent a car in just 0, 60, 240 or 1,217 seconds?–Comparative usability measurement, cue-8. Journal of Usability Studies, 6(1), 8–24.
  • Montag, C., & Reuter, M. (2008). Does speed in completing an online questionnaire have an influence on its reliability? CyberPsychology & Behavior, 11(6), 719–721. doi:10.1089/cpb.2007.0258
  • Nebeling, M., Speicher, M., Grossniklaus, M., & Norrie, M. C. (2012, July). Crowdsourced web site evaluation with crowdstudy. In International Conference on Web Engineering (pp. 494–497). Berlin, Heidelberg: Springer.
  • Nebeling, M., Speicher, M., & Norrie, M. C. (2013, June). Crowdstudy: General toolkit for crowdsourced evaluation of web interfaces. In Proceedings of the 5th ACM SIGCHI symposium on Engineering interactive computing systems, (pp. 255–264). London, UK: ACM. doi:10.3389/fgene.2013.00255
  • Nielsen, J., & Landauer, T. (1993). A mathematical model of the finding of usability problems. Proceedings of the SIGCHI conference on Human factors in Computing Systems - CHI ’93. (pp. 206–213). Amsterdam, The Netherlands.
  • Ryu, Y. S., & Smith-Jackson, T. L., (2005) Usability questionnaire items for mobile products and content validity, Proceedings of HCI International, (pp. 22–27). Las Vegas.
  • Sauro, J., & Lewis, J. R. (2009). Correlations among prototypical usability metrics: Evidence for the construct of usability. Sigchi Conference on Human Factors in Computing Systems, (pp. 1609–1618). Boston, MA: ACM.
  • Sauro, J., & Lewis, J. R. (2011, May 7–12). When designing usability questionnaires, does it hurt to be positive? Proceedings of the International Conference on Human Factors in Computing Systems, CHI 2011, Vancouver, BC, Canada.
  • Sauro, J., & Lewis, J. R. (2012). Quantifying the user experience: practical statistics for user research. Waltham, MA: Morgan Kaufmann.
  • Sauro, J., & Lewis, J. R. (2014). Quantifying the user experience. Beijing, China: China Machine Press.
  • Schaie, K. W. (2005). Developmental influences on adult intelligence. Oxford, UK: Oxford University Press.
  • Schneider, C., & Cheung, T. (2013). The power of the crowd: performing usability testing using an on-demand workforce. In Proceedings of the Information Systems Development (pp. 551–560). New York: Springer-Verlag.
  • Stieger, S., & Reips, U. D. (2010). What are participants doing while filling in an online questionnaire: A paradata collection tool and an empirical study. Computers in Human Behavior, 26(6), 1488–1495. doi:10.1016/j.chb.2010.05.013
  • Teitcher, J. E., Bockting, W. O., Bauermeister, J. A., Hoefer, C. J., Miner, M. H., & Klitzman, R. L. (2015). Detecting, preventing, and responding to “fraudsters” in internet research: Ethics and tradeoffs. Journal of Law Medicine & Ethics, 43(1), 116–133. doi:10.1111/jlme.12200
  • Thompson, K. E., Rozanski, E. P., & Haake, A. R. (2004). Here, there, anywhere: Remote usability testing that works. Conference on Information Technology Education, ACM, Salt Lake City, UT.
  • Troyer, A. K. (2011). Serial position effect. In Encyclopedia of clinical neuropsychology (pp. 2263–2264). New York: Springer.
  • Tullis, T., & Albert, B. (2013). Measuring the user experience. Beijing, China: Publishing House of Electronics Industry.
  • Waterson, S., Landay, J. A., & Matthews, T. (2002). In the lab and out in the wild: Remote web usability testing for mobile devices. Extended Abstracts of the Conference on Human Factors in Computing Systems, DBLP, Minneapolis, MN.
  • Wenjing, W. (2007). Eye movement study on information extraction time and word frequency effect in Chinese reading. Tianjin Normal University. doi:10.1094/PDIS-91-4-0467B
  • Wozney, L. M., Baxter, P., Fast, H., Cleghorn, L., Hundert, A. S., & Newton, A. S. (2016). Sociotechnical human factors involved in remote online usability testing of two ehealth interventions. Jmir Human Factors, 3(1), e6. doi:10.2196/humanfactors.4602
  • Yang, T., Linder, J., & Bolchini, D. (2012). Deep: Design-oriented evaluation of perceived usability. International Journal of Human Computer Interaction, 28(5), 308–346. doi:10.1080/10447318.2011.586320

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.