443
Views
3
CrossRef citations to date
0
Altmetric
Research Article

Making more in crowdsourcing contests: a choice model of idea generation and feedback type

&
Pages 607-630 | Received 14 Jul 2019, Accepted 24 Mar 2020, Published online: 02 Apr 2020
 

ABSTRACT

A crowdsourcing contest is the process of inviting the general public or a targeted group of individuals to submit their ideas or solutions to a specific problem or challenge within a predefined period of time. In this study, utilizing nearly 500 participants, we design and run a field experiment to model two types of feedback: rated and ranked. We also measure the effect of each on the likelihood of revising the generated ideas and their subsequent quality based on novelty, feasibility, and value criteria in a crowdsourcing contest. Our major findings indicate that providing any type of feedback, compared to no feedback, improves the quality of ideas generated. Furthermore, we show that, on average, ranked feedback increases the quality of ideas more than rated feedback. Rated feedback, on the contrary, increases the likelihood of revising and resubmitting ideas for the top participants (75th percentile) more than the low participants (25th percentile). Managers and entrepreneurs may use our findings to improve the effectiveness of their online crowdsourcing contests and to enhance the quality of ideas collected in a real setting.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Correction Statement

This article has been republished with minor changes. These changes do not impact the academic content of the article.

Additional information

Notes on contributors

Rambod Dargahi

Rambod Dargahi is an Assistant Professor of Marketing at the College of Business –Southeastern Louisiana University. Having an Electrical Engineering background and an MBA degree, Dr. Dargahi joined the Ph.D. program at the Bauer College of Business as an analytical and quantitative researcher. After earning his Ph.D. from the University of Houston, he joined the Freeman School of Business, as a Visiting Assistant Professor of Marketing. Being commended for teaching excellence at University of Houston and Tulane University, he has over seven years’ teaching experience at the undergraduate and graduate levels. His research projects focus on designing crowdsourcing contests and tapping into external sources of knowledge and innovation. By running real field experiments and also scrapping online data, Dr. Dargahi is interested in studying the motivations and behaviors of potential participants joining a crowdsourcing contest.

Aidin Namin

Aidin Namin is Assistant Professor of Marketing at Loyola Marymount University. He earned his Ph.D. in Quantitative Marketing (i.e., Marketing Analytics) from the University of Texas at Dallas. A modeler by training and passion, Dr. Namin's main area of research is Analytics and Big Data. He has received multiple Teaching, Research, Grants, and Best Paper Awards from different institutions, including the Ascending Scholar Award for Excellence in Research from the President of Loyola Marymount University, The Paul R. Lawrence Award from Case Research Foundation, Junior Faculty Research Award from Western Decision Sciences Institute, Thought Leadership in Retailing Research Recognition sponsored by AMA Retailing and Pricing SIG, Faculty Fellowship Award in Research from Loyola Marymount University and University of Idaho, and Outstanding Reviewer Recognition from Journal of Business Research and Journal of Retailing and Consumer Services. Dr Namin has also received the 2020 Teaching Innovation Award from ACME and Ph.D. Student Teacher of the Year Award from the University of Texas at Dallas. He currently serves on the Editorial Board of Journal of Business Research and Journal of Marketing Analytics, and has recently been awarded the AMS-AFM Grant from Academy of Marketing Science.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 615.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.