1,498
Views
30
CrossRef citations to date
0
Altmetric
Research Article

Effect of Crowd Voting on Participation in Crowdsourcing Contests

, &
Pages 510-535 | Published online: 16 Jun 2020
 

ABSTRACT

While expert rating is still a dominant approach for selecting winners in contests for creative works, a few crowdsourcing platforms have recently used “crowd voting” for winner selection – that is, let users of the crowdsourcing community publicly vote for contest winners. We investigate how a contest’s reliance on crowd voting for winner selection, defined as the percentage of crowd-voted prizes to the total prize sum (in dollar amounts), affects contest participation. Drawing upon expectancy theory and tournament theory, we develop a theoretical understanding of this relationship. Using a novel dataset of contests employing both crowd voting and expert rating, we find that a contest’s reliance on crowd voting is positively associated with participation. Specifically, every 10% increase in the crowd-voting reliance can boost users’ odds of participation by about 7%. Moreover, crowd voting is more appealing to users whose expertise is not high and whose status in the crowdsourcing community is high.

Supplemental Material

Supplemental data for this article can be accessed on the here.

Notes

1. According to Magallanes et al. [Citation52], a creative work is a manifestation of creative effort as in the formation of concepts, artwork, literature, music, paintings, software, and architectural designs. Creative works have in common a degree of arbitrariness, such that it is improbable that two people would independently create the same work.

2. One may combine crowd voting and expert rating in other ways (e.g., using crowd voting for initial screening and expert rating for selecting final winners). We focus on the side-by-side use of crowd voting and expert rating.

3. Haan et al. [Citation32] rely on comparing contests judged by experts and a different set of contests judged by televoters. Mollick and Nanda [Citation59] obtain expert opinions through separate surveys outside of the crowdfunding platform. Holbrook and Addis [Citation36] have both expert and audience ratings of the same movies but these two types of ratings typically influence each other.

4. The wisdom of crowds has also been applied in organizational designs, in the form of “organizational democracy,” where employees are empowered to collectively make decisions on workplace issues through direct or representative joint consultation, dialogue, voting, co-determination, or other democratic processes [Citation42, Citation78]. This literature considers a broader set of decisions (e.g., a buyout deal) and processes (e.g., dialogue) whereas we focus on the application of crowd voting as a winner-selection mechanism in contests for creative works.

5. Crowdsourcing research shows that participants of crowdsourcing contests may also be motivated by intrinsic motivations of self enhancement, enjoyment, and autonomy [Citation16, Citation84]. We focus on the tangible rewards because, as we will argue, the choice of winner-selection mechanism has a direct impact on the expectancy of tangible rewards but the same cannot be said about intrinsic rewards.

6. For example, contests “Smile You’re at Red Robin” and “The UK’s Fastest Network” have four equally weighted criteria: engaging storytelling, positive representation of the brand, originality, and production quality. “Squeeze More Out!” has these four: the functional benefits of the new bottle are communicated effectively (40%), the delicious-looking sandwiches make us drool (20%), videos are creative and unique (20%), videos are of high production quality (20%).

7. Contest sponsors own the copyright of winning entries and can use them on their media channel without paying use fees. Contest sponsors can also use the work of any other member in addition to the work of the award-winning member, on its media channels, at any time after an award is granted. Members whose work is selected for use by the sponsor will receive a use fee of $500.

8. After Aug 2013, Zooppa changed its website and rarely used crowd voting.

9. The user fixed-effect estimation further removes users who had no variation in participation, retaining a set of 2,635 distinct users in the final estimation.

10. Each registered user receives an email notification when a new contest is announced. Thus, it is reasonable to interpret a lack of video entries from this user as the user choosing not to participate in this contest.

11. Although a fixed-effects estimator has been proposed for the negative binomial model for count data [Citation33], Allison and Waterman [Citation4] pointed out that it is not a true fixed-effects estimator.

12. Because the lifetime hit rate is calculated on the per-entry basis, it is orthogonal to the decision of whether to participate, our dependent variable of interests.

13. We also evaluate a random sample of 5% of all comments using Amazon Mechanical Turk and find that only 2.4% of comments are negative, and 88.7% are positive.

14. The Status variable is log-transformed in data analysis as it is highly skewed and has a large value range.

17. Our results include direct evidence on the number of submissions being affected by winner-selection mechanisms, but we can only indirectly infer characters (e.g., quality) of submissions may also change based on the intuition that high- and low-expertise contestants, as well as high and low-status contestants, may submit systematically different solutions. The latter implication is worthy of future investigations.

Additional information

Notes on contributors

Liang Chen

Liang Chen ([email protected]) is an Assistant Professor of Computer Information Systems in the Paul and Virginia Engler College of Business at West Texas A&M University. He received his Ph.D. in Decision Science and Information Systems from the University of Kentucky. His research has focused on crowdsourcing, data analytics in business, knowledge management, and supplier development. Dr. Chen’s work has been published in Decision Support Systems, International Journal of Operations & Production Management, and Journal of the Association for Information Science and Technology, among others.

Pei Xu

Pei Xu ([email protected]) is an Assistant professor of Business Analytics in the Harbert College of Business at Auburn University. She received her Ph.D. in Decision Science and Information Systems from the University of Kentucky. Her research focus is on understanding the economic and social impact of emerging information technology in areas including social media for business, crowdsourcing, healthcare, and shadow banking. Dr. Xu’s work has been published in such journals as Decision Support Systems, Information & Management, Journal of the Academy of Marketing Science, and others.

De Liu

De Liu ([email protected]; corresponding author) is an Associate Professor of Information and Decision Sciences and 3M Fellow in Business Analytics at the Carlson School of Management, University of Minnesota. He received his Ph.D. degree from the University of Texas at Austin. His research interests include design issues in digital auctions, gamification, crowdsourcing, crowdfunding, and online platforms. His research has appeared in such journals as MIS Quarterly, Information Systems Research, Journal of Marketing, Management Science, and others. He serves as an associate editor for Information Systems Research and Journal of Organizational Computing and Electronic Commerce.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 640.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.