Abstract
In previous articles in this series the author has explored metadata quality issues related to the parsing of enumeration information as well as strategies for creating a cloud-based data warehouse solution to enable quick and responsive querying for harmonized metadata. In building on this theme the author explores the role that crowdsourcing can plan in improving metadata quality in the current article. The author reviews an “experiment” in using crowdsourcing platforms to improve metadata quality. The platforms/metadata quality issues tested were used to validate technical metadata on proxy configuration. The platforms explored demonstrated that crowdsourced metadata work is feasible but under the expected limitations associated with such work (e.g., limited expertise of workers, need to break task down into bite-sized tasks, difficulty in crowdsourcing complex tasks as a result).
Notes
1.https://www.zooniverse.org/
2.http://pybossa.com/
3.http://www.historypin.org/en/
4.http://oclc.org/ezproxy
5.https://chrome.google.com/webstore/detail/formranger/faepkjkcpnnghgdhiobglpppbfdnaehc?hl=en
6.https://chrome.google.com/webstore/detail/choice-eliminator-2/mnhoinjhhhoneafgieggnhjekliaodnkigj?hl=en-US
7.https://www.operationwardiary.org/?_ga=1.116844677.246495004.1464711820#/classify