Crowdsourcing technology to support academic research

Matthias Hirth, Jason Jacques, Peter Rodgers, Ognjen Scekic, Michael Wybrow

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearch

6 Citations (Scopus)

Abstract

Current crowdsourcing platforms typically concentrate on simple microtasks and do not meet the needs of academic research well, where more complex, time consuming studies are required. This has lead to the development of specialised software tools to support academic research on such platforms. However, the loose coupling of the software with the crowdsourcing site means that there is only limited access to the features of the platform. In addition, the specialised nature of the software tools means that technical knowledge is needed to operate them. Hence there is great potential to enrich the features of crowdsourcing platforms from an academic perspective. In this chapter we discuss the possibilities for practical improvement of academic crowdsourced studies through adaption of technological solutions.

Original languageEnglish
Title of host publicationEvaluation in the Crowd
Subtitle of host publicationCrowdsourcing and Human-Centered Experiments
EditorsDaniel Archambault, Helen Purchase, Tobias Hoßfeld
Place of PublicationCham Switzerland
PublisherSpringer
Pages70-95
Number of pages26
ISBN (Electronic)9783319664354
ISBN (Print)9783319664347
DOIs
Publication statusPublished - 2017
EventEvaluation in the Crowd: Crowdsourcing and Human-Centered Experiments 2015 (Dagstuhl Seminar 15481) - Dagstuhl Castle, Germany
Duration: 22 Nov 201527 Nov 2015

Publication series

NameLecture Notes in Computer Science
PublisherSpringer
Volume10264
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceEvaluation in the Crowd
Country/TerritoryGermany
CityDagstuhl Castle
Period22/11/1527/11/15

Cite this