Borgo, Rita ; Lee, Bongshin ; Bach, Benjamin ; Fabrikant, Sara ; Jianu, Radu ; Kerren, Andreas ; Kobourov, Stephen ; McGee, Fintan ; Micallef, Luana ; Landesberger von Antburg, Tatiana ; Ballweg, Kathrin ; Diehl, Stephan ; Simonetto, Paolo ; Zhou, Michelle (2017):
Crowdsourcing for Information Visualization: Promises and Pitfalls.
In: Evaluation in the Crowd, pp. 96-138,
Berlin, Springer, 1581. Dagstuhl-Seminar, Dagstuhl Castle, Germany, November 22 – 27, 2015, ISBN Print 978-3-319-66434-7 Online ISBN 978-3-319-66435-4,
DOI: 10.1007/978-3-319-66435-4_5,
[Conference or Workshop Item]
Abstract
Crowdsourcing offers great potential to overcome the limitations of controlled lab studies. To guide future designs of crowdsourcing-based studies for visualization, we review visualization research that has attempted to leverage crowdsourcing for empirical evaluations of visualizations. We discuss six core aspects for successful employment of crowdsourcing in empirical studies for visualization - participants, study design, study procedure, data, tasks, and metrics & measures. We then present four case studies, discussing potential mechanisms to overcome common pitfalls. This chapter will help the visualization community understand how to effectively and efficiently take advantage of the exciting potential crowdsourcing has to offer to support empirical visualization research.
Item Type: | Conference or Workshop Item |
---|---|
Erschienen: | 2017 |
Creators: | Borgo, Rita ; Lee, Bongshin ; Bach, Benjamin ; Fabrikant, Sara ; Jianu, Radu ; Kerren, Andreas ; Kobourov, Stephen ; McGee, Fintan ; Micallef, Luana ; Landesberger von Antburg, Tatiana ; Ballweg, Kathrin ; Diehl, Stephan ; Simonetto, Paolo ; Zhou, Michelle |
Title: | Crowdsourcing for Information Visualization: Promises and Pitfalls |
Language: | English |
Abstract: | Crowdsourcing offers great potential to overcome the limitations of controlled lab studies. To guide future designs of crowdsourcing-based studies for visualization, we review visualization research that has attempted to leverage crowdsourcing for empirical evaluations of visualizations. We discuss six core aspects for successful employment of crowdsourcing in empirical studies for visualization - participants, study design, study procedure, data, tasks, and metrics & measures. We then present four case studies, discussing potential mechanisms to overcome common pitfalls. This chapter will help the visualization community understand how to effectively and efficiently take advantage of the exciting potential crowdsourcing has to offer to support empirical visualization research. |
Title of Book: | Evaluation in the Crowd |
Place of Publication: | Berlin |
Publisher: | Springer |
ISBN: | Print 978-3-319-66434-7 Online ISBN 978-3-319-66435-4 |
Divisions: | 20 Department of Computer Science 20 Department of Computer Science > Interactive Graphics Systems |
Event Location: | 1581. Dagstuhl-Seminar, Dagstuhl Castle, Germany |
Event Dates: | November 22 – 27, 2015 |
Date Deposited: | 04 May 2020 09:05 |
DOI: | 10.1007/978-3-319-66435-4_5 |
Additional Information: | 1581. Dagstuhl-Seminar Lecture Notes in Computer Science, vol 10264 |
Export: | |
Suche nach Titel in: | TUfind oder in Google |
![]() |
Send an inquiry |
Options (only for editors)
![]() |
Show editorial Details |