TU Darmstadt / ULB / TUbiblio

Crowdsourcing for Information Visualization: Promises and Pitfalls

Borgo, Rita ; Lee, Bongshin ; Bach, Benjamin ; Fabrikant, Sara ; Jianu, Radu ; Kerren, Andreas ; Kobourov, Stephen ; McGee, Fintan ; Micallef, Luana ; Landesberger von Antburg, Tatiana ; Ballweg, Kathrin ; Diehl, Stephan ; Simonetto, Paolo ; Zhou, Michelle (2017)
Crowdsourcing for Information Visualization: Promises and Pitfalls.
1581. Dagstuhl-Seminar, Dagstuhl Castle, Germany (22.11.2015-27.11.2015)
doi: 10.1007/978-3-319-66435-4_5
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

Crowdsourcing offers great potential to overcome the limitations of controlled lab studies. To guide future designs of crowdsourcing-based studies for visualization, we review visualization research that has attempted to leverage crowdsourcing for empirical evaluations of visualizations. We discuss six core aspects for successful employment of crowdsourcing in empirical studies for visualization - participants, study design, study procedure, data, tasks, and metrics & measures. We then present four case studies, discussing potential mechanisms to overcome common pitfalls. This chapter will help the visualization community understand how to effectively and efficiently take advantage of the exciting potential crowdsourcing has to offer to support empirical visualization research.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2017
Autor(en): Borgo, Rita ; Lee, Bongshin ; Bach, Benjamin ; Fabrikant, Sara ; Jianu, Radu ; Kerren, Andreas ; Kobourov, Stephen ; McGee, Fintan ; Micallef, Luana ; Landesberger von Antburg, Tatiana ; Ballweg, Kathrin ; Diehl, Stephan ; Simonetto, Paolo ; Zhou, Michelle
Art des Eintrags: Bibliographie
Titel: Crowdsourcing for Information Visualization: Promises and Pitfalls
Sprache: Englisch
Publikationsjahr: 2017
Ort: Berlin
Verlag: Springer
Buchtitel: Evaluation in the Crowd
Veranstaltungsort: 1581. Dagstuhl-Seminar, Dagstuhl Castle, Germany
Veranstaltungsdatum: 22.11.2015-27.11.2015
DOI: 10.1007/978-3-319-66435-4_5
Kurzbeschreibung (Abstract):

Crowdsourcing offers great potential to overcome the limitations of controlled lab studies. To guide future designs of crowdsourcing-based studies for visualization, we review visualization research that has attempted to leverage crowdsourcing for empirical evaluations of visualizations. We discuss six core aspects for successful employment of crowdsourcing in empirical studies for visualization - participants, study design, study procedure, data, tasks, and metrics & measures. We then present four case studies, discussing potential mechanisms to overcome common pitfalls. This chapter will help the visualization community understand how to effectively and efficiently take advantage of the exciting potential crowdsourcing has to offer to support empirical visualization research.

Zusätzliche Informationen:

1581. Dagstuhl-Seminar Lecture Notes in Computer Science, vol 10264

Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Graphisch-Interaktive Systeme
Hinterlegungsdatum: 04 Mai 2020 09:05
Letzte Änderung: 22 Jul 2021 18:31
PPN:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen