TU Darmstadt / ULB / TUbiblio

Lowering the Barrier for Successful Replication and Evaluation

Lücke-Tieke, Hendrik ; Beuth, Marcel ; Schader, Philipp ; May, Thorsten ; Bernard, Jürgen ; Kohlhammer, Jörn (2018)
Lowering the Barrier for Successful Replication and Evaluation.
2018 IEEE Workshop on evaluation and Beyond - methodological approaches for Visualization (BELIV). Berlin, Germany (21 Oct 2018)
doi: 10.1109/BELIV.2018.8634201
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

Evaluation of a visualization technique is complex and time-consuming. We present a system that aims at easing design, creation and execution of controlled experiments for visualizations in the web. We include of parameterizable visualization generation services, thus separating the visualization implementation from study design and execution. This enables experimenters to design and run multiple experiments on the same visualization service in parallel, replicate experiments, and compare different visualization services quickly. The system supports the range from simple questionnaires to visualization-specific interaction techniques as well as automated task generation based on dynamic sampling of parameter spaces. We feature two examples to demonstrate our service-based approach. One example demonstrates how a suite of successive experiments can be conducted, while the other example includes an extended replication study.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2018
Autor(en): Lücke-Tieke, Hendrik ; Beuth, Marcel ; Schader, Philipp ; May, Thorsten ; Bernard, Jürgen ; Kohlhammer, Jörn
Art des Eintrags: Bibliographie
Titel: Lowering the Barrier for Successful Replication and Evaluation
Sprache: Englisch
Publikationsjahr: 2018
Veranstaltungstitel: 2018 IEEE Workshop on evaluation and Beyond - methodological approaches for Visualization (BELIV)
Veranstaltungsort: Berlin, Germany
Veranstaltungsdatum: 21 Oct 2018
DOI: 10.1109/BELIV.2018.8634201
URL / URN: https://doi.org/10.1109/BELIV.2018.8634201
Kurzbeschreibung (Abstract):

Evaluation of a visualization technique is complex and time-consuming. We present a system that aims at easing design, creation and execution of controlled experiments for visualizations in the web. We include of parameterizable visualization generation services, thus separating the visualization implementation from study design and execution. This enables experimenters to design and run multiple experiments on the same visualization service in parallel, replicate experiments, and compare different visualization services quickly. The system supports the range from simple questionnaires to visualization-specific interaction techniques as well as automated task generation based on dynamic sampling of parameter spaces. We feature two examples to demonstrate our service-based approach. One example demonstrates how a suite of successive experiments can be conducted, while the other example includes an extended replication study.

Freie Schlagworte: Human-computer interaction (HCI), User study, Evaluation
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Graphisch-Interaktive Systeme
Hinterlegungsdatum: 29 Aug 2019 05:54
Letzte Änderung: 29 Aug 2019 05:54
PPN:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen