From Detectables to Inspectables: Understanding Qualitative Analysis of Audiovisual Data
Paper at ACM CHI '21
Presented on May 11, 09:00–11:00 and 17:00–19:00 CEST; https://programs.sigchi.org/chi/2021/program/content/47454
This paper received an Honorable Mention Award, a distinction aimed at the top 5% of all CHI paper submissions that year.
Abstract
Audiovisual recordings of user studies and interviews provide important data in qualitative HCI research. Even when a textual transcription is available, researchers frequently turn to these recordings due to their rich information content. However, the temporal, unstructured nature of audiovisual recordings makes them less efficient to work with than text. Through interviews and a survey, we explored how HCI researchers work with audiovisual recordings. We investigated researchers' transcription and annotation practice, their overall analysis workflow, and the prevalence of direct analysis of audiovisual recordings. We found that a key task was locating and analyzing inspectables, interesting segments in recordings. Since locating inspectables can be time consuming, participants look for detectables, visual or auditory cues that indicate the presence of an inspectable. Based on our findings, we discuss the potential for automation in locating detectables in qualitative audiovisual analysis.
Video
Publications
- Krishna Subramanian, Johannes Maas, Jan Borchers and James Hollan. From Detectables to Inspectables: Understanding Qualitative Analysis of Audiovisual Data. In Proceedings of the 2021 ACM Conference on Human Factors in Computing Systems, CHI 2021, ACM, May 2021.
- Johannes Maas. Understanding and Supporting Analysis of Audio and Video. Master's Thesis, RWTH Aachen University, Aachen, September 2020.
2021
2020