Final Thesis: Intercoder Evaluation Metrics in QDAcity 

Abstract: In qualitative research, Intercoder Agreement (ICA) represents the extent to which two or more researchers share the interpretation of the data. The ICA assessment adds investigator triangulation to a project and helps researchers validate their findings. QDAcity is a cloud-based application which provides a platform for qualitative data analysis by allowing researchers to collaborate and analyse the data by defining codes mutually and code the same documents independently. The researchers can assess their coding using three ICA metrics: F-Measure, Krippendorf’s Alpha and Fleiss’ Kappa, which can be generated with an evaluation unit as ‘paragraph’ for selected documents. In this thesis, we extend the existing metrics by providing ‘sentence’ as another evaluation unit and an option to calculate agreement on a subset of the codesystem, which enables collaboration on larger projects. Additionally, we present the design and implementation of a new ICA metric called Agreement Queries, which generates results based on agreement types such as Code Occurrence, Code Frequency and Code Intersection Percentage.

Keywords: QDA, qualitative data analysis, intercoder agreement, interrater agreement, QDAcity

PDF: Master Thesis

Reference: Vishwas Anavatti. Intercoder Evaluation Metrics in QDAcity. Master Thesis. Friedrich-Alexander-Universität Erlangen-Nürnberg: 2022.