Options
1997
Report
Title
Evaluating the interrater agreement of process capability rating
Abstract
The reliability of process assessments has received some study in the recent past, much of it being conducted within the context of the SPICE trials. In this paper we build upon this work by evaluating the reliability of ratings on each of the practices that make up the SPICE capability dimension. The type of reliability that we evaluate is interrater agreement: the agreement amongst independent assessors' capability ratings. Interrater agreement was found to be generally high. We also identify one particular practice that exhibits low agreement in its ratings.
Publishing Place
Kaiserslautern