Intraclass Correlation (ICC) Calculator
Enter ratings from multiple raters on multiple subjects to calculate three forms of the intraclass correlation coefficient. ICC measures how consistently raters assign similar scores to the same subjects.
What is the ICC?
The intraclass correlation coefficient (ICC) quantifies reliability or agreement between two or more raters. Unlike Pearson correlation, ICC accounts for both correlation and systematic differences between raters.
ICC Types:
- ICC(1,1) -- One-way random: Each subject is rated by a different random set of raters. Appropriate when raters are sampled from a larger population.
- ICC(2,1) -- Two-way random, single measures: Both subjects and raters are random effects. Measures absolute agreement.
- ICC(3,1) -- Two-way mixed, single measures: Subjects are random but raters are the only raters of interest. Measures consistency (systematic differences between raters are ignored).
Formulas (from two-way ANOVA):
- ICC(1,1) = (MSR - MSW) / (MSR + (k-1) x MSW)
- ICC(2,1) = (MSR - MSE) / (MSR + (k-1) x MSE + (k/n) x (MSC - MSE))
- ICC(3,1) = (MSR - MSE) / (MSR + (k-1) x MSE)
Where MSR = mean square between subjects, MSC = mean square between raters, MSE = residual mean square.
Interpretation (Cicchetti, 1994):
| ICC | Reliability |
|---|---|
| < 0.40 | Poor |
| 0.40 - 0.59 | Fair |
| 0.60 - 0.74 | Good |
| 0.75 - 1.00 | Excellent |