Image Quality and Reader Agreement*
↵* κ-values for agreement were 0.71 for reader 1 vs. reader 2, 0.76 for reader 1 vs. reader 3, and 0.69 for reader 2 vs. reader 3.