%0 Generic %A D., Nisbet %A A., McLennan %A A., Robertson %A P.J., Schluter %A J., Hyett %D 2011 %T Supplementary Material for: Reducing Inter-Rater Variability in the Assessment of Nuchal Translucency Image Quality %U https://karger.figshare.com/articles/dataset/Supplementary_Material_for_Reducing_Inter-Rater_Variability_in_the_Assessment_of_Nuchal_Translucency_Image_Quality/5121790 %R 10.6084/m9.figshare.5121790.v1 %2 https://ndownloader.figshare.com/files/8706487 %K Nuchal translucency %K Ultrasound %K Audit %K Inter-rater variability %K Assessment %K Intervention %X Introduction: Standardization of first-trimester nuchal translucency (NT) image acquisition is crucial to the success of screening for Down syndrome. Rigorous audit of operator performance and constructive feedback from assessors maintain standards. This process relies on good inter-rater agreement on image assessment. We describe the Australian approach to NT image assessment and evaluate the impact of a targeted intervention on inter-rater agreement. Methods: Between 2002 and 2008 a group of experienced practitioners met nine times to compare their assessment of a series of NT images. Each assessor had previously scored the images according to a system described in 2002. Inter-rater agreement was evaluated before and after an intervention where the assessors were required to refer to a detailed resource manual designed to reduce the subjectivity inherent in image assessment. Results: There was a statistical improvement in inter-rater agreement for all elements of image assessment (original scores and individual component scores) after the intervention, apart from horizontal fetal position. However, even after the intervention, inter-rater agreement levels generally remained moderate (kappa range: 0.14–0.58). Conclusions: This study has shown that provision of detailed resource documentation to experienced assessors can significantly improve inter-rater agreement in all facets of NT image assessment. It also highlights areas of image assessment that require critical review. It is recommended that all audit bodies regularly review their inter-rater agreement to ensure consistent feedback to operators who submit images for expert peer review. %I Karger Publishers