The Infection Prevention and Control Department (IPC) at an acute care pediatric facility
is required, by state law, to report all healthcare-associated infections (HAI) using
the National Healthcare Safety Network (NHSN) surveillance definitions. Due to the
definitions being subjective, interpretation could impact data integrity. A novel
surveillance validation program was created in 2018 as an internal approach for validation
and continuous surveillance education. The aim of this project is to assess the quality
assurance program for consistently identifying HAIs between Infection Preventionists
(IP) across IPC.
Validators, who are expert IPs, investigated randomly sampled infection events that
were already investigated by an IP in the department. They performed chart review
and documented in a data collection tool if the infections were reportable to NHSN
with supporting rationale. Discrepancies were identified and cases were adjudicated.
IPC utilized a statistical method, Cohen's Kappa, to determine inter-rater reliability
between the validator response and original documentation monthly. Inter-rater reliability
was examined per month and plotted to show trends in surveillance competency.
One hundred and thirty-one infections were chosen for validation between July 2020
and April 2021. Cohen's kappa for two raters was calculated for each month. The median
and interquartile range for the monthly kappa values were 1.00 [0.83 - 1.00] which
is well above the values of 0.60 - 0.80 considered to have substantial agreement.
The minimum and maximum kappa values were 0.76 and 1.00 respectively.
Overall, this study confirms that the use of Cohen's Kappa is a unique asset to the
department's surveillance validation program and allows us to assess inter-rater reliability
continually and easily. IPC programs could benefit from statistical analyses in evaluating
data integrity and management.