Loading…
Attending this event?
Friday September 6, 2024 5:30pm - 6:20pm EDT
Counseling researchers often utilize behavioral observation as a data collection strategy. Cohen’s Kappa or weighted Kappa is a common coefficient for demonstrating interrater reliability with categorical data between two independent raters/coders. Researchers may encounter an issue with high agreement between two raters but low or even negative Kappa values. This session will discuss the reasons for this issue in Kappa and demonstrate two alternative coefficients, Prevalence-adjusted bias-adjusted Kappa and AC1, as recommended solutions.
Friday September 6, 2024 5:30pm - 6:20pm EDT
Butler East

Attendees (2)


Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link