JSM 2005 - Toronto

Abstract #303275

This is the preliminary program for the 2005 Joint Statistical Meetings in Minneapolis, Minnesota. Currently included in this program is the "technical" program, schedule of invited, topic contributed, regular contributed and poster sessions; Continuing Education courses (August 7-10, 2005); and Committee and Business Meetings. This on-line program will be updated frequently to reflect the most current revisions.

To View the Program:
You may choose to view all activities of the program or just parts of it at any one time. All activities are arranged by date and time.



The views expressed here are those of the individual authors
and not necessarily those of the ASA or its board, officers, or staff.


The Program has labeled the meeting rooms with "letters" preceding the name of the room, designating in which facility the room is located:

Minneapolis Convention Center = “MCC” Hilton Minneapolis Hotel = “H” Hyatt Regency Minneapolis = “HY”

Back to main JSM 2005 Program page



Legend: = Applied Session, = Theme Session, = Presenter
Activity Number: 32
Type: Contributed
Date/Time: Sunday, August 7, 2005 : 2:00 PM to 3:50 PM
Sponsor: Biometrics Section
Abstract - #303275
Title: Assessment on Agreement Studies Using Adjusted Kappa and Yule's Index
Author(s): Jun-mo Nam*+
Companies: National Cancer Institute
Address: 6120 Executive Boulevard, Rockville, MD, 20852, United States
Keywords: Adjusted Kappa ; Yule's coefficient of colligation ; Validity studies ; Sensitivity ; Specificity
Abstract:

The chance corrected Cohen's kappa has been widely applied as a measure of agreement in assessing agreement between two raters. As its value is influenced by the prevalence and bias between raters, use of the kappa may not be sound unless the corresponding marginal frequencies are comparable. Instead, use of the prevalence-adjusted, bias-adjusted kappa is more appropriate for the comparisons. We present interval estimation of the adjusted kappa. Alternatively, Yule's Y statistic, which is less dependent on marginal frequencies, also may be applied as an index of agreement for comparisons. The Y value is greater than the adjusted kappa statistic. We investigate validity studies using adjusted kappa and Y. When one rating method is better than the other by both coefficients of agreement, we may confirm the result of findings. However, it is not easy to determine which is better if preference using the adjusted kappa differs from that using the Y index. This may occur when one rating method is better than the other in sensitivity but not specificity. In this case, we cannot decide preference unless the relative importance of sensitivity and specificity are given.


  • The address information is for the authors that have a + after their name.
  • Authors who are presenting talks have a * after their name.

Back to the full JSM 2005 program

JSM 2005 For information, contact jsm@amstat.org or phone (888) 231-3473. If you have questions about the Continuing Education program, please contact the Education Department.
Revised March 2005