Abstract #301857

This is the preliminary program for the 2003 Joint Statistical Meetings in San Francisco, California. Currently included in this program is the "technical" program, schedule of invited, topic contributed, regular contributed and poster sessions; Continuing Education courses (August 2-5, 2003); and Committee and Business Meetings. This on-line program will be updated frequently to reflect the most current revisions.

To View the Program:
You may choose to view all activities of the program or just parts of it at any one time. All activities are arranged by date and time.

The views expressed here are those of the individual authors
and not necessarily those of the ASA or its board, officers, or staff.


Back to main JSM 2003 Program page



JSM 2003 Abstract #301857
Activity Number: 352
Type: Invited
Date/Time: Wednesday, August 6, 2003 : 10:30 AM to 12:20 PM
Sponsor: Section on Health Policy Statistics
Abstract - #301857
Title: Is a Risk-Adjusted Bootstrap a Better Way to Find "Bad Apple" Providers?
Author(s): Dina Alper*+ and Yang Zhao and John Haughton and Arlene S. Ash
Companies: DxCG, Inc. and DxCG, Inc. and DxCG and Boston University School of Medicine
Address: 25 Kingston St., Boston, MA, 02111,
Keywords: provider profiling ; DCG ; report cards
Abstract:

Health care oversight requires comparing the outcomes that providers achieve with their expected outcomes due to patient health burden. Provider "profiling" involves calculating an observed and expected outcome (O and E) for each provider; summarizing the discrepancy as, say, O/E; determining which providers' O/Es are "unacceptably" different from 1.0. When a good risk adjuster is used to determine the Es and "chance" discrepancies are ignored, "outlier" O/E values can point to practices that really are worse (or better) than typical. We use a DCG model to predict annual costs for each patient from demographics and diagnoses (R2~ 0.50) and to calculate providers' Es in a large health insurance database. For each provider, we construct a bootstrapped distribution of O/E for patient panels with the same n and DCG distribution. We use these to construct "real" 95%-acceptance intervals, compare these intervals to analytically calculated ones, and summarize our findings as to how well the analytic intervals conform to reality. We also explore conceptual and practical issues in provider profiling.


  • The address information is for the authors that have a + after their name.
  • Authors who are presenting talks have a * after their name.

Back to the full JSM 2003 program

JSM 2003 For information, contact meetings@amstat.org or phone (703) 684-1221. If you have questions about the Continuing Education program, please contact the Education Department.
Revised March 2003