Abstract:
|
Algorithmic fairness is top of mind for companies and data professionals these days. However, when trying to measure it or correct for outputs that unfairly discriminate against groups of people, they very quickly run into very practical obstacles. For example, the fact that demographic data is needed to measure how fair an algorithm's outputs are in the first place is an enormous challenge because of roadblocks involving legal barriers, privacy concerns, and more. We will hear from a wide range of researchers and practitioners about the obstacles they faced and how they were able to get through some of the more practical issues to implementing fairness checks and corrections. This panel will provide practitioners who want to implement fairness checks in their own organizations with strategies on how to do so.
|