A recap of “Rights and Ethics: When technology knows how you feel” in RightsCon2019. insights shared from technologists, lawyers, policy makers and human rights activists. Quite a few comments are about the false positive and false negative errors in these algorithms, and what kind of bad outcomes could be resulted from them.
This paper raises the problem of side-stepping fairness check in black-box model auditing. Since the model is unavailable, black-box auditing may rely on surrogate models. It is then possible for companies to build a surrogate model that is fairer than the original model for the purpose of auditing, making the appearance of using fair algorithms.