Maserejian, NN et al. Differences in physicians' interpretations of heart disease symptoms according to patient gender: Results from a video vignette factorial experiment. J. Women Health 18, 1661–1667 (2009).
Zach, T. et al. Assessing the potential of GPT-4 to perpetuate racial and gender bias in health care: A model evaluation study. lancet digit. Health 6, e12–e22 (2024).
Gonen, H. & Goldberg, Y. Putting lipstick on a pig: Debiasing techniques mask, but do not remove, systematic gender bias in word embeddings. in Proceedings of the 2019 Conference of the North American Chapter of Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers) (Burstein, J. et al., eds.) 609–614 (Association for Computational Linguistics, 2019)).
Schröder, S. et al. Measuring fairness with biased data: A case study on the impact of unsupervised data on fairness assessments. In Advances in Computational Intelligence. IWANN 2023 vol. 14134 (Rojas, I. et al. eds.) 134–145 (Springer, 2023).
Ktena, I. et al. Generative models improve the fairness of medical classifiers under changing distributions. nut. medicine. 30, 1166–1173 (2024).
Obermeyer, Z. et al. Analyzing racial bias in algorithms used to manage population health. Science 366, 447–453 (2019).
Caruana, R. et al. Easy-to-understand models for healthcare: predicting pneumonia risk and 30-day readmission. Proceedings of the 21st ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 1721-1730 (ACM, 2015).
Srivastava, M., Hashimoto, T. & Liang, P. Robustness to spurious correlations due to human annotations. Proceedings of the 37th International Conference on Machine Learning Vol. 119, 9109–9119 (PMLR, 2020).
Yang, Y et al. Limitations of unbiased medical imaging AI in real-world generalization. nut. medicine. 30, 2838–2848 (2024).
Schrouff, J. et al. Diagnosing equity transfer failure across distributional shifts in real-world healthcare settings. Advances in Neural Information Processing Systems Vol. 35, 19304–19318 (NeurIPS, 2022).