Publication: Attention IoU: Examining Biases in Image Classification Models using Attention Maps
Files
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Computer vision models have been shown to exhibit and amplify biases across a wide array of datasets and tasks. Existing methods for quantifying bias in classification models primarily focus on dataset distribution and model performance on subgroups, overlooking the internal workings of a model. We introduce the Attention-IoU (Attention Intersection over Union) metric and related scores, which use attention maps to reveal biases within a model's internal representations and identify image features potentially causing the biases. First, we validate Attention-IoU on the synthetic Waterbirds dataset, showing that the metric accurately measures model bias. We then investigate the CelebA dataset, finding that Attention-IoU uncovers correlations beyond accuracy disparities. Through an investigation of individual attributes through the protected attribute of Male, we examine the distinct ways biases are represented in CelebA. We furthermore explore contextual biases with Attention-IoU through the COCO dataset, highlighting the challenges in measuring contextual bias. Lastly, we analyze distribution shifts in iWildCam, revealing the impact of background environments in camera trap images on model performance. Altogether, Attention-IoU reveals aspects of biases beyond dataset labels and model accuracies, enabling us to gain deeper insights into the representations of bias within computer vision models and develop better debiasing methods and fairer models.
This thesis partially contains materials to be presented at the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) at Nashville, Tennessee, USA in June 2025 as "Attention IoU: Examining Biases in CelebA using Attention Maps", in collaboration with Tyler Zhu, Olga Russakovsky, and Vikram V. Ramaswamy. Our code and data are available at https://github.com/aaronserianni/attention-iou.