Global Explanations for Bias Identification.
With our proposed method, we have identified four different clusters. Each cluster reveals unique characteristics in the look of analyzed data set, related with the skin tone, skin lesions, but also with the presence of the unwanted artifacts. The first and the second cluster seem to group images based on the skin lesions similarity, which is a welcomed result in this case.
Discover bias with global explanations [code]
Insert bias to all images in the dataset [code]
Check how predictions changed