taesiri commited on
Commit
4d63e18
1 Parent(s): b894da5

Upload abstract/2210.10769.txt with huggingface_hub

Browse files
Files changed (1) hide show
  1. abstract/2210.10769.txt +5 -0
abstract/2210.10769.txt ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ Machine learning models frequently experience performance drops under distribution shifts. The underlying cause of such shifts may be multiple simultaneous factors such as changes in data quality, differences in specific covariate distributions, or changes in the relationship between label and features. When a model does fail during deployment, attributing performance change to these factors is critical for the model developer to identify the root cause and take mitigating actions.
2
+
3
+ In this work, we introduce the problem of attributing performance differences between environments to distribution shifts in the underlying data generating mechanisms. We formulate the problem as a cooperative game where the players are distributions. We define the value of a set of distributions to be the change in model performance when only this set of distributions has changed between environments, and derive an importance weighting method for computing the value of an arbitrary set of distributions.
4
+
5
+ The contribution of each distribution to the total performance change is then quantified as its Shapley value. We demonstrate the correctness and utility of our method on synthetic, semi-synthetic, and real-world case studies, showing its effectiveness in attributing performance changes to a wide range of distribution shifts.