Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reproducible error: SHAP ExplainerError: Additivity check failed in TreeExplainer #873

Open
yanisvdc opened this issue Apr 5, 2024 · 4 comments

Comments

@yanisvdc
Copy link

yanisvdc commented Apr 5, 2024

This is related to issue #866

import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import MinMaxScaler

# Create a sample DataFrame X
np.random.seed(0)  # for reproducibility
data = {
    'Category': np.random.randint(2, size=100),
    'Value': np.random.randint(100, size=100)
}
X = pd.DataFrame(data)

# Creating binary arrays T and y
T = np.random.randint(2, size=100)
y = np.random.randint(2, size=100)

# Splitting the data into training and testing sets
X_train, X_test, T_train, T_test, y_train, y_test = train_test_split(X, T, y, test_size=0.2, random_state=42)

to_scale = ["Value"]
scaler = MinMaxScaler()
# Fit scaler on training data and transform both training and testing data
X_train[to_scale] = scaler.fit_transform(X_train[to_scale])
X_test[to_scale] = scaler.transform(X_test[to_scale])

est = CausalForestDML(model_y=RandomForestClassifier(), model_t=DummyClassifier(strategy='uniform'), random_state=123, discrete_outcome=True,discrete_treatment=True)
est.fit(y_train, T_train, X=X_train)
shap_values = est.shap_values(X_train[:20])
shap.plots.beeswarm(shap_values['Y0']['T0_1'])

If you try to comment out or remove the lines:

to_scale = ["Value"]
scaler = MinMaxScaler()
# Fit scaler on training data and transform both training and testing data
X_train[to_scale] = scaler.fit_transform(X_train[to_scale])
X_test[to_scale] = scaler.transform(X_test[to_scale])

the code runs correctly.

It seems that the scaler induces an issue, it is the same with StandardScaler(). My guess is that float rounding errors are responsible for this behavior. I believe that there is a PR to add check_additivity = False, which seems to be the only way to resolve this, unless I am missing something.

@jcreinhold
Copy link
Contributor

jcreinhold commented Apr 5, 2024

Thanks for the code. I was trying to write a consistent unit test for PR #872 (which fixes this issue and #866); however, I'm unable to reproduce the error with the following code (structured as a unit test in test_shap.py):

def test_check_additivity_handling(self):
    np.random.seed(0)
    n = 100
    X = np.hstack([np.random.randint(2, size=(n, 1)), np.random.randint(100, size=(n, 1))])
    T = np.random.randint(2, size=(n,))
    Y = np.random.randint(2, size=(n,))
    X_train, _, T_train, _, Y_train, _ = train_test_split(X, T, Y, test_size=0.2, random_state=42)
    scaler = MinMaxScaler()
    X_train[:, 1:2] = scaler.fit_transform(X_train[:, 1:2])
    est = CausalForestDML(
        model_y=RandomForestClassifier(),
        model_t=DummyClassifier(strategy="uniform"),
        random_state=123,
        discrete_outcome=True,
        discrete_treatment=True
    )
    est.fit(Y_train, T_train, X=X_train)
    shap_values = est.shap_values(X_train[:20])
    assert shap_values is not None

As discussed in my second comment in #866, I don't believe there's a consistent way to reproduce this error across machines—for various reasons. However, #872 will fix this issue regardless.

@jcreinhold
Copy link
Contributor

jcreinhold commented Apr 5, 2024

@yanisvdc Do you have the literal values for X_train, T_train, and y_train that create the error (after scaling)? I can try hard-coding those values in and seeing if that reproduces the error. Given they're size 80, this seems feasible.

Could you please try the following (on the set of variables that cause the error):

arrays = {
    "X": X_train,  # after scaling (potentially call .to_numpy() on it if it's a df)
    "T": T_train.squeeze(),
    "Y": y_train.squeeze(),
}
for name, arr in arrays.items():
    mlw = 20 if name != "X" else 50
    arr_str = np.array2string(arr, separator=", ", floatmode="maxprec", precision=16, max_line_width=mlw)
    print(f"{name}:\n\n{arr_str}")

and copy the results here in a markdown codeblock for each variable?

@yanisvdc
Copy link
Author

yanisvdc commented Apr 11, 2024

Thanks for your comments.
Your test named test_check_additivity_handling() should fail by adding "astype(float)" to X. However, your test passes by removing it. The problem is when you remove "astype(float)" to X, the MinMaxScaler outputs zeros as it casts the values to int to match the dtype. Thus, your current test passes but that is only because X_train ends up being an array with binary integer values. In the more general case, with "astype(float)" added to X, MinMaxScaler outputs floats between 0 and 1, and the error displayed below will be raised (see after the code containing "astype(float)" added to X)

np.random.seed(0)
n = 100
X = np.hstack([np.random.randint(2, size=(n, 1)), np.random.randint(100, size=(n, 1))]).astype(float)
T = np.random.randint(2, size=(n,))
Y = np.random.randint(2, size=(n,))
X_train, _, T_train, _, Y_train, _ = train_test_split(X, T, Y, test_size=0.2, random_state=42)
scaler = MinMaxScaler()
X_train[:, 1:2] = scaler.fit_transform(X_train[:, 1:2])
est = CausalForestDML(
    model_y=RandomForestClassifier(),
    model_t=DummyClassifier(strategy="uniform"),
    random_state=123,
    discrete_outcome=True,
    discrete_treatment=True
)
est.fit(Y_train, T_train, X=X_train)
shap_values = est.shap_values(X_train[:20])
shap.plots.beeswarm(shap_values['Y0']['T0_1'])

Error message:

ExplainerError: Additivity check failed in TreeExplainer! Please ensure the data matrix you passed to the explainer is the same shape that the model was trained on. If your data shape is correct then please report this on GitHub. This check failed because for one of the samples the sum of the SHAP values was -0.061160, while the model output was -0.059755. If this difference is acceptable you can set check_additivity=False to disable this check.

Please let me know if you experience the same behavior.

@yanisvdc
Copy link
Author

@yanisvdc Do you have the literal values for X_train, T_train, and y_train that create the error (after scaling)? I can try hard-coding those values in and seeing if that reproduces the error. Given they're size 80, this seems feasible.

Could you please try the following (on the set of variables that cause the error):

arrays = {
    "X": X_train,  # after scaling (potentially call .to_numpy() on it if it's a df)
    "T": T_train.squeeze(),
    "Y": y_train.squeeze(),
}
for name, arr in arrays.items():
    mlw = 20 if name != "X" else 50
    arr_str = np.array2string(arr, separator=", ", floatmode="maxprec", precision=16, max_line_width=mlw)
    print(f"{name}:\n\n{arr_str}")

and copy the results here in a markdown codeblock for each variable?

To answer this, here are the exact values:

X:

[[0.                , 0.202020202020202 ],
 [1.                , 0.4040404040404041],
 [1.                , 0.696969696969697 ],
 [0.                , 0.4343434343434344],
 [0.                , 0.5151515151515152],
 [0.                , 0.2323232323232323],
 [0.                , 0.                ],
 [0.                , 0.6767676767676768],
 [1.                , 0.5252525252525253],
 [0.                , 0.5454545454545455],
 [0.                , 0.1515151515151515],
 [1.                , 0.98989898989899  ],
 [1.                , 0.9494949494949496],
 [1.                , 0.                ],
 [0.                , 0.2626262626262627],
 [1.                , 0.686868686868687 ],
 [0.                , 0.4343434343434344],
 [0.                , 0.8686868686868687],
 [1.                , 0.98989898989899  ],
 [0.                , 0.797979797979798 ],
 [1.                , 0.3535353535353536],
 [1.                , 0.0303030303030303],
 [1.                , 0.0303030303030303],
 [0.                , 0.5252525252525253],
 [1.                , 0.9494949494949496],
 [1.                , 0.4848484848484849],
 [1.                , 0.303030303030303 ],
 [1.                , 0.9595959595959597],
 [1.                , 0.6565656565656566],
 [1.                , 1.                ],
 [0.                , 0.6464646464646465],
 [0.                , 0.686868686868687 ],
 [0.                , 0.1313131313131313],
 [1.                , 0.2121212121212121],
 [1.                , 0.7676767676767677],
 [1.                , 0.5656565656565657],
 [1.                , 0.4747474747474748],
 [1.                , 0.101010101010101 ],
 [0.                , 0.4242424242424243],
 [0.                , 0.7272727272727273],
 [1.                , 0.8181818181818182],
 [0.                , 0.9696969696969697],
 [1.                , 1.                ],
 [1.                , 0.5858585858585859],
 [0.                , 0.9494949494949496],
 [0.                , 0.1111111111111111],
 [1.                , 0.0202020202020202],
 [0.                , 0.0202020202020202],
 [0.                , 0.5858585858585859],
 [1.                , 0.6161616161616162],
 [0.                , 0.6060606060606061],
 [1.                , 0.101010101010101 ],
 [0.                , 0.2727272727272728],
 [0.                , 0.6262626262626263],
 [1.                , 0.1414141414141414],
 [0.                , 0.5050505050505051],
 [1.                , 0.1919191919191919],
 [1.                , 0.4848484848484849],
 [1.                , 0.6666666666666667],
 [1.                , 0.1414141414141414],
 [0.                , 0.3636363636363636],
 [1.                , 0.0303030303030303],
 [1.                , 0.7777777777777778],
 [0.                , 0.5050505050505051],
 [1.                , 0.8484848484848485],
 [1.                , 0.8282828282828284],
 [1.                , 0.696969696969697 ],
 [1.                , 0.7575757575757577],
 [0.                , 0.3535353535353536],
 [1.                , 0.1313131313131313],
 [1.                , 0.9595959595959597],
 [0.                , 0.3838383838383839],
 [1.                , 0.98989898989899  ],
 [0.                , 0.2424242424242424],
 [0.                , 0.494949494949495 ],
 [1.                , 0.4141414141414142],
 [1.                , 0.3232323232323233],
 [0.                , 0.5858585858585859],
 [0.                , 0.7272727272727273],
 [1.                , 0.6767676767676768]]
T:

[0, 1, 0, 0, 1, 0,
 0, 1, 1, 0, 0, 1,
 1, 1, 1, 0, 1, 0,
 1, 1, 1, 1, 1, 1,
 0, 1, 0, 0, 0, 1,
 1, 1, 0, 1, 1, 1,
 1, 0, 0, 0, 1, 1,
 1, 1, 1, 1, 1, 1,
 0, 1, 0, 0, 1, 1,
 1, 0, 0, 1, 1, 1,
 1, 1, 0, 1, 1, 0,
 1, 1, 1, 1, 1, 0,
 1, 0, 0, 0, 1, 1,
 1, 0]
Y:

[1, 1, 1, 0, 1, 1,
 0, 1, 1, 1, 1, 1,
 1, 1, 1, 1, 1, 0,
 1, 0, 0, 0, 1, 0,
 0, 0, 0, 0, 1, 1,
 0, 0, 0, 0, 0, 0,
 1, 0, 1, 1, 0, 0,
 1, 0, 1, 1, 1, 0,
 0, 1, 1, 1, 1, 1,
 1, 1, 0, 0, 1, 0,
 1, 1, 1, 0, 0, 0,
 1, 0, 1, 0, 0, 1,
 1, 1, 1, 1, 1, 1,
 0, 0]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants