-
Notifications
You must be signed in to change notification settings - Fork 373
Description
Describe the bug
When using XGBoost ≥ 3.1, the function
shap.explainers.Tree.supports_model_with_masker(model, None)
returns False, whereas it returned True with earlier versions of XGBoost.
This happens because learner_model_param["base_score"] in XGBTreeModelLoader.__init__ is now a list instead of a float, causing a type mismatch in the SHAP XGBoost model loader.
As a result, Shapash currently enforces a constraint on xgboost < 3.1, which prevents compatibility with recent versions.
Minimal reproducible example
import numpy as np
import pandas as pd
import xgboost as xgb
import shap
from shapash.backend.shap_backend import ShapBackend # optional, same issue occurs with shap directly
model_list = [
xgb.XGBRegressor(n_estimators=1),
xgb.XGBClassifier(n_estimators=1),
]
df = pd.DataFrame(range(0, 21), columns=["id"])
df["y"] = df["id"].apply(lambda x: 1 if x < 10 else 0)
df["x1"] = np.random.randint(1, 123, df.shape[0])
df["x2"] = np.random.randint(1, 3, df.shape[0])
df = df.set_index("id")
x_df = df[["x1", "x2"]]
y_df = df["y"].to_frame()
model = xgb.XGBRegressor(n_estimators=1)
model.fit(x_df, y_df)
print(shap.explainers.Tree.supports_model_with_masker(model, None))
# Expected: True
# Actual (with xgboost>=3.1): False| Package | Version |
|---|---|
| SHAP | 0.49.1 |
| XGBoost | 3.1.0+ |
| Python | 3.10+ |
| OS | Ubuntu 22.04 |
Additional context
The next SHAP release includes a fix for this issue, but it is available only for Python ≥ 3.11 (see shap/shap#4202).
It would therefore be helpful to remove the strict XGBoost <3.1 constraint in Shapash and ensure internal compatibility with both XGBoost 3.0 and newer versions.