Skip to content

Commit a83698f

Browse files
Merge pull request #11 from christinaexyou/update-readme
Update README and other minor edits
2 parents 403bac9 + e37522b commit a83698f

File tree

3 files changed

+72
-28
lines changed

3 files changed

+72
-28
lines changed

README.md

Lines changed: 66 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -12,15 +12,15 @@ The TrustyAI KServe integration provides explanations for predictions made by AI
1212

1313
The TrustyAI explainer can be added to KServe `InferenceServices`. Here are YAML configurations to deploy explainers with LIME and SHAP:
1414

15-
### LIME Explainer `InferenceService`
15+
### LIME and SHAP Explainer `InferenceService`
1616

17-
By default, the TrustyAI KServe explainer will use the LIME explainer. You can deploy the explainer using the following YAML configuration:
17+
By default, the TrustyAI KServe explainer will use the **both the LIME and SHAP explainer**. You can deploy the explainers using the following YAML configuration:
1818

1919
```yaml
2020
apiVersion: "serving.kserve.io/v1beta1"
2121
kind: "InferenceService"
2222
metadata:
23-
name: "explainer-test-lime"
23+
name: "explainer-test-all"
2424
annotations:
2525
sidecar.istio.io/inject: "true"
2626
sidecar.istio.io/rewriteAppHTTPProbers: "true"
@@ -39,41 +39,85 @@ spec:
3939
image: quay.io/trustyai/trustyai-kserve-explainer:latest
4040
```
4141
42-
### Example: Using the LIME Explainer
42+
### Example: Using the both the LIME and SHAP Explainer
4343
44-
You can interact with the LIME explainer using the following `curl` command:
44+
You can interact with the LIME and SHAP explainer using the following `curl` command:
4545

4646
```bash
4747
payload='{"data": {"ndarray": [[1.0, 2.0]]}}' # Adjust payload as per your input requirements
4848
curl -s -H "Host: ${HOST}" \
4949
-H "Content-Type: application/json" \
50-
"http://${GATEWAY}/v1/models/explainer-test-lime:explain" -d $payload
50+
"http://${GATEWAY}/v1/models/explainer-test-all:explain" -d $payload
5151
```
5252

53-
This command sends a JSON payload to the `:explain` endpoint and retrieves an explanation for the prediction. The response structure includes the saliencies of each feature contributing to the prediction, as shown below:
53+
This command sends a JSON payload to the `:explain` endpoint and retrieves an explanation for the prediction. The response structure includes the explainer type and saliencies of each feature contributing to the prediction, as shown below:
5454

5555
```json
5656
{
57-
"saliencies": {
58-
"value": {
59-
"output": {"value": {"underlyingObject": 1}, "type": "NUMBER", "score": 1.0, "name": "value"},
60-
"perFeatureImportance": [
61-
{
62-
"feature": {"name": "f", "type": "NUMBER", "value": {"underlyingObject": 0.9}},
63-
"score": 0.7474712680313286
57+
"timestamp": "2024-05-06T21:42:45.307+00:00",
58+
"LIME": {
59+
"saliencies": {
60+
"outputs-0": [
61+
{
62+
"name": "inputs-12",
63+
"score": 0.8496797810357467,
64+
"confidence": 0
65+
},
66+
{
67+
"name": "inputs-5",
68+
"score": 0.6830766647546147,
69+
"confidence": 0
70+
},
71+
{
72+
"name": "inputs-7",
73+
"score": 0.6768475400887952,
74+
"confidence": 0
75+
},
76+
// Additional features
77+
]
6478
}
65-
// Additional features...
66-
]
6779
}
68-
},
69-
"availableCFs": [],
70-
"sourceExplainer": "LIME"
80+
"SHAP": {
81+
"saliencies": {
82+
// Additional features
83+
}
84+
}
7185
}
7286
```
7387

88+
### LIME Explainer `InferenceService`
89+
90+
To use the **LIME explainer only**, you can deploy the explainer by specifying it as an environment variable and using the following YAML configuration (initial part will be identical to the previous `InferenceService`):
91+
92+
```yaml
93+
apiVersion: "serving.kserve.io/v1beta1"
94+
kind: "InferenceService"
95+
metadata:
96+
name: "explainer-test-lime"
97+
annotations:
98+
sidecar.istio.io/inject: "true"
99+
sidecar.istio.io/rewriteAppHTTPProbers: "true"
100+
serving.knative.openshift.io/enablePassthrough: "true"
101+
spec:
102+
predictor:
103+
model:
104+
modelFormat:
105+
name: sklearn
106+
protocolVersion: v2
107+
runtime: kserve-sklearnserver
108+
storageUri: https://github.com/trustyai-explainability/model-collection/raw/main/credit-score/model.joblib
109+
explainer:
110+
containers:
111+
- name: explainer
112+
image: quay.io/trustyai/trustyai-kserve-explainer:latest
113+
env:
114+
- name: EXPLAINER_TYPE # <- specify LIME here
115+
value: "LIME"
116+
```
117+
74118
### SHAP Explainer `InferenceService`
75119

76-
To use the SHAP explainer, you can deploy the explainer by specifying it as an environment variable and using the following YAML configuration (initial part will be identical to the previous `InferenceService`):
120+
To use the **SHAP explainer only**:
77121

78122

79123
```yaml
@@ -102,15 +146,15 @@ spec:
102146
value: "SHAP"
103147
```
104148

105-
The explanation request will be identical to the LIME explainer case.
149+
The explanation request for either LIME or SHAP will be identical to both LIME and SHAP.
106150

107151
## Configuration
108152

109153
The following environment variables can be used in the `InferenceService` to customize the explainer:
110154

111155
| Name | Description | Default |
112156
|--------------------------------------------------------------------------|--------------------------------------------------------------------|---------------|
113-
| `EXPLAINER_TYPE` | `LIME` or `SHAP`, the explainer to use. | `LIME` |
157+
| `EXPLAINER_TYPE` | `ALL`, `LIME` or `SHAP`, the explainer to use. | `ALL` |
114158
| `LIME_SAMPLES` | The number of samples to use in LIME | `200` |
115159
| `LIME_RETRIES` | Number of LIME retries | `2` |
116160
| `LIME_WLR` | Use LIME Weighted Linear Regression, `true` or `false` | `true` |

src/main/java/org/kie/trustyai/ConfigService.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
@ApplicationScoped
99
public class ConfigService {
1010

11-
@ConfigProperty(name = "explainer.type")
11+
@ConfigProperty(name = "explainer.type", defaultValue = "ALL")
1212
ExplainerType explainerType;
1313
@ConfigProperty(name = "lime.samples", defaultValue = "200")
1414
int limeSamples;

src/main/java/org/kie/trustyai/payloads/SaliencyExplanationResponse.java

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -37,12 +37,13 @@ public void setSaliencies(Map<ExplainerType, Map<String, List<FeatureSaliency>>>
3737
@Override
3838
public String toString() {
3939
return "SaliencyExplanationResponse{" +
40-
"timestamp=" + timestamp +
41-
", type='" + type + '\'' +
42-
", saliencies=" + saliencies +
43-
'}';
40+
"timestamp=" + timestamp + '\'' +
41+
", LIME=" + saliencies.get(ExplainerType.LIME) +
42+
", SHAP=" + saliencies.get(ExplainerType.SHAP) +
43+
'}';
4444
}
4545

46+
4647
public static class FeatureSaliency {
4748

4849
private String name;
@@ -146,5 +147,4 @@ public static SaliencyExplanationResponse fromSaliencyResults(@Nonnull SaliencyR
146147

147148
return new SaliencyExplanationResponse(combinedMap);
148149
}
149-
150150
}

0 commit comments

Comments
 (0)