You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* ShapleyCAM
Weighting the activation maps using Gradient and Hessian-Vector Product.
* name
* ReST example
* comments
* Update README.md
* Update README.md
* Update README.md
* update a simpler version
* comments
* forward function in shapely_cam.py still needed
This is because the calculation of the Hessian-vector product (HVP) requires the computation graph to be retained, see comments in line 37 or 38.
* delete forward function in shapley_cam.py
* comments
Copy file name to clipboardexpand all lines: README.md
+6-1
Original file line number
Diff line number
Diff line change
@@ -47,6 +47,7 @@ The aim is also to serve as a benchmark of algorithms and metrics for research o
47
47
| Deep Feature Factorizations | Non Negative Matrix Factorization on the 2D activations |
48
48
| KPCA-CAM | Like EigenCAM but with Kernel PCA instead of PCA |
49
49
| FEM | A gradient free method that binarizes activations by an activation > mean + k * std rule. |
50
+
| ShapleyCAM | Weight the activations using the gradient and Hessian-vector product.|
50
51
## Visual Examples
51
52
52
53
| What makes the network think the image label is 'pug, pug-dog' | What makes the network think the image label is 'tabby, tabby cat' | Combining Grad-CAM with Guided Backpropagation for the 'pug, pug-dog' class |
# When using the following loss.backward() method, a warning is raised: "UserWarning: Using backward() with create_graph=True will create a reference cycle"
Weights the activation maps using the gradient and Hessian-Vector product.
9
+
This method (https://arxiv.org/abs/2501.06261) reinterpret CAM methods (include GradCAM, HiResCAM and the original CAM) from a Shapley value perspective.
10
+
"""
11
+
classShapleyCAM(BaseCAM):
12
+
def__init__(self, model, target_layers,
13
+
reshape_transform=None):
14
+
super(
15
+
ShapleyCAM,
16
+
self).__init__(
17
+
model=model,
18
+
target_layers=target_layers,
19
+
reshape_transform=reshape_transform,
20
+
compute_input_gradient=True,
21
+
uses_gradients=True,
22
+
detach=False)
23
+
24
+
defget_cam_weights(self,
25
+
input_tensor,
26
+
target_layer,
27
+
target_category,
28
+
activations,
29
+
grads):
30
+
31
+
hvp=torch.autograd.grad(
32
+
outputs=grads,
33
+
inputs=activations,
34
+
grad_outputs=activations,
35
+
retain_graph=False,
36
+
allow_unused=True
37
+
)[0]
38
+
# print(torch.max(hvp[0]).item()) # check if hvp is not all zeros
0 commit comments