Captum v0.6.0 Release
The Captum v0.6.0 release introduces a new feature StochasticGates
. This release also enhances Influential Examples and includes a series of other improvements & bug fixes.
Stochastic Gates
Stochastic Gates is a technique to enforce sparsity by approximating L0 regularization. It can be used for network pruning and feature selection. As directly optimizing L0 is a non-differentiable combinatorial problem, Stochastic Gates approximates it by using certain continuous probability distributions (e.g., Concrete, Gaussian) as smoothed Bernoulli distributions. So the optimization can be reparameterized into the distributions parameters. Check the following papers for more details:
Captum provides two Stochastic Gates implementations using different distributions as smoothed Bernoulli, BinaryConcreteStochasticGates
and GaussianStochasticGates
. They are available under captum.module
, a new subpackage collecting neural network building blocks that are useful for model understanding. A usage example:
from captum.module import GaussianStochasticGates
n_gates = 5 # number of gates
stg = GaussianStochasticGates(n_gates, reg_weight=0.01)
inputs = torch.randn(3, n_gates) # mock inputs with batch size of 3
gated_inputs, reg = stg(mock_inputs) # gate the inputs
loss = model(gated_inputs) # use gated inputs in the downstream network
# optimize sparsity regularization together with the model loss
loss += reg
...
# verify the learned gate values to see how model is using the inputs
print(stg.get_gate_values())
Influential Examples
Influential Examples is a new function pillar enabled in the last version. This new release continues to focus on it and introduces many improvements upon the existing TracInCP
family. Some of the changes are incompatible with the previous version. Below is the list of details:
- Support loss function with reduction of
mean
inTracInCPFast
andTracInCPFastRandProj
(#913) TracInCP
classes add a new argumentshow_progress
to optionally display progress bars for the compuation (#898, #1046)TracInCP
provides a new public methodself_influence
which computes the self influence scores among the examples in the given data.influence
can no longer compute self_influence scores and the argumentinputs
cannot beNone
(#994, #1069, #1087, #1072)- Previous constructor argument
influence_src_dataset
inTracInCP
is renamed totrain_dataset
(#994) - Add GPU support to
TracInCPFast
andTracInCPFastRandProj
(#969) TracInCP
andTracInCPFastRandProj
provides a new public methodcompute_intermediate_quantities
which computes “embedding” vectors for examples in a the given data (#1068)TracInCP
classes supports a new optional argumenttest_loss_fn
for use cases where different losses are used for training and testing examples (#1073)- Revised the interface of the method
influence
. Removed the argumentsunpack_inputs
andtarget
. Now, theinputs
argument must be atuple
where the last element is the label (#1072)
Notable Changes
- LRP now will throw error when it detects the model ruses any modules (#911)
- Fixed the bug that the concept order changes in
TCAV
’s output (#915, #909) - Fixed the data type issue of using Captum’s built-in SGD linear models in
Lime
(#938, #910) - All submodules are now accessible under the top-level
captum
module, so users canimport captum
and access everything underneath it, e.g.,captum.attr
(#912, #992, #680) - Added a new attribution visualization utility for time series data (#980)
- Improved version detection to fix some compatibility issues caused by dependencies’ versions (#940, #999, )
- Fixed an index bug in the tutorial Interpret regression models using Boston House Prices Dataset (#1014, #1012)
- Refactored
FeatureAblation
andFeaturePermutation
to verify the output type offorward_func
and its shape whenperturbation_per_eval > 1
(#1047, #1049, #1091) - Changed Housing Regression tutorial with California housing dataset (#1041)
- Improved the error message of invalid input types when the required data type is
tensor
ortuple[tensor]
(#1083) - Switched to tensor
forward_hook
from modulebackward_hook
for many attribution algorithms that need tensor gradients, likeDeepLift
andLayerLRP
. So those modules can now support models with in-place modules (#979, #914) - Added an optional
mask
argument toFGSM
andPGD
adversarial attacks undercaptum.robust
to specify which elements are perturbed (#1043)