Conversation
The results are not the same as the paper, but seem to be consistent to some degree. I believe the actual method is implemented correctly, but there are differences in the way the datasets are processed and the models are trained, which can lead to different results.
|
@codex review |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting
| optimizer.zero_grad() | ||
| # x_new_enc = reconstruct_encoding_constraints( | ||
| # x_new, cat_feature_indices, binary_cat_features | ||
| # ) | ||
| # use x_new_enc for prediction results to ensure constraints | ||
| # f_x_new = softmax(torch_model(x_new))[:, 1] | ||
| f_x_new_binary = torch_model(x_new).squeeze(axis=0) | ||
|
|
There was a problem hiding this comment.
Replace unsupported axis argument to squeeze
Inside the optimization loop the code calls torch_model(x_new).squeeze(axis=0) before computing the loss. PyTorch tensors only accept the keyword dim (or a positional argument) for squeeze; passing axis raises a TypeError, so Probe.get_counterfactuals will fail as soon as it is executed. Use squeeze(0) or squeeze(dim=0) instead so the loop can run.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
@codex just to clarify, even with Python 3.7 and PyTorch 1.7.0, this syntax would still be invalid?
There was a problem hiding this comment.
To use Codex here, create an environment for this repo.
There was a problem hiding this comment.
@codex even with Python 3.7 and PyTorch 1.7.0, this syntax would still be invalid?
There was a problem hiding this comment.
Summary
- Replaced the unsupported
axiskeyword in the Wachter RIP probe’s call totorch.Tensor.squeezewith the positional argument to maintain PyTorch compatibility. methods/catalog/probe/library/wachter_rip.pyL170-L199
Testing
⚠️ No automated tests were run (not requested).
No description provided.