-
Notifications
You must be signed in to change notification settings - Fork 25
Xai #123
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: development
Are you sure you want to change the base?
Xai #123
Conversation
To see reason for failure
Removed changes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
change back loss for dl prediction wrapper
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reverted loss to mse loss again
|
|
||
|
|
||
| ''' | ||
| @gin.configurable |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why is it commented out?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the model built from scratch in the beginning I kept it as I thought this could be handy at some point however only used the model from pytorch forecasting in the end tell me if you think it is better to remove it
| } | ||
| ) | ||
|
|
||
| random_model_dir = args.random_model |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what is this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the path of the model trained on random labels which is needed when calculating the Data randomization distance
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps we can integrate this into a .gin file binding? So then we don't clutter up the run for people who do not necessarily want to use this.
| debug: bool = False, | ||
| verbose: bool = False, | ||
| wandb: bool = False, | ||
| pytorch_forecasting: bool = False, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am not sure about putting this here as it makes the code quite inflexible. Are you able to use isinstance for example?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The thing is train_common doesn't take the model class in the beginning just has the name and then creates a model so is instance wouldn't work but here it gets this from the run command so this doesn't come from each function it is defined by the person running the code once in the beginning and then is pushed through everything
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's see if we can change this for the "more integrated approach"
| - demo_data/mortality24/eicu_demo | ||
| - demo_data/mortality24/mimic_demo | ||
| - demo_data/aki/eicu_demo | ||
| #fails for some reason - demo_data/aki/eicu_demo |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would warrant further investigation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this was before I edited any code but I can look it in for sure
| - ignite=0.4.11 | ||
| - pytorch=2.0.1 | ||
| - pytorch-cuda=11.8 | ||
| # - pytorch-cuda=11.8 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is because the other packages already install a certain cuda?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this was before I edited any code but I can look it in for sure same as the comment before
| - LSTM | ||
| - TCN | ||
| - Transformer | ||
| - TFT |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps we can create a (or a couple) of yml files for the XAI experiments specifically?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that would make it cleaner for sure once we have the setup for the models
| verbose: bool = False, | ||
| wandb: bool = False, | ||
| complete_train: bool = False | ||
| complete_train: bool = False, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
would it be possible to use inheritance here? I.e., we have a default CV and then one specifically for your implementation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would work I think yes
| pass | ||
|
|
||
|
|
||
| def Faithfulness_Correlation( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would clearly indicate that the code comes from Quantus
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I already reference quantus in the function however the code here is adapted from the quantus implementation to get it work with captum and the model
| parser.add_argument("-sn", "--source-name", type=Path, help="Name of the source dataset.") | ||
| parser.add_argument("--source-dir", type=Path, help="Directory containing gin and model weights.") | ||
| parser.add_argument("-sa", "--samples", type=int, default=None, help="Number of samples to use for evaluation.") | ||
| parser.add_argument("--explain", default=False, action=BOA, help="Provide explaintations for predictions.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
some typos
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will fix these
| plt.xlabel("Feature") | ||
| plt.ylabel("{} Attribution".format(method_name)) | ||
| plt.title("{} Attribution Values".format(method_name)) | ||
| plt.xticks( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is quite hardcoded. Can we take this from a gin file?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for sure can be the next steps for plotting
No description provided.