Skip to content

Conversation

@rvandewater
Copy link
Owner

No description provided.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

change back loss for dl prediction wrapper

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

reverted loss to mse loss again



'''
@gin.configurable
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is it commented out?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the model built from scratch in the beginning I kept it as I thought this could be handy at some point however only used the model from pytorch forecasting in the end tell me if you think it is better to remove it

}
)

random_model_dir = args.random_model
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is this?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the path of the model trained on random labels which is needed when calculating the Data randomization distance

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps we can integrate this into a .gin file binding? So then we don't clutter up the run for people who do not necessarily want to use this.

debug: bool = False,
verbose: bool = False,
wandb: bool = False,
pytorch_forecasting: bool = False,
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure about putting this here as it makes the code quite inflexible. Are you able to use isinstance for example?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The thing is train_common doesn't take the model class in the beginning just has the name and then creates a model so is instance wouldn't work but here it gets this from the run command so this doesn't come from each function it is defined by the person running the code once in the beginning and then is pushed through everything

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's see if we can change this for the "more integrated approach"

- demo_data/mortality24/eicu_demo
- demo_data/mortality24/mimic_demo
- demo_data/aki/eicu_demo
#fails for some reason - demo_data/aki/eicu_demo
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would warrant further investigation

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this was before I edited any code but I can look it in for sure

- ignite=0.4.11
- pytorch=2.0.1
- pytorch-cuda=11.8
# - pytorch-cuda=11.8
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is because the other packages already install a certain cuda?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this was before I edited any code but I can look it in for sure same as the comment before

- LSTM
- TCN
- Transformer
- TFT
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps we can create a (or a couple) of yml files for the XAI experiments specifically?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that would make it cleaner for sure once we have the setup for the models

verbose: bool = False,
wandb: bool = False,
complete_train: bool = False
complete_train: bool = False,
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would it be possible to use inheritance here? I.e., we have a default CV and then one specifically for your implementation.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would work I think yes

pass


def Faithfulness_Correlation(
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would clearly indicate that the code comes from Quantus

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I already reference quantus in the function however the code here is adapted from the quantus implementation to get it work with captum and the model

parser.add_argument("-sn", "--source-name", type=Path, help="Name of the source dataset.")
parser.add_argument("--source-dir", type=Path, help="Directory containing gin and model weights.")
parser.add_argument("-sa", "--samples", type=int, default=None, help="Number of samples to use for evaluation.")
parser.add_argument("--explain", default=False, action=BOA, help="Provide explaintations for predictions.")
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

some typos

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will fix these

plt.xlabel("Feature")
plt.ylabel("{} Attribution".format(method_name))
plt.title("{} Attribution Values".format(method_name))
plt.xticks(
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is quite hardcoded. Can we take this from a gin file?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for sure can be the next steps for plotting

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants