Skip to content

Jieming changes#4

Open
zkhotanlou wants to merge 14 commits intomainfrom
jieming-changes
Open

Jieming changes#4
zkhotanlou wants to merge 14 commits intomainfrom
jieming-changes

Conversation

@zkhotanlou
Copy link
Owner

No description provided.

@zkhotanlou
Copy link
Owner Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +126 to +151
# Load trained Transformer
self._transformer = self._load_transformer(self._params["model_path"])

# Store device
self._device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
self._transformer = self._transformer.to(self._device)
self._transformer.eval()


def _load_transformer(self, model_path: str) -> GenReTransformer:
"""
Load trained GenRe Transformer.

Args:
model_path: Path to model checkpoint (relative to genre/ directory)

Returns:
Loaded GenRe Transformer model
"""
# Construct absolute path ⭐
# model_path is relative to methods/catalog/genre/
base_path = Path(__file__).parent

# If path doesn't start with "saved_models/", prepend it
if not model_path.startswith("saved_models/"):
model_path = f"saved_models/{model_path}"

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Fail fast when GenRe checkpoint does not match dataset

The new GenRe recourse method always loads a single checkpoint from saved_models/genre_transformer.pth (lines 126‑151) and never uses the data_name that run_experiment injects into the hyperparameters. As a result, any CLI invocation that runs --recourse_method genre on a dataset whose feature dimensionality differs from the COMPAS checkpoint will crash when self._transformer.generate receives inputs of the wrong size (the linear embeddings were constructed with checkpoint['n_features']). The method should either select a dataset-specific model based on data_name or raise a clear error before loading; otherwise the experiments fail for every dataset except the one the checkpoint was trained on.

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants