Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

probabilistic reparameterization tutorial #1534

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

sdaulton
Copy link
Contributor

Summary: see title

Differential Revision: D41629553

@facebook-github-bot facebook-github-bot added CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported labels Nov 30, 2022
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D41629553

facebook-github-bot pushed a commit that referenced this pull request Dec 27, 2022
Summary:
<!--
Thank you for sending the PR! We appreciate you spending the time to make BoTorch better.

Help us understand your motivation by explaining why you decided to make this change.

You can learn more about contributing to BoTorch here: https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md
-->

## Motivation

As I am currently refactoring our internal codebase, I had a look at sdaulton PR regarding probabilistic reparameterization.
From my understanding one has to use it by representing the categoricals by a one hot encoding for the reparmeterized ACQF and then eventually transforming the input to a numerical represenation via `OneHotToNumeric` especially when one wants to use it togehter with `MixedSingleTaskGP`. Currently MixedSingleTaskGP is very strict on which input transforms are allowed. This PR lifts the restrictions to make it usable with  OneHotToNumeric`.

Note that the transform also has to be instantiated with `transform_on_train = False` and `train_X` has to be transformed before it is passed to the constructor of `MixedSingleTaskGP`, else the indices for the different kernels are mixed up.

### Have you read the [Contributing Guidelines on pull requests](https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md#pull-requests)?

Yes.

Pull Request resolved: #1568

Test Plan:
Unit tests.

## Related PRs

#1534

Reviewed By: esantorella

Differential Revision: D42230252

Pulled By: Balandat

fbshipit-source-id: b6a0a12d926fbab9890a75438eb60ef849441149
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D41629553

sdaulton added a commit to sdaulton/botorch that referenced this pull request Feb 4, 2023
Summary:
Pull Request resolved: pytorch#1534

see title

Differential Revision: D41629553

fbshipit-source-id: 2ecd5870ecbb769c7157a29f68cfbafb6a76dd04
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D41629553

sdaulton added a commit to sdaulton/botorch that referenced this pull request Feb 8, 2023
Summary:
Pull Request resolved: pytorch#1534

see title

Differential Revision: D41629553

fbshipit-source-id: 19856a6df6f88eb0493b4eedc6d53ec5ea141b72
@codecov
Copy link

codecov bot commented Feb 8, 2023

Codecov Report

Merging #1534 (2a4554f) into main (63dd0cd) will decrease coverage by 2.71%.
The diff coverage is 12.71%.

❗ Current head 2a4554f differs from pull request most recent head 9412010. Consider uploading reports for the commit 9412010 to get more accurate results

@@             Coverage Diff             @@
##              main    #1534      +/-   ##
===========================================
- Coverage   100.00%   97.29%   -2.71%     
===========================================
  Files          169      171       +2     
  Lines        14518    14949     +431     
===========================================
+ Hits         14518    14544      +26     
- Misses           0      405     +405     
Impacted Files Coverage Δ
...ch/acquisition/probabilistic_reparameterization.py 0.00% <0.00%> (ø)
botorch/models/transforms/input.py 67.79% <5.42%> (-32.21%) ⬇️
botorch/models/transforms/factory.py 71.79% <8.33%> (-28.21%) ⬇️
botorch/acquisition/fixed_feature.py 100.00% <100.00%> (ø)
botorch/acquisition/penalized.py 100.00% <100.00%> (ø)
botorch/acquisition/proximal.py 100.00% <100.00%> (ø)
botorch/acquisition/utils.py 100.00% <100.00%> (ø)
botorch/acquisition/wrapper.py 100.00% <100.00%> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

sdaulton and others added 4 commits February 14, 2023 16:32
Summary:
Pull Request resolved: pytorch#1532

Add a wrapper for modifying inputs/outputs. This is useful for not only probabilistic reparameterization, but will also simplify other integrated AFs (e.g. MCMC) as well as fixed feature AFs and things like prior-guided AFs

Differential Revision: https://internalfb.com/D41629186

fbshipit-source-id: 51b84765e58c17cda63bc582bfe30d0ca13955b5
Summary: Creates a new helper method for checking both if a given AF is an instance of a class or if the given AF wraps a base AF that is an instance of a class

Differential Revision: D43127722

fbshipit-source-id: 13b9d54b05de09d2b2ed86406a921a38fcedab13
Summary:
Pull Request resolved: pytorch#1533

Probabilistic reparameterization

Differential Revision: https://internalfb.com/D41629217

fbshipit-source-id: f0719b974a8b9de4a1fe8fb62a9c73e9a1fbb551
Summary:
Pull Request resolved: pytorch#1534

see title

Differential Revision: D41629553

fbshipit-source-id: 522f1fc245c268b4de33524c8a7addd1a8bf15b7
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D41629553

@facebook-github-bot
Copy link
Contributor

Hi @sdaulton!

Thank you for your pull request.

We require contributors to sign our Contributor License Agreement, and yours needs attention.

You currently have a record in our system, but the CLA is no longer valid, and will need to be resubmitted.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at [email protected]. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants