Open
Description
Feature requests marked as 'wishlist' will be gathered here going forward, in order to:
- improve discoverability of issues that report bugs or ask questions,
- help us easily assess these requests when roadmapping.
Please still feel free to open new issues for feature requests (or comment them here if they are short/clear), and we will take care of adding them to this post.
Status: will likely be addressed in the short-term
[DONE] Obtaining best parameters (parameters on the Pareto frontier) for multi-objective optimization in Service API (ImplementAxClient.get_best_parameters
, returning the parameter configurations on the Pareto frontier #656)- Support for Ax viz in Colab (Visualization in tutorial not working in colab and remote setup #306)
- Adding a conda distribution (Why is pip recommended over conda? #608, Request: add
requirement.txt
for conda-forge compatibility #614) [DONE] Hierarchical search spaces (Hierarchical search spaces #140)- Support specifying fixed features in Service API ([Feature Request] Specifying fixed features to
AxClient.get_next_trial
(to conduct contextual BO using Service API) #746) - Support preference learning in Ax (Preference BO with Ax #754)
- Early-stopping tutorial in Ax (Tutorial for early-stopping in Ax #851)
Status: will likely be addressed in the long-term
- TensorBoard support (TensorBoard Support #248)
- [IN PROGRESS] Optimization algorithm, suitable for combinatorial search spaces in Ax ([Feature Request] Optimization over high-dim discrete / combinatorial search spaces #477)
- Include performance benchmarks in Ax documentation beyond the BoTorch benchmarks published in https://arxiv.org/abs/1910.06403 ([Feature Request] Performance Benchmarks in Documentation #496)
- Exact equality parameter constraints ([Feature Request] Support Exact Equality Parameter Constraints #510)
- Automate selection of appropriate parameters for BoTorch components in Ax based on experiment and data size (Automate selection of appropriate parameters for BoTorch components in Ax based on experiment and data size #674)
- "random_seed" setting in Loop and Service APIs guaranteeing full reproducibility (Issues with
optimize
: needs more documentation, maybe does not return the correct best parameters? #605) - Constraints on ordered choice parameters (Feature request: allow for constraints on ordered choice parameters #710)
- Make more common functions available with just
import ax
: Add commonly used plots to__init__.py
, so they can be available with justimport ax
#774 - TRBO in Ax (TuRBO support in Ax #474)
- Add getters/setters for
Experiment._properties
andTrial._properties
(Wishlist: Tracking Issue #566 (comment)) - Support for MOO for hierarchical search spaces without flattening the search space (Does botorch optimizer in mo setting such as qehvi, qnehvi or qnparego support hierarchical space? #1042)
- Gaussian priors for BoTorch models in Ax (How to use custom Gaussian prior for Bayesian optimization? #1647)
- Slice and contour plots for choice parameters (plot_slice with choiceParameter #1577, Plot_contour doesn't work with a Choice parameter #1624)
- Storage of
Data
for trials without Dataframes (associating potentially larger data, e.g. image data, with trials, and passing that data to the model directly; Question: Best way to include non-scalar types in experiment results? #880) - Official release of
DeterministicModel
and documentation for it (Request for documentation: example withDeterministicModel
#1192, Optimization of analytic functions #935) - Multi-task GP tutorial (Example (tutorial) for ST_MTGP_NEHVI #2086)
- Plotting compatibility with Hierarchical Search Spaces (Plotting functions do not work with HierarchicalSearchSpace #2481)
- SQLAlchemy 2.0 support (Ax is incompatible with SQLAlchemy 2.0+; incompatibility errors arise when using only JSON storage, too #1697)
- Fix RayTune Tutorial ("Hyperparameter Optimization via Raytune" link in website is broken. #2090)
- Add properties to more easily expose acquisition function details re: optimization (How to expose the default acquisition function being used by AxClient() #2411)
Status: uncertain
- Nonlinear parameter constraints (Nonlinear parameter constraints #153)
- Adding "projects" to group Ax experiments (Hierarchy of experiments #189)
- Integration with Nevergrad (Population based optimization #163)
- Returning optimal ranges rather than single values for best parameters (
best_parameters
as optimal ranges instead of single values #320) - Ax on ARM / Raspberry pie (Installing Ax on a Raspberry Pi (Model 3B+) #412, blocked on official PyTorch support for ARM)
- ALEBO support for parameter constraints (Parameter Constraint at ALEBO #424)
- Selective noise inference for some observations in a field experiment (Selective noise inference for some observations in field experiment #491)
- Off-the-shelf models that do not use Gaussian noise (Selective noise inference for some observations in field experiment #491)
- Support sampling individual parameters from specified distributions (Feature request: specifying distributions, from which individual parameters in search space should be sampled #702)
- Ability to specify input uncertainties ([Feature Request] Ability to specify input uncertainties (uncertainties of measured parameters) #751)
PAUSED
trial status (Request for a feature. Please introduce the TrialStatus.PAUSED #862)- Discrete fidelity parameter support (Multi-fidelity optimization with discrete fidelities #979)
- Accept
numpy
values as well as Ax primitives (Parameter type checking #996) - Exposing more settings for ALEBO (How to set the batch_size and replace the acquisition function with qUCB for ALEBO? #953)
- Enable intersphinx mapping (intersphinx mapping #1227)
- PyCharm support for viz (Feature Request: PyCharm support for viz #1729)
- Add annealing parameter to Models.THOMPSON (Extending Models.THOMPSON with an extra parameter #2334)
Suggestions for setup changes:
- Migration of website to Docusaurus 2 (migration to docusaurus 2 #466, status: blocked)