Skip to content

Commit

Permalink
Ask-tell tutorial (#3390)
Browse files Browse the repository at this point in the history
Summary:
Pull Request resolved: #3390

Tutorial demonstrating ask-tell usage for minimizing hartmann6. Advanced features (constraints, MOO, storage, partial data, etc) not covered.

Differential Revision: D69322316
  • Loading branch information
mpolson64 authored and facebook-github-bot committed Feb 20, 2025
1 parent f5441f1 commit d284758
Show file tree
Hide file tree
Showing 2 changed files with 340 additions and 1 deletion.
333 changes: 333 additions & 0 deletions tutorials/ask_tell/ask_tell.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,333 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Ask-tell Optimization with Ax\n",
"\n",
"We often encounter complex optimization problems in which we wish to tune the values of some parameters to improve metric performance, but have little to no knowledge of their effect on each other.\n",
"We call this class of problems “black-box optimization” and it appears across many disciplines including machine learning, robotics, materials science, and chemistry.\n",
"These problems present a unique challenge, and can be made even more challenging if evaluations are expensive to conduct, time-consuming, or noisy.\n",
"\n",
"We can use Ax to efficiently conduct an experiment in which we \"ask\" for candidate points to evaluate, \"tell\" Ax the results, and repeat.\n",
"We'll uses Ax's `Client`, a tool for managing the state of our experiment, and we'll learn how to define an optimization problem, configure an experiment, run trials, analyze results, and persist the experiment for later use using the `Client`.\n",
"\n",
"Because Ax is a black box optimizer, we can use it to optimize any arbitrary function. In this example we will minimize the [Hartmann6 function](https://www.sfu.ca/~ssurjano/hart6.html), a complicated 6-dimensional function with multiple local minima -- a challenging test case for optimization algorithms that is commonly used in the global optimization literature.\n",
"Looking at its analytic form we can see that it would be incredibly challenging to efficiently find the global minimum either by manual trial-and-error or traditional design of experiments like grid-search or random-search.\n",
"\n",
"$$\n",
"f(\\mathbf{x})=-\\sum_{i=1}^4 \\alpha_i \\exp \\left(-\\sum_{j=1}^6 A_{i j}\\left(x_j-P_{i j}\\right)^2\\right)\n",
"$$\n",
"\n",
"where\n",
"\n",
"$$\\alpha=(1.0,1.2,3.0,3.2)^T$$\n",
"\n",
"$$\n",
"\\mathbf{A}=\\left(\\begin{array}{cccccc}10 & 3 & 17 & 3.50 & 1.7 & 8 \\\\ 0.05 & 10 & 17 & 0.1 & 8 & 14 \\\\ 3 & 3.5 & 1.7 & 10 & 17 & 8 \\\\ 17 & 8 & 0.05 & 10 & 0.1 & 14\\end{array}\\right)\n",
"$$\n",
"\n",
"\n",
"$$\n",
"\\mathbf{P}=10^{-4}\\left(\\begin{array}{cccccc}1312 & 1696 & 5569 & 124 & 8283 & 5886 \\\\ 2329 & 4135 & 8307 & 3736 & 1004 & 9991 \\\\ 2348 & 1451 & 3522 & 2883 & 3047 & 6650 \\\\ 4047 & 8828 & 8732 & 5743 & 1091 & 381\\end{array}\\right)\n",
"$$\n",
"\n",
"\n",
"### Learning Objectives\n",
"- Understand the basic concepts of black box optimization\n",
"- Learn how to define an optimization problem using Ax\n",
"- Configure and run an experiment using Ax's `Client`\n",
"- Analyze the results of the optimization\n",
"\n",
"### Prerequisites\n",
"\n",
"* Familiarity with Python and basic programming concepts\n",
"* Understanding of adaptive experimentation and Bayesian optimization (see [Introduction to Adaptive Experimentation](#) and [Introduction to Bayesian Optimization](#))\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Step 1: Import Necessary Modules\n",
"\n",
"First, ensure you have all the necessary imports:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
"from ax.preview.api.client import Client\n",
"from ax.preview.api.configs import (\n",
" ExperimentConfig,\n",
" RangeParameterConfig,\n",
" ParameterType,\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Step 2: Initialize the Client\n",
"\n",
"Create an instance of the `Client` to manage the state of your experiment."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"client = Client()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Step 3: Configure the Experiment\n",
"\n",
"The `Client` instance can be configured with a series of `Config`s that define how the experiment will be run.\n",
"\n",
"The Hartmann6 problem is usually evaluated on the hypercube $x_i \\in (0, 1)$, so we will define six identical `RangeParameterConfig`s with these bounds and add these to an `ExperimentConfig` along with other metadata about the experiment.\n",
"\n",
"You may specify additional features like parameter constraints to further refine the search space and parameter scaling to help navigate parameters with nonuniform effects.\n",
"For more on configuring experiments, see this [recipe](#)."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Define six float parameters x1, x2, x3, ... for the Hartmann6 function\n",
"parameters = [\n",
" RangeParameterConfig(\n",
" name=f\"x{i + 1}\", parameter_type=ParameterType.FLOAT, bounds=(0, 1)\n",
" )\n",
" for i in range(6)\n",
"]\n",
"\n",
"# Create an experiment configuration\n",
"experiment_config = ExperimentConfig(\n",
" name=\"hartmann6_experiment\",\n",
" parameters=parameters,\n",
" # The following arguments are optional\n",
" description=\"Optimization of the Hartmann6 function\",\n",
" owner=\"developer\",\n",
")\n",
"\n",
"# Apply the experiment configuration to the client\n",
"client.configure_experiment(experiment_config=experiment_config)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Step 4: Configure Optimization\n",
"Now, we must configure the objective for this optimization, which we do using `Client.configure_optimization`.\n",
"This method expects a string `objective`, an expression containing either a single metric to maximize, a linear combination of metrics to maximize, or a tuple or multiple metrics to jointly maximize.\n",
"These expressions are parsed using [SymPy](https://www.sympy.org/en/index.html). For example:\n",
"* `\"score\"` would direct Ax to maximize a metric named score\n",
"* `\"-loss\"` would direct Ax to Ax to minimize a metric named loss\n",
"* `\"task_0 + 0.5 * task_1\"` would direct Ax to maximize the sum of two task scores, downweighting task_1 by a factor of 0.5\n",
"* `\"score, -flops\"` would direct Ax to simultaneously maximize score while minimizing flops\n",
"\n",
"For more information, see the [string parsing recipe](#)."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"metric_name = \"hartmann6\" # this name is used during the optimization loop in Step 5\n",
"objective = f\"-{metric_name}\" # minimization is specified by the negative sign\n",
"\n",
"client.configure_optimization(objective=objective)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Step 5: Run Trials\n",
"Here, we will configure the ask-tell loop.\n",
"\n",
"We begin by defining our Hartmann6 function as written above.\n",
"Remember, this is just an example problem and any Python function can be substituted here.\n",
"\n",
"Then we will iteratively call `client.get_next_trials` to \"ask\" Ax for a parameterization to evaluate, call `hartmann6` using those parameters, and \"tell\" Ax the result using `client.complete_trial`.\n",
"\n",
"This loop will run multiple trials to optimize the function.\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Hartmann6 function\n",
"def hartmann6(x1, x2, x3, x4, x5, x6):\n",
" alpha = np.array([1.0, 1.2, 3.0, 3.2])\n",
" A = np.array([\n",
" [10, 3, 17, 3.5, 1.7, 8],\n",
" [0.05, 10, 17, 0.1, 8, 14],\n",
" [3, 3.5, 1.7, 10, 17, 8],\n",
" [17, 8, 0.05, 10, 0.1, 14]\n",
" ])\n",
" P = 10**-4 * np.array([\n",
" [1312, 1696, 5569, 124, 8283, 5886],\n",
" [2329, 4135, 8307, 3736, 1004, 9991],\n",
" [2348, 1451, 3522, 2883, 3047, 6650],\n",
" [4047, 8828, 8732, 5743, 1091, 381]\n",
" ])\n",
"\n",
" outer = 0.0\n",
" for i in range(4):\n",
" inner = 0.0\n",
" for j, x in enumerate([x1, x2, x3, x4, x5, x6]):\n",
" inner += A[i, j] * (x - P[i, j])**2\n",
" outer += alpha[i] * np.exp(-inner)\n",
" return -outer\n",
"\n",
"hartmann6(0.1, 0.45, 0.8, 0.25, 0.552, 1.0)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Optimization Loop\n",
"\n",
"Having configured the experiment and optimization settings and defined the Python function, proceed with the optimization loop, where several trials are run sequentially."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Number of trials to run\n",
"\n",
"num_trials = 30\n",
"\n",
"# Run trials\n",
"\n",
"for _ in range(num_trials):\n",
" trials = client.get_next_trials(\n",
" maximum_trials=1\n",
" ) # We will request just one trial at a time in this example\n",
" for trial_index, parameters in trials.items():\n",
" x1 = parameters[\"x1\"]\n",
" x2 = parameters[\"x2\"]\n",
" x3 = parameters[\"x3\"]\n",
" x4 = parameters[\"x4\"]\n",
" x5 = parameters[\"x5\"]\n",
" x6 = parameters[\"x6\"]\n",
"\n",
" result = hartmann6(x1, x2, x3, x4, x5, x6)\n",
"\n",
" # Set raw_data as a dictionary with metric names as keys and results as values\n",
"\n",
" raw_data = {metric_name: result}\n",
"\n",
" # Complete the trial with the result\n",
"\n",
" client.complete_trial(trial_index=trial_index, raw_data=raw_data)\n",
" print(f\"Completed trial {trial_index} with {raw_data=}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Step 6: Analyze Results\n",
"\n",
"After running trials, you can analyze the results.\n",
"Most commonly this means extracting the parameterization from the best performing trial you conducted."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"best_parameters, prediction, index, name = client.get_best_parameterization()\n",
"print(\"Best Parameters:\", best_parameters)\n",
"print(\"Prediction (mean, variance):\", prediction)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Step 7: Compute Analyses\n",
"\n",
"Ax can also produce a number of analyses to help interpret the results of the experiment via `client.compute_analyses`.\n",
"These analyses can be anything from dataframes to markdown messages to Plotly figures, and they help us gain a deeper understanding of our experiment.\n",
"Users can manually select which analyses to run, or can allow Ax to select which would be most relevant.\n",
"In this case Ax selects the following:\n",
"* **Parrellel Coordinates Plot** shows which parameterizations were evaluated and what metric values were observed -- this is useful for getting a high level overview of how thoroughly the search space was explored and which regions tend to produce which outcomes\n",
"* **Interaction Analysis Plot** shows which parameters have the largest affect on the function and plots the most important parameters as 1 or 2 dimensional surfaces\n",
"* **Summary** lists all trials generated along with their parameterizations, observations, and miscellaneous metadata"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"client.compute_analyses(display=True) # By default Ax will display the AnalysisCards produced by compute_analyses"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Conclusion\n",
"\n",
"This tutorial demonstrates how to use Ax's `Client` for ask-tell optimization of Python functions using the Hartmann6 function as an example.\n",
"You can adjust the function and parameters to suit your specific optimization problem."
]
}
],
"metadata": {
"fileHeader": "",
"fileUid": "9dfaed34-de4d-42ed-8755-25343d677ef0",
"isAdHoc": false,
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.4"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
8 changes: 7 additions & 1 deletion website/tutorials.json
Original file line number Diff line number Diff line change
@@ -1,2 +1,8 @@
{
}
"Basic usage": [
{
"id": "ask_tell",
"title": "Ask-tell Optimization with Ax"
}
]
}

0 comments on commit d284758

Please sign in to comment.