Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 0 additions & 12 deletions config_explorer/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -84,16 +84,6 @@ The Streamlit frontend includes the following pages:

1. **Capacity Planner** - Analyze GPU memory requirements and capacity planning for LLM models
2. **GPU Recommender** - Get optimal GPU recommendations based on model and workload requirements
3. **Sweep Visualizer** - Visualize benchmark results and configuration sweeps

### Using the Sweep Visualizer

The Sweep Visualizer page supports visualizing a collection of `llm-d-benchmark` report files. To get started easily, you may download the data from the [public llm-d-benchmark community Google Drive](https://drive.google.com/drive/u/0/folders/1r2Z2Xp1L0KonUlvQHvEzed8AO9Xj8IPm). Preset options have been selected for each scenario. For example, we recommend viewing

- `qwen-qwen-3-0-6b` using the Chatbot application highlight Inference Scheduling
- `meta-llama/Llama-3.1-70B-Instruct` using the Document Summarization application highlight PD Disaggregation

Default values will be populated once those options are selected. Advanced users may further conduct their own configuration.

### Using the GPU Recommender

Expand Down Expand Up @@ -131,6 +121,4 @@ The GPU Recommender displays cost information to help you find cost-effective GP

## Library

Configuration exploration and benchmark sweep performance comparison is best demonstrated in the Jupyter notebook [analysis.ipynb](../analysis/analysis.ipynb). This notebook can be used for interactive analysis of benchmarking data results, and it utilizes the same core functions as the "Sweep Visualizer" page of the web app. For instructions on using the notebook see [../analysis/README.md](../analysis/README.md).

For GPU recommender API usage see [./examples/gpu_recommender_example.py](./examples/gpu_recommender_example.py).
Loading
Loading