Skip to content

Commit f2ddf12

Browse files
committed
Update README and the pyproject.toml of vl2l benchmark
1 parent bd161fd commit f2ddf12

File tree

3 files changed

+19
-50
lines changed

3 files changed

+19
-50
lines changed

loadgen/README_BUILD.md

Lines changed: 9 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,18 +3,22 @@
33
## Prerequisites
44

55
sudo apt-get install libglib2.0-dev python-pip python3-pip
6-
pip2 install absl-py numpy
7-
pip3 install absl-py numpy
86

97
## Quick Start
108
### Installation - Python
9+
If you need to clone the repo (e.g., because you are a MLPerf Inference developer), you
10+
can build and install the `mlperf-loadgen` package via:
1111

12-
pip install absl-py numpy
1312
git clone --recurse-submodules https://github.com/mlcommons/inference.git mlperf_inference
1413
cd mlperf_inference/loadgen
15-
CFLAGS="-std=c++14 -O3" python -m pip install .
14+
pip install .
1615

17-
This will fetch the loadgen source, build and install the loadgen as a python module, and run a simple end-to-end demo.
16+
If you don't need to clone the repo (e.g., you just want to install `mlperf-loadgen`
17+
from the latest commit of the `master` branch):
18+
19+
pip install git+https://github.com/mlcommons/inference.git#subdirectory=loadgen
20+
21+
This will fetch the loadgen source, then build and install the loadgen as a python module.
1822

1923
Alternatively, we provide wheels for several python versions and operating system that can be installed using pip directly.
2024

multimodal/vl2l/README.md

Lines changed: 9 additions & 44 deletions
Original file line numberDiff line numberDiff line change
@@ -2,20 +2,6 @@
22

33
## Quick Start
44

5-
### Get the source code
6-
7-
Clone the MLPerf Inference repo via:
8-
9-
```bash
10-
git clone --recurse-submodules https://github.com/mlcommons/inference.git mlperf-inference
11-
```
12-
13-
Then enter the repo:
14-
15-
```bash
16-
cd mlperf-inference/
17-
```
18-
195
### Create a Conda environment
206

217
Follow [this link](https://www.anaconda.com/docs/getting-started/miniconda/install#quickstart-install-instructions)
@@ -26,53 +12,32 @@ environment via:
2612
conda create -n mlperf-inf-mm-vl2l python=3.12
2713
```
2814

29-
### Install LoadGen
30-
31-
Update `libstdc++` in the conda environment:
15+
### (Optionally) Update `libstdc++` in the conda environment:
3216

3317
```bash
3418
conda install -c conda-forge libstdcxx-ng
3519
```
3620

37-
Install `absl-py` and `numpy`:
38-
39-
```bash
40-
conda install absl-py numpy
41-
```
42-
43-
Build and install LoadGen from source:
44-
45-
```bash
46-
cd loadgen/
47-
CFLAGS="-std=c++14 -O3" python -m pip install .
48-
cd ../
49-
```
50-
51-
Run a quick test to validate that LoadGen was installed correctly:
52-
53-
```bash
54-
python loadgen/demos/token_metrics/py_demo_server.py
55-
```
21+
This is only needed when your local environment doesn't have `libstdc++`.
5622

5723
### Install the VL2L benchmarking CLI
5824

5925
For users, install `mlperf-inf-mm-vl2l` with:
6026

6127
```bash
62-
pip install multimodal/vl2l/
28+
pip install git+https://github.com/mlcommons/inference.git#subdirectory=multimodal/vl2l
6329
```
6430

6531
For developers, install `mlperf-inf-mm-vl2l` and the development tools with:
66-
67-
- On Bash
32+
1. Clone the MLPerf Inference repo.
6833
```bash
69-
pip install multimodal/vl2l/[dev]
70-
```
71-
- On Zsh
72-
```zsh
73-
pip install multimodal/vl2l/"[dev]"
34+
git clone --recurse-submodules https://github.com/mlcommons/inference.git mlperf-inference
7435
```
7536

37+
2. Install in editable mode with the development tools.
38+
- Bash: `pip install mlperf-inference/multimodal/vl2l/[dev]`
39+
- Zsh: `pip install mlperf-inference/multimodal/vl2l/"[dev]"`
40+
7641
After installation, you can check the CLI flags that `mlperf-inf-mm-vl2l` can take with:
7742

7843
```bash

multimodal/vl2l/pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ dependencies = [
1313
"datasets",
1414
"loguru",
1515
"matplotlib",
16-
"mlcommons_loadgen",
16+
"mlcommons_loadgen @ file://../../loadgen",
1717
"openai[aiohttp]",
1818
"pydantic",
1919
"pydantic-typer @ git+https://github.com/CentML/pydantic-typer.git@wangshangsam/preserve-full-annotated-type",

0 commit comments

Comments
 (0)