Skip to content

Commit e8a7f68

Browse files
committed
Add more commands to the latency benchmark
1 parent 8baaff8 commit e8a7f68

File tree

2 files changed

+8
-2
lines changed

2 files changed

+8
-2
lines changed

.github/workflows/bencher.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,4 +42,4 @@ jobs:
4242
--err \
4343
--adapter json \
4444
--github-actions '${{ secrets.GITHUB_TOKEN }}' \
45-
python bench/latency_benchmark_web.py mini_bencher -1 del
45+
python bench/latency_benchmark_web.py mini_bencher 10 del

bench/latency_benchmark_web.py

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -88,6 +88,12 @@ def benchmark(test_config):
8888
subprocess.run(["fgfa", "-I", test_file_name, "extract", "-n", "3", "-c", "3"], stdout=devnull,
8989
stderr=devnull,
9090
check=True)
91+
subprocess.run(["fgfa", "-I", test_file_name, "chop", "-c", "3", "-l"], stdout=devnull,
92+
stderr=devnull,
93+
check=True)
94+
subprocess.run(["fgfa", "-I", test_file_name, "depth"], stdout=devnull,
95+
stderr=devnull,
96+
check=True)
9197
end_time = time.time()
9298
total_time += (end_time - start_time) * 1000
9399
subprocess.run(["rm", "-rf", results], check = True)
@@ -113,5 +119,5 @@ def benchmark(test_config):
113119
else:
114120
print(f"Average latency: {round(benchmark(test_config), 2)} ms")
115121

116-
# Command format: python latency_benchmark_web.py [size](_bencher) -[run_count] (del)
122+
# Command format: python latency_benchmark_web.py [size](_bencher) [run_count] (del)
117123
# () = optional, [] = replace with value

0 commit comments

Comments
 (0)