Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Env flag affects the peak memory in e2e benchmark. #4937

Open
qjia7 opened this issue Apr 15, 2021 · 3 comments
Open

Env flag affects the peak memory in e2e benchmark. #4937

qjia7 opened this issue Apr 15, 2021 · 3 comments

Comments

@qjia7
Copy link
Contributor

qjia7 commented Apr 15, 2021

Steps to reproduce:

  1. Access https://tensorflow.github.io/tfjs/e2e/benchmarks/local-benchmark/index.html
  2. Use the default backend wasm and click Run benchmark.
  3. See the left table
Peak memory 19.63 MB
  1. Switch to webgl backend. Change any flags, for example webgl pack. And then switch back to wasm.
  2. Click Run benchmark again. The peak memory increases as below:
Peak memory 33.51 MB
  1. If we repeat 4 and 5, you will find the peak memory gets bigger and bigger. However, if you don't change the flag, the peak memory won't change. It happens for all backends that the flag will affect the peak memory. It seems like a bug.
@qjia7 qjia7 added the type:bug Something isn't working label Apr 15, 2021
@qjia7
Copy link
Contributor Author

qjia7 commented Apr 15, 2021

fyi @lina128. This is the issue that I mentioned in yesterday's meeting.

@gbaned gbaned assigned gbaned and rthadur and unassigned gbaned Apr 15, 2021
@rthadur rthadur assigned lina128 and unassigned rthadur Apr 16, 2021
@gaikwadrahul8
Copy link
Contributor

Hi, @qjia7

Apologize for the delayed response and we're re-visiting our older issues and checking whether those issues got resolved or not as of now and I tried from my end and still this issue exists so May I know are you still looking for the solution ? Thank you!

For your reference I have added screenshot below:

image

@qjia7
Copy link
Contributor Author

qjia7 commented May 16, 2023

I expect it can be resolved since it's a bug.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants