-
Notifications
You must be signed in to change notification settings - Fork 4.3k
Issues: apache/beam
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Feature Request]: Vertex AI Triton Inference Server Support
ml
new feature
P3
python
run-inference
#31173
opened May 3, 2024 by
damccorm
1 of 16 tasks
[Bug]: Model loading repeatedly fails for large models in RunInference
bug
dataflow
ml
P3
python
run-inference
#25286
opened Feb 2, 2023 by
damccorm
2 of 15 tasks
[Feature Request][Tracking]: Use accelerate from Hugging Face to optimize loading Pytorch models
ml
new feature
P3
python
run-inference
#24340
opened Nov 23, 2022 by
AnandInguva
[Task]: Update RunInference Notebooks to use Custom Inference Functions
ml
P2
python
run-inference
task
#24334
opened Nov 23, 2022 by
jrmccluskey
[Task]: Tracking task for Benchmark RunInference frameworks on CPUs and GPUs
ml
P2
python
run-inference
task
#23874
opened Oct 27, 2022 by
AnandInguva
[Feature Request]: Add optional thread safety to RunInference
ml
new feature
P2
python
run-inference
#23853
opened Oct 26, 2022 by
damccorm
[Feature Request]: More granular treatment of failed inferences
ml
new feature
P2
python
run-inference
#23835
opened Oct 25, 2022 by
BjornPrime
[Feature Request]: Python RunInference transform should be Schema-Aware
ml
new feature
P2
run-inference
#23467
opened Oct 3, 2022 by
chamikaramj
[Task]: Expand PyTorch tested version for RunInference
ml
P2
python
run-inference
task
#23258
opened Sep 15, 2022 by
yeandy
[Feature Request]: Add metrics definitions to RunInference documentation
ml
new feature
P2
python
run-inference
#23142
opened Sep 9, 2022 by
yeandy
RunInference Fit and Finish
core
ml
new feature
P2
python
run-inference
#22117
opened Jun 30, 2022 by
yeandy
RunInference: investigate adding optional batching flag
core
ml
new feature
P3
python
run-inference
#21863
opened Jun 14, 2022 by
yeandy
Remove mypy ignore line from apache_beam/ml/inference/base.py once Dataclass is replaced with NamedTuple
core
ml
P3
python
run-inference
task
#21822
opened Jun 13, 2022 by
AnandInguva
RunInference Benchmarking tests
core
ml
P2
python
run-inference
sub-task
#21454
opened Jun 4, 2022 by
damccorm
Investigate load state_dict vs loading whole model
core
ml
P3
python
run-inference
sub-task
#21450
opened Jun 4, 2022 by
damccorm
Add resource location hints to base RunInference Implementation
core
ml
P2
python
run-inference
sub-task
#21448
opened Jun 4, 2022 by
damccorm
Add batch_size back off in RunInferenceBase
core
ml
P2
python
run-inference
sub-task
#21447
opened Jun 4, 2022 by
damccorm
Explore versions of pytorch to test in Tox
core
ml
P2
python
run-inference
sub-task
#21446
opened Jun 4, 2022 by
damccorm
Correctly comment or remove Metrics Cache
core
ml
P3
python
run-inference
sub-task
#21445
opened Jun 4, 2022 by
damccorm
Add flag to drop example from PredicitonResult
core
ml
P2
python
run-inference
sub-task
#21444
opened Jun 4, 2022 by
damccorm
Investigate releasing models in inference base class
core
ml
P2
python
run-inference
sub-task
#21443
opened Jun 4, 2022 by
damccorm
Figure out how type hints should work
core
ml
P2
python
run-inference
sub-task
#21441
opened Jun 4, 2022 by
damccorm
Hook In Batching DoFn Apis to RunInference
core
ml
P2
python
run-inference
sub-task
#21440
opened Jun 4, 2022 by
damccorm
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.