Can sahi do slice predictions without torch dependencies? #1006
zhaoruibing
started this conversation in
General
Replies: 2 comments
-
|
I have the same requirements, so I manually modify the source code to change all torch based to numpy based (just for yolo detection, I also removed several unnecessary codes), you can check the source code here (based on v0.11.20): The modification can be checked at this commit: |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Next release will be not gonna include torch and all of the process can be done via "numpy" but it can be use via torch or numba as well. So I make sure it is now optional as well. But please try and let us know, thank you. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to deploy a YOLOv8 model with
get_sliced_prediction(), pack everything into a small container.get_sliced_prediction()calls postprocess functions, and these postprocess functions insahi.postprocess.combinedepends on torch.sahi is said to be 'lightweight' and in this discussion independent of torch.
The question is, could sahi do slice predictions without torch dependencies?
Or this is something sahi plans to do in the future?
Beta Was this translation helpful? Give feedback.
All reactions