-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[InferenceSlicer] - allow batch size inference #781
Comments
Hi, @inakierregueab 👋🏻 That is something we were considering but didn't implement due to time restrictions. Let me add some details to this issue. Maybe someone will pick it up. |
Hi @SkalskiP, can I work on this issue if it is for beginners? |
Hi, @Bhavay-2001 👋🏻 Do you already have experience with running model inference at different batch sizes? |
Hi @SkalskiP, yes I think I can manage that. Can you please let me know how to proceed with this? Thanks |
Great! Do you have any specific questions? |
Hi @SkalskiP, how to add batch_size feature in the Inference Class. How can I test in google colab? Any start point that can help me get on track will be helpful. |
I outlined vital steps that need to be taken to add |
Hi @SkalskiP, can you please refer me some code sample that is already been implemented and provides the batch_size functionality? |
@Bhavay-2001, I'm afraid we do not have a code sample. Implementing batch inference was supposed to be executed in this task. :/ |
@SkalskiP, What I am thinking of doing is to implement a for loop with batch of images. Each image is then passed to the model and detections are collected and at the end the detections are returned for the batch. |
Hi @SkalskiP, can you please review this PR? |
Hi @SkalskiP, can you please review and let me know. Thanks |
Me and SkalskiP had a conversation about this - I'll take over for now. |
Intermediate results:
Testing more broadly, however, provides mixed results.
Still checking Colab coming soon. |
https://colab.research.google.com/drive/1j85QErM74VCSLADoGliM296q4GFUdnGM?usp=sharing As you can see, in these tests it only helped the Ultralytics case. Known insufficiencies:
|
PR: #1108 |
This batched inference slicer does not write the detections to file. Also, a drawback of the initial inferenceslicer is it assumes that the entire image can be read into memory. This may not be the case when dealing with large satellite images. A solution to this is windowed reading and writing. the rasterio package offers windowed reading and writing rasterio windowed read and writes. |
Hi @linas,
thanks for your speedy response. What I meant was that the proposed batched
version assumes that each batch contains independent samples (seemingly the
same with sinks). I was mentioning the case whereby you read a patch from a
large image and then you write to the same window location in the output.
I look forward to the outcomes of your discussions:) I have struggled to
find anything like this and have resorted to implementing my own version
using the rasterio windows.
Thanks,
Geethen
…On Thu, Jun 20, 2024 at 3:20 PM Linas Kondrackis ***@***.***> wrote:
Hi @Geethen <https://github.com/Geethen>,
You brought up very good points. Indeed, when dealing with very large
images, this would hog all available memory. I'll loop this idea in, in our
internal discussions.
As for saving the results, that's decoupled. Check out Sinks
<https://supervision.roboflow.com/develop/how_to/save_detections/>.
—
Reply to this email directly, view it on GitHub
<#781 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AJBW5I6RETZKM56MDMP7X6LZILJK3AVCNFSM6AAAAABCLFQ4TKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCOBQGY3TANZSGQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Description
Currently,
sv.InferenceSlicer
processes each slice in a separate callback call - hindering inference with a batch size larger than 1. We can change this by:batch_size
can be a new parameter for theInferenceSlicer
class.callback: Callable[[np.ndarray], Detections]
to callback:Callable[[List[np.ndarray]], List[Detections]]
.Additional
The text was updated successfully, but these errors were encountered: