You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have searched the Pytorch-Wildlife issues and found no similar bug report.
Description
I have a directory of about 17,000 camera trap images, probably an average of a handful of detections per image. When I try to run the batch megadetector on that directory from within a notebook, at around halfway through the batch, the machine runs out of memory (32GB).
If the high memory usage is unavoidable, a nice option would be to be able to run the detector on lists of images rather than directories, that way large directories could be broken up more easily.
Thanks for everything!
Use case
Large directories of images to be detected
The text was updated successfully, but these errors were encountered:
Hello @davidwhealey, thank you so much for reporting this. We also realized this issue on our end and already have a solution to it. We are working on integrating it to the codebase and will give you an update as soon as the new inference function is released!
Hello @davidwhealey , we just pushed a new version with the fix to the batch detection memory issue. Could you try update the package and see if it fixes your issue?
Search before asking
Description
I have a directory of about 17,000 camera trap images, probably an average of a handful of detections per image. When I try to run the batch megadetector on that directory from within a notebook, at around halfway through the batch, the machine runs out of memory (32GB).
If the high memory usage is unavoidable, a nice option would be to be able to run the detector on lists of images rather than directories, that way large directories could be broken up more easily.
Thanks for everything!
Use case
Large directories of images to be detected
The text was updated successfully, but these errors were encountered: