You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Does the library have the capability to run batched inference? i.e. being able to pass multiple images at the same time to take advantage of gpu parallelization and speed up inference on multiple images? I see that there are references to a "pipe" being created and was wondering if there can be any parallels drawn in this library to neural nets that are much more efficient running batched inference. The example seems to run images sequentially, I couldn't find examples of batched inference. Any leads or help is much appreciated, thanks!
The text was updated successfully, but these errors were encountered:
Does the library have the capability to run batched inference? i.e. being able to pass multiple images at the same time to take advantage of gpu parallelization and speed up inference on multiple images? I see that there are references to a "pipe" being created and was wondering if there can be any parallels drawn in this library to neural nets that are much more efficient running batched inference. The example seems to run images sequentially, I couldn't find examples of batched inference. Any leads or help is much appreciated, thanks!
The text was updated successfully, but these errors were encountered: