You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As far, Adlik serving need to use Adlik compiler to convert model format and create config file which is needed when running inference.
However, if users have already runtime model converted by themselves, such as OpenVINO IR or TensorRT engine file, it would be hard to run inference with Adlik serving due to lack of config file.
So, Adlik serving should support to run original runtime modeldirecly without Adlik compiler.
The text was updated successfully, but these errors were encountered:
As far, Adlik serving need to use Adlik compiler to convert model format and create config file which is needed when running inference.
However, if users have already runtime model converted by themselves, such as OpenVINO IR or TensorRT engine file, it would be hard to run inference with Adlik serving due to lack of config file.
So, Adlik serving should support to run original runtime modeldirecly without Adlik compiler.
The text was updated successfully, but these errors were encountered: