New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
paddle_infer::CreatePredictor(config)崩溃 #64032
Comments
麻烦给出详细的复现环境和步骤,从而帮助定位和解决 |
cuda11.6, cudnn8.4,vs2019,没有选tensorRT,直接下载的avx_mkl_cuda11.6_cudnn8.4_avx_mkl-trt8.4.1.5,我自己也编译过paddle源码,还是一样的结果。我再继续查找原因 |
运行到这里就崩溃,我跟踪了一下,原来是config里面全部是空的,参数输入不进去,config.SetModel没有作用。可以麻烦给下具体代码嘛,辅助内部RD定位和解决问题 |
用的PaddleOCR-release-2.7\deploy\cpp_infer,运行到 config.EnableUseGpu的时候,watch一下config的数值,基本没有变化。我用的SetModelBuffer这个函数,所以稍微改了一下: progFile.Read(pProgBuf, dwFileLen); //把progStr文件的内容读到内存中 config.SetModelBuffer(pProgBuf, dwFileLen, pParamsBuf, dwFileLen1); std::cout << "In PP-OCRv3, default rec_img_h is 48," if (this->use_gpu_) |
感谢提供的paddle源码,这个问题使能够解决的,我有时间的时候跟踪一下 |
请提出你的问题 Please ask your question
运行到这里就崩溃,我跟踪了一下,原来是config里面全部是空的,参数输入不进去,config.SetModel没有作用,config.EnableUseGpu(this->gpu_mem_, this->gpu_id_);也没有作用。用的是avx_mkl_cuda11.6_cudnn8.4_avx_mkl-trt8.4.1.5。有人说config.SetModelbuf有用,我测试了,也没有用
The text was updated successfully, but these errors were encountered: