Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

paddle_infer::CreatePredictor(config)崩溃 #64032

Closed
sbing2000 opened this issue May 4, 2024 · 5 comments
Closed

paddle_infer::CreatePredictor(config)崩溃 #64032

sbing2000 opened this issue May 4, 2024 · 5 comments
Assignees
Labels
status/close 已关闭 type/question 用户提问

Comments

@sbing2000
Copy link

请提出你的问题 Please ask your question

运行到这里就崩溃,我跟踪了一下,原来是config里面全部是空的,参数输入不进去,config.SetModel没有作用,config.EnableUseGpu(this->gpu_mem_, this->gpu_id_);也没有作用。用的是avx_mkl_cuda11.6_cudnn8.4_avx_mkl-trt8.4.1.5。有人说config.SetModelbuf有用,我测试了,也没有用

@xuxinyi389
Copy link
Contributor

麻烦给出详细的复现环境和步骤,从而帮助定位和解决

@sbing2000
Copy link
Author

cuda11.6, cudnn8.4,vs2019,没有选tensorRT,直接下载的avx_mkl_cuda11.6_cudnn8.4_avx_mkl-trt8.4.1.5,我自己也编译过paddle源码,还是一样的结果。我再继续查找原因

@xuxinyi389
Copy link
Contributor

运行到这里就崩溃,我跟踪了一下,原来是config里面全部是空的,参数输入不进去,config.SetModel没有作用。可以麻烦给下具体代码嘛,辅助内部RD定位和解决问题

@sbing2000
Copy link
Author

用的PaddleOCR-release-2.7\deploy\cpp_infer,运行到 config.EnableUseGpu的时候,watch一下config的数值,基本没有变化。我用的SetModelBuffer这个函数,所以稍微改了一下:
LoadModel(const std::string &model_dir) {
paddle_infer::Config config;
//config.SetModel(model_dir + "\inference.pdmodel",
// model_dir + "\inference.pdiparams");
CString str1 = model_dir.c_str();
CString progStr = str1+"\inference.pdmodel";
CString paramsStr = str1+ "\inference.pdiparams";
//把progStr文件的内容读到内存中
//把paramsStr文件的内容读到内存中
//////////////////////////////////////////
CFile progFile;
progFile.Open(progStr, CFile::modeRead);
DWORD dwFileLen = progFile.GetLength();
char* pProgBuf = new char[dwFileLen];

progFile.Read(pProgBuf, dwFileLen);
progFile.Close();

//把progStr文件的内容读到内存中
CFile paramsFile;
paramsFile.Open(paramsStr, CFile::modeRead);
DWORD dwFileLen1 = paramsFile.GetLength();
char* pParamsBuf = new char[dwFileLen1];
paramsFile.Read(pParamsBuf, dwFileLen1);
paramsFile.Close();
//////////////////////////////////////////

config.SetModelBuffer(pProgBuf, dwFileLen, pParamsBuf, dwFileLen1);

std::cout << "In PP-OCRv3, default rec_img_h is 48,"
<< "if you use other model, you should set the param rec_img_h=32"
<< std::endl;

if (this->use_gpu_)
{
this->gpu_id_ = 1;
this->gpu_mem_ = 1000;
config.EnableUseGpu(this->gpu_mem_, this->gpu_id_);
}
}

@sbing2000
Copy link
Author

感谢提供的paddle源码,这个问题使能够解决的,我有时间的时候跟踪一下

@paddle-bot paddle-bot bot added status/close 已关闭 and removed status/new-issue 新建 labels May 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status/close 已关闭 type/question 用户提问
Projects
None yet
Development

No branches or pull requests

2 participants