Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

怎么主动释放已经加载的模型 #2393

Open
ThinkWD opened this issue Mar 4, 2024 · 7 comments
Open

怎么主动释放已经加载的模型 #2393

ThinkWD opened this issue Mar 4, 2024 · 7 comments
Assignees

Comments

@ThinkWD
Copy link
Contributor

ThinkWD commented Mar 4, 2024

问一个小白问题, 为什么所有模型都只有加载的接口, 没有释放的接口.

如果有释放的接口的话请说一下在哪里, 谢谢

@ThinkWD
Copy link
Contributor Author

ThinkWD commented Mar 5, 2024

我尝试使用智能指针, 但是类析构了, 占用的内存/显存没有释放 (jetson, trt 后端)

// load
auto det_model = std::make_unique<detection::PPYOLOE>(model, params, config, option);
if (!det_model->Initialized())
    std::exit();

// free
det_model.reset();

大佬们帮忙看看 @jiangjiajun @rainyfly

@ThinkWD
Copy link
Contributor Author

ThinkWD commented Mar 8, 2024

今天也是等待回复的一天...

1 similar comment
@ThinkWD
Copy link
Contributor Author

ThinkWD commented Mar 11, 2024

今天也是等待回复的一天...

@rainyfly
Copy link
Collaborator

@ThinkWD
Copy link
Contributor Author

ThinkWD commented Mar 25, 2024

https://github.com/PaddlePaddle/FastDeploy/blob/develop/fastdeploy/runtime/backends/tensorrt/trt_backend.cc 把这个里面context_、engine_ destroy()掉试试

fastdeploy/runtime/backends/tensorrt/trt_backend.h 添加

~TrtBackend() {
    if (parser_) {
      parser_.reset();
    }
    if (context_) {
      context_.reset();
      engine_.reset();
      builder_.reset();
      network_.reset();
    }
  }

fastdeploy/runtime/runtime.h 添加

  ~Runtime() {
    if (backend_)
      backend_.reset();
  }

fastdeploy/fastdeploy_model.h 添加

  virtual void Release() {
    if (runtime_)
      runtime_.reset();
  }

实测均无法释放显存

是不是我的修改不正确? 还有其它可能有用的建议吗? @rainyfly

@ThinkWD
Copy link
Contributor Author

ThinkWD commented Mar 28, 2024

等待回复...

@ThinkWD
Copy link
Contributor Author

ThinkWD commented Apr 3, 2024

......

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants