Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OOM when running with valgrind #101

Closed
Jimmy316 opened this issue May 6, 2019 · 11 comments
Closed

OOM when running with valgrind #101

Jimmy316 opened this issue May 6, 2019 · 11 comments

Comments

@Jimmy316
Copy link

Jimmy316 commented May 6, 2019

I tried to run valgrind on the detection algorithm. I see memory consumption go up continously. It reaches 15GB and kills my wks. Have any of you encountered similar issues

@Jimmy316
Copy link
Author

Jimmy316 commented May 6, 2019

This is the command I am running
valgrind --leak-check=yes --track-origins=yes --log-file=cctag.out --leak-check=full --show-leak-kinds=all ./detection -n 3 -i sample01.png

@Jimmy316
Copy link
Author

Jimmy316 commented May 6, 2019

Running within an Ubuntu 16.04 docker container.

@simogasp
Copy link
Member

simogasp commented May 6, 2019

Just to know, are you using the cuda version or the cpu version? Do you have the same if u run it with --use-cuda?

@Jimmy316
Copy link
Author

Jimmy316 commented May 6, 2019

I am using the stand alone version (i.e the non cuda version) I didn't build a library with the CUDA version. I can give it a shot next.

@simogasp
Copy link
Member

simogasp commented May 7, 2019

I was asking because in the cuda version there are a lot of dynamic allocations but it should not be the case in the cpu.
Can u post the log or the result of valgrind?

@Jimmy316
Copy link
Author

Jimmy316 commented May 7, 2019

Hi

I got on a server that has 128GB of memory and ran the test. I see the total memory consumption goes upto 25GB (that explains why it didn't run on my laptop)

Find the valgrind output attached
valgrind_cctag.log

@github-actions
Copy link

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@github-actions
Copy link

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@github-actions github-actions bot added the stale label May 27, 2021
@simogasp simogasp removed the stale label May 27, 2021
@github-actions
Copy link

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@github-actions github-actions bot added the stale label May 23, 2022
@simogasp simogasp removed the stale label May 23, 2022
@github-actions
Copy link

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@github-actions github-actions bot added the stale label May 19, 2023
@simogasp simogasp added type:bug and removed stale labels May 19, 2023
Copy link

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants