New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Endurance Test! #23
Comments
I tested with a set of 1483 files on (32GB, 4CPUs)and used |
I ran another test of 1483 files on podman VM with different memory configurations. The results are below.
|
@shahrokhDaijavad, @shivdeep-singh-ibm This is an important piece of info not only for us but also for potential users. Can we please:
|
I agree, @blublinsky. I will create an md file called memorytest under doc with this information and link from the mac.md file to it. |
@shahrokhDaijavad great. Should it be memory? or endurance? |
@blublinsky It's a combination of testing for memory leak (which peaks and flattens around 4GB, i.e., no leak) and endurance that shows with smaller memory (4GB and 6GB total memory), it is still possible to process 500 or 900 files successfully before it crashes. I will explain in the readme. |
As we discussed issue #388 in the internal repo, we want to test if the framework has a memory leak or not. To do this, we use the noop transform and a large dataset (like test set 3 that has 1500 zip files) and 1) Run ingest2parquet on it first and 2) Try the noop transform while monitoring the memory usage of the laptop and see if it reaches a flat plateau or not.
The text was updated successfully, but these errors were encountered: