You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When run on a machine with small resource requirements (e.g. a Github Codespace with 2 vCPUS and 4 GB of RAM), tests/test_cyto_utils/test_DeepProfiler_processing.py fails with no explanation.
Expected behavior
To enable local debugging on cloud development instances, these tests should be able to run to completion even on machines with smaller resource allocations.
Additional information
A smaller DeepProfiler test dataset would also enable much faster github actions pipeline runtimes as it is by far the longest running test in the pytest suite.
The text was updated successfully, but these errors were encountered:
@michaelbornholdt@roshankern I am currently working on re-factoring test_DeepProfiler_processing.py and it is challenging to debug in my usual development environment because of the long runtime and failures dues to insufficient memory. Would one of you be able to help me in creating a minimal set of data files or dataframes that can be used to reduce the runtime of this set of tests?
I think a set of data tests similar to the ones here would also be helpful in making it easier to debug potential future bugs when you don't need to dig through output files that are quite so large.
Example code with output
Issue description
When run on a machine with small resource requirements (e.g. a Github Codespace with 2 vCPUS and 4 GB of RAM),
tests/test_cyto_utils/test_DeepProfiler_processing.py
fails with no explanation.Expected behavior
To enable local debugging on cloud development instances, these tests should be able to run to completion even on machines with smaller resource allocations.
Additional information
A smaller DeepProfiler test dataset would also enable much faster github actions pipeline runtimes as it is by far the longest running test in the pytest suite.
The text was updated successfully, but these errors were encountered: