You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
recent change in test_perf (now worked around, but present in d7617d3 for example)
disabled the gc when doing multiple runs on the same vb (just in case).
Though usage, this appears to cause memory exhaustion for large N, which is
an indication that cyclic refs are being created somewhere. invoking gc.collect()
clears the memory (so it's not the malloc.trim issue from #2659)
It might be another dependency, not pandas that's causing this but need to check.
maybe a refernce is kept around in the test?
(their IS copying of the blocks, so if a ref is held memory will increase)
import gc
from time import sleep
import pandas as pd
import numpy as np
df = pd.DataFrame(np.random.randn(10000, 25))
df['foo'] = 'bar'
df['bar'] = 'baz'
df = df.consolidate()
for i in range(1000):
print i
df._get_numeric_data()
sleep(0.1)
# this is optional
if i % 100 == 0:
gc.collect()
This was a long time ago and I don't remember the details.
If you reset to the commit mentioned and run the test, doesn't the memory grow without bound?
I just came across the comments in #4491 on cycles and remembered I opened an issue
on it and so dropped in the xref. afaict, those comments are still valid and there are memory
cycles hiding somewhere.
The problem here may be with the test, though, as you say.
recent change in test_perf (now worked around, but present in d7617d3 for example)
disabled the gc when doing multiple runs on the same vb (just in case).
Though usage, this appears to cause memory exhaustion for large N, which is
an indication that cyclic refs are being created somewhere. invoking gc.collect()
clears the memory (so it's not the malloc.trim issue from #2659)
It might be another dependency, not pandas that's causing this but need to check.
repro:
See also comments in #4491
The text was updated successfully, but these errors were encountered: