You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a 2 hour length, 426 GB PsiStore. As part of my post study process, I wanted to remove a stream and save the store in another location for deep storage. The code is basically as following:
var outputStorePath = Path.Combine(@"D:\Hallway-Recording-Clean", datasetID, sessionID);
PsiStore.Copy((partition.StoreName, partition.StorePath), (partition.StoreName,outputStorePath), null, s => s.Name.Contains("ir"), false);
When I run the code, the memory usage slowly goes up and eventually my system run out of memory (I have 64 GB of RAM) and the application crashes. I think I could just change the pipeline to use real time which should work but it means I have to wait 2 hours for it to finish.
Is there any ways to make the system run as fast as possible without reading the store so fast that it runs out of memory? Thanks!
The text was updated successfully, but these errors were encountered:
I'm looking into this. Unlimited delivery policies are indeed the likely culprit, with ever-growing queues. Setting to SynchronousOrThrottle in CopyStream, as you did, is a good idea to try. Additionally, could you try setting in Exporter.cs:332: source.PipeTo(connector, DeliveryPolicy.SynchronousOrThrottle);
Hey Ashley, I actually just got time to worked on this. You are right, changing source.PipeTo(connector, Delivery.SynchronousOrThrottle) in Exporter.cs:227 worked. I could push a small pull request with this fix but I'm worry there might be a lot of unintended side effects.
Hey Zhi, glad that works for you. You're right that the effects of throttling and back pressure may not be appropriate for all (most?) scenarios. I believe we'll make this at least configurable though. Thanks much for reporting!
I have a 2 hour length, 426 GB PsiStore. As part of my post study process, I wanted to remove a stream and save the store in another location for deep storage. The code is basically as following:
When I run the code, the memory usage slowly goes up and eventually my system run out of memory (I have 64 GB of RAM) and the application crashes. I think I could just change the pipeline to use real time which should work but it means I have to wait 2 hours for it to finish.
Is there any ways to make the system run as fast as possible without reading the store so fast that it runs out of memory? Thanks!
The text was updated successfully, but these errors were encountered: