You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is no such limit, but you might have problems with <16GB RAM when you have a lot of neural frames. Another possibility is that all your frames are in a single file (and we don't have incremental tiff loading in the Matlab version yet, but we have it in the Python version).
I do have sufficient RAM, although all my frames are in a single file. I suppose that the best way to fix this for now in matlab is to seperate the single file to a couple of smaller ones and save them in the same folder?
Hi @annie2013, I had the same problem recently (tifs were apparently getting too big, suite2p didn't spit out a warning, realized much later that files got truncated during processing). Splitting the big tif into multiple small parts helped as you suggested.
I am currently running suite2p on matlab. There seems to be a limit of processing data up to 64000 frames. Is there a way to increase that?
The text was updated successfully, but these errors were encountered: