Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

run out of memory error when analyze large size SEM image #16

Open
ahnydd opened this issue Jan 11, 2022 · 0 comments
Open

run out of memory error when analyze large size SEM image #16

ahnydd opened this issue Jan 11, 2022 · 0 comments

Comments

@ahnydd
Copy link

ahnydd commented Jan 11, 2022

Hi Justin Blaber,

I am Dongdi Yin from Southwest Jiaotong University. We are working on HRDIC based on SEM. We have got very nice strain maps using Ncorr for small size SEM images (up to 6749x5229 pixels). However, when we tried to analyze larger size SEM images, such as 8818x6871 pixels, an error occurred, which seems a run out of memory error. The screenshot of the error message is attached below.
We are using MATLAB 2021b. 20 cores are set for the OpenMP multithreading.
The computer we used: Intel® Xeon® Gold 6230R Processor (35.75M Cache, 2.10 GHz, 26 cores)+192GB DDR4 memory.

Many thanks and Happy New Year!
Looking forward to hearing from you
Kind regards
Dongdi Yin

error-20220110183910

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant