You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When trying to run the SNODASDaily_Interactive.py file over a year and a half range of dates, a critical error occurs. The error states "Could not create AF_NETLINK socket (Too many open files)", and might have something to do with the limit of file descriptors the computer is allowed to have open at one time. This can be found by running the ulimit -n command. This error occurs roughly 308 days, or 44 weeks after the starting date.
The default is usually 1024, though it can be increased to a much higher number if desired. For SNODAS Tools purposes, There will be a limit to the range of dates a user can query, which will be set to one year at a time to resolve this issue. I will create a bash script that will attempt to run each year separately for the entire dataset, starting at 2003 and ending at the present day.
Even though this code worked before (and is still currently working), it's not dealing with as large datasets, so I can't say for certain that there's an issue with the Python conversion code I'm changing right now, or if this was always there in the code, but I'll keep an eye on anything that looks off/wrong in the code as I keep working.
The text was updated successfully, but these errors were encountered:
When trying to run the
SNODASDaily_Interactive.py
file over a year and a half range of dates, a critical error occurs. The error states "Could not create AF_NETLINK socket (Too many open files)", and might have something to do with the limit of file descriptors the computer is allowed to have open at one time. This can be found by running theulimit -n
command. This error occurs roughly 308 days, or 44 weeks after the starting date.The default is usually 1024, though it can be increased to a much higher number if desired. For SNODAS Tools purposes, There will be a limit to the range of dates a user can query, which will be set to one year at a time to resolve this issue. I will create a bash script that will attempt to run each year separately for the entire dataset, starting at 2003 and ending at the present day.
Even though this code worked before (and is still currently working), it's not dealing with as large datasets, so I can't say for certain that there's an issue with the Python conversion code I'm changing right now, or if this was always there in the code, but I'll keep an eye on anything that looks off/wrong in the code as I keep working.
The text was updated successfully, but these errors were encountered: