-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Value Error when using bilinear interpolation method #197
Comments
I just wanted to chime in and say that this could be very helpful for the broader community. Since I found the workaround at some point in the past, I have interacted with many folks who experienced this issue. Also many thanks for raising this issue @jdldeauna! |
Hi @jdldeauna , this problem is caused by the data having been published on a grid which still contains the grid halo (rows or columns of duplicated grid cells), which is required during the model run. The grid halo cells are often not properly defined and are collapsing to lines or points and are only sharing the cell centers with their original, which then leads to this This problem exists for quite a few CMIP ocean models. What you can usually do (besides using Here is an example with CDOs (Climate Data Operators): And here an example for xarray, for MPI-ESM1-2-LR/HR (see also #109 or this notebook): # Define paths to the data
path_mpiesmlr="./tos_Omon_MPI-ESM1-2-LR_historical_r1i1p1f1_gn_201001-201412.nc"
path_mpiesmhr="./tos_Omon_MPI-ESM1-2-HR_historical_r1i1p1f1_gn_201001-201412.nc"
# Read the data with halo
ds_mpiesmlr_halo=xr.open_dataset(path_mpiesmlr).isel(time=0)
ds_mpiesmhr_halo=xr.open_dataset(path_mpiesmhr).isel(time=0)
# Read the data without halo (LR - omit first and last column, HR - omit first and last column, first 2 rows)
ds_mpiesmlr=xr.open_dataset(path_mpiesmlr).isel(time=0).isel(i=slice(1, 255))
ds_mpiesmhr=xr.open_dataset(path_mpiesmhr).isel(time=0).isel(i=slice(1, 801), j=slice(2,404)) |
Thank you for the comprehensive reply @sol1105 ! Maybe the error message from |
This is a really helpful explanation @sol1105. I was not actually aware of this.
|
@jbusecke Please do submit a PR for this, we should be able to give it a quick review. |
Happy to work on this next week. Could you assign me to this? Thx |
There is no way to infer that information from the metadata. To see whether duplicate cells are present, one would have to look into the cell coordinates, for example: (from #109) # 2D latitude and longitude arrays - create an array of (lat,lon) tuples
latlon_halo=np.array(list(zip(ds["latitude"].values.ravel(),ds["longitude"].values.ravel())),
dtype=('double,double')).reshape(ds["longitude"].values.shape)
# use numpy.unique to identify unique columns
latlon_no_halo,indices=np.unique(latlon_halo, axis=1, return_index=True) Alternatively, one could remap an array of ones with the generated weights and see if there are any values
I did not find a way so far to perform this trimming in an automated way. The problem is, that the halo cells are often improperly defined (with collapsing or wrong bounds), so one has to select "the right" cell to be removed / masked. So probably notifying the user about duplicated cells in his array and his options to either use adaptive masking or manually remove / mask them, would be the best solution? |
Thanks for that explanation @sol1105! |
Hi! I was using the
bilinear
method and kept encountering an error message:Error:
Fortunately @jbusecke guided me to a solution:
I was wondering if the error message could be edited to reflect that adding the kwarg
ignore_degenerate=True
resolves it? Its difficult to work out how to resolve the issue with the existingValueError
message. Thanks!The text was updated successfully, but these errors were encountered: