-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
jgrss/store time series #315
Conversation
I am going to try to take a look at this. |
|
||
DF = gpd.GeoDataFrame(geometry=[shape(geojson)], crs=4326) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should be updated to shape(search_geojson)
I think.
@@ -28,15 +35,47 @@ | |||
], | |||
} | |||
|
|||
EPSG = 8857 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
its failing without this line
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The client save seems to be working well. Seems like you inadvertently removed some lines that muck up stac
tests. I added comments to address
) | ||
|
||
if hasattr(self.data, "_FillValue"): | ||
nodata = self.data.attrs["_FillValue"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is no longer a user argument.
if self.scatter is None: | ||
band_count = self.data.gw.nbands | ||
else: | ||
if self.scatter == 'band': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
File scattering is new.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what is file scattering?
memory_limit="1GB", | ||
) as cluster: | ||
with Client(cluster) as client: | ||
time_mean.gw.save( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here is the test using a client.
client (Optional[Client object]): A ``dask.distributed.Client`` client object to persist data. | ||
Default is None. | ||
compute (Optinoal[bool]): Whether to compute and write to ``filename``. Otherwise, return | ||
the ``dask`` task graph. If ``True``, compute and write to ``filename``. If ``False``, | ||
return the ``dask`` task graph. Default is ``True``. | ||
tags (Optional[dict]): Metadata tags to write to file. Default is None. | ||
compress (Optional[str]): The file compression type. Default is 'none', or no compression. | ||
|
||
.. note:: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def test_client_save(self): | ||
|
||
with LocalCluster( | ||
processes=False, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Necessary to write to compressed file.
filename=out_path, | ||
overwrite=True, | ||
tags={"TEST_METADATA": "TEST_VALUE"}, | ||
compress="lzw", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Compression should be okay.
I merged this into #319. |
What is this PR changing?
save()