You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In my project we have a couple of raster layers, a locked vector layer (geopackage, set to Directly access data source), and a field collection vector layer (also geopackage).
When updating anything in the field collection layer and pushing the updates to the cloud, the Delta Apply job uploads (or attempts to upload) the raster layers and the locked vector layer as well.
Looking at the python SDK and the logs, it seems that:
The raster layers have the exact same checksum, but are still being uploaded. Based on the SDK code, it would entail that their name attribute in the local and remote projects are not identical, which is odd.
The locked vector layer has a different checksum (though exact same size), even though I can confirm it has not been changed.
Since our locked vector layer is quite large, it can also exceed the 10min processing time and result in a failed job.
Reproduction steps
Steps to reproduce the behavior:
Create a new project with raster layers and vector layers.
Set the raster layers and all-but-the-data vector layer to Directly access data source
Set the data vector layer to Offline editing
Modify the data vector layer and inspect the logs output.
Expected behavior
I'd expect only the modified layer to be uploaded.
Observed behavior
It seems all raster and vector layers are, at the very least, reported to be uploading.
Screenshots and GIFs
A bit redacted, but color highlights to match the filenames.
The text was updated successfully, but these errors were encountered:
Thanks for the nice and detailed report! Highly appreciated.
We are aware of this issue and discussed this morning we should prioritize it in our development plan for the next few weeks. Will keep you updated in this issue here.
Are there any updates on this front? I believe some of our users cannot sync the project because of this.
Specifically, the Package job fails when it attempts to upload larger files (i.e. the orange one in the image above).
Describe the issue
(See original comment here)
In my project we have a couple of raster layers, a locked vector layer (geopackage, set to
Directly access data source
), and a field collection vector layer (also geopackage).When updating anything in the field collection layer and pushing the updates to the cloud, the
Delta Apply
job uploads (or attempts to upload) the raster layers and the locked vector layer as well.Looking at the python SDK and the logs, it seems that:
name
attribute in the local and remote projects are not identical, which is odd.Since our locked vector layer is quite large, it can also exceed the 10min processing time and result in a failed job.
Reproduction steps
Steps to reproduce the behavior:
Directly access data source
Offline editing
Expected behavior
I'd expect only the modified layer to be uploaded.
Observed behavior
It seems all raster and vector layers are, at the very least, reported to be uploading.
Screenshots and GIFs
A bit redacted, but color highlights to match the filenames.
The text was updated successfully, but these errors were encountered: