You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@loicland@VSainteuf Hello, thank you for the work. I want to test the trained model for my own region. How should my input data ? For example, I obtained all cloud-free Sentinel-2 images from 2020-2021 for the area I specified. Afterwards, I combined these different dates with numy. Now I have an image numpy array of size 10800(h)*10800(w)*13(ch)*34(date). I should save them in 256(h)*256(w)*13(ch)*34(date) pieces with numpy.save and use these npy files for inference?
The text was updated successfully, but these errors were encountered:
Hi @oguzhannysr, you can have a look at the documentation of the dataset here.
The model has been trained on patches of size 128x128x10xT (we don't use bands B01,B09, and B10 for sentinel2).
In addition to the numpy files you will need to provide the metadata file ('metadata.geojson' in the original dataset). It specifies the ID of each patch and the dates of the observations of each image for a given patch. You'll need to have a look at the original metadata file to see how it's formatted.
If I understand correctly that in your dataset all patches have the same observation dates then you don't need the metadata file but you will need to slightly modify the dataloader accordingly. And always provide the same date sequence in the getitem.
Hope this helps !
@loicland @VSainteuf Hello, thank you for the work. I want to test the trained model for my own region. How should my input data ? For example, I obtained all cloud-free Sentinel-2 images from 2020-2021 for the area I specified. Afterwards, I combined these different dates with numy. Now I have an image numpy array of size 10800(h)*10800(w)*13(ch)*34(date). I should save them in 256(h)*256(w)*13(ch)*34(date) pieces with numpy.save and use these npy files for inference?
The text was updated successfully, but these errors were encountered: