-
Hi everyone, I'm having one heck of a time trying to get predict to work as a proof of concept to show my supervisor. I have experience with predictions using TorchGeo, but the workflow is quite different when using TerraTorch's CLI so I must just be doing something wrong. If anyone has an working examples that I could mull over I would greatly appreciate that! After both a successful fit and test I'm trying now to use predict on test images to see how well it segments. I started with the default burn_scars.yaml in Terratorch's test/conf directory I've gotten this to run in a Notebook for about 90 seconds but I can't seem to get past the following error: !terratorch predict
File "..\Lib\site-packages\terratorch\cli_tools.py", line 167, in write_on_batch_end I've tried several variations of explicitly adding callbacks as well as finding the right built in one as it seems that either the name or images are not being passes to the write_on_batch_end function: #firescars.yaml where I added trainer variations of this As well as removing all references to writer callbacks in the hopes a default function would handle this. The only documentation I can find on how to use predict on the getting started page: terratorch predict -c <path_to_config_file> --ckpt_path<path_to_checkpoint> --predict_output_dir <path_to_output_dir> --data.init_args.predict_data_root <path_to_input_dir> --data.init_args.predict_dataset_bands <all bands in the predicted dataset, e.g. [BLUE,GREEN,RED,NIR_NARROW,SWIR_1,SWIR_2,0]> I'm using the standard test case from https://github.com/isaaccorley/prithvi-pytorch for the HLS Burn Scar Dataset as well as their yaml file for the fit and test wget https://huggingface.co/datasets/ibm-nasa-geospatial/hls_burn_scars/resolve/main/hls_burn_scars.tar.gz?download=true -O hls_burn_scars.tar.gz Thank you! |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 1 reply
-
Hi, @FogDrip What happens when you add a flag like: |
Beta Was this translation helpful? Give feedback.
-
Hi, @FogDrip . Give a look at the examples |
Beta Was this translation helpful? Give feedback.
-
Hi @Joao-L-S-Almeida, thank you so much for your quick response! I was able to fix my problem. The config I was using is actually in that repo that you're referencing. To fix the first issue, I just had to add this line at the root level of the predict_output_dir: "../../../Prithvi/datasets_temp/hls_burn_scars/hls_burn_scars/predict_test/" The second issue with my workflow that I had to fix was that, for some reason, in elif isinstance(prediction, tuple):
pred_batch, filename_batch = prediction
for prediction, file_name in zip(torch.unbind(pred_batch, dim=0), filename_batch, strict=False):
save_prediction(prediction, file_name, output_dir, dtype=trainer.out_dtype) To fix this, I changed the following inside the if self.transform:
output = self.transform(**output) I added: output["filename"] = self.image_files[index] The predictions look pretty good, thanks for your help! |
Beta Was this translation helpful? Give feedback.
Hi @Joao-L-S-Almeida, thank you so much for your quick response! I was able to fix my problem.
The config I was using is actually in that repo that you're referencing. To fix the first issue, I just had to add this line at the root level of the
firescars.yaml
:The second issue with my workflow that I had to fix was that, for some reason, in
FireScarsNonGeo
class inside thefire_scars.py
the__getitem__
method was not returning the input filename somewhere during the model inference. Ultimately theprediction
variable being passed towrite_on_batch_end
inside theCustomWriter
incli_tools.py
was…