Paolo-Fraccaro commited on
Commit
30b8ce4
1 Parent(s): 6704465

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -37,7 +37,7 @@ The Prithvi-100m model was initially pretrained using a sequence length of 3 tim
37
  ### Code
38
  Code for Finetuning is available through [github](https://github.com/NASA-IMPACT/hls-foundation-os/)
39
 
40
- Configuration used for finetuning is available through [config](https://github.com/NASA-IMPACT/hls-foundation-os/blob/main/fine-tuning-examples/configs/multi_temporal_crop_classification.py).
41
 
42
  ### Results
43
  The experiment by running the mmseg stack for 80 epochs using the above config led to the following result:
 
37
  ### Code
38
  Code for Finetuning is available through [github](https://github.com/NASA-IMPACT/hls-foundation-os/)
39
 
40
+ Configuration used for finetuning is available through [config](https://github.com/NASA-IMPACT/hls-foundation-os/blob/main/configs/multi_temporal_crop_classification.py).
41
 
42
  ### Results
43
  The experiment by running the mmseg stack for 80 epochs using the above config led to the following result: