palubad commited on
Commit
14b052a
·
verified ·
1 Parent(s): 85554bc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -55,12 +55,12 @@ To implement this model:
55
  The training data is available from the [SAR-based-VIs GitHub repository](https://github.com/palubad/SAR-based-VIs).
56
 
57
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6798c936ece6b7910c55d1e5/V49cLxspCqdoN_aURaD_c.png)
58
- Figure 2. Used areas for training and validation (training and validation data are not differentiated in this figure)
59
 
60
  ### Training Procedure
61
 
62
  - Feature Selection: Using permutation feature importance analysis to identify key predictors.
63
- - Data Splitting: Training and validation sets created with a balanced representation of healthy and disturbed forests.
64
  - Hyperparameter Optimization:
65
  - RFR: Fine-tuned for maximum depth, number of trees, and minimum samples per split.
66
  - XGB: Optimized learning rate, tree depth, and number of boosting rounds.
 
55
  The training data is available from the [SAR-based-VIs GitHub repository](https://github.com/palubad/SAR-based-VIs).
56
 
57
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6798c936ece6b7910c55d1e5/V49cLxspCqdoN_aURaD_c.png)
58
+ Figure 2. Used areas for training and testing (training and testing data are not differentiated in this figure)
59
 
60
  ### Training Procedure
61
 
62
  - Feature Selection: Using permutation feature importance analysis to identify key predictors.
63
+ - Data Splitting: Training and testing sets created with a balanced representation of healthy and disturbed forests.
64
  - Hyperparameter Optimization:
65
  - RFR: Fine-tuned for maximum depth, number of trees, and minimum samples per split.
66
  - XGB: Optimized learning rate, tree depth, and number of boosting rounds.