Karzan commited on
Commit
d28f8db
·
verified ·
1 Parent(s): 12040ac

update the training code url

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -187,7 +187,7 @@ The details of the masking procedure for each sentence are the following:
187
  ### Pretraining
188
 
189
  The model was trained on 8 16 GB V100 for 90 hours. See the
190
- [training code](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation) for all hyperparameters
191
  details.
192
 
193
  ## Evaluation results
 
187
  ### Pretraining
188
 
189
  The model was trained on 8 16 GB V100 for 90 hours. See the
190
+ [training code](https://github.com/huggingface/transformers-research-projects/tree/main/distillation) for all hyperparameters
191
  details.
192
 
193
  ## Evaluation results