Small correction for self-contained code

#3
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -16,7 +16,7 @@ To use the pre-trained model for masked language modeling, use the following sni
16
  from transformers import AutoModelForMaskedLM, AutoTokenizer
17
 
18
  # See the `MDLM` collection page on the hub for list of available models.
19
- tokenizer = transformers.AutoTokenizer.from_pretrained('gpt2')
20
  model = AutoModelForMaskedLM.from_pretrained('s-sahoo/duo')
21
  ```
22
  For a hands-on example, check out this [Colab notebook](https://colab.research.google.com/drive/1Sf7R-dqdR6gq-H8nyZ9E3ZkyvqMTqcwq?usp=sharing).
 
16
  from transformers import AutoModelForMaskedLM, AutoTokenizer
17
 
18
  # See the `MDLM` collection page on the hub for list of available models.
19
+ tokenizer = AutoTokenizer.from_pretrained('gpt2')
20
  model = AutoModelForMaskedLM.from_pretrained('s-sahoo/duo')
21
  ```
22
  For a hands-on example, check out this [Colab notebook](https://colab.research.google.com/drive/1Sf7R-dqdR6gq-H8nyZ9E3ZkyvqMTqcwq?usp=sharing).