Davlan commited on
Commit
857b63f
·
1 Parent(s): ebc14a1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -5,7 +5,7 @@ language:
5
  ---
6
 
7
 
8
- # oyo-bert
9
 
10
  OYO-BERT (or Oyo-dialet of Yoruba BERT) was created by pre-training a [BERT model with token dropping](https://aclanthology.org/2022.acl-long.262/) on Yoruba language texts for about 100K steps. It was trained using BERT-base architecture
11
 
@@ -16,7 +16,7 @@ A mix of WURA, Wikipedia and MT560 Yoruba data
16
  You can use this model with Transformers *pipeline* for masked token prediction.
17
  ```python
18
  >>> from transformers import pipeline
19
- >>> unmasker = pipeline('fill-mask', model='Davlan/oyo-bert')
20
  >>> unmasker("Ọjọ kẹsan-an, [MASK] Kẹjọ ni wọn ri oku Baba")
21
  ```
22
  ```
 
5
  ---
6
 
7
 
8
+ # oyo-bert-base
9
 
10
  OYO-BERT (or Oyo-dialet of Yoruba BERT) was created by pre-training a [BERT model with token dropping](https://aclanthology.org/2022.acl-long.262/) on Yoruba language texts for about 100K steps. It was trained using BERT-base architecture
11
 
 
16
  You can use this model with Transformers *pipeline* for masked token prediction.
17
  ```python
18
  >>> from transformers import pipeline
19
+ >>> unmasker = pipeline('fill-mask', model='Davlan/oyo-bert-base')
20
  >>> unmasker("Ọjọ kẹsan-an, [MASK] Kẹjọ ni wọn ri oku Baba")
21
  ```
22
  ```