namhkoh commited on
Commit ·
d41adc7
1
Parent(s): 098d52a
Update README.md
Browse files
README.md
CHANGED
|
@@ -2,7 +2,7 @@
|
|
| 2 |
|
| 3 |
Fine-tuned RoBERTa MLM model for [`Miscrosoft Sentence Completion Challenge`](https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/MSR_SCCD.pdf). This model case-sensitive following the `Roberta-base` model.
|
| 4 |
|
| 5 |
-
# Model description (taken from
|
| 6 |
|
| 7 |
RoBERTa is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means
|
| 8 |
it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
|
|
|
|
| 2 |
|
| 3 |
Fine-tuned RoBERTa MLM model for [`Miscrosoft Sentence Completion Challenge`](https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/MSR_SCCD.pdf). This model case-sensitive following the `Roberta-base` model.
|
| 4 |
|
| 5 |
+
# Model description (taken from [here](https://huggingface.co/roberta-base))
|
| 6 |
|
| 7 |
RoBERTa is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means
|
| 8 |
it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
|