moussaKam commited on
Commit ·
ddc70a3
1
Parent(s): c89de60
Add README.md
Browse files- .-tokenizer.json +0 -0
- README.md +30 -0
.-tokenizer.json
DELETED
|
The diff for this file is too large to render.
See raw diff
|
|
|
README.md
ADDED
|
@@ -0,0 +1,30 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
tags:
|
| 3 |
+
- summarization
|
| 4 |
+
|
| 5 |
+
language:
|
| 6 |
+
- fr
|
| 7 |
+
|
| 8 |
+
pipeline_tag: "fill-mask"
|
| 9 |
+
---
|
| 10 |
+
A french sequence to sequence pretrained model based on [BART](https://huggingface.co/facebook/bart-large). <br>
|
| 11 |
+
BARThez is pretrained by learning to reconstruct a corrupted input sentence. A corpus of 66GB of french raw text is used to carry out the pretraining. <br>
|
| 12 |
+
Unlike already existing BERT-based French language models such as CamemBERT and FlauBERT, BARThez is particularly well-suited for generative tasks (such as abstractive summarization), since not only its encoder but also its decoder is pretrained.
|
| 13 |
+
|
| 14 |
+
In addition to BARThez that is pretrained from scratch, we continue the pretraining of a multilingual BART [mBART](https://huggingface.co/facebook/mbart-large-cc25) which boosted its performance in both discriminative and generative tasks. We call the french adapted version [mBARThez](https://huggingface.co/moussaKam/mbarthez).
|
| 15 |
+
|
| 16 |
+
| Model | Architecture | #layers | #params |
|
| 17 |
+
| ------------- |:-------------:| :-----:|:-----:|
|
| 18 |
+
| [BARThez](https://huggingface.co/moussaKam/barthez) | BASE | 12 | 165M |
|
| 19 |
+
| [mBARThez](https://huggingface.co/moussaKam/mbarthez) | LARGE | 24 | 458M |
|
| 20 |
+
|
| 21 |
+
<br>
|
| 22 |
+
|
| 23 |
+
```
|
| 24 |
+
@article{eddine2020barthez,
|
| 25 |
+
title={BARThez: a Skilled Pretrained French Sequence-to-Sequence Model},
|
| 26 |
+
author={Eddine, Moussa Kamal and Tixier, Antoine J-P and Vazirgiannis, Michalis},
|
| 27 |
+
journal={arXiv preprint arXiv:2010.12321},
|
| 28 |
+
year={2020}
|
| 29 |
+
}
|
| 30 |
+
```
|