Update README.md
Browse files
README.md
CHANGED
|
@@ -1,17 +1,13 @@
|
|
| 1 |
# AraT5-base
|
| 2 |
# AraT5: Text-to-Text Transformers for Arabic Language Generation
|
| 3 |
|
| 4 |
-
<img src="AraT5_CR_new.png" alt="AraT5" width="
|
| 5 |
|
| 6 |
This is the repository accompanying our paper [AraT5: Text-to-Text Transformers for Arabic Language Understanding and Generation](https://arxiv.org/abs/2109.12068). In this is the repository we introduce:
|
| 7 |
* Introduce **AraT5<sub>MSA</sub>**, **AraT5<sub>Tweet</sub>**, and **AraT5**: three powerful Arabic-specific text-to-text Transformer based models;
|
| 8 |
* Introduce **ARGEN**: A new benchmark for Arabic language generation and evaluation for four Arabic NLP tasks, namely, ```machine translation```, ```summarization```, ```news title generation```, ```question generation```, , ```paraphrasing```, ```transliteration```, and ```code-switched translation```.
|
| 9 |
* Evaluate ```AraT5``` models on ```ARGEN``` and compare against available language models.
|
| 10 |
|
| 11 |
-
Our models establish new state-of-the-art (SOTA) on several publicly available datasets.
|
| 12 |
-
Our language models are publicaly available for research (see below).
|
| 13 |
-
|
| 14 |
-
The rest of this repository provides more information about our new language models, benchmark, and experiments.
|
| 15 |
|
| 16 |
---
|
| 17 |
# How to use AraT5 models
|
|
@@ -56,14 +52,16 @@ AraT5 Pytorch and TensorFlow checkpoints are available on the Huggingface websit
|
|
| 56 |
|
| 57 |
If you use our models (Arat5-base, Arat5-msa-base, Arat5-tweet-base, Arat5-msa-small, or Arat5-tweet-small ) for your scientific publication, or if you find the resources in this repository useful, please cite our paper as follows (to be updated):
|
| 58 |
```bibtex
|
| 59 |
-
@inproceedings{
|
| 60 |
-
title = "{AraT5: Text-to-Text Transformers for Arabic Language
|
| 61 |
author = "Nagoudi, El Moatez Billah and
|
| 62 |
Elmadany, AbdelRahim and
|
| 63 |
Abdul-Mageed, Muhammad",
|
| 64 |
-
booktitle = "
|
| 65 |
-
month =
|
| 66 |
-
year = "
|
|
|
|
|
|
|
| 67 |
```
|
| 68 |
|
| 69 |
## Acknowledgments
|
|
|
|
| 1 |
# AraT5-base
|
| 2 |
# AraT5: Text-to-Text Transformers for Arabic Language Generation
|
| 3 |
|
| 4 |
+
<img src="https://huggingface.co/UBC-NLP/AraT5-base/resolve/main/AraT5_CR_new.png" alt="AraT5" width="45%" height="35%" align="right"/>
|
| 5 |
|
| 6 |
This is the repository accompanying our paper [AraT5: Text-to-Text Transformers for Arabic Language Understanding and Generation](https://arxiv.org/abs/2109.12068). In this is the repository we introduce:
|
| 7 |
* Introduce **AraT5<sub>MSA</sub>**, **AraT5<sub>Tweet</sub>**, and **AraT5**: three powerful Arabic-specific text-to-text Transformer based models;
|
| 8 |
* Introduce **ARGEN**: A new benchmark for Arabic language generation and evaluation for four Arabic NLP tasks, namely, ```machine translation```, ```summarization```, ```news title generation```, ```question generation```, , ```paraphrasing```, ```transliteration```, and ```code-switched translation```.
|
| 9 |
* Evaluate ```AraT5``` models on ```ARGEN``` and compare against available language models.
|
| 10 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 11 |
|
| 12 |
---
|
| 13 |
# How to use AraT5 models
|
|
|
|
| 52 |
|
| 53 |
If you use our models (Arat5-base, Arat5-msa-base, Arat5-tweet-base, Arat5-msa-small, or Arat5-tweet-small ) for your scientific publication, or if you find the resources in this repository useful, please cite our paper as follows (to be updated):
|
| 54 |
```bibtex
|
| 55 |
+
@inproceedings{nagoudi-2022-arat5,
|
| 56 |
+
title = "{AraT5: Text-to-Text Transformers for Arabic Language Generation",
|
| 57 |
author = "Nagoudi, El Moatez Billah and
|
| 58 |
Elmadany, AbdelRahim and
|
| 59 |
Abdul-Mageed, Muhammad",
|
| 60 |
+
booktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics",
|
| 61 |
+
month = May,
|
| 62 |
+
year = "2022",
|
| 63 |
+
address = "Online",
|
| 64 |
+
publisher = "Association for Computational Linguistics",
|
| 65 |
```
|
| 66 |
|
| 67 |
## Acknowledgments
|