| | --- |
| | license: apache-2.0 |
| | --- |
| | |
| | # Overview |
| |
|
| | <p align="center"> |
| | <img src="https://avatars.githubusercontent.com/u/12619994?s=200&v=4" width="150"> |
| | </p> |
| |
|
| | <!-- -------------------------------------------------------------------------------- --> |
| |
|
| | AT5B is an Arabic T5-base model. It's is **only compatible** with the code in [this github repo](https://github.com/huawei-noah/Pretrained-Language-Model/tree/master/JABER-PyTorch) (not supported by the [Transformers](https://github.com/huggingface/transformers) library) |
| | |
| | ## Citation |
| |
|
| | Please cite the following [paper](https://arxiv.org/pdf/2205.10687.pdf) when using our code and model: |
| |
|
| | ``` bibtex |
| | @article{ghaddar2022revisiting, |
| | title={Revisiting Pre-trained Language Models and their Evaluation for Arabic Natural Language Understanding}, |
| | author={Ghaddar, Abbas and Wu, Yimeng and Bagga, Sunyam and Rashid, Ahmad and Bibi, Khalil and Rezagholizadeh, Mehdi and Xing, Chao and Wang, Yasheng and Xinyu, Duan and Wang, Zhefeng and others}, |
| | journal={arXiv preprint arXiv:2205.10687}, |
| | year={2022} |
| | } |
| | ``` |
| |
|
| |
|