Instructions to use moussaKam/AraBART with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use moussaKam/AraBART with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="moussaKam/AraBART")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("moussaKam/AraBART") model = AutoModel.from_pretrained("moussaKam/AraBART") - Inference
- Notebooks
- Google Colab
- Kaggle
AraBART is the first Arabic model in which the encoder and the decoder are pretrained end-to-end, based on BART. AraBART follows the architecture of BART-Base which has 6 encoder and 6 decoder layers and 768 hidden dimensions. In total AraBART has 139M parameters.
AraBART achieves the best performance on multiple abstractive summarization datasets, outperforming strong baselines including a pretrained Arabic BERT-based models and multilingual mBART and mT5 models.
- Downloads last month
- 2,194