BARThez: a Skilled Pretrained French Sequence-to-Sequence Model
Paper • 2010.12321 • Published • 1
How to use moussaKam/barthez-orangesum-title with Transformers:
# Use a pipeline as a high-level helper
# Warning: Pipeline type "summarization" is no longer supported in transformers v5.
# You must load the model directly (see below) or downgrade to v4.x with:
# 'pip install "transformers<5.0.0'
from transformers import pipeline
pipe = pipeline("summarization", model="moussaKam/barthez-orangesum-title") # Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("moussaKam/barthez-orangesum-title")
model = AutoModelForSeq2SeqLM.from_pretrained("moussaKam/barthez-orangesum-title")finetuning: examples/seq2seq/ (as of Nov 06, 2020)
Metrics: ROUGE-2 > 23
paper: https://arxiv.org/abs/2010.12321
github: https://github.com/moussaKam/BARThez
@article{eddine2020barthez,
title={BARThez: a Skilled Pretrained French Sequence-to-Sequence Model},
author={Eddine, Moussa Kamal and Tixier, Antoine J-P and Vazirgiannis, Michalis},
journal={arXiv preprint arXiv:2010.12321},
year={2020}
}