Spaces:
Sleeping
Sleeping
File size: 1,230 Bytes
9439512 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 | ---
license: mit
library_name: transformers
language: en
datasets:
- your-dataset-name
metrics:
- rouge
base_model: facebook/bart-large
tags:
- text2text-generation
- summarization
- fine-tuned
pipeline_tag: summarization
model-index:
- name: fine-tuned-bart-large
results:
- task: summarization
dataset: your-dataset-name
metrics:
- rouge1: 0.45
- rouge2: 0.22
- rougel: 0.40
---
# Fine-tuned BART Large Model
This repository contains a fine-tuned BART large model for text summarization tasks.
## Model Details
- Base model: facebook/bart-large
- Fine-tuned on: your-dataset-name
- License: MIT
## Usage
You can load this model using the Hugging Face Transformers library:
```python
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model_name = "ArchCoder/fine-tuned-bart-large"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSeq2SeqLM.from_pretrained(model_name)
```
Replace `"ArchCoder/fine-tuned-bart-large"` with your actual model repo name.
## Evaluation
The model was evaluated on the your-dataset-name dataset with the following metrics:
- ROUGE-1: 0.45
- ROUGE-2: 0.22
- ROUGE-L: 0.40
## License
This model is licensed under the MIT License.
|