File size: 705 Bytes
64dc3ec
 
 
 
 
0fc1bca
 
 
 
e82dfe8
 
 
64dc3ec
5cb9320
0fc1bca
20bd818
 
28ad975
20bd818
2690552
20bd818
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
---
datasets:
- fajrikoto/id_liputan6
language:
- id
pipeline_tag: summarization
tags:
- summarization
- t5
metrics:
- bertscore
- rouge
---

# Indonesian T5 Abstractive Summarization Base Model
Hello everyone, we are from Bina Nusantara University (SumText Group) consisting of Stevan Pohan, Joseph Vincent Liem, and Yongky Alexander Tristan. This is the result of a model that we have fine-tuned for the use of abstractive summarization


# Load Fine Tuned Model
```python
  from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
  model_path = "migz117/T5-Abstractive"
  model = AutoModelForSeq2SeqLM.from_pretrained(model_path).to(device)
  tokenizer = AutoTokenizer.from_pretrained(model_path)