File size: 843 Bytes
7f27d95
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
---
language: en
tags:
- text-simplification
- bart
license: apache-2.0
---

# Text Simplification Model (H100 Trained)

## Training Results
- **Training Loss**: 0.2796
- **Training Time**: 22:39 (3 epochs)
- **Dataset**: GEM/wiki_auto_asset_turk (483,801 samples)
- **GPU**: NVIDIA H100 80GB
- **Batch Size**: 64

## Usage
```python
from transformers import BartTokenizer, BartForConditionalGeneration

model = BartForConditionalGeneration.from_pretrained("Lorobert/text-simplification-runpod")
tokenizer = BartTokenizer.from_pretrained("Lorobert/text-simplification-runpod")

text = "Complex sentence here."
inputs = tokenizer(text, return_tensors="pt", max_length=128, truncation=True)
outputs = model.generate(**inputs, max_length=128, num_beams=4)
simplified = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(simplified)
```