ArtifactAI
commited on
Commit
·
0ad71dc
1
Parent(s):
4c22648
Update README.md
Browse files
README.md
CHANGED
|
@@ -113,4 +113,18 @@ widget:
|
|
| 113 |
medical parole hearing or medical parole release is scheduled for an inmate receiving
|
| 114 |
medical parole consideration, regardless of whether the inmate is sentenced either
|
| 115 |
determinately or indeterminately.'
|
| 116 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 113 |
medical parole hearing or medical parole release is scheduled for an inmate receiving
|
| 114 |
medical parole consideration, regardless of whether the inmate is sentenced either
|
| 115 |
determinately or indeterminately.'
|
| 116 |
+
---
|
| 117 |
+
|
| 118 |
+
# Longformer Encoder-Decoder (LED) fine-tuned on Billsum
|
| 119 |
+
This model is a fine-tuned version of led-base-16384 on the billsum dataset.
|
| 120 |
+
|
| 121 |
+
As described in Longformer: The Long-Document Transformer by Iz Beltagy, Matthew E. Peters, Arman Cohan, led-base-16384 was initialized from bart-base since both models share the exact same architecture. To be able to process 16K tokens, bart-base's position embedding matrix was simply copied 16 times.
|
| 122 |
+
|
| 123 |
+
|
| 124 |
+
# Use In Transformers
|
| 125 |
+
```
|
| 126 |
+
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
|
| 127 |
+
|
| 128 |
+
tokenizer = AutoTokenizer.from_pretrained("Artifact-AI/led_base_16384_billsum_summarization")
|
| 129 |
+
|
| 130 |
+
model = AutoModelForSeq2SeqLM.from_pretrained("Artifact-AI/led_base_16384_billsum_summarization")
|