Summarization
Transformers
PyTorch
Safetensors
English
led
text2text-generation
Eval Results (legacy)
Instructions to use AlgorithmicResearchGroup/led_large_16384_billsum_summarization with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use AlgorithmicResearchGroup/led_large_16384_billsum_summarization with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "summarization" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("summarization", model="AlgorithmicResearchGroup/led_large_16384_billsum_summarization")# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("AlgorithmicResearchGroup/led_large_16384_billsum_summarization") model = AutoModelForSeq2SeqLM.from_pretrained("AlgorithmicResearchGroup/led_large_16384_billsum_summarization") - Notebooks
- Google Colab
- Kaggle
Commit History
Update README.md 1de5f30
ArtifactAI commited on
Add evaluation results on the default config and test split of billsum (#2) 5e23461
Update README.md a209969
ArtifactAI commited on
Update README.md e88b4b4
ArtifactAI commited on
Update README.md b958628
ArtifactAI commited on
Update README.md e0babc4
ArtifactAI commited on
Create README.md b92259c
ArtifactAI commited on
first commit 101c25f
Artifact-AI commited on
initial commit 64da35f
ArtifactAI commited on