Instructions to use allenai/led-base-16384 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use allenai/led-base-16384 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("allenai/led-base-16384") model = AutoModelForSeq2SeqLM.from_pretrained("allenai/led-base-16384") - Notebooks
- Google Colab
- Kaggle
Update README.md
#6 opened 4 months ago
by
cherry0328
Anyone got this to work over 1K tokens?
#5 opened 4 months ago
by
calebdollar
Adding `safetensors` variant of this model
#4 opened almost 3 years ago
by
SFconvertbot
Unable to set max number of tokens in input higher than 1024
6
#3 opened about 3 years ago
by
traopia
Adding `safetensors` variant of this model
#2 opened over 3 years ago
by
Narsil