Fill-Mask
Transformers
PyTorch
English
bart
text2text-generation
summarization
long context
custom_code
Instructions to use ccdv/lsg-bart-base-4096 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ccdv/lsg-bart-base-4096 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="ccdv/lsg-bart-base-4096", trust_remote_code=True)# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("ccdv/lsg-bart-base-4096", trust_remote_code=True) model = AutoModelForSeq2SeqLM.from_pretrained("ccdv/lsg-bart-base-4096", trust_remote_code=True) - Notebooks
- Google Colab
- Kaggle
Added import random - dropout probability refers to random but it is not imported anywhere
#2
by Agniva - opened
- modeling_lsg_bart.py +1 -0
modeling_lsg_bart.py
CHANGED
|
@@ -1,5 +1,6 @@
|
|
| 1 |
from logging import warn
|
| 2 |
import torch
|
|
|
|
| 3 |
from transformers.models.bart.modeling_bart import *
|
| 4 |
from transformers.models.bart.modeling_bart import _expand_mask
|
| 5 |
import torch.nn as nn
|
|
|
|
| 1 |
from logging import warn
|
| 2 |
import torch
|
| 3 |
+
import random
|
| 4 |
from transformers.models.bart.modeling_bart import *
|
| 5 |
from transformers.models.bart.modeling_bart import _expand_mask
|
| 6 |
import torch.nn as nn
|