Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
KameronB
/
sitcc-roberta
like
1
Transformers
PyTorch
KameronB/SITCC-dataset
English
roberta
IT
classification
call center
grammar
License:
mit
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
d1e86e7
sitcc-roberta
500 MB
Ctrl+K
Ctrl+K
1 contributor
History:
5 commits
KameronB
Rename sitcc-robert.bin to pytorch_model.bin
d1e86e7
verified
about 2 years ago
.gitattributes
Safe
1.52 kB
initial commit
about 2 years ago
README.md
3.16 kB
Updated README.md
about 2 years ago
config.json
749 Bytes
Upload config.json
about 2 years ago
merges.txt
Safe
456 kB
Upload 5 files
about 2 years ago
pytorch_model.bin
pickle
Detected Pickle imports (26)
"collections.OrderedDict"
,
"torch.nn.modules.activation.Tanh"
,
"torch.nn.modules.linear.Linear"
,
"torch._utils._rebuild_tensor_v2"
,
"transformers.models.roberta.modeling_roberta.RobertaSelfOutput"
,
"transformers.models.roberta.modeling_roberta.RobertaSelfAttention"
,
"transformers.activations.GELUActivation"
,
"__main__.RobertaForRegression"
,
"transformers.models.roberta.modeling_roberta.RobertaModel"
,
"torch.nn.modules.dropout.Dropout"
,
"torch._utils._rebuild_parameter"
,
"transformers.models.roberta.modeling_roberta.RobertaIntermediate"
,
"transformers.models.roberta.modeling_roberta.RobertaOutput"
,
"transformers.models.roberta.modeling_roberta.RobertaLayer"
,
"transformers.models.roberta.modeling_roberta.RobertaAttention"
,
"transformers.models.roberta.modeling_roberta.RobertaPooler"
,
"transformers.models.roberta.modeling_roberta.RobertaEmbeddings"
,
"torch.nn.modules.sparse.Embedding"
,
"transformers.models.roberta.modeling_roberta.RobertaEncoder"
,
"torch._C._nn.gelu"
,
"torch.FloatStorage"
,
"torch.nn.modules.normalization.LayerNorm"
,
"torch.LongStorage"
,
"transformers.models.roberta.configuration_roberta.RobertaConfig"
,
"__builtin__.set"
,
"torch.nn.modules.container.ModuleList"
How to fix it?
499 MB
xet
Rename sitcc-robert.bin to pytorch_model.bin
about 2 years ago
special_tokens_map.json
Safe
957 Bytes
Upload 5 files
about 2 years ago
tokenizer_config.json
Safe
1.31 kB
Upload 5 files
about 2 years ago
vocab.json
Safe
999 kB
Upload 5 files
about 2 years ago