Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
utahnlp
/
imdb_facebook_opt-13b_seed-1
like
0
Follow
NLP at University of Utah
22
Text Classification
Transformers
Safetensors
opt
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
imdb_facebook_opt-13b_seed-1
51.4 GB
1 contributor
History:
3 commits
ashim95
Upload tokenizer
b5a2980
verified
almost 2 years ago
.gitattributes
1.52 kB
initial commit
almost 2 years ago
README.md
5.17 kB
Upload OPTForSequenceClassification
almost 2 years ago
config.json
954 Bytes
Upload OPTForSequenceClassification
almost 2 years ago
merges.txt
456 kB
Upload tokenizer
almost 2 years ago
model-00001-of-00011.safetensors
4.95 GB
xet
Upload OPTForSequenceClassification
almost 2 years ago
model-00002-of-00011.safetensors
4.93 GB
xet
Upload OPTForSequenceClassification
almost 2 years ago
model-00003-of-00011.safetensors
4.61 GB
xet
Upload OPTForSequenceClassification
almost 2 years ago
model-00004-of-00011.safetensors
4.61 GB
xet
Upload OPTForSequenceClassification
almost 2 years ago
model-00005-of-00011.safetensors
4.93 GB
xet
Upload OPTForSequenceClassification
almost 2 years ago
model-00006-of-00011.safetensors
4.93 GB
xet
Upload OPTForSequenceClassification
almost 2 years ago
model-00007-of-00011.safetensors
4.93 GB
xet
Upload OPTForSequenceClassification
almost 2 years ago
model-00008-of-00011.safetensors
4.93 GB
xet
Upload OPTForSequenceClassification
almost 2 years ago
model-00009-of-00011.safetensors
4.61 GB
xet
Upload OPTForSequenceClassification
almost 2 years ago
model-00010-of-00011.safetensors
4.61 GB
xet
Upload OPTForSequenceClassification
almost 2 years ago
model-00011-of-00011.safetensors
3.36 GB
xet
Upload OPTForSequenceClassification
almost 2 years ago
model.safetensors.index.json
56.3 kB
Upload OPTForSequenceClassification
almost 2 years ago
special_tokens_map.json
548 Bytes
Upload tokenizer
almost 2 years ago
tokenizer.json
2.11 MB
Upload tokenizer
almost 2 years ago
tokenizer_config.json
669 Bytes
Upload tokenizer
almost 2 years ago
vocab.json
798 kB
Upload tokenizer
almost 2 years ago