Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
GracchusYAO
/
my_finetuned_roberta_weibo
like
0
Safetensors
dirtycomputer/weibo_senti_100k
bert
License:
mit
Model card
Files
Files and versions
xet
Community
main
my_finetuned_roberta_weibo
410 MB
Ctrl+K
Ctrl+K
1 contributor
History:
3 commits
GracchusYAO
Create README.md
2ef4fa2
verified
about 1 year ago
.gitattributes
Safe
1.52 kB
initial commit
about 1 year ago
README.md
Safe
63 Bytes
Create README.md
about 1 year ago
config.json
Safe
942 Bytes
Upload fine-tuned RoBERTa model for Weibo classification
about 1 year ago
model.safetensors
409 MB
xet
Upload fine-tuned RoBERTa model for Weibo classification
about 1 year ago
special_tokens_map.json
Safe
732 Bytes
Upload fine-tuned RoBERTa model for Weibo classification
about 1 year ago
tokenizer.json
439 kB
Upload fine-tuned RoBERTa model for Weibo classification
about 1 year ago
tokenizer_config.json
Safe
1.31 kB
Upload fine-tuned RoBERTa model for Weibo classification
about 1 year ago
training_args.bin
pickle
Detected Pickle imports (10)
"accelerate.utils.dataclasses.DistributedType"
,
"transformers.trainer_utils.IntervalStrategy"
,
"transformers.trainer_utils.SchedulerType"
,
"transformers.trainer_utils.SaveStrategy"
,
"transformers.training_args.OptimizerNames"
,
"transformers.trainer_pt_utils.AcceleratorConfig"
,
"torch.device"
,
"transformers.training_args.TrainingArguments"
,
"transformers.trainer_utils.HubStrategy"
,
"accelerate.state.PartialState"
How to fix it?
5.3 kB
xet
Upload fine-tuned RoBERTa model for Weibo classification
about 1 year ago
vocab.txt
Safe
110 kB
Upload fine-tuned RoBERTa model for Weibo classification
about 1 year ago