modelId
stringlengths
6
107
label
list
readme
stringlengths
0
56.2k
readme_len
int64
0
56.2k
Jeevesh8/bert_ft_cola-14
null
Entry not found
15
Jeevesh8/bert_ft_cola-15
null
Entry not found
15
Jeevesh8/bert_ft_cola-16
null
Entry not found
15
Jeevesh8/bert_ft_cola-17
null
Entry not found
15
Jeevesh8/bert_ft_cola-18
null
Entry not found
15
Jeevesh8/bert_ft_cola-19
null
Entry not found
15
Jeevesh8/bert_ft_cola-22
null
Entry not found
15
Jeevesh8/bert_ft_cola-23
null
Entry not found
15
Jeevesh8/bert_ft_cola-24
null
Entry not found
15
Jeevesh8/bert_ft_cola-25
null
Entry not found
15
Jeevesh8/bert_ft_cola-26
null
Entry not found
15
Jeevesh8/bert_ft_cola-28
null
Entry not found
15
Jeevesh8/bert_ft_cola-29
null
Entry not found
15
Jeevesh8/bert_ft_cola-30
null
Entry not found
15
Jeevesh8/bert_ft_cola-32
null
Entry not found
15
Jeevesh8/bert_ft_cola-33
null
Entry not found
15
Jeevesh8/bert_ft_cola-34
null
Entry not found
15
Jeevesh8/bert_ft_cola-36
null
Entry not found
15
Jeevesh8/bert_ft_cola-37
null
Entry not found
15
Jeevesh8/bert_ft_cola-41
null
Entry not found
15
Jeevesh8/bert_ft_cola-43
null
Entry not found
15
Jeevesh8/bert_ft_cola-44
null
Entry not found
15
Jeevesh8/bert_ft_cola-47
null
Entry not found
15
Jeevesh8/bert_ft_cola-48
null
Entry not found
15
Jeevesh8/bert_ft_cola-50
null
Entry not found
15
Jeevesh8/bert_ft_cola-51
null
Entry not found
15
Jeevesh8/bert_ft_cola-52
null
Entry not found
15
Jeevesh8/bert_ft_cola-53
null
Entry not found
15
Jeevesh8/bert_ft_cola-54
null
Entry not found
15
Jeevesh8/bert_ft_cola-57
null
Entry not found
15
Jeevesh8/bert_ft_cola-58
null
Entry not found
15
Jeevesh8/bert_ft_cola-59
null
Entry not found
15
Jeevesh8/bert_ft_cola-60
null
Entry not found
15
Jeevesh8/bert_ft_cola-61
null
Entry not found
15
Jeevesh8/bert_ft_cola-66
null
Entry not found
15
Jeevesh8/bert_ft_cola-69
null
Entry not found
15
Jeevesh8/bert_ft_cola-72
null
Entry not found
15
Jeevesh8/bert_ft_cola-74
null
Entry not found
15
Jeevesh8/bert_ft_cola-76
null
Entry not found
15
Jeevesh8/bert_ft_cola-78
null
Entry not found
15
Jeevesh8/bert_ft_cola-79
null
Entry not found
15
Jeevesh8/bert_ft_cola-82
null
Entry not found
15
Jeevesh8/bert_ft_cola-84
null
Entry not found
15
Jeevesh8/bert_ft_cola-86
null
Entry not found
15
Jeevesh8/bert_ft_cola-88
null
Entry not found
15
Jeevesh8/bert_ft_cola-89
null
Entry not found
15
Jeevesh8/bert_ft_cola-90
null
Entry not found
15
Jeevesh8/bert_ft_cola-93
null
Entry not found
15
Jeevesh8/bert_ft_cola-95
null
Entry not found
15
Jeevesh8/bert_ft_cola-98
null
Entry not found
15
princeton-nlp/CoFi-MRPC-s95
[ "0", "1" ]
This is a model checkpoint for "[Structured Pruning Learns Compact and Accurate Models](https://arxiv.org/pdf/2204.00408.pdf)". The model is pruned from `bert-base-uncased` to a 95% sparsity on dataset MRPC. Please go to [our repository](https://github.com/princeton-nlp/CoFiPruning) for more details on how to use the model for inference. Note that you would have to use the model class specified in our repository to load the model.
434
Nakul24/RoBERTa-Goemotions-6
[ "LABEL_0", "LABEL_1", "LABEL_2", "LABEL_3", "LABEL_4", "LABEL_5" ]
Entry not found
15
SreyanG-NVIDIA/bert-base-cased-finetuned-cola
null
Entry not found
15
choondrise/emolve_basic
[ "LABEL_0", "LABEL_1", "LABEL_2", "LABEL_3", "LABEL_4", "LABEL_5", "LABEL_6", "LABEL_7" ]
Entry not found
15
cmcmorrow/distilbert-rater
[ "LABEL_0", "LABEL_1", "LABEL_2", "LABEL_3", "LABEL_4" ]
--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: distilbert-rater results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-rater This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results ### Framework versions - Transformers 4.16.2 - Pytorch 1.9.1 - Datasets 1.18.4 - Tokenizers 0.11.6
1,031
TinySuitStarfish/distilbert-base-uncased-finetuned-emotion
[ "LABEL_0", "LABEL_1", "LABEL_2", "LABEL_3", "LABEL_4", "LABEL_5" ]
Entry not found
15
Yarn/autotrain-Traimn-853827191
[ "business", "entertainment", "politics", "sport", "tech" ]
--- tags: autotrain language: unk widget: - text: "I love AutoTrain 🤗" datasets: - Yarn/autotrain-data-Traimn co2_eq_emissions: 1.712176860015081 --- # Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 853827191 - CO2 Emissions (in grams): 1.712176860015081 ## Validation Metrics - Loss: 0.10257730633020401 - Accuracy: 0.973421926910299 - Macro F1: 0.9735224586288418 - Micro F1: 0.973421926910299 - Weighted F1: 0.9735187934099364 - Macro Precision: 0.9738505933839127 - Micro Precision: 0.973421926910299 - Weighted Precision: 0.9738995774527256 - Macro Recall: 0.9734994306470444 - Micro Recall: 0.973421926910299 - Weighted Recall: 0.973421926910299 ## Usage You can use cURL to access this model: ``` $ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoTrain"}' https://api-inference.huggingface.co/models/Yarn/autotrain-Traimn-853827191 ``` Or Python API: ``` from transformers import AutoModelForSequenceClassification, AutoTokenizer model = AutoModelForSequenceClassification.from_pretrained("Yarn/autotrain-Traimn-853827191", use_auth_token=True) tokenizer = AutoTokenizer.from_pretrained("Yarn/autotrain-Traimn-853827191", use_auth_token=True) inputs = tokenizer("I love AutoTrain", return_tensors="pt") outputs = model(**inputs) ```
1,359
EhsanAghazadeh/bert-base-uncased-random-weights-S42
[ "LABEL_0", "LABEL_1", "LABEL_2" ]
Entry not found
15
Pablo94/roberta-base-bne-finetuned-detests
null
--- license: apache-2.0 tags: - generated_from_trainer metrics: - accuracy model-index: - name: roberta-base-bne-finetuned-detests results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # roberta-base-bne-finetuned-detests This model is a fine-tuned version of [BSC-TeMU/roberta-base-bne](https://huggingface.co/BSC-TeMU/roberta-base-bne) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.0052 - Accuracy: 0.8674 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.2876 | 1.0 | 153 | 0.3553 | 0.8445 | | 0.3309 | 2.0 | 306 | 0.4247 | 0.8216 | | 0.0679 | 3.0 | 459 | 0.6958 | 0.8494 | | 0.0007 | 4.0 | 612 | 0.8027 | 0.8445 | | 0.0003 | 5.0 | 765 | 0.9791 | 0.8511 | | 0.0002 | 6.0 | 918 | 0.9495 | 0.8642 | | 0.0002 | 7.0 | 1071 | 0.9742 | 0.8642 | | 0.0001 | 8.0 | 1224 | 0.9913 | 0.8658 | | 0.0001 | 9.0 | 1377 | 1.0017 | 0.8674 | | 0.0001 | 10.0 | 1530 | 1.0052 | 0.8674 | ### Framework versions - Transformers 4.19.1 - Pytorch 1.11.0+cu113 - Datasets 2.2.1 - Tokenizers 0.12.1
1,935
anwesham/autotrain-imdb-sentiment-analysis-864927559
[ "0", "1" ]
--- language: unk datasets: - anwesham/autotrain-data-imdb-sentiment-analysis co2_eq_emissions: 0.2033402242358345 --- - Problem type: Binary Classification - Model ID: 864927559 - CO2 Emissions (in grams): 0.2033402242358345 ## Validation Metrics - Loss: 0.18383920192718506 - Accuracy: 0.9318 - Precision: 0.9560625264047318 - Recall: 0.9052 - AUC: 0.98281574 - F1: 0.9299363057324841 ## Usage You can use cURL to access this model: ``` $ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoTrain"}' https://api-inference.huggingface.co/models/anwesham/autotrain-imdb-sentiment-analysis-864927559 ``` Or Python API: ``` from transformers import AutoModelForSequenceClassification, AutoTokenizer model = AutoModelForSequenceClassification.from_pretrained("anwesham/autotrain-imdb-sentiment-analysis-864927559", use_auth_token=True) tokenizer = AutoTokenizer.from_pretrained("anwesham/autotrain-imdb-sentiment-analysis-864927559", use_auth_token=True) inputs = tokenizer("I love to eat food", return_tensors="pt") outputs = model(**inputs) ```
1,119
reallycarlaost/emobert-valence-5
[ "LABEL_0", "LABEL_1", "LABEL_2", "LABEL_3", "LABEL_4" ]
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-0
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-3
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-4
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-5
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-6
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-7
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-8
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-11
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-12
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-13
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-15
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-16
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-17
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-18
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-19
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-20
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-21
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-22
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-24
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-25
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-26
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-27
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-29
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-30
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-32
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-36
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-38
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-39
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-40
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-41
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-43
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-44
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-46
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-47
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-56
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-57
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-58
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-64
null
Entry not found
15
Jeevesh8/6ep_bert_ft_cola-74
null
Entry not found
15