Search is not available for this dataset
pipeline_tag stringclasses 48
values | library_name stringclasses 205
values | text stringlengths 0 18.3M | metadata stringlengths 2 1.07B | id stringlengths 5 122 | last_modified null | tags listlengths 1 1.84k | sha null | created_at stringlengths 25 25 |
|---|---|---|---|---|---|---|---|---|
fill-mask | transformers |
# Transformer language model for Croatian and Serbian
Trained on 28GB datasets that contain Croatian and Serbian language for one epochs (3 mil. steps).
Leipzig Corpus, OSCAR, srWac, hrWac, cc100-hr and cc100-sr datasets
| Model | #params | Arch. | Training data ... | {"language": ["hr", "sr", "multilingual"], "license": "apache-2.0", "tags": ["masked-lm"], "datasets": ["oscar", "srwac", "leipzig", "cc100", "hrwac"], "widget": [{"text": "Ovo je po\u010detak <mask>."}]} | Andrija/SRoBERTa-XL | null | [
"transformers",
"pytorch",
"roberta",
"fill-mask",
"masked-lm",
"hr",
"sr",
"multilingual",
"dataset:oscar",
"dataset:srwac",
"dataset:leipzig",
"dataset:cc100",
"dataset:hrwac",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 |
token-classification | transformers | Named Entity Recognition (Token Classification Head) for Serbian / Croatian languges.
Abbreviation|Description
-|-
O|Outside of a named entity
B-MIS |Beginning of a miscellaneous entity right after another miscellaneous entity
I-MIS | Miscellaneous entity
B-PER |Beginning of a person’s name right after another person’... | {"language": ["hr", "sr", "multilingual"], "license": "apache-2.0", "datasets": ["hr500k"], "widget": [{"text": "Moje ime je Aleksandar i zivim u Beogradu pored Vlade Republike Srbije"}]} | Andrija/SRoBERTa-base-NER | null | [
"transformers",
"pytorch",
"roberta",
"token-classification",
"hr",
"sr",
"multilingual",
"dataset:hr500k",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 |
fill-mask | transformers | # Transformer language model for Croatian and Serbian
Trained on 3GB datasets that contain Croatian and Serbian language for two epochs.
Leipzig and OSCAR datasets
# Information of dataset
| Model | #params | Arch. | Training data |
|----------------... | {"language": ["hr", "sr", "multilingual"], "license": "apache-2.0", "tags": ["masked-lm"], "datasets": ["oscar", "leipzig"], "widget": [{"text": "Ovo je po\u010detak <mask>."}]} | Andrija/SRoBERTa-base | null | [
"transformers",
"pytorch",
"roberta",
"fill-mask",
"masked-lm",
"hr",
"sr",
"multilingual",
"dataset:oscar",
"dataset:leipzig",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 |
fill-mask | transformers | # Transformer language model for Croatian and Serbian
Trained on 0.7GB dataset Croatian and Serbian language for one epoch.
Dataset from Leipzig Corpora.
# Information of dataset
| Model | #params | Arch. | Training data |
|---------------------------... | {"language": ["hr", "sr", "multilingual"], "license": "apache-2.0", "tags": ["masked-lm"], "datasets": ["leipzig"], "widget": [{"text": "Gde je <mask>."}]} | Andrija/SRoBERTa | null | [
"transformers",
"pytorch",
"roberta",
"fill-mask",
"masked-lm",
"hr",
"sr",
"multilingual",
"dataset:leipzig",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 |
fill-mask | transformers | {} | Andrija/SRoBERTaFastBPE-2 | null | [
"transformers",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | null | {} | Andrija/SRoBERTaFastBPE | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | null | {} | Andry/111 | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | null | C:\Users\andry\Desktop\Выжигание 24-12-2021.jpg | {} | Andry/1111 | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 |
null | null |
Now we only upload two models for creating demos for image and video classification.
More models and code can be found in our github repo: [UniFormer](https://github.com/Sense-X/UniFormer). | {"license": "mit"} | Andy1621/uniformer | null | [
"license:mit",
"has_space",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 |
null | transformers | {} | AndyJ/clinicalBERT | null | [
"transformers",
"pytorch",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
fill-mask | transformers | {} | AndyJ/prompt_finetune | null | [
"transformers",
"pytorch",
"bert",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
multiple-choice | transformers | {} | AndyyyCai/bert-base-uncased-finetuned-copa | null | [
"transformers",
"pytorch",
"bert",
"multiple-choice",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | null | {} | Ani123/Ani | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
text-classification | transformers | {} | Anirbanbhk/Hate-speech-Pretrained-movies | null | [
"transformers",
"tf",
"bert",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
text-classification | transformers | {} | tensor-trek/distilbert-base-uncased-finetuned-emotion | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | null | {} | Anji/roberta-base-squad2-finetuned-squad | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | null | {} | Ankit-11/distilbert-base-uncased-finetuned-toxic | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | null | {} | Ankitha/DialoGPT-small-harrypotter | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | null | {} | Ankitha/DialoGPT-small-harrypottery | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
token-classification | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-ner
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/dis... | {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["conll2003"], "metrics": ["precision", "recall", "f1", "accuracy"], "model_index": [{"name": "distilbert-base-uncased-finetuned-ner", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "con... | Ann2020/distilbert-base-uncased-finetuned-ner | null | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"token-classification",
"generated_from_trainer",
"dataset:conll2003",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 |
null | null | {} | Ann2020/model-finetuned-ner | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | null | {} | Ann2020/rubert-base-cased-finetuned-ner | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | null | {} | Ann2020/rubert-base-cased-sentence-finetuned-ner | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | null | {} | Ann2020/rubert-base-cased-sentence-finetuned-ner_tags | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | null | {} | AnnettJaeger/AnneJae | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | null | {} | Anomic/DialoGPT-medium-loki | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
text-classification | transformers | {} | AnonARR/qqp-bert | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | Pre-trained to have better reasoning ability, try this if you are working with task like QA. For more details please see https://openreview.net/forum?id=cGB7CMFtrSx
This is based on bert-base-uncased model and pre-trained for text input | {} | Anonymous/ReasonBERT-BERT | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 |
feature-extraction | transformers | Pre-trained to have better reasoning ability, try this if you are working with task like QA. For more details please see https://openreview.net/forum?id=cGB7CMFtrSx
This is based on roberta-base model and pre-trained for text input | {} | Anonymous/ReasonBERT-RoBERTa | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 |
feature-extraction | transformers | Pre-trained to have better reasoning ability, try this if you are working with task like QA. For more details please see https://openreview.net/forum?id=cGB7CMFtrSx
This is based on tapas-base(no_reset) model and pre-trained for table input | {} | Anonymous/ReasonBERT-TAPAS | null | [
"transformers",
"pytorch",
"tapas",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 |
null | null | {} | Anonymous0230/model_name | null | [
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | transformers | {} | AnonymousNLP/pretrained-model-1 | null | [
"transformers",
"pytorch",
"gpt2",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
null | transformers | {} | AnonymousNLP/pretrained-model-2 | null | [
"transformers",
"pytorch",
"gpt2",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_EManuals-BERT | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_EManuals-RoBERTa | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_SDR_HF_model_base | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_bert-base-uncased | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_cline | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_consert | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_declutr | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_bert_quadruplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_bert_triplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_hier_quadruplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_hier_triplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_only_classfn_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_only_classfn_twostage_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_bert_quadruplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_bert_quadruplet_epochs_1_shard_10 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_bert_triplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_bert_triplet_epochs_1_shard_10 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_hier_quadruplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_hier_quadruplet_epochs_1_shard_10 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_hier_triplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_hier_triplet_epochs_1_shard_10 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_only_classfn_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_only_classfn_epochs_1_shard_10 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_only_classfn_twostage_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_only_classfn_twostage_epochs_1_shard_10 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_twostage_quadruplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_twostage_quadruplet_epochs_1_shard_10 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_twostagequadruplet_hier_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_twostagequadruplet_hier_epochs_1_shard_10 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_twostagetriplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_twostagetriplet_epochs_1_shard_10 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_twostagetriplet_hier_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_roberta_twostagetriplet_hier_epochs_1_shard_10 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_twostage_quadruplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_twostagequadruplet_hier_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_twostagetriplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_rule_based_twostagetriplet_hier_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/AR_specter | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/EManuals_BERT_copy | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
text-classification | transformers | {} | AnonymousSub/EManuals_BERT_copy_wikiqa | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
question-answering | transformers | {} | AnonymousSub/EManuals_BERT_squad2.0 | null | [
"transformers",
"pytorch",
"bert",
"question-answering",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
question-answering | transformers | {} | AnonymousSub/EManuals_RoBERTa_squad2.0 | null | [
"transformers",
"pytorch",
"roberta",
"question-answering",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
text-classification | transformers | {} | AnonymousSub/EManuals_RoBERTa_wikiqa | null | [
"transformers",
"pytorch",
"roberta",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SDR_HF_model_base | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_EManuals-BERT | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_EManuals-RoBERTa | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_SDR_HF_model_base | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_bert-base-uncased | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_cline | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_consert | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_declutr | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_bert_quadruplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_bert_triplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_hier_quadruplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_hier_triplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_only_classfn_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_only_classfn_twostage_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_roberta_bert_quadruplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_roberta_bert_quadruplet_epochs_1_shard_10 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_roberta_bert_triplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_roberta_bert_triplet_epochs_1_shard_10 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_roberta_hier_quadruplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_roberta_hier_quadruplet_epochs_1_shard_10 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_roberta_hier_triplet_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_roberta_hier_triplet_epochs_1_shard_10 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_roberta_hier_triplet_epochs_1_shard_1_wikiqa_copy | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | |
feature-extraction | transformers | {} | AnonymousSub/SR_rule_based_roberta_only_classfn_epochs_1_shard_1 | null | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.