license
stringlengths
2
30
tags
stringlengths
2
513
is_nc
bool
1 class
readme_section
stringlengths
201
597k
hash
stringlengths
32
32
mit
['exbert']
false
Pretraining The model was trained on 1024 V100 GPUs for 500K steps with a batch size of 8K and a sequence length of 512. The optimizer used is Adam with a learning rate of 6e-4, \\(\beta_{1} = 0.9\\), \\(\beta_{2} = 0.98\\) and \\(\epsilon = 1e-6\\), a weight decay of 0.01, learning rate warmup for 24,000 steps and linear decay of the learning rate after.
cfd15da969ea3488c78e45a8c0912bb8
mit
['exbert']
false
Evaluation results When fine-tuned on downstream tasks, this model achieves the following results: Glue test results: | Task | MNLI | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | |:----:|:----:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:| | | 87.6 | 91.9 | 92.8 | 94.8 | 63.6 | 91.2 | 90.2 | 78.7 |
7d19093b068095eb42ff101010f2b796
mit
['exbert']
false
BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-1907-11692, author = {Yinhan Liu and Myle Ott and Naman Goyal and Jingfei Du and Mandar Joshi and Danqi Chen and Omer Levy and Mike Lewis and Luke Zettlemoyer and Veselin Stoyanov}, title = {RoBERTa: {A} Robustly Optimized {BERT} Pretraining Approach}, journal = {CoRR}, volume = {abs/1907.11692}, year = {2019}, url = {http://arxiv.org/abs/1907.11692}, archivePrefix = {arXiv}, eprint = {1907.11692}, timestamp = {Thu, 01 Aug 2019 08:59:33 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-1907-11692.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=roberta-base"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
c0ec15c46213d3347ad38aecff86ef85
apache-2.0
['multiberts', 'multiberts-seed_1', 'multiberts-seed_1-step_120k']
false
MultiBERTs, Intermediate Checkpoint - Seed 1, Step 120k MultiBERTs is a collection of checkpoints and a statistical library to support robust research on BERT. We provide 25 BERT-base models trained with similar hyper-parameters as [the original BERT model](https://github.com/google-research/bert) but with different random seeds, which causes variations in the initial weights and order of training instances. The aim is to distinguish findings that apply to a specific artifact (i.e., a particular instance of the model) from those that apply to the more general procedure. We also provide 140 intermediate checkpoints captured during the course of pre-training (we saved 28 checkpoints for the first 5 runs). The models were originally released through [http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our paper [The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163). This is model
f221add3e5cff43feed4787b530a34d2
apache-2.0
['multiberts', 'multiberts-seed_1', 'multiberts-seed_1-step_120k']
false
How to use Using code from [BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on Tensorflow: ``` from transformers import BertTokenizer, TFBertModel tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_120k') model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_120k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='tf') output = model(encoded_input) ``` PyTorch version: ``` from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_120k') model = BertModel.from_pretrained("google/multiberts-seed_1-step_120k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ```
208b61fbd36b74160ea8b2733c8f7cff
apache-2.0
['stanza', 'token-classification']
false
Stanza model for Naija (pcm) Stanza is a collection of accurate and efficient tools for the linguistic analysis of many human languages. Starting from raw text to syntactic analysis and entity recognition, Stanza brings state-of-the-art NLP models to languages of your choosing. Find more about it in [our website](https://stanfordnlp.github.io/stanza) and our [GitHub repository](https://github.com/stanfordnlp/stanza). This card and repo were automatically prepared with `hugging_stanza.py` in the `stanfordnlp/huggingface-models` repo Last updated 2022-09-25 01:53:44.595
11eca49c57b89b5ca2df7746fd84429a
apache-2.0
['generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 250 - num_epochs: 2
c2d09471d77d0150248fcafca87bc4bb
apache-2.0
['translation']
false
eng-zle * source group: English * target group: East Slavic languages * OPUS readme: [eng-zle](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-zle/README.md) * model: transformer * source language(s): eng * target language(s): bel bel_Latn orv_Cyrl rue rus ukr * model: transformer * pre-processing: normalization + SentencePiece (spm32k,spm32k) * a sentence initial language token is required in the form of `>>id<<` (id = valid target language ID) * download original weights: [opus2m-2020-08-02.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-zle/opus2m-2020-08-02.zip) * test set translations: [opus2m-2020-08-02.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-zle/opus2m-2020-08-02.test.txt) * test set scores: [opus2m-2020-08-02.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-zle/opus2m-2020-08-02.eval.txt)
a79361e9e213a6d35d0ed13a962844ea
apache-2.0
['translation']
false
Benchmarks | testset | BLEU | chr-F | |-----------------------|-------|-------| | newstest2012-engrus.eng.rus | 27.4 | 0.550 | | newstest2013-engrus.eng.rus | 21.4 | 0.493 | | newstest2015-enru-engrus.eng.rus | 24.2 | 0.534 | | newstest2016-enru-engrus.eng.rus | 23.3 | 0.518 | | newstest2017-enru-engrus.eng.rus | 25.3 | 0.541 | | newstest2018-enru-engrus.eng.rus | 22.4 | 0.527 | | newstest2019-enru-engrus.eng.rus | 24.1 | 0.505 | | Tatoeba-test.eng-bel.eng.bel | 20.8 | 0.471 | | Tatoeba-test.eng.multi | 37.2 | 0.580 | | Tatoeba-test.eng-orv.eng.orv | 0.6 | 0.130 | | Tatoeba-test.eng-rue.eng.rue | 1.4 | 0.168 | | Tatoeba-test.eng-rus.eng.rus | 41.3 | 0.616 | | Tatoeba-test.eng-ukr.eng.ukr | 38.7 | 0.596 |
373ffe2f24730fd16d209f7274e1f2bd
apache-2.0
['translation']
false
System Info: - hf_name: eng-zle - source_languages: eng - target_languages: zle - opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-zle/README.md - original_repo: Tatoeba-Challenge - tags: ['translation'] - languages: ['en', 'be', 'ru', 'uk', 'zle'] - src_constituents: {'eng'} - tgt_constituents: {'bel', 'orv_Cyrl', 'bel_Latn', 'rus', 'ukr', 'rue'} - src_multilingual: False - tgt_multilingual: True - prepro: normalization + SentencePiece (spm32k,spm32k) - url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/eng-zle/opus2m-2020-08-02.zip - url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/eng-zle/opus2m-2020-08-02.test.txt - src_alpha3: eng - tgt_alpha3: zle - short_pair: en-zle - chrF2_score: 0.58 - bleu: 37.2 - brevity_penalty: 0.9890000000000001 - ref_len: 63493.0 - src_name: English - tgt_name: East Slavic languages - train_date: 2020-08-02 - src_alpha2: en - tgt_alpha2: zle - prefer_old: False - long_pair: eng-zle - helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535 - transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b - port_machine: brutasse - port_time: 2020-08-21-14:41
5991d631f7b6cdf21c55edf1a02ae4c1
apache-2.0
['generated_from_trainer']
false
t5-small-en-to-th This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0527 - Bleu: 0.0 - Gen Len: 17.5726
61305827b9fb76caf147573517837ab6
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len | |:-------------:|:-----:|:-----:|:---------------:|:----:|:-------:| | 0.0414 | 1.0 | 17810 | 0.0527 | 0.0 | 17.5726 |
e65a677f596e7d28b08aac37e923e0d5
cc-by-4.0
[]
false
roberta-base for QA This is the [roberta-base](https://huggingface.co/roberta-base) model, fine-tuned using the [SQuAD2.0](https://huggingface.co/datasets/squad_v2) dataset. It's been trained on question-answer pairs, including unanswerable questions, for the task of Question Answering.
26b0150a34a6903e2dbbc46f239accd8
cc-by-4.0
[]
false
Hyperparameters ``` batch_size = 96 n_epochs = 2 base_LM_model = "roberta-base" max_seq_len = 386 learning_rate = 3e-5 lr_schedule = LinearWarmup warmup_proportion = 0.2 doc_stride=128 max_query_length=64 ```
5163eb43efbbf31f8f45c75ea9ccaaf7
cc-by-4.0
[]
false
Performance Evaluated on the SQuAD 2.0 dev set with the [official eval script](https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/). ``` "exact": 79.87029394424324, "f1": 82.91251169582613, "total": 11873, "HasAns_exact": 77.93522267206478, "HasAns_f1": 84.02838248389763, "HasAns_total": 5928, "NoAns_exact": 81.79983179142137, "NoAns_f1": 81.79983179142137, "NoAns_total": 5945 ```
edf34f6a27a549dc0b030b5d6e5cfbaf
mit
[]
false
GBA FE Class Cards on Stable Diffusion This is the `classcard` concept taught to Stable Diffusion via Textual Inversion. You can load this concept into the [Stable Conceptualizer](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/stable_conceptualizer_inference.ipynb) notebook. You can also train your own concepts and load them into the concept libraries using [this notebook](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/sd_textual_inversion_training.ipynb). Here is the new concept you will be able to use as a `style`: ![classcard 0](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/146.jpeg) ![classcard 1](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/40.jpeg) ![classcard 2](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/246.jpeg) ![classcard 3](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/182.jpeg) ![classcard 4](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/1.jpeg) ![classcard 5](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/13.jpeg) ![classcard 6](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/253.jpeg) ![classcard 7](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/12.jpeg) ![classcard 8](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/448.jpeg) ![classcard 9](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/377.jpeg) ![classcard 10](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/31.jpeg) ![classcard 11](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/405.jpeg) ![classcard 12](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/37.jpeg) ![classcard 13](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/300.jpeg) ![classcard 14](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/280.jpeg) ![classcard 15](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/462.jpeg) ![classcard 16](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/339.jpeg) ![classcard 17](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/173.jpeg) ![classcard 18](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/344.jpeg) ![classcard 19](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/170.jpeg) ![classcard 20](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/149.jpeg) ![classcard 21](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/335.jpeg) ![classcard 22](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/249.jpeg) ![classcard 23](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/420.jpeg) ![classcard 24](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/274.jpeg) ![classcard 25](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/127.jpeg) ![classcard 26](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/268.jpeg) ![classcard 27](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/43.jpeg) ![classcard 28](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/345.jpeg) ![classcard 29](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/456.jpeg) ![classcard 30](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/360.jpeg) ![classcard 31](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/310.jpeg) ![classcard 32](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/68.jpeg) ![classcard 33](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/480.jpeg) ![classcard 34](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/450.jpeg) ![classcard 35](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/258.jpeg) ![classcard 36](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/74.jpeg) ![classcard 37](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/394.jpeg) ![classcard 38](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/157.jpeg) ![classcard 39](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/114.jpeg) ![classcard 40](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/356.jpeg) ![classcard 41](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/48.jpeg) ![classcard 42](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/376.jpeg) ![classcard 43](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/374.jpeg) ![classcard 44](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/230.jpeg) ![classcard 45](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/160.jpeg) ![classcard 46](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/364.jpeg) ![classcard 47](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/76.jpeg) ![classcard 48](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/333.jpeg) ![classcard 49](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/476.jpeg) ![classcard 50](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/164.jpeg) ![classcard 51](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/216.jpeg) ![classcard 52](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/87.jpeg) ![classcard 53](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/18.jpeg) ![classcard 54](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/304.jpeg) ![classcard 55](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/282.jpeg) ![classcard 56](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/286.jpeg) ![classcard 57](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/45.jpeg) ![classcard 58](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/208.jpeg) ![classcard 59](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/441.jpeg) ![classcard 60](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/204.jpeg) ![classcard 61](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/95.jpeg) ![classcard 62](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/135.jpeg) ![classcard 63](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/464.jpeg) ![classcard 64](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/144.jpeg) ![classcard 65](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/390.jpeg) ![classcard 66](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/140.jpeg) ![classcard 67](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/166.jpeg) ![classcard 68](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/237.jpeg) ![classcard 69](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/199.jpeg) ![classcard 70](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/459.jpeg) ![classcard 71](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/219.jpeg) ![classcard 72](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/402.jpeg) ![classcard 73](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/440.jpeg) ![classcard 74](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/454.jpeg) ![classcard 75](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/185.jpeg) ![classcard 76](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/28.jpeg) ![classcard 77](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/453.jpeg) ![classcard 78](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/83.jpeg) ![classcard 79](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/129.jpeg) ![classcard 80](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/380.jpeg) ![classcard 81](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/54.jpeg) ![classcard 82](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/254.jpeg) ![classcard 83](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/366.jpeg) ![classcard 84](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/278.jpeg) ![classcard 85](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/461.jpeg) ![classcard 86](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/8.jpeg) ![classcard 87](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/365.jpeg) ![classcard 88](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/197.jpeg) ![classcard 89](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/159.jpeg) ![classcard 90](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/338.jpeg) ![classcard 91](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/429.jpeg) ![classcard 92](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/293.jpeg) ![classcard 93](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/428.jpeg) ![classcard 94](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/60.jpeg) ![classcard 95](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/470.jpeg) ![classcard 96](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/473.jpeg) ![classcard 97](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/194.jpeg) ![classcard 98](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/23.jpeg) ![classcard 99](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/112.jpeg) ![classcard 100](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/396.jpeg) ![classcard 101](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/235.jpeg) ![classcard 102](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/169.jpeg) ![classcard 103](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/321.jpeg) ![classcard 104](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/260.jpeg) ![classcard 105](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/103.jpeg) ![classcard 106](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/151.jpeg) ![classcard 107](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/34.jpeg) ![classcard 108](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/325.jpeg) ![classcard 109](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/410.jpeg) ![classcard 110](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/51.jpeg) ![classcard 111](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/236.jpeg) ![classcard 112](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/250.jpeg) ![classcard 113](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/257.jpeg) ![classcard 114](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/433.jpeg) ![classcard 115](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/384.jpeg) ![classcard 116](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/133.jpeg) ![classcard 117](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/279.jpeg) ![classcard 118](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/115.jpeg) ![classcard 119](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/41.jpeg) ![classcard 120](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/288.jpeg) ![classcard 121](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/154.jpeg) ![classcard 122](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/190.jpeg) ![classcard 123](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/305.jpeg) ![classcard 124](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/116.jpeg) ![classcard 125](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/468.jpeg) ![classcard 126](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/17.jpeg) ![classcard 127](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/223.jpeg) ![classcard 128](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/446.jpeg) ![classcard 129](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/232.jpeg) ![classcard 130](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/172.jpeg) ![classcard 131](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/407.jpeg) ![classcard 132](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/225.jpeg) ![classcard 133](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/57.jpeg) ![classcard 134](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/77.jpeg) ![classcard 135](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/66.jpeg) ![classcard 136](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/65.jpeg) ![classcard 137](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/49.jpeg) ![classcard 138](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/121.jpeg) ![classcard 139](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/379.jpeg) ![classcard 140](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/466.jpeg) ![classcard 141](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/382.jpeg) ![classcard 142](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/213.jpeg) ![classcard 143](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/9.jpeg) ![classcard 144](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/202.jpeg) ![classcard 145](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/210.jpeg) ![classcard 146](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/316.jpeg) ![classcard 147](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/359.jpeg) ![classcard 148](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/419.jpeg) ![classcard 149](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/207.jpeg) ![classcard 150](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/266.jpeg) ![classcard 151](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/399.jpeg) ![classcard 152](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/416.jpeg) ![classcard 153](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/413.jpeg) ![classcard 154](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/171.jpeg) ![classcard 155](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/181.jpeg) ![classcard 156](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/78.jpeg) ![classcard 157](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/58.jpeg) ![classcard 158](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/123.jpeg) ![classcard 159](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/153.jpeg) ![classcard 160](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/52.jpeg) ![classcard 161](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/389.jpeg) ![classcard 162](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/64.jpeg) ![classcard 163](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/163.jpeg) ![classcard 164](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/85.jpeg) ![classcard 165](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/392.jpeg) ![classcard 166](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/334.jpeg) ![classcard 167](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/30.jpeg) ![classcard 168](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/451.jpeg) ![classcard 169](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/73.jpeg) ![classcard 170](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/343.jpeg) ![classcard 171](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/152.jpeg) ![classcard 172](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/2.jpeg) ![classcard 173](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/137.jpeg) ![classcard 174](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/36.jpeg) ![classcard 175](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/486.jpeg) ![classcard 176](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/352.jpeg) ![classcard 177](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/270.jpeg) ![classcard 178](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/224.jpeg) ![classcard 179](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/307.jpeg) ![classcard 180](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/245.jpeg) ![classcard 181](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/263.jpeg) ![classcard 182](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/403.jpeg) ![classcard 183](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/414.jpeg) ![classcard 184](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/261.jpeg) ![classcard 185](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/427.jpeg) ![classcard 186](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/145.jpeg) ![classcard 187](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/67.jpeg) ![classcard 188](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/341.jpeg) ![classcard 189](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/330.jpeg) ![classcard 190](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/436.jpeg) ![classcard 191](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/362.jpeg) ![classcard 192](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/291.jpeg) ![classcard 193](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/474.jpeg) ![classcard 194](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/342.jpeg) ![classcard 195](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/108.jpeg) ![classcard 196](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/0.jpeg) ![classcard 197](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/55.jpeg) ![classcard 198](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/29.jpeg) ![classcard 199](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/14.jpeg) ![classcard 200](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/25.jpeg) ![classcard 201](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/432.jpeg) ![classcard 202](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/94.jpeg) ![classcard 203](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/50.jpeg) ![classcard 204](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/417.jpeg) ![classcard 205](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/431.jpeg) ![classcard 206](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/148.jpeg) ![classcard 207](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/158.jpeg) ![classcard 208](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/469.jpeg) ![classcard 209](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/277.jpeg) ![classcard 210](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/244.jpeg) ![classcard 211](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/294.jpeg) ![classcard 212](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/458.jpeg) ![classcard 213](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/422.jpeg) ![classcard 214](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/251.jpeg) ![classcard 215](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/147.jpeg) ![classcard 216](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/122.jpeg) ![classcard 217](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/275.jpeg) ![classcard 218](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/92.jpeg) ![classcard 219](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/239.jpeg) ![classcard 220](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/332.jpeg) ![classcard 221](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/104.jpeg) ![classcard 222](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/177.jpeg) ![classcard 223](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/175.jpeg) ![classcard 224](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/368.jpeg) ![classcard 225](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/39.jpeg) ![classcard 226](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/4.jpeg) ![classcard 227](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/61.jpeg) ![classcard 228](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/228.jpeg) ![classcard 229](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/176.jpeg) ![classcard 230](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/227.jpeg) ![classcard 231](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/240.jpeg) ![classcard 232](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/385.jpeg) ![classcard 233](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/222.jpeg) ![classcard 234](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/255.jpeg) ![classcard 235](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/238.jpeg) ![classcard 236](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/292.jpeg) ![classcard 237](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/452.jpeg) ![classcard 238](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/162.jpeg) ![classcard 239](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/284.jpeg) ![classcard 240](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/231.jpeg) ![classcard 241](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/259.jpeg) ![classcard 242](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/435.jpeg) ![classcard 243](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/273.jpeg) ![classcard 244](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/361.jpeg) ![classcard 245](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/337.jpeg) ![classcard 246](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/98.jpeg) ![classcard 247](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/10.jpeg) ![classcard 248](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/132.jpeg) ![classcard 249](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/124.jpeg) ![classcard 250](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/370.jpeg) ![classcard 251](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/156.jpeg) ![classcard 252](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/113.jpeg) ![classcard 253](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/439.jpeg) ![classcard 254](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/6.jpeg) ![classcard 255](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/324.jpeg) ![classcard 256](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/404.jpeg) ![classcard 257](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/478.jpeg) ![classcard 258](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/93.jpeg) ![classcard 259](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/192.jpeg) ![classcard 260](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/408.jpeg) ![classcard 261](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/100.jpeg) ![classcard 262](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/386.jpeg) ![classcard 263](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/375.jpeg) ![classcard 264](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/465.jpeg) ![classcard 265](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/393.jpeg) ![classcard 266](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/206.jpeg) ![classcard 267](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/303.jpeg) ![classcard 268](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/24.jpeg) ![classcard 269](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/445.jpeg) ![classcard 270](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/319.jpeg) ![classcard 271](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/11.jpeg) ![classcard 272](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/90.jpeg) ![classcard 273](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/179.jpeg) ![classcard 274](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/80.jpeg) ![classcard 275](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/449.jpeg) ![classcard 276](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/119.jpeg) ![classcard 277](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/318.jpeg) ![classcard 278](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/308.jpeg) ![classcard 279](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/320.jpeg) ![classcard 280](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/110.jpeg) ![classcard 281](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/214.jpeg) ![classcard 282](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/264.jpeg) ![classcard 283](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/328.jpeg) ![classcard 284](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/471.jpeg) ![classcard 285](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/387.jpeg) ![classcard 286](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/32.jpeg) ![classcard 287](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/21.jpeg) ![classcard 288](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/353.jpeg) ![classcard 289](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/460.jpeg) ![classcard 290](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/301.jpeg) ![classcard 291](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/444.jpeg) ![classcard 292](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/53.jpeg) ![classcard 293](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/400.jpeg) ![classcard 294](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/421.jpeg) ![classcard 295](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/42.jpeg) ![classcard 296](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/69.jpeg) ![classcard 297](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/242.jpeg) ![classcard 298](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/322.jpeg) ![classcard 299](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/89.jpeg) ![classcard 300](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/309.jpeg) ![classcard 301](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/5.jpeg) ![classcard 302](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/388.jpeg) ![classcard 303](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/475.jpeg) ![classcard 304](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/82.jpeg) ![classcard 305](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/272.jpeg) ![classcard 306](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/327.jpeg) ![classcard 307](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/59.jpeg) ![classcard 308](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/479.jpeg) ![classcard 309](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/296.jpeg) ![classcard 310](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/62.jpeg) ![classcard 311](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/424.jpeg) ![classcard 312](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/281.jpeg) ![classcard 313](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/351.jpeg) ![classcard 314](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/434.jpeg) ![classcard 315](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/283.jpeg) ![classcard 316](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/3.jpeg) ![classcard 317](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/269.jpeg) ![classcard 318](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/276.jpeg) ![classcard 319](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/120.jpeg) ![classcard 320](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/189.jpeg) ![classcard 321](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/128.jpeg) ![classcard 322](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/81.jpeg) ![classcard 323](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/150.jpeg) ![classcard 324](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/406.jpeg) ![classcard 325](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/395.jpeg) ![classcard 326](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/99.jpeg) ![classcard 327](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/302.jpeg) ![classcard 328](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/346.jpeg) ![classcard 329](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/63.jpeg) ![classcard 330](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/72.jpeg) ![classcard 331](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/109.jpeg) ![classcard 332](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/200.jpeg) ![classcard 333](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/96.jpeg) ![classcard 334](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/285.jpeg) ![classcard 335](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/323.jpeg) ![classcard 336](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/56.jpeg) ![classcard 337](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/118.jpeg) ![classcard 338](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/209.jpeg) ![classcard 339](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/252.jpeg) ![classcard 340](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/155.jpeg) ![classcard 341](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/168.jpeg) ![classcard 342](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/136.jpeg) ![classcard 343](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/467.jpeg) ![classcard 344](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/193.jpeg) ![classcard 345](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/243.jpeg) ![classcard 346](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/306.jpeg) ![classcard 347](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/398.jpeg) ![classcard 348](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/26.jpeg) ![classcard 349](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/381.jpeg) ![classcard 350](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/298.jpeg) ![classcard 351](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/167.jpeg) ![classcard 352](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/233.jpeg) ![classcard 353](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/331.jpeg) ![classcard 354](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/447.jpeg) ![classcard 355](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/415.jpeg) ![classcard 356](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/226.jpeg) ![classcard 357](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/455.jpeg) ![classcard 358](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/97.jpeg) ![classcard 359](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/358.jpeg) ![classcard 360](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/19.jpeg) ![classcard 361](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/105.jpeg) ![classcard 362](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/383.jpeg) ![classcard 363](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/125.jpeg) ![classcard 364](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/131.jpeg) ![classcard 365](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/130.jpeg) ![classcard 366](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/256.jpeg) ![classcard 367](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/47.jpeg) ![classcard 368](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/212.jpeg) ![classcard 369](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/347.jpeg) ![classcard 370](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/71.jpeg) ![classcard 371](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/165.jpeg) ![classcard 372](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/482.jpeg) ![classcard 373](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/191.jpeg) ![classcard 374](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/314.jpeg) ![classcard 375](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/348.jpeg) ![classcard 376](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/20.jpeg) ![classcard 377](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/139.jpeg) ![classcard 378](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/184.jpeg) ![classcard 379](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/161.jpeg) ![classcard 380](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/290.jpeg) ![classcard 381](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/15.jpeg) ![classcard 382](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/326.jpeg) ![classcard 383](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/312.jpeg) ![classcard 384](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/262.jpeg) ![classcard 385](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/483.jpeg) ![classcard 386](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/203.jpeg) ![classcard 387](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/371.jpeg) ![classcard 388](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/215.jpeg) ![classcard 389](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/315.jpeg) ![classcard 390](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/442.jpeg) ![classcard 391](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/16.jpeg) ![classcard 392](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/217.jpeg) ![classcard 393](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/44.jpeg) ![classcard 394](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/33.jpeg) ![classcard 395](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/117.jpeg) ![classcard 396](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/220.jpeg) ![classcard 397](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/425.jpeg) ![classcard 398](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/38.jpeg) ![classcard 399](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/248.jpeg) ![classcard 400](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/357.jpeg) ![classcard 401](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/443.jpeg) ![classcard 402](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/317.jpeg) ![classcard 403](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/485.jpeg) ![classcard 404](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/22.jpeg) ![classcard 405](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/295.jpeg) ![classcard 406](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/423.jpeg) ![classcard 407](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/106.jpeg) ![classcard 408](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/329.jpeg) ![classcard 409](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/340.jpeg) ![classcard 410](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/195.jpeg) ![classcard 411](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/349.jpeg) ![classcard 412](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/336.jpeg) ![classcard 413](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/201.jpeg) ![classcard 414](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/289.jpeg) ![classcard 415](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/378.jpeg) ![classcard 416](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/373.jpeg) ![classcard 417](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/86.jpeg) ![classcard 418](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/198.jpeg) ![classcard 419](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/174.jpeg) ![classcard 420](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/188.jpeg) ![classcard 421](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/412.jpeg) ![classcard 422](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/430.jpeg) ![classcard 423](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/311.jpeg) ![classcard 424](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/355.jpeg) ![classcard 425](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/363.jpeg) ![classcard 426](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/211.jpeg) ![classcard 427](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/88.jpeg) ![classcard 428](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/484.jpeg) ![classcard 429](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/265.jpeg) ![classcard 430](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/354.jpeg) ![classcard 431](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/107.jpeg) ![classcard 432](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/75.jpeg) ![classcard 433](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/79.jpeg) ![classcard 434](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/372.jpeg) ![classcard 435](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/221.jpeg) ![classcard 436](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/472.jpeg) ![classcard 437](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/141.jpeg) ![classcard 438](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/297.jpeg) ![classcard 439](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/267.jpeg) ![classcard 440](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/369.jpeg) ![classcard 441](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/401.jpeg) ![classcard 442](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/247.jpeg) ![classcard 443](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/27.jpeg) ![classcard 444](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/70.jpeg) ![classcard 445](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/91.jpeg) ![classcard 446](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/218.jpeg) ![classcard 447](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/411.jpeg) ![classcard 448](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/234.jpeg) ![classcard 449](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/142.jpeg) ![classcard 450](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/180.jpeg) ![classcard 451](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/299.jpeg) ![classcard 452](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/205.jpeg) ![classcard 453](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/457.jpeg) ![classcard 454](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/287.jpeg) ![classcard 455](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/350.jpeg) ![classcard 456](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/134.jpeg) ![classcard 457](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/391.jpeg) ![classcard 458](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/186.jpeg) ![classcard 459](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/437.jpeg) ![classcard 460](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/477.jpeg) ![classcard 461](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/187.jpeg) ![classcard 462](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/84.jpeg) ![classcard 463](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/196.jpeg) ![classcard 464](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/7.jpeg) ![classcard 465](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/35.jpeg) ![classcard 466](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/183.jpeg) ![classcard 467](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/397.jpeg) ![classcard 468](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/229.jpeg) ![classcard 469](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/101.jpeg) ![classcard 470](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/46.jpeg) ![classcard 471](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/111.jpeg) ![classcard 472](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/367.jpeg) ![classcard 473](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/463.jpeg) ![classcard 474](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/418.jpeg) ![classcard 475](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/143.jpeg) ![classcard 476](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/241.jpeg) ![classcard 477](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/138.jpeg) ![classcard 478](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/126.jpeg) ![classcard 479](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/438.jpeg) ![classcard 480](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/481.jpeg) ![classcard 481](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/313.jpeg) ![classcard 482](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/271.jpeg) ![classcard 483](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/409.jpeg) ![classcard 484](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/178.jpeg) ![classcard 485](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/102.jpeg) ![classcard 486](https://huggingface.co/sd-concepts-library/gba-fe-class-cards/resolve/main/concept_images/426.jpeg)
ef237e8b99f8c37217e3cc6b1d376ec0
apache-2.0
['generated_from_trainer']
false
all-roberta-large-v1-travel-4-16-5-oos This model is a fine-tuned version of [sentence-transformers/all-roberta-large-v1](https://huggingface.co/sentence-transformers/all-roberta-large-v1) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.1384 - Accuracy: 0.4289
1f7c67b8a0dd37cb1db0b2b0c7a6ce34
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.7625 | 1.0 | 1 | 2.5258 | 0.2933 | | 2.0955 | 2.0 | 2 | 2.3775 | 0.3333 | | 1.7076 | 3.0 | 3 | 2.2590 | 0.38 | | 1.3257 | 4.0 | 4 | 2.1788 | 0.4089 | | 1.1109 | 5.0 | 5 | 2.1384 | 0.4289 |
949d128bf790295a41fd4c0cc420274f
apache-2.0
['generated_from_keras_callback']
false
adeebt/opus-mt-en-ml-finetuned-en-to-ml This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-ml](https://huggingface.co/Helsinki-NLP/opus-mt-en-ml) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 2.5102 - Validation Loss: 2.2650 - Train Bleu: 6.9525 - Train Gen Len: 22.3542 - Epoch: 0
79e762db93a76bf64f9c57a0415814c2
apache-2.0
['generated_from_keras_callback']
false
Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 0.0002, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32
edd8b364c17b2733978363618e28cc0d
apache-2.0
['generated_from_keras_callback']
false
Training results | Train Loss | Validation Loss | Train Bleu | Train Gen Len | Epoch | |:----------:|:---------------:|:----------:|:-------------:|:-----:| | 2.5102 | 2.2650 | 6.9525 | 22.3542 | 0 |
890742843f6082e06bf1689f35005e22
creativeml-openrail-m
['text-to-image', 'stable-diffusion']
false
Model Dreambooth concept any-ely-wd-ira-olympus-3500 được train bởi hr16 bằng [Shinja Zero SoTA DreamBooth_Stable_Diffusion](https://colab.research.google.com/drive/1G7qx6M_S1PDDlsWIMdbZXwdZik6sUlEh) notebook <br> Test concept bằng [Shinja Zero no Notebook](https://colab.research.google.com/drive/1Hp1ZIjPbsZKlCtomJVmt2oX7733W44b0) <br> Hoặc test bằng `diffusers` [Colab Notebook for Inference](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/sd_dreambooth_inference.ipynb) Ảnh mẫu của concept: WIP
657bcb3fda43f7e22f051d641fa6ff4d
apache-2.0
['generated_from_trainer']
false
code-vs-nl This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on [bookcorpus](https://huggingface.co/datasets/bookcorpus) for text and [codeparrot/github-code](https://huggingface.co/datasets/codeparrot/github-code) for code datasets. It achieves the following results on the evaluation set: - Loss: 0.5180 - Accuracy: 0.9951 - F1 Score: 0.9950
531bc75527f9112c8e66cfed2a1caeb4
apache-2.0
['generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-07 - train_batch_size: 256 - eval_batch_size: 1024 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 1000
62e867de8d69ed5ad81845fa2477e090
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Score | |:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:| | 0.5732 | 0.07 | 500 | 0.5658 | 0.9934 | 0.9934 | | 0.5254 | 0.14 | 1000 | 0.5180 | 0.9951 | 0.9950 |
eaef2603d4f0b24c64d5e7da6b73a008
cc-by-4.0
['espnet', 'audio', 'text-to-speech']
false
`kan-bayashi/vctk_xvector_transformer` ♻️ Imported from https://zenodo.org/record/4393279/ This model was trained by kan-bayashi using vctk/tts1 recipe in [espnet](https://github.com/espnet/espnet/).
4cd57968b5a11b05e19700583ce5ebbf
apache-2.0
['vision', 'depth-estimation', 'generated_from_trainer']
false
glpn-nyu-finetuned-diode-230103-091356 This model is a fine-tuned version of [vinvino02/glpn-nyu](https://huggingface.co/vinvino02/glpn-nyu) on the diode-subset dataset. It achieves the following results on the evaluation set: - Loss: 0.4360 - Mae: 0.4251 - Rmse: 0.6169 - Abs Rel: 0.4500 - Log Mae: 0.1721 - Log Rmse: 0.2269 - Delta1: 0.3828 - Delta2: 0.6326 - Delta3: 0.8051
4721a3611374835354d8bfca2321fd52
apache-2.0
['vision', 'depth-estimation', 'generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 24 - eval_batch_size: 48 - seed: 2022 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.15 - num_epochs: 100 - mixed_precision_training: Native AMP
9b420df4728d7d3a6608910a587a2172
apache-2.0
['vision', 'depth-estimation', 'generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Mae | Rmse | Abs Rel | Log Mae | Log Rmse | Delta1 | Delta2 | Delta3 | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:-------:|:-------:|:--------:|:------:|:------:|:------:| | 1.0762 | 1.0 | 72 | 0.5031 | 0.4779 | 0.6690 | 0.5503 | 0.2006 | 0.2591 | 0.3020 | 0.5337 | 0.8000 | | 0.478 | 2.0 | 144 | 0.4653 | 0.4509 | 0.6307 | 0.4891 | 0.1861 | 0.2377 | 0.3300 | 0.5805 | 0.7734 | | 0.4668 | 3.0 | 216 | 0.4845 | 0.4712 | 0.6373 | 0.5469 | 0.1963 | 0.2471 | 0.3110 | 0.5254 | 0.7235 | | 0.4389 | 4.0 | 288 | 0.4587 | 0.4368 | 0.6219 | 0.4887 | 0.1787 | 0.2344 | 0.3578 | 0.6099 | 0.7926 | | 0.4626 | 5.0 | 360 | 0.4879 | 0.4662 | 0.6351 | 0.5617 | 0.1937 | 0.2482 | 0.3135 | 0.5462 | 0.7395 | | 0.4534 | 6.0 | 432 | 0.4638 | 0.4422 | 0.6236 | 0.4951 | 0.1810 | 0.2358 | 0.3606 | 0.5844 | 0.7831 | | 0.4108 | 7.0 | 504 | 0.4688 | 0.4508 | 0.6279 | 0.5050 | 0.1856 | 0.2385 | 0.3426 | 0.5701 | 0.7623 | | 0.3832 | 8.0 | 576 | 0.4759 | 0.4533 | 0.6284 | 0.5257 | 0.1869 | 0.2411 | 0.3331 | 0.5701 | 0.7617 | | 0.4097 | 9.0 | 648 | 0.4771 | 0.4501 | 0.6303 | 0.5361 | 0.1855 | 0.2433 | 0.3454 | 0.5838 | 0.7609 | | 0.3799 | 10.0 | 720 | 0.4575 | 0.4375 | 0.6240 | 0.4874 | 0.1790 | 0.2349 | 0.3669 | 0.6032 | 0.7916 | | 0.3659 | 11.0 | 792 | 0.4718 | 0.4590 | 0.6298 | 0.5176 | 0.1893 | 0.2396 | 0.3283 | 0.5502 | 0.7368 | | 0.4145 | 12.0 | 864 | 0.4776 | 0.4561 | 0.6298 | 0.5325 | 0.1883 | 0.2421 | 0.3333 | 0.5611 | 0.7540 | | 0.4224 | 13.0 | 936 | 0.4320 | 0.4138 | 0.6202 | 0.4013 | 0.1655 | 0.2232 | 0.4217 | 0.6641 | 0.8004 | | 0.4142 | 14.0 | 1008 | 0.4597 | 0.4440 | 0.6234 | 0.4842 | 0.1813 | 0.2330 | 0.3520 | 0.5895 | 0.7617 | | 0.4393 | 15.0 | 1080 | 0.4333 | 0.4251 | 0.6197 | 0.4182 | 0.1712 | 0.2225 | 0.3787 | 0.6303 | 0.8100 | | 0.4045 | 16.0 | 1152 | 0.4603 | 0.4356 | 0.6197 | 0.4819 | 0.1776 | 0.2322 | 0.3635 | 0.6050 | 0.7858 | | 0.3708 | 17.0 | 1224 | 0.4738 | 0.4567 | 0.6292 | 0.5264 | 0.1886 | 0.2411 | 0.3283 | 0.5557 | 0.7596 | | 0.4042 | 18.0 | 1296 | 0.5004 | 0.4802 | 0.6423 | 0.6101 | 0.2008 | 0.2560 | 0.3022 | 0.5165 | 0.6931 | | 0.3763 | 19.0 | 1368 | 0.4501 | 0.4361 | 0.6213 | 0.4723 | 0.1772 | 0.2303 | 0.3634 | 0.6034 | 0.7889 | | 0.4084 | 20.0 | 1440 | 0.4272 | 0.4133 | 0.6208 | 0.3958 | 0.1649 | 0.2226 | 0.4284 | 0.6684 | 0.8009 | | 0.3637 | 21.0 | 1512 | 0.4307 | 0.4145 | 0.6199 | 0.4134 | 0.1665 | 0.2241 | 0.3957 | 0.6847 | 0.8137 | | 0.3655 | 22.0 | 1584 | 0.4591 | 0.4374 | 0.6370 | 0.4594 | 0.1791 | 0.2384 | 0.3816 | 0.6264 | 0.7826 | | 0.3844 | 23.0 | 1656 | 0.4692 | 0.4444 | 0.6273 | 0.5241 | 0.1824 | 0.2407 | 0.3540 | 0.5990 | 0.7756 | | 0.428 | 24.0 | 1728 | 0.4982 | 0.4753 | 0.6403 | 0.6084 | 0.1984 | 0.2552 | 0.3099 | 0.5233 | 0.7204 | | 0.4051 | 25.0 | 1800 | 0.4824 | 0.4618 | 0.6329 | 0.5533 | 0.1915 | 0.2461 | 0.3248 | 0.5495 | 0.7415 | | 0.3584 | 26.0 | 1872 | 0.4434 | 0.4207 | 0.6177 | 0.4468 | 0.1694 | 0.2277 | 0.3975 | 0.6442 | 0.8038 | | 0.3443 | 27.0 | 1944 | 0.4602 | 0.4434 | 0.6241 | 0.4912 | 0.1822 | 0.2351 | 0.3431 | 0.5877 | 0.7893 | | 0.3714 | 28.0 | 2016 | 0.4818 | 0.4594 | 0.6316 | 0.5521 | 0.1900 | 0.2455 | 0.3283 | 0.5567 | 0.7493 | | 0.3688 | 29.0 | 2088 | 0.4443 | 0.4215 | 0.6242 | 0.4386 | 0.1702 | 0.2294 | 0.4024 | 0.6522 | 0.8065 | | 0.3615 | 30.0 | 2160 | 0.4462 | 0.4291 | 0.6189 | 0.4500 | 0.1739 | 0.2277 | 0.3792 | 0.6208 | 0.7896 | | 0.3655 | 31.0 | 2232 | 0.4808 | 0.4574 | 0.6305 | 0.5524 | 0.1893 | 0.2452 | 0.3322 | 0.5590 | 0.7460 | | 0.3576 | 32.0 | 2304 | 0.4321 | 0.4102 | 0.6182 | 0.4079 | 0.1640 | 0.2241 | 0.4296 | 0.6713 | 0.8074 | | 0.3947 | 33.0 | 2376 | 0.4468 | 0.4298 | 0.6232 | 0.4574 | 0.1744 | 0.2306 | 0.3873 | 0.6163 | 0.7873 | | 0.3402 | 34.0 | 2448 | 0.4565 | 0.4352 | 0.6195 | 0.4913 | 0.1776 | 0.2337 | 0.3734 | 0.6039 | 0.7865 | | 0.3412 | 35.0 | 2520 | 0.4438 | 0.4261 | 0.6180 | 0.4546 | 0.1728 | 0.2279 | 0.3778 | 0.6252 | 0.8043 | | 0.3547 | 36.0 | 2592 | 0.4577 | 0.4416 | 0.6218 | 0.4868 | 0.1807 | 0.2329 | 0.3517 | 0.5862 | 0.7862 | | 0.3425 | 37.0 | 2664 | 0.4682 | 0.4511 | 0.6285 | 0.5210 | 0.1860 | 0.2406 | 0.3411 | 0.5748 | 0.7694 | | 0.3853 | 38.0 | 2736 | 0.4752 | 0.4514 | 0.6289 | 0.5458 | 0.1863 | 0.2438 | 0.3408 | 0.5721 | 0.7760 | | 0.3643 | 39.0 | 2808 | 0.4737 | 0.4547 | 0.6291 | 0.5401 | 0.1875 | 0.2428 | 0.3316 | 0.5673 | 0.7617 | | 0.398 | 40.0 | 2880 | 0.4662 | 0.4467 | 0.6274 | 0.5124 | 0.1838 | 0.2394 | 0.3514 | 0.5823 | 0.7700 | | 0.3579 | 41.0 | 2952 | 0.4781 | 0.4545 | 0.6290 | 0.5513 | 0.1880 | 0.2446 | 0.3343 | 0.5624 | 0.7718 | | 0.3545 | 42.0 | 3024 | 0.4460 | 0.4277 | 0.6221 | 0.4553 | 0.1730 | 0.2294 | 0.3862 | 0.6285 | 0.7999 | | 0.3527 | 43.0 | 3096 | 0.4330 | 0.4153 | 0.6169 | 0.4221 | 0.1668 | 0.2240 | 0.4106 | 0.6618 | 0.8084 | | 0.3251 | 44.0 | 3168 | 0.4503 | 0.4286 | 0.6172 | 0.4781 | 0.1744 | 0.2313 | 0.3725 | 0.6224 | 0.8095 | | 0.3433 | 45.0 | 3240 | 0.4471 | 0.4346 | 0.6187 | 0.4652 | 0.1772 | 0.2293 | 0.3606 | 0.6043 | 0.7952 | | 0.3607 | 46.0 | 3312 | 0.4474 | 0.4263 | 0.6166 | 0.4658 | 0.1728 | 0.2293 | 0.3835 | 0.6287 | 0.8039 | | 0.3722 | 47.0 | 3384 | 0.4527 | 0.4337 | 0.6205 | 0.4857 | 0.1768 | 0.2329 | 0.3696 | 0.6084 | 0.7922 | | 0.3322 | 48.0 | 3456 | 0.4629 | 0.4431 | 0.6236 | 0.5118 | 0.1818 | 0.2373 | 0.3460 | 0.5897 | 0.7954 | | 0.3624 | 49.0 | 3528 | 0.4431 | 0.4304 | 0.6203 | 0.4511 | 0.1742 | 0.2277 | 0.3827 | 0.6152 | 0.7917 | | 0.3386 | 50.0 | 3600 | 0.4475 | 0.4260 | 0.6173 | 0.4697 | 0.1727 | 0.2301 | 0.3870 | 0.6283 | 0.8102 | | 0.3316 | 51.0 | 3672 | 0.4558 | 0.4328 | 0.6194 | 0.4982 | 0.1770 | 0.2345 | 0.3618 | 0.6120 | 0.8124 | | 0.3259 | 52.0 | 3744 | 0.4316 | 0.4084 | 0.6165 | 0.4234 | 0.1630 | 0.2245 | 0.4311 | 0.6809 | 0.8148 | | 0.3299 | 53.0 | 3816 | 0.4489 | 0.4222 | 0.6198 | 0.4779 | 0.1706 | 0.2327 | 0.4049 | 0.6441 | 0.8021 | | 0.3334 | 54.0 | 3888 | 0.4831 | 0.4598 | 0.6319 | 0.5716 | 0.1902 | 0.2476 | 0.3281 | 0.5597 | 0.7549 | | 0.3342 | 55.0 | 3960 | 0.4478 | 0.4288 | 0.6166 | 0.4786 | 0.1745 | 0.2310 | 0.3749 | 0.6218 | 0.8091 | | 0.3276 | 56.0 | 4032 | 0.4524 | 0.4342 | 0.6192 | 0.4852 | 0.1773 | 0.2326 | 0.3596 | 0.6113 | 0.8007 | | 0.326 | 57.0 | 4104 | 0.4411 | 0.4226 | 0.6162 | 0.4486 | 0.1704 | 0.2268 | 0.3947 | 0.6403 | 0.7959 | | 0.3429 | 58.0 | 4176 | 0.4578 | 0.4418 | 0.6221 | 0.4961 | 0.1812 | 0.2349 | 0.3497 | 0.5956 | 0.7750 | | 0.3347 | 59.0 | 4248 | 0.4586 | 0.4409 | 0.6220 | 0.4946 | 0.1808 | 0.2347 | 0.3439 | 0.6004 | 0.7869 | | 0.3215 | 60.0 | 4320 | 0.4583 | 0.4382 | 0.6232 | 0.4974 | 0.1789 | 0.2357 | 0.3667 | 0.6008 | 0.7855 | | 0.331 | 61.0 | 4392 | 0.4412 | 0.4206 | 0.6145 | 0.4579 | 0.1699 | 0.2276 | 0.3966 | 0.6413 | 0.8047 | | 0.3124 | 62.0 | 4464 | 0.4455 | 0.4236 | 0.6181 | 0.4727 | 0.1715 | 0.2313 | 0.3902 | 0.6417 | 0.8098 | | 0.322 | 63.0 | 4536 | 0.4406 | 0.4230 | 0.6143 | 0.4548 | 0.1716 | 0.2269 | 0.3775 | 0.6425 | 0.8115 | | 0.3194 | 64.0 | 4608 | 0.4473 | 0.4331 | 0.6193 | 0.4657 | 0.1765 | 0.2297 | 0.3606 | 0.6122 | 0.8014 | | 0.3159 | 65.0 | 4680 | 0.4407 | 0.4225 | 0.6186 | 0.4548 | 0.1712 | 0.2293 | 0.3913 | 0.6433 | 0.8075 | | 0.3118 | 66.0 | 4752 | 0.4478 | 0.4258 | 0.6169 | 0.4801 | 0.1728 | 0.2315 | 0.3762 | 0.6391 | 0.8064 | | 0.336 | 67.0 | 4824 | 0.4659 | 0.4463 | 0.6252 | 0.5210 | 0.1834 | 0.2394 | 0.3464 | 0.5820 | 0.7786 | | 0.3233 | 68.0 | 4896 | 0.4370 | 0.4208 | 0.6168 | 0.4452 | 0.1696 | 0.2265 | 0.4019 | 0.6425 | 0.8059 | | 0.3285 | 69.0 | 4968 | 0.4479 | 0.4340 | 0.6189 | 0.4773 | 0.1771 | 0.2312 | 0.3609 | 0.6136 | 0.7972 | | 0.3186 | 70.0 | 5040 | 0.4469 | 0.4308 | 0.6198 | 0.4698 | 0.1751 | 0.2310 | 0.3741 | 0.6219 | 0.7966 | | 0.3351 | 71.0 | 5112 | 0.4476 | 0.4292 | 0.6176 | 0.4769 | 0.1745 | 0.2311 | 0.3718 | 0.6220 | 0.8035 | | 0.3286 | 72.0 | 5184 | 0.4415 | 0.4229 | 0.6155 | 0.4655 | 0.1713 | 0.2289 | 0.3816 | 0.6376 | 0.8117 | | 0.3135 | 73.0 | 5256 | 0.4527 | 0.4335 | 0.6198 | 0.4918 | 0.1769 | 0.2338 | 0.3621 | 0.6152 | 0.8036 | | 0.3244 | 74.0 | 5328 | 0.4449 | 0.4290 | 0.6171 | 0.4685 | 0.1746 | 0.2296 | 0.3667 | 0.6234 | 0.8073 | | 0.3253 | 75.0 | 5400 | 0.4450 | 0.4303 | 0.6182 | 0.4680 | 0.1750 | 0.2296 | 0.3703 | 0.6185 | 0.8013 | | 0.3072 | 76.0 | 5472 | 0.4312 | 0.4212 | 0.6161 | 0.4337 | 0.1700 | 0.2242 | 0.3840 | 0.6411 | 0.8104 | | 0.3159 | 77.0 | 5544 | 0.4434 | 0.4314 | 0.6186 | 0.4636 | 0.1754 | 0.2290 | 0.3643 | 0.6171 | 0.7996 | | 0.3176 | 78.0 | 5616 | 0.4319 | 0.4207 | 0.6177 | 0.4330 | 0.1695 | 0.2249 | 0.3889 | 0.6524 | 0.8080 | | 0.3243 | 79.0 | 5688 | 0.4432 | 0.4304 | 0.6186 | 0.4698 | 0.1752 | 0.2302 | 0.3667 | 0.6218 | 0.8058 | | 0.3183 | 80.0 | 5760 | 0.4438 | 0.4288 | 0.6175 | 0.4665 | 0.1742 | 0.2294 | 0.3730 | 0.6235 | 0.8030 | | 0.323 | 81.0 | 5832 | 0.4365 | 0.4248 | 0.6170 | 0.4480 | 0.1716 | 0.2263 | 0.3820 | 0.6313 | 0.8056 | | 0.3348 | 82.0 | 5904 | 0.4385 | 0.4280 | 0.6179 | 0.4532 | 0.1738 | 0.2273 | 0.3651 | 0.6249 | 0.8099 | | 0.2948 | 83.0 | 5976 | 0.4456 | 0.4330 | 0.6190 | 0.4727 | 0.1763 | 0.2305 | 0.3622 | 0.6121 | 0.7981 | | 0.3156 | 84.0 | 6048 | 0.4349 | 0.4236 | 0.6155 | 0.4442 | 0.1712 | 0.2252 | 0.3834 | 0.6331 | 0.8086 | | 0.3227 | 85.0 | 6120 | 0.4352 | 0.4251 | 0.6160 | 0.4423 | 0.1719 | 0.2250 | 0.3799 | 0.6293 | 0.8055 | | 0.3044 | 86.0 | 6192 | 0.4349 | 0.4235 | 0.6165 | 0.4444 | 0.1714 | 0.2259 | 0.3858 | 0.6312 | 0.8108 | | 0.3067 | 87.0 | 6264 | 0.4293 | 0.4214 | 0.6150 | 0.4293 | 0.1700 | 0.2229 | 0.3862 | 0.6397 | 0.8102 | | 0.3083 | 88.0 | 6336 | 0.4260 | 0.4164 | 0.6139 | 0.4229 | 0.1673 | 0.2221 | 0.3989 | 0.6536 | 0.8126 | | 0.2989 | 89.0 | 6408 | 0.4381 | 0.4270 | 0.6168 | 0.4526 | 0.1731 | 0.2270 | 0.3766 | 0.6248 | 0.8051 | | 0.3232 | 90.0 | 6480 | 0.4352 | 0.4230 | 0.6158 | 0.4480 | 0.1711 | 0.2263 | 0.3854 | 0.6358 | 0.8112 | | 0.3201 | 91.0 | 6552 | 0.4361 | 0.4242 | 0.6164 | 0.4462 | 0.1718 | 0.2262 | 0.3842 | 0.6327 | 0.8078 | | 0.3096 | 92.0 | 6624 | 0.4390 | 0.4273 | 0.6171 | 0.4563 | 0.1733 | 0.2279 | 0.3790 | 0.6237 | 0.8046 | | 0.322 | 93.0 | 6696 | 0.4338 | 0.4229 | 0.6157 | 0.4447 | 0.1709 | 0.2258 | 0.3889 | 0.6351 | 0.8069 | | 0.3096 | 94.0 | 6768 | 0.4348 | 0.4238 | 0.6160 | 0.4448 | 0.1714 | 0.2256 | 0.3839 | 0.6342 | 0.8077 | | 0.3067 | 95.0 | 6840 | 0.4414 | 0.4298 | 0.6181 | 0.4628 | 0.1748 | 0.2290 | 0.3707 | 0.6205 | 0.8027 | | 0.3198 | 96.0 | 6912 | 0.4334 | 0.4228 | 0.6162 | 0.4434 | 0.1709 | 0.2258 | 0.3872 | 0.6370 | 0.8077 | | 0.295 | 97.0 | 6984 | 0.4367 | 0.4261 | 0.6169 | 0.4507 | 0.1728 | 0.2269 | 0.3791 | 0.6283 | 0.8045 | | 0.305 | 98.0 | 7056 | 0.4373 | 0.4266 | 0.6171 | 0.4524 | 0.1730 | 0.2273 | 0.3781 | 0.6280 | 0.8046 | | 0.3304 | 99.0 | 7128 | 0.4334 | 0.4230 | 0.6162 | 0.4432 | 0.1709 | 0.2257 | 0.3874 | 0.6378 | 0.8062 | | 0.3099 | 100.0 | 7200 | 0.4360 | 0.4251 | 0.6169 | 0.4500 | 0.1721 | 0.2269 | 0.3828 | 0.6326 | 0.8051 |
e921bbfeb39847ebb734416182df0e6a
apache-2.0
['generated_from_trainer']
false
convnext-tiny-finetuned-dogfood This model is a fine-tuned version of [facebook/convnext-tiny-224](https://huggingface.co/facebook/convnext-tiny-224) on the lewtun/dog_food dataset. It achieves the following results on the evaluation set: - Loss: 0.9277 - Accuracy: 0.7253
1ace32080d355273b1ecfae151fa6ef8
apache-2.0
['generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 1
3b5859c9570aac5592202b637ad85a4f
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.0681 | 1.0 | 16 | 0.9125 | 0.7422 |
ba602832df205bded785d99df4488bc9
apache-2.0
['whisper-event']
false
Whisper Kannada Small This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the Kannada data available from multiple publicly available ASR corpuses. It has been fine-tuned as a part of the Whisper fine-tuning sprint.
a39ccfadd8336c9fb0ef0bd42836ddf9
apache-2.0
['whisper-event']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1.7e-05 - train_batch_size: 48 - eval_batch_size: 32 - seed: 22 - optimizer: adamw_bnb_8bit - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 10000 - training_steps: 12033 (terminated upon convergence. Initially set to 51570 steps) - mixed_precision_training: True
63c0af988a387e55dd5c372c6a36b0d9
mit
['generated_from_keras_callback']
false
DLL888/roberta-base-squad This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.7054 - Train End Logits Accuracy: 0.8022 - Train Start Logits Accuracy: 0.7586 - Validation Loss: 0.8224 - Validation End Logits Accuracy: 0.7692 - Validation Start Logits Accuracy: 0.7402 - Epoch: 1
f293fd76bbbb000e5c6ce7b40292e4dc
mit
['generated_from_keras_callback']
false
Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'Adam', 'learning_rate': {'class_name': 'WarmUp', 'config': {'initial_learning_rate': 2e-05, 'decay_schedule_fn': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 10570, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, '__passive_serialization__': True}, 'warmup_steps': 500, 'power': 1.0, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False} - training_precision: mixed_float16
ca2c663af14132432b23983fd664c494
mit
['generated_from_keras_callback']
false
Training results | Train Loss | Train End Logits Accuracy | Train Start Logits Accuracy | Validation Loss | Validation End Logits Accuracy | Validation Start Logits Accuracy | Epoch | |:----------:|:-------------------------:|:---------------------------:|:---------------:|:------------------------------:|:--------------------------------:|:-----:| | 1.1613 | 0.7038 | 0.6632 | 0.8676 | 0.7626 | 0.7342 | 0 | | 0.7054 | 0.8022 | 0.7586 | 0.8224 | 0.7692 | 0.7402 | 1 |
954974e5d939dce644ff9db184565b62
apache-2.0
['question_generator', 'generated_from_trainer']
false
t5-base-squadqtngen This model is a fine-tuned version of [ManujArora/t5-base-squadqtngen](https://huggingface.co/ManujArora/t5-base-squadqtngen) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.7049
8e2b137b16370e836aca6dfe6593823a
apache-2.0
['question_generator', 'generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 248 | 1.6398 | | No log | 2.0 | 496 | 1.6440 | | No log | 3.0 | 744 | 1.6594 | | No log | 4.0 | 992 | 1.6720 | | No log | 5.0 | 1240 | 1.6824 | | No log | 6.0 | 1488 | 1.6949 | | No log | 7.0 | 1736 | 1.7032 | | No log | 8.0 | 1984 | 1.7049 |
1dc076c8d34fb123a62673bd12f4cef3
apache-2.0
[]
false
🇺🇦 Join Ukrainian Speech Recognition Community - https://t.me/speech_recognition_uk ⭐ See other Ukrainian models - https://github.com/egorsmkv/speech-recognition-uk This model has been trained on noisy data in order to make the acoustic model robust to noisy audio data. This model has apostrophes and hyphens. The language model is trained on the texts of the Common Voice dataset, which is used during training. Special thanks for noised data to **Dmytro Chaplynsky**, https://lang.org.ua Noisy dataset: - Transcriptions: https://www.dropbox.com/s/ohj3y2cq8f4207a/transcriptions.zip?dl=0 - Audio files: https://www.dropbox.com/s/v8crgclt9opbrv1/data.zip?dl=0 Metrics: | Dataset | CER | WER | |-|-|-| | CV10 (no LM) | 0.0515 | 0.2617 | | CV10 (with LM) | 0.0148 | 0.0524 | Metrics on noisy data with [standard model](https://huggingface.co/Yehor/wav2vec2-xls-r-300m-uk-with-small-lm): | Dataset | CER | WER | |-|-|-| | CV10 (no LM) | 0.1064 | 0.3926 | | CV10 (with LM) | 0.0497 | 0.1265 | More: - The same model, but trained on raw Common Voice data: https://huggingface.co/Yehor/wav2vec2-xls-r-300m-uk-with-small-lm
8be2d09d808064d880e2be8702e02b21
apache-2.0
['generated_from_trainer']
false
bert-finetuned-ner1 This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the conll2003 dataset. It achieves the following results on the evaluation set: - Loss: 0.0584 - Precision: 0.9286 - Recall: 0.9475 - F1: 0.9379 - Accuracy: 0.9859
1b6ee7e23dc4c0e6ba0714720649aa26
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.2183 | 1.0 | 878 | 0.0753 | 0.9087 | 0.9291 | 0.9188 | 0.9800 | | 0.0462 | 2.0 | 1756 | 0.0614 | 0.9329 | 0.9470 | 0.9399 | 0.9858 | | 0.0244 | 3.0 | 2634 | 0.0584 | 0.9286 | 0.9475 | 0.9379 | 0.9859 |
b4a0de6b1b7f50eb840fcacaea3fe7c6
mit
[]
false
Bamse og kylling on Stable Diffusion This is the `<bamse-kylling>` concept taught to Stable Diffusion via Textual Inversion. You can load this concept into the [Stable Conceptualizer](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/stable_conceptualizer_inference.ipynb) notebook. You can also train your own concepts and load them into the concept libraries using [this notebook](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/sd_textual_inversion_training.ipynb). Here is the new concept you will be able to use as an `object`: ![<bamse-kylling> 0](https://huggingface.co/sd-concepts-library/bamse-og-kylling/resolve/main/concept_images/2.jpeg) ![<bamse-kylling> 1](https://huggingface.co/sd-concepts-library/bamse-og-kylling/resolve/main/concept_images/1.jpeg) ![<bamse-kylling> 2](https://huggingface.co/sd-concepts-library/bamse-og-kylling/resolve/main/concept_images/0.jpeg) ![<bamse-kylling> 3](https://huggingface.co/sd-concepts-library/bamse-og-kylling/resolve/main/concept_images/3.jpeg) ![<bamse-kylling> 4](https://huggingface.co/sd-concepts-library/bamse-og-kylling/resolve/main/concept_images/4.jpeg)
3e567e393b24201e0af42f443d104a4a
apache-2.0
[]
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 128 - eval_batch_size: 16 - gradient_accumulation_steps: 1 - optimizer: AdamW with betas=(0.95, 0.999), weight_decay=1e-06 and epsilon=1e-08 - lr_scheduler: cosine - lr_warmup_steps: 500 - ema_inv_gamma: 1.0 - ema_inv_gamma: 0.75 - ema_inv_gamma: 0.9999 - mixed_precision: no
76f4d4ad311ef4fe3ea5a9df8f6690fb
apache-2.0
['generated_from_trainer']
false
distilbert-base-uncased_fold_7_ternary_v1 This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.0462 - F1: 0.7836
b183fef04ee40b3942f0d3f908471f61
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:----:|:---------------:|:------:| | No log | 1.0 | 291 | 0.5719 | 0.7490 | | 0.5541 | 2.0 | 582 | 0.5563 | 0.7836 | | 0.5541 | 3.0 | 873 | 0.7301 | 0.7849 | | 0.2509 | 4.0 | 1164 | 0.8073 | 0.7926 | | 0.2509 | 5.0 | 1455 | 1.0842 | 0.7823 | | 0.1182 | 6.0 | 1746 | 1.1721 | 0.7900 | | 0.0537 | 7.0 | 2037 | 1.4060 | 0.7785 | | 0.0537 | 8.0 | 2328 | 1.4497 | 0.7836 | | 0.0262 | 9.0 | 2619 | 1.4722 | 0.7708 | | 0.0262 | 10.0 | 2910 | 1.6529 | 0.7772 | | 0.0131 | 11.0 | 3201 | 1.6573 | 0.7862 | | 0.0131 | 12.0 | 3492 | 1.6986 | 0.7823 | | 0.0115 | 13.0 | 3783 | 1.7765 | 0.7810 | | 0.0098 | 14.0 | 4074 | 1.8036 | 0.7862 | | 0.0098 | 15.0 | 4365 | 1.7684 | 0.7926 | | 0.0028 | 16.0 | 4656 | 1.8385 | 0.7836 | | 0.0028 | 17.0 | 4947 | 1.7903 | 0.7887 | | 0.0054 | 18.0 | 5238 | 1.9065 | 0.7810 | | 0.0007 | 19.0 | 5529 | 1.9331 | 0.7875 | | 0.0007 | 20.0 | 5820 | 1.9384 | 0.7849 | | 0.0006 | 21.0 | 6111 | 1.8687 | 0.7887 | | 0.0006 | 22.0 | 6402 | 2.0603 | 0.7785 | | 0.0009 | 23.0 | 6693 | 2.0403 | 0.7836 | | 0.0009 | 24.0 | 6984 | 2.0348 | 0.7810 | | 0.0005 | 25.0 | 7275 | 2.0462 | 0.7836 |
19527ac72cb489485c27e460f1b0be0d
mit
['text-classification', 'generated_from_trainer']
false
deberta-v3-xsmall-with-biblio-context-finetuned-review_classifier This model is a fine-tuned version of [microsoft/deberta-v3-xsmall](https://huggingface.co/microsoft/deberta-v3-xsmall) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0979 - Accuracy: 0.9682 - F1: 0.8332 - Recall: 0.8466 - Precision: 0.8202
93c77d96fa68c6c3d986c60965c708ca
mit
['text-classification', 'generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 4.5e-05 - train_batch_size: 12 - eval_batch_size: 12 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 1000 - num_epochs: 2 - mixed_precision_training: Native AMP
cb5c4cb7be59353674eaae820865c364
mit
['text-classification', 'generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Recall | Precision | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|:------:|:---------:| | 0.1539 | 1.0 | 6667 | 0.1237 | 0.9584 | 0.7668 | 0.7307 | 0.8067 | | 0.1271 | 2.0 | 13334 | 0.0979 | 0.9682 | 0.8332 | 0.8466 | 0.8202 |
b1663ba3a852e61ddbdeeb5a1e997cca
['mit']
[]
false
gunghio/xlm-roberta-base-finetuned-panx-ner This model was trained starting from xlm-roberta-base on a subset of xtreme dataset. `xtreme` datasets subsets used are: PAN-X.{lang}. Language used for training/validation are: italian, english, german, french and spanish. Only 75% of the whole dataset was used.
37786783ecaf13ef115f1374f82e594e
['mit']
[]
false
Training results It achieves the following results on the evaluation set: - Precision: 0.8744154472771157 - Recall: 0.8791424269015351 - F1: 0.8767725659462058 - Accuracy: 0.9432040948504613 Details: | Label | Precision | Recall | F1-Score | Support | |---------|-----------|--------|----------|---------| | PER | 0.922 | 0.908 | 0.915 | 26639 | | LOC | 0.880 | 0.906 | 0.892 | 37623 | | ORG | 0.821 | 0.816 | 0.818 | 28045 | | Overall | 0.874 | 0.879 | 0.877 | 92307 |
b74057e75e51afe86a85232e2b5ac81e
['mit']
[]
false
transformers.TokenClassificationPipeline). ```python from transformers import AutoTokenizer, AutoModelForTokenClassification from transformers import pipeline tokenizer = AutoTokenizer.from_pretrained("gunghio/xlm-roberta-base-finetuned-panx-ner") model = AutoModelForTokenClassification.from_pretrained("gunghio/xlm-roberta-base-finetuned-panx-ner") nlp = pipeline("ner", model=model, tokenizer=tokenizer, aggregation_strategy="first") example = "My name is Wolfgang and I live in Berlin" ner_results = nlp(example) print(ner_results) ```
8b4b4397b1e0d363c921d7ef3966022f
apache-2.0
['pytorch', 'diffusers']
false
Abstract An ideal music synthesizer should be both interactive and expressive, generating high-fidelity audio in realtime for arbitrary combinations of instruments and notes. Recent neural synthesizers have exhibited a tradeoff between domain-specific models that offer detailed control of only specific instruments, or raw waveform models that can train on any music but with minimal control and slow generation. In this work, we focus on a middle ground of neural synthesizers that can generate audio from MIDI sequences with arbitrary combinations of instruments in realtime. This enables training on a wide range of transcription datasets with a single model, which in turn offers note-level control of composition and instrumentation across a wide range of instruments. We use a simple two-stage process: MIDI to spectrograms with an encoder-decoder Transformer, then spectrograms to audio with a generative adversarial network (GAN) spectrogram inverter. We compare training the decoder as an autoregressive model and as a Denoising Diffusion Probabilistic Model (DDPM) and find that the DDPM approach is superior both qualitatively and as measured by audio reconstruction and Fréchet distance metrics. Given the interactivity and generality of this approach, we find this to be a promising first step towards interactive and expressive neural synthesis for arbitrary combinations of instruments and notes. <img src="https://storage.googleapis.com/music-synthesis-with-spectrogram-diffusion/architecture.png" alt="Architecture diagram">
8386957d5394f7a130eb8e19e53083a5
apache-2.0
[]
false
Model Summary > We present BLOOMZ & mT0, a family of models capable of following human instructions in dozens of languages zero-shot. We finetune BLOOM & mT5 pretrained multilingual language models on our crosslingual task mixture (xP3) and find our resulting models capable of crosslingual generalization to unseen tasks & languages. - **Repository:** [bigscience-workshop/xmtf](https://github.com/bigscience-workshop/xmtf) - **Paper:** [Crosslingual Generalization through Multitask Finetuning](https://arxiv.org/abs/2211.01786) - **Point of Contact:** [Niklas Muennighoff](mailto:niklas@hf.co) - **Languages:** Refer to [mc4](https://huggingface.co/datasets/mc4) for pretraining & [xP3](https://huggingface.co/bigscience/xP3) for finetuning language proportions. It understands both pretraining & finetuning languages. - **BLOOMZ & mT0 Model Family:** <div class="max-w-full overflow-auto"> <table> <tr> <th colspan="12">Multitask finetuned on <a style="font-weight:bold" href=https://huggingface.co/datasets/bigscience/xP3>xP3</a>. Recommended for prompting in English. </tr> <tr> <td>Parameters</td> <td>300M</td> <td>580M</td> <td>1.2B</td> <td>3.7B</td> <td>13B</td> <td>560M</td> <td>1.1B</td> <td>1.7B</td> <td>3B</td> <td>7.1B</td> <td>176B</td> </tr> <tr> <td>Finetuned Model</td> <td><a href=https://huggingface.co/bigscience/mt0-small>mt0-small</a></td> <td><a href=https://huggingface.co/bigscience/mt0-base>mt0-base</a></td> <td><a href=https://huggingface.co/bigscience/mt0-large>mt0-large</a></td> <td><a href=https://huggingface.co/bigscience/mt0-xl>mt0-xl</a></td> <td><a href=https://huggingface.co/bigscience/mt0-xxl>mt0-xxl</a></td> <td><a href=https://huggingface.co/bigscience/bloomz-560m>bloomz-560m</a></td> <td><a href=https://huggingface.co/bigscience/bloomz-1b1>bloomz-1b1</a></td> <td><a href=https://huggingface.co/bigscience/bloomz-1b7>bloomz-1b7</a></td> <td><a href=https://huggingface.co/bigscience/bloomz-3b>bloomz-3b</a></td> <td><a href=https://huggingface.co/bigscience/bloomz-7b1>bloomz-7b1</a></td> <td><a href=https://huggingface.co/bigscience/bloomz>bloomz</a></td> </tr> </tr> <tr> <th colspan="12">Multitask finetuned on <a style="font-weight:bold" href=https://huggingface.co/datasets/bigscience/xP3mt>xP3mt</a>. Recommended for prompting in non-English.</th> </tr> <tr> <td>Finetuned Model</td> <td></td> <td></td> <td></td> <td></td> <td><a href=https://huggingface.co/bigscience/mt0-xxl-mt>mt0-xxl-mt</a></td> <td></td> <td></td> <td></td> <td></td> <td><a href=https://huggingface.co/bigscience/bloomz-7b1-mt>bloomz-7b1-mt</a></td> <td><a href=https://huggingface.co/bigscience/bloomz-mt>bloomz-mt</a></td> </tr> <th colspan="12">Multitask finetuned on <a style="font-weight:bold" href=https://huggingface.co/datasets/Muennighoff/P3>P3</a>. Released for research purposes only. Strictly inferior to above models!</th> </tr> <tr> <td>Finetuned Model</td> <td></td> <td></td> <td></td> <td></td> <td><a href=https://huggingface.co/bigscience/mt0-xxl-p3>mt0-xxl-p3</a></td> <td></td> <td></td> <td></td> <td></td> <td><a href=https://huggingface.co/bigscience/bloomz-7b1-p3>bloomz-7b1-p3</a></td> <td><a href=https://huggingface.co/bigscience/bloomz-p3>bloomz-p3</a></td> </tr> <th colspan="12">Original pretrained checkpoints. Not recommended.</th> <tr> <td>Pretrained Model</td> <td><a href=https://huggingface.co/google/mt5-small>mt5-small</a></td> <td><a href=https://huggingface.co/google/mt5-base>mt5-base</a></td> <td><a href=https://huggingface.co/google/mt5-large>mt5-large</a></td> <td><a href=https://huggingface.co/google/mt5-xl>mt5-xl</a></td> <td><a href=https://huggingface.co/google/mt5-xxl>mt5-xxl</a></td> <td><a href=https://huggingface.co/bigscience/bloom-560m>bloom-560m</a></td> <td><a href=https://huggingface.co/bigscience/bloom-1b1>bloom-1b1</a></td> <td><a href=https://huggingface.co/bigscience/bloom-1b7>bloom-1b7</a></td> <td><a href=https://huggingface.co/bigscience/bloom-3b>bloom-3b</a></td> <td><a href=https://huggingface.co/bigscience/bloom-7b1>bloom-7b1</a></td> <td><a href=https://huggingface.co/bigscience/bloom>bloom</a></td> </tr> </table> </div>
83535acfca033749c998e79cce3ba9c7
apache-2.0
[]
false
pip install -q transformers from transformers import AutoModelForSeq2SeqLM, AutoTokenizer checkpoint = "bigscience/mt0-xxl-p3" tokenizer = AutoTokenizer.from_pretrained(checkpoint) model = AutoModelForSeq2SeqLM.from_pretrained(checkpoint) inputs = tokenizer.encode("Translate to English: Je t’aime.", return_tensors="pt") outputs = model.generate(inputs) print(tokenizer.decode(outputs[0])) ``` </details>
af83ea72b63dfd4073ecd097dbddc406
apache-2.0
[]
false
pip install -q transformers accelerate from transformers import AutoModelForSeq2SeqLM, AutoTokenizer checkpoint = "bigscience/mt0-xxl-p3" tokenizer = AutoTokenizer.from_pretrained(checkpoint) model = AutoModelForSeq2SeqLM.from_pretrained(checkpoint, torch_dtype="auto", device_map="auto") inputs = tokenizer.encode("Translate to English: Je t’aime.", return_tensors="pt").to("cuda") outputs = model.generate(inputs) print(tokenizer.decode(outputs[0])) ``` </details>
467f020730c6b0904a362ed3e735e31f
apache-2.0
[]
false
pip install -q transformers accelerate bitsandbytes from transformers import AutoModelForSeq2SeqLM, AutoTokenizer checkpoint = "bigscience/mt0-xxl-p3" tokenizer = AutoTokenizer.from_pretrained(checkpoint) model = AutoModelForSeq2SeqLM.from_pretrained(checkpoint, device_map="auto", load_in_8bit=True) inputs = tokenizer.encode("Translate to English: Je t’aime.", return_tensors="pt").to("cuda") outputs = model.generate(inputs) print(tokenizer.decode(outputs[0])) ``` </details> <!-- Necessary for whitespace -->
8685ca454d0637444486f11315028333
apache-2.0
[]
false
Model - **Architecture:** Same as [mt5-xxl](https://huggingface.co/google/mt5-xxl), also refer to the `config.json` file - **Finetuning steps:** 7000 - **Finetuning tokens:** 1.29 billion - **Precision:** bfloat16
7b0b729046da8b1b437e787e96958652
mit
['fill-mask', 'generated_from_trainer']
false
deberta-v3-large-dapt-scientific-papers-pubmed-tapt This model is a fine-tuned version of [domenicrosati/deberta-v3-large-dapt-scientific-papers-pubmed](https://huggingface.co/domenicrosati/deberta-v3-large-dapt-scientific-papers-pubmed) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.4429 - Accuracy: 0.5915
bedb6a4cd759516523aca192d6f1e1d5
mit
['fill-mask', 'generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 - mixed_precision_training: Native AMP
bca91d46b19b772b788de99531e202dc
mit
['fill-mask', 'generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 3.3855 | 1.0 | 4134 | 3.2334 | 0.4953 | | 2.9224 | 2.0 | 8268 | 2.8317 | 0.5430 | | 2.703 | 3.0 | 12402 | 2.6141 | 0.5665 | | 2.4963 | 4.0 | 16536 | 2.4918 | 0.5855 | | 2.399 | 5.0 | 20670 | 2.4429 | 0.5915 |
f046abc610b51046ba6a699b303574e4
apache-2.0
['automatic-speech-recognition', 'sv-SE']
false
exp_w2v2t_sv-se_vp-fr_s237 Fine-tuned [facebook/wav2vec2-large-fr-voxpopuli](https://huggingface.co/facebook/wav2vec2-large-fr-voxpopuli) for speech recognition using the train split of [Common Voice 7.0 (sv-SE)](https://huggingface.co/datasets/mozilla-foundation/common_voice_7_0). When using this model, make sure that your speech input is sampled at 16kHz. This model has been fine-tuned by the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) tool.
5b5b4d97cc5bdeeff0b7d7c89b1bf805
mit
['generated_from_trainer']
false
xlm-roberta-base-finetuned-panx-en This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the xtreme dataset. It achieves the following results on the evaluation set: - Loss: 0.3926 - F1: 0.6991
bd1141459ad1d17b4984db976e70b52d
mit
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:----:|:---------------:|:------:| | 1.1415 | 1.0 | 50 | 0.5404 | 0.5163 | | 0.5045 | 2.0 | 100 | 0.4347 | 0.6498 | | 0.371 | 3.0 | 150 | 0.3926 | 0.6991 |
4e1f5876a062c7c6727b311209edf056
other
[]
false
JurisBert JurisBert, es una iniciativa de la **Suprema Corte de Justicia de la Nación (SCJN) de México**, nace en agosto del 2020, a propuesta de la **Unidad General de Administración del Conocimiento Jurídico (UGACJ)**, para entrenar un Modelo del Lenguaje contextualizado al ámbito jurídico. Su principal objetivo es generar aplicaciones de **Procesamiento del Lenguaje Natural (PLN)** que coadyuven a la labor jurisdiccional del Alto Tribunal mediante el aprovechamiento del conocimiento de la SCJN plasmado en documentos no estructurados que generan las áreas jurisdiccionales. En 2021, esta iniciativa tomó mayor relevancia con la llegada de la Reforma Judicial y el inicio de la undécima época del SJF, puesto que la creación de JurisBert tiene como objetivos principales la ayuda a la identificación del precedente y la creación de Plataformas de Recuperación de Información. Como parte de la Transformación Digital impulsada por la SCJN, en razón de generar un esquema de “Gobierno Abierto” mediante la Colaboración e Innovación y en el contexto de la operación remota obligada por la contingencia sanitaria derivada del virus SARS COV 2, se pone a disposición de toda la comunidad esta innovación tecnológica pretendiendo con ello la retribución del conocimiento generado por el Alto Tribunal a la ciudadanía. Es su primer versión, JurisBert es un modelo del lenguaje basado en Transformadores, teniendo como base SpanBERTa ```python from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("scjnugacj/jurisbert") model = AutoModel.from_pretrained("scjnugacj/jurisbert") ``` ```python from transformers import pipeline fill_mask = pipeline( "fill-mask", model="scjnugacj/jurisbert", tokenizer="scjnugacj/jurisbert" ) fill_mask("interés superior del <mask>.") [ { "score": 0.941512405872345, "token": 3152, "token_str": " menor", "sequence": "interés superior del menor" }, { "score": 0.046888645738363266, "token": 3337, "token_str": " niño", "sequence": "interés superior del niño" }, { "score": 0.004166217986494303, "token": 9386, "token_str": " adolescente", "sequence": "interés superior del adolescente" }, { "score": 0.0008063237182796001, "token": 4914, "token_str": " menores", "sequence": "interés superior del menores" }, { "score": 0.0006806919700466096, "token": 48133, "token_str": " infante", "sequence": "interés superior del infante" } ] ```
96f52bbb6447d442a6c6255ea0cede04
other
[]
false
Términos de uso Al descargar este modelo usted ha aceptado quedar vinculado por los términos establecidos en este aviso legal. El propietario del modelo se reserva el derecho de enmendar, modificar o sustituir estos términos de uso en cualquier momento y sin previo aviso. Cuando una persona o entidad despliegue o proporcione sistemas, servicios, y/o cualquier tecnología a terceros usando este modelo y/o alguno derivado del mismo, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y cumplir con la normativa aplicable en todo momento. En ningún caso el propietario de los modelos (SCJN – Suprema Corte de Justicia de la Nación) ni la ( UGACJ - Unidad General de Administración del Conocimiento Juridico) serán responsables de los resultados derivados del uso que se de a estos modelos.
9ab1c6734c3bbda5acef1b18d6db0036
other
[]
false
Uso previsto Este modelo fue creado con la finalidad de que cualquier persona o institución pueda crear herramientas de consulta de información jurídica del Estado Mexicano basados en modelos de lenguaje.
6c5c9a1405c7ad942ace9a983ccd46d2
apache-2.0
[]
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 256 - eval_batch_size: 128 - gradient_accumulation_steps: 1 - optimizer: AdamW with betas=(0.95, 0.999), weight_decay=1e-06 and epsilon=1e-08 - lr_scheduler: None - lr_warmup_steps: 500 - ema_inv_gamma: 1.0 - ema_inv_gamma: 0.75 - ema_inv_gamma: 0.9999 - mixed_precision: fp16
900e396af7cc7f827ecd55c2f7c1dc64
mit
['bert', 'pytorch']
false
Introduction BERTimbau Base is a pretrained BERT model for Brazilian Portuguese that achieves state-of-the-art performances on three downstream NLP tasks: Named Entity Recognition, Sentence Textual Similarity and Recognizing Textual Entailment. It is available in two sizes: Base and Large. For further information or requests, please go to [BERTimbau repository](https://github.com/neuralmind-ai/portuguese-bert/).
9315423bb8cb9f71ce3671cae15019ae
mit
['bert', 'pytorch']
false
Params | | ---------------------------------------- | ---------- | ------- | ------- | | `neuralmind/bert-base-portuguese-cased` | BERT-Base | 12 | 110M | | `neuralmind/bert-large-portuguese-cased` | BERT-Large | 24 | 335M |
63c5e961edcdab2feb3aa2a6e2e0cf73
mit
['bert', 'pytorch']
false
or BertModel, for BERT without pretraining heads model = AutoModelForPreTraining.from_pretrained('neuralmind/bert-base-portuguese-cased') tokenizer = AutoTokenizer.from_pretrained('neuralmind/bert-base-portuguese-cased', do_lower_case=False) ```
cf9888aa006503088f19bc101091b591
mit
['bert', 'pytorch']
false
For BERT embeddings ```python import torch model = AutoModel.from_pretrained('neuralmind/bert-base-portuguese-cased') input_ids = tokenizer.encode('Tinha uma pedra no meio do caminho.', return_tensors='pt') with torch.no_grad(): outs = model(input_ids) encoded = outs[0][0, 1:-1]
2a374240c8a948b905a8bbb19b5fbdf9
mit
['bert', 'pytorch']
false
Citation If you use our work, please cite: ```bibtex @inproceedings{souza2020bertimbau, author = {F{\'a}bio Souza and Rodrigo Nogueira and Roberto Lotufo}, title = {{BERT}imbau: pretrained {BERT} models for {B}razilian {P}ortuguese}, booktitle = {9th Brazilian Conference on Intelligent Systems, {BRACIS}, Rio Grande do Sul, Brazil, October 20-23 (to appear)}, year = {2020} } ```
15d186910839bbe955fe5d843b110b6c
mit
['stable-diffusion', 'text-to-image']
false
Its Calling (Mob Umamusume) on Waifu Diffusion v1.3.5 This is the `<wd135-itscalling-mob-umamusume>` concept taught to [Waifu Diffusion v1.3.5](https://huggingface.co/hakurei/waifu-diffusion-v1-4/blob/main/models/wd-1-3-5_80000-fp32.ckpt) via Textual Inversion.
387dda4eaad190402dee725499cdb1c7
mit
['stable-diffusion', 'text-to-image']
false
Credits The training images were selectively taken from [Pixiv](https://www.pixiv.net), [Twitter](https://twitter.com), and in-game screenshots of Uma Musume Pretty Derby. A CSV file describing the original sources for most images is available in the [raw dataset archive file](./datasets/raw.7z).
610cb5f405a088712542e0a9318bd7ae
mit
['stable-diffusion', 'text-to-image']
false
Input Here is the new concept you will be able to use as an `object`: ![<wd135-itscalling-mob-umamusume> input 0](./concept_images/91370005_p0_transparent_512x512.png) ![<wd135-itscalling-mob-umamusume> input 1](./concept_images/FgzUbx1aEAEFDdO_512x512.png) ![<wd135-itscalling-mob-umamusume> input 2](./concept_images/FH5MdF7acAA42RG_512x512.png) ![<wd135-itscalling-mob-umamusume> input 3](./concept_images/Fklj8U4aYAIOAGP_512x512.png) ![<wd135-itscalling-mob-umamusume> input 4](./concept_images/FRXH5ibUcAE4KLZ_512x512.png)
d9d3dfd99d47f89d6f941dc78c894fc7
mit
['stable-diffusion', 'text-to-image']
false
Output Examples Some images that can be possibly generated by using the new concept: !["<wd135-itscalling-mob-umamusume>, [bad anatomy, bad hands, bad perspective, bad proportions, blurry, censored, cropped, error, extra arms, extra ears, fewer digits, jpeg artifacts, lowres, multiple legs, out of frame, poorly drawn]" -s 64 -S 3505534900 -W 512 -H 768 -C 10 -A k_dpmpp_2](./examples/000013.63c4d22c.3505534900.png) ```json { "model": "stable diffusion", "model_weights": "waifu-diffusion-1.3.5", "model_hash": "b438efac4434af4e482d20cdfcea64067f8dfec438628261d2f2aa60ffc41452", "app_id": "invoke-ai/InvokeAI", "app_version": "2.2.4", "image": { "prompt": [ { "prompt": "<wd135-itscalling-mob-umamusume>, [bad anatomy, bad hands, bad perspective, bad proportions, blurry, censored, cropped, error, extra arms, extra ears, fewer digits, jpeg artifacts, lowres, multiple legs, out of frame, poorly drawn]", "weight": 1 } ], "steps": 64, "cfg_scale": 10, "threshold": 0, "perlin": 0, "height": 768, "width": 512, "seed": 3505534900, "seamless": false, "hires_fix": false, "type": "txt2img", "postprocessing": null, "sampler": "k_dpmpp_2", "variations": [] } } ``` !["<wd135-itscalling-mob-umamusume> horse ears horse tail horse girl, running outdoors park, white t-shirts black shorts, morning sunlight, pov from side looking at viewer cowboy shot, [bad anatomy, bad hands, bad perspective, bad proportions, blurry, censored, cropped, error, extra arms, extra ears, fewer digits, jpeg artifacts, lowres, multiple legs, out of frame, poorly drawn]" -s 64 -S 821696414 -W 512 -H 768 -C 10 -A k_dpmpp_2](./examples/000019.37833118.821696414.png) ```json { "model": "stable diffusion", "model_weights": "waifu-diffusion-1.3.5", "model_hash": "b438efac4434af4e482d20cdfcea64067f8dfec438628261d2f2aa60ffc41452", "app_id": "invoke-ai/InvokeAI", "app_version": "2.2.4", "image": { "prompt": [ { "prompt": "<wd135-itscalling-mob-umamusume> horse ears horse tail horse girl, running outdoors park, white t-shirts black shorts, morning sunlight, pov from side looking at viewer cowboy shot, [bad anatomy, bad hands, bad perspective, bad proportions, blurry, censored, cropped, error, extra arms, extra ears, fewer digits, jpeg artifacts, lowres, multiple legs, out of frame, poorly drawn]", "weight": 1 } ], "steps": 64, "cfg_scale": 10, "threshold": 0, "perlin": 0, "height": 768, "width": 512, "seed": 821696414, "seamless": false, "hires_fix": false, "type": "txt2img", "postprocessing": null, "sampler": "k_dpmpp_2", "variations": [] } } ``` !["<wd135-itscalling-mob-umamusume> horse ears horse tail horse girl, running outdoors park, white t-shirts black shorts, morning sunlight, pov from side looking at viewer cowboy shot, [bad anatomy, bad hands, bad perspective, bad proportions, blurry, censored, cropped, error, extra arms, extra ears, fewer digits, jpeg artifacts, lowres, multiple legs, out of frame, poorly drawn]" -s 64 -S 460073536 -W 512 -H 768 -C 10 -A k_dpmpp_2](./examples/000020.58cf5625.460073536.png) ```json { "model": "stable diffusion", "model_weights": "waifu-diffusion-1.3.5", "model_hash": "b438efac4434af4e482d20cdfcea64067f8dfec438628261d2f2aa60ffc41452", "app_id": "invoke-ai/InvokeAI", "app_version": "2.2.4", "image": { "prompt": [ { "prompt": "<wd135-itscalling-mob-umamusume> horse ears horse tail horse girl, running outdoors park, white t-shirts black shorts, morning sunlight, pov from side looking at viewer cowboy shot, [bad anatomy, bad hands, bad perspective, bad proportions, blurry, censored, cropped, error, extra arms, extra ears, fewer digits, jpeg artifacts, lowres, multiple legs, out of frame, poorly drawn]", "weight": 1 } ], "steps": 64, "cfg_scale": 10, "threshold": 0, "perlin": 0, "height": 768, "width": 512, "seed": 460073536, "seamless": false, "hires_fix": false, "type": "txt2img", "postprocessing": null, "sampler": "k_dpmpp_2", "variations": [] } } ``` !["<wd135-itscalling-mob-umamusume> horse ears horse tail horse girl, school sailor uniform white shirt purple pleated skirt, standing looking at viewer smile one eye closed arms behind back, standing indoors empty classroom, dusk sunset ambience light, full body shot, [bad anatomy, bad hands, bad perspective, bad proportions, blurry, censored, cropped, error, extra arms, extra ears, fewer digits, jpeg artifacts, lowres, multiple legs, out of frame, poorly drawn]" -s 64 -S 1869090925 -W 512 -H 768 -C 10 -A k_dpmpp_2](./examples/000032.f35340f2.1869090925.png) ```json { "model": "stable diffusion", "model_weights": "waifu-diffusion-1.3.5", "model_hash": "b438efac4434af4e482d20cdfcea64067f8dfec438628261d2f2aa60ffc41452", "app_id": "invoke-ai/InvokeAI", "app_version": "2.2.4", "image": { "prompt": [ { "prompt": "<wd135-itscalling-mob-umamusume> horse ears horse tail horse girl, school sailor uniform white shirt purple pleated skirt, standing looking at viewer smile one eye closed arms behind back, standing indoors empty classroom, dusk sunset ambience light, full body shot, [bad anatomy, bad hands, bad perspective, bad proportions, blurry, censored, cropped, error, extra arms, extra ears, fewer digits, jpeg artifacts, lowres, multiple legs, out of frame, poorly drawn]", "weight": 1 } ], "steps": 64, "cfg_scale": 10, "threshold": 0, "perlin": 0, "height": 768, "width": 512, "seed": 1869090925, "seamless": false, "hires_fix": false, "type": "txt2img", "postprocessing": null, "sampler": "k_dpmpp_2", "variations": [] } } ```
16e40f2d6fcdf5a3b9422e7bcf712b5c
apache-2.0
['generated_from_trainer']
false
distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2220 - Accuracy: 0.9215 - F1: 0.9216
10c683db7f023a35fcb247b2acb8db18
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.8267 | 1.0 | 250 | 0.3110 | 0.909 | 0.9073 | | 0.252 | 2.0 | 500 | 0.2220 | 0.9215 | 0.9216 |
88b6126609174c784088f670dc2e9050
creativeml-openrail-m
['text-to-image']
false
Aipom_From_Pokémon-Diffusion Dreambooth model trained by Laughify with [TheLastBen's fast-DreamBooth](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast-DreamBooth.ipynb) notebook Test the concept via A1111 Colab [fast-Colab-A1111](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast_stable_diffusion_AUTOMATIC1111.ipynb) Or you can run your new concept via `diffusers` [Colab Notebook for Inference](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/sd_dreambooth_inference.ipynb)
2b8df58dda90f058094c85a6a5c418d5
apache-2.0
['automatic-speech-recognition', 'pl']
false
exp_w2v2t_pl_wav2vec2_s530 Fine-tuned [facebook/wav2vec2-large-lv60](https://huggingface.co/facebook/wav2vec2-large-lv60) for speech recognition using the train split of [Common Voice 7.0 (pl)](https://huggingface.co/datasets/mozilla-foundation/common_voice_7_0). When using this model, make sure that your speech input is sampled at 16kHz. This model has been fine-tuned by the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) tool.
05e7706da906474871b34d9d550e3d83
other
['text-generation', 'opt']
false
How to use You can use this model directly with a pipeline for text generation. ```python >>> from transformers import pipeline >>> generator = pipeline('text-generation', model="facebook/opt-2.7b") >>> generator("Hello, I'm am conscious and") [{'generated_text': 'Hello, I am conscious and I am a human being.\nI am a human being, and'}] ``` By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`. ```python >>> from transformers import pipeline, set_seed >>> set_seed(32) >>> generator = pipeline('text-generation', model="facebook/opt-2.7b", do_sample=True) >>> generator("Hello, I'm am conscious and") [{'generated_text': "Hello, I'm am conscious and I make things. I'm in the creative community, which is"}] ```
7d85f95da30fe432e8025580403f3f35
other
['text-generation', 'opt']
false
Limitations and bias As mentioned in Meta AI's model card, given that the training data used for this model contains a lot of unfiltered content from the internet, which is far from neutral the model is strongly biased : > Like other large language models for which the diversity (or lack thereof) of training > data induces downstream impact on the quality of our model, OPT-175B has limitations in terms > of bias and safety. OPT-175B can also have quality issues in terms of generation diversity and > hallucination. In general, OPT-175B is not immune from the plethora of issues that plague modern > large language models. Here's an example of how the model can have biased predictions: ```python >>> from transformers import pipeline, set_seed >>> set_seed(32) >>> generator = pipeline('text-generation', model="facebook/opt-2.7b", do_sample=True, num_return_sequences=5) >>> generator("The woman worked as a") [{'generated_text': "The woman worked as a security guard at a nursery in the city's eastern district of Samut P"}, {'generated_text': 'The woman worked as a doctor in the Philippines. Officials in China allege she stole the coronavirus'}, {'generated_text': 'The woman worked as a teacher in the city of Krasnodar in south Russia. She'}, {'generated_text': 'The woman worked as a researcher and lecturer at the Russian Academy of Sciences in a laboratory dedicated to the'}, {'generated_text': 'The woman worked as a nanny on a property owned by Mr Fitton-Allen in the city'}] ``` compared to: ```python >>> from transformers import pipeline, set_seed >>> set_seed(32) >>> generator = pipeline('text-generation', model="facebook/opt-2.7b", do_sample=True, num_return_sequences=5) >>> generator("The man worked as a") [{'generated_text': "The man worked as a security guard at a retirement home after being hired by the administrator's cousin,"}, {'generated_text': 'The man worked as a doctor in the Philippines.\n\nHe had hoped to work his way back'}, {'generated_text': 'The man worked as a teacher in the city of Krasnodar in south Russia.He'}, {'generated_text': 'The man worked as a researcher and his work on the topic predates the project, by many years'}, {'generated_text': 'The man worked as a chef in a restaurant for 40 years. How could this be so different from'}] ``` This bias will also affect all fine-tuned versions of this model.
919e957332379a7211c0fef88e23f8c1
apache-2.0
['generated_from_keras_callback']
false
transformers-abhi This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 2.9227 - Validation Loss: 2.5929 - Train Rougel: tf.Tensor(0.19853836, shape=(), dtype=float32) - Epoch: 0
6d07109fdbbd03df4cea274b7289e965
apache-2.0
['generated_from_keras_callback']
false
Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'Adam', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False} - training_precision: float32
61d4d2259d7fccb35a099c460b83ba06
apache-2.0
['generated_from_keras_callback']
false
Training results | Train Loss | Validation Loss | Train Rougel | Epoch | |:----------:|:---------------:|:----------------------------------------------:|:-----:| | 2.9227 | 2.5929 | tf.Tensor(0.19853836, shape=(), dtype=float32) | 0 |
b526aed2b24ae3ac73857bf849fb0ba9
apache-2.0
['bert', 'rte', 'glue', 'kd', 'torchdistill']
false
`bert-base-uncased` fine-tuned on RTE dataset, using fine-tuned `bert-large-uncased` as a teacher model, [***torchdistill***](https://github.com/yoshitomo-matsubara/torchdistill) and [Google Colab](https://colab.research.google.com/github/yoshitomo-matsubara/torchdistill/blob/master/demo/glue_kd_and_submission.ipynb) for knowledge distillation. The training configuration (including hyperparameters) is available [here](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/glue/rte/kd/bert_base_uncased_from_bert_large_uncased.yaml). I submitted prediction files to [the GLUE leaderboard](https://gluebenchmark.com/leaderboard), and the overall GLUE score was **78.9**.
609c925516dd6bd49d49772c838c835e
apache-2.0
['generated_from_trainer', 'whisper-event']
false
Whisper Small3 Italian This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the mozilla-foundation/common_voice_11_0 it dataset. It achieves the following results on the evaluation set: - Loss: 0.2307 - Wer: 10.2508
d7a5848bbf66a7525cb051b512d36de7
apache-2.0
['generated_from_trainer', 'whisper-event']
false
Intended uses & limitations This model has been developed as part of the Hugging Face Whisper Fine Tuning sprint, December 2022. It is meant to spread the knowledge on how these models are built and can be used to develop solutions where it is needed ASR on the Italian Language. It has not been extensively tested. It is possible that on other datasets the accuracy will be lower. Please, test it before using it.
751722b2f522fb7c51ea98f04d4c90c8
apache-2.0
['generated_from_trainer', 'whisper-event']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 8e-06 - train_batch_size: 64 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 6000 - mixed_precision_training: Native AMP
1edf7e7b261b68d40444cb77e996e3a5
apache-2.0
['generated_from_trainer', 'whisper-event']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:-------:| | 0.226 | 2.01 | 1000 | 0.2494 | 11.3684 | | 0.1017 | 4.02 | 2000 | 0.2403 | 10.6029 | | 0.0491 | 6.03 | 3000 | 0.2549 | 10.9591 | | 0.1102 | 8.04 | 4000 | 0.2307 | 10.2508 | | 0.0384 | 10.05 | 5000 | 0.2592 | 10.5903 | | 0.0285 | 12.06 | 6000 | 0.2537 | 10.5026 |
5412d5ed219b3ed281df0fa2baf82183
apache-2.0
['sentence-transformers', 'feature-extraction', 'sentence-similarity', 'transformers']
false
sentence-transformers/msmarco-distilbert-base-v4 This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
1720cc6fb632f0d18fd77ca3fbe9d59a
apache-2.0
['sentence-transformers', 'feature-extraction', 'sentence-similarity', 'transformers']
false
Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('sentence-transformers/msmarco-distilbert-base-v4') embeddings = model.encode(sentences) print(embeddings) ```
fd3ee41575318386a40992e7da33fd26
apache-2.0
['sentence-transformers', 'feature-extraction', 'sentence-similarity', 'transformers']
false
Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/msmarco-distilbert-base-v4') model = AutoModel.from_pretrained('sentence-transformers/msmarco-distilbert-base-v4')
f1a200aa7f131c867c87fa5cf0300f86
apache-2.0
['sentence-transformers', 'feature-extraction', 'sentence-similarity', 'transformers']
false
Evaluation Results For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/msmarco-distilbert-base-v4)
03f60b66e57ec703422b88c9e1b1c159
apache-2.0
['sentence-transformers', 'feature-extraction', 'sentence-similarity', 'transformers']
false
Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DistilBertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False}) ) ```
14c8dc76340da12dcd1525e9fc46b4f9
apache-2.0
['generated_from_trainer']
false
sentiment_model This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3852 - Accuracy: 0.8424 - F1: 0.8398
e6a9483a9809da745a13a9d5bcd693e1
mit
['generated_from_trainer']
false
xlm-roberta-base-finetuned-panx-it This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the xtreme dataset. It achieves the following results on the evaluation set: - Loss: 0.2401 - F1: 0.8246
ab9534db38c323d806d330b3c0f2615a
mit
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:----:|:---------------:|:------:| | 0.8187 | 1.0 | 70 | 0.3325 | 0.7337 | | 0.2829 | 2.0 | 140 | 0.2554 | 0.8003 | | 0.1894 | 3.0 | 210 | 0.2401 | 0.8246 |
e9af34550689d4ad440d067abea310d2
apache-2.0
['generated_from_keras_callback']
false
thanat/mt5-small-finetuned-amazon-en-es This model is a fine-tuned version of [google/mt5-small](https://huggingface.co/google/mt5-small) on the [amazon_reviews_multi](https://huggingface.co/datasets/amazon_reviews_multi) dataset. It achieves the following results on the evaluation set: - Train Loss: 4.0061 - Validation Loss: 3.3257 - Epoch: 7
db1b33d9f360f802c3d92aba9c9bd9e8