license
stringlengths
2
30
tags
stringlengths
2
513
is_nc
bool
1 class
readme_section
stringlengths
201
597k
hash
stringlengths
32
32
apache-2.0
['translation']
false
System Info: - hf_name: vie-fra - source_languages: vie - target_languages: fra - opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/vie-fra/README.md - original_repo: Tatoeba-Challenge - tags: ['translation'] - languages: ['vi', 'fr'] - src_constituents: {'vie', 'vie_Hani'} - tgt_constituents: {'fra'} - src_multilingual: False - tgt_multilingual: False - prepro: normalization + SentencePiece (spm32k,spm32k) - url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/vie-fra/opus-2020-06-17.zip - url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/vie-fra/opus-2020-06-17.test.txt - src_alpha3: vie - tgt_alpha3: fra - short_pair: vi-fr - chrF2_score: 0.544 - bleu: 34.2 - brevity_penalty: 0.955 - ref_len: 11519.0 - src_name: Vietnamese - tgt_name: French - train_date: 2020-06-17 - src_alpha2: vi - tgt_alpha2: fr - prefer_old: False - long_pair: vie-fra - helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535 - transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b - port_machine: brutasse - port_time: 2020-08-21-14:41
85c15f3926c7738ff842388ea3e90652
cc-by-4.0
['bert']
false
bert-sr-small A small-size BERT Language Model with a **shuffle + random** pre-training objective. For more details about the pre-training objective and the pre-training hyperparameters, please refer to [How does the pre-training objective affect what large language models learn about linguistic properties?](https://aclanthology.org/2022.acl-short.16/)
a52d2349c899501634d6029ee9936e36
['apache-2.0', 'bsd-3-clause']
['summarization', 'summary', 'booksum', 'long-document', 'long-form']
false
long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP17 This model is a fine-tuned version of [pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP16](https://huggingface.co/pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP16) on the kmfoda/booksum dataset.
008a9c6eaf58e1024d0243e2c828ee6c
['apache-2.0', 'bsd-3-clause']
['summarization', 'summary', 'booksum', 'long-document', 'long-form']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - distributed_type: multi-GPU - gradient_accumulation_steps: 64 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.01 - num_epochs: 3
741c9b0f63f370135490d0a35e150d1e
apache-2.0
['automatic-speech-recognition', 'en']
false
exp_w2v2r_en_vp-100k_age_teens-10_sixties-0_s232 Fine-tuned [facebook/wav2vec2-large-100k-voxpopuli](https://huggingface.co/facebook/wav2vec2-large-100k-voxpopuli) for speech recognition using the train split of [Common Voice 7.0 (en)](https://huggingface.co/datasets/mozilla-foundation/common_voice_7_0). When using this model, make sure that your speech input is sampled at 16kHz. This model has been fine-tuned by the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) tool.
89d1f5640185c25130d469f62f7bac18
mit
['bert', 'pytorch', 'tsdae']
false
Introduction Legal_BERTimbau Large is a fine-tuned BERT model based on [BERTimbau](https://huggingface.co/neuralmind/bert-base-portuguese-cased) Large. "BERTimbau Base is a pretrained BERT model for Brazilian Portuguese that achieves state-of-the-art performances on three downstream NLP tasks: Named Entity Recognition, Sentence Textual Similarity and Recognizing Textual Entailment. It is available in two sizes: Base and Large. For further information or requests, please go to [BERTimbau repository](https://github.com/neuralmind-ai/portuguese-bert/)." The performance of Language Models can change drastically when there is a domain shift between training and test data. In order create a Portuguese Language Model adapted to a Legal domain, the original BERTimbau model was submitted to a fine-tuning stage where it was performed 1 "PreTraining" epoch over 10000 cleaned documents (lr: 2e-5, using TSDAE technique)
27b7da55d295884576802fb60de815a6
mit
['bert', 'pytorch', 'tsdae']
false
Params | | ---------------------------------------- | ---------- | ------- | ------- | |`rufimelo/Legal-BERTimbau-base` |BERT-Base |12 |110M| | `rufimelo/Legal-BERTimbau-large` | BERT-Large | 24 | 335M |
e4c4e06247deef3be00e50265ce8fee6
mit
['bert', 'pytorch', 'tsdae']
false
Usage ```python from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("rufimelo/Legal-BERTimbau-large-TSDAE") model = AutoModelForMaskedLM.from_pretrained("rufimelo/Legal-BERTimbau-large-TSDAE") ```
aa4f6ebb2043ec609e0c858c0ee62a64
mit
['bert', 'pytorch', 'tsdae']
false
Masked language modeling prediction example ```python from transformers import pipeline from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("rufimelo/Legal-BERTimbau-large-TSDAE") model = AutoModelForMaskedLM.from_pretrained("rufimelo/Legal-BERTimbau-large-TSDAE") pipe = pipeline('fill-mask', model=model, tokenizer=tokenizer) pipe('O advogado apresentou [MASK] para o juíz')
74ba74db369dac16eea4f0778c515346
mit
['bert', 'pytorch', 'tsdae']
false
For BERT embeddings ```python import torch from transformers import AutoModel model = AutoModel.from_pretrained('rufimelo/Legal-BERTimbau-large-TSDAE') input_ids = tokenizer.encode('O advogado apresentou recurso para o juíz', return_tensors='pt') with torch.no_grad(): outs = model(input_ids) encoded = outs[0][0, 1:-1]
caef72ff527f3dcfecd2d6177a471908
mit
['bert', 'pytorch', 'tsdae']
false
Citation If you use this work, please cite BERTimbau's work: ```bibtex @inproceedings{souza2020bertimbau, author = {F{\'a}bio Souza and Rodrigo Nogueira and Roberto Lotufo}, title = {{BERT}imbau: pretrained {BERT} models for {B}razilian {P}ortuguese}, booktitle = {9th Brazilian Conference on Intelligent Systems, {BRACIS}, Rio Grande do Sul, Brazil, October 20-23 (to appear)}, year = {2020} } ```
e93e23adcbeb738c04c49f70f7e51099
mit
[]
false
dapSciBERT DapSciBERT is a BERT-like model trained based on the domain adaptive pretraining method ([Gururangan et al.](https://aclanthology.org/2020.acl-main.740/)) for the patent domain. Allenai/scibert_scivocab_uncased is used as base for the training. The training dataset used consists of a corpus of 10,000,000 patent abstracts that have been filed between 1998-2020 in US and European patent offices as well as the World Intellectual Property Organization.
f1ec6a09f217a6298400b2b4dffeee90
mit
['generated_from_trainer']
false
poem-gen-spanish-t5-small-v5 This model is a fine-tuned version of [hackathon-pln-es/poem-gen-spanish-t5-small](https://huggingface.co/hackathon-pln-es/poem-gen-spanish-t5-small) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.8881
67a67b1e34f2ffe7f38362116224407c
mit
['generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.000125 - train_batch_size: 6 - eval_batch_size: 6 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5
7c4e6d374f4e54840931e6ceb68b5f9d
mit
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:------:|:---------------:| | 2.9366 | 0.73 | 30000 | 2.9656 | | 2.7518 | 1.46 | 60000 | 2.9120 | | 2.6018 | 2.19 | 90000 | 2.8870 | | 2.5262 | 2.93 | 120000 | 2.8646 | | 2.3886 | 3.66 | 150000 | 2.8816 | | 2.2758 | 4.39 | 180000 | 2.8900 |
e348b308852229b1d288b78d307c25fc
mit
['roberta-base', 'roberta-base-epoch_46']
false
RoBERTa, Intermediate Checkpoint - Epoch 46 This model is part of our reimplementation of the [RoBERTa model](https://arxiv.org/abs/1907.11692), trained on Wikipedia and the Book Corpus only. We train this model for almost 100K steps, corresponding to 83 epochs. We provide the 84 checkpoints (including the randomly initialized weights before the training) to provide the ability to study the training dynamics of such models, and other possible use-cases. These models were trained in part of a work that studies how simple statistics from data, such as co-occurrences affects model predictions, which are described in the paper [Measuring Causal Effects of Data Statistics on Language Model's `Factual' Predictions](https://arxiv.org/abs/2207.14251). This is RoBERTa-base epoch_46.
fa67ebd8b90b1aa5085d7eead9772151
apache-2.0
['generated_from_trainer']
false
Tagged_One_250v4_NER_Model_3Epochs_AUGMENTED This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the tagged_one250v4_wikigold_split dataset. It achieves the following results on the evaluation set: - Loss: 0.3389 - Precision: 0.5685 - Recall: 0.4847 - F1: 0.5233 - Accuracy: 0.8928
037299f78ebf11d5a67c2905d5da3168
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | No log | 1.0 | 87 | 0.4018 | 0.2797 | 0.1842 | 0.2221 | 0.8514 | | No log | 2.0 | 174 | 0.3266 | 0.5245 | 0.4398 | 0.4784 | 0.8888 | | No log | 3.0 | 261 | 0.3389 | 0.5685 | 0.4847 | 0.5233 | 0.8928 |
1a09ee7635a65f7b056883c32158eefb
apache-2.0
['generated_from_trainer']
false
bert-base-cased-sst2 This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the GLUE SST2 dataset. It achieves the following results on the evaluation set: - Loss: 0.2345 - Accuracy: 0.9140
7f977488b9c65882fcf70fdd00b089ce
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.6253 | 0.12 | 500 | 0.3641 | 0.8567 | | 0.3189 | 0.24 | 1000 | 0.2656 | 0.8899 | | 0.2701 | 0.36 | 1500 | 0.3463 | 0.8807 | | 0.2533 | 0.48 | 2000 | 0.2409 | 0.9071 | | 0.2436 | 0.59 | 2500 | 0.2345 | 0.9140 | | 0.2155 | 0.71 | 3000 | 0.2926 | 0.9002 | | 0.22 | 0.83 | 3500 | 0.2998 | 0.9094 | | 0.2146 | 0.95 | 4000 | 0.2481 | 0.9140 | | 0.1737 | 1.07 | 4500 | 0.2802 | 0.9128 | | 0.1578 | 1.19 | 5000 | 0.3536 | 0.9083 | | 0.1534 | 1.31 | 5500 | 0.4714 | 0.8830 | | 0.1641 | 1.43 | 6000 | 0.3235 | 0.9128 | | 0.1601 | 1.54 | 6500 | 0.3133 | 0.9094 | | 0.1644 | 1.66 | 7000 | 0.3021 | 0.9071 | | 0.1578 | 1.78 | 7500 | 0.3552 | 0.9094 | | 0.1582 | 1.9 | 8000 | 0.2896 | 0.9106 | | 0.1448 | 2.02 | 8500 | 0.3343 | 0.9232 | | 0.0989 | 2.14 | 9000 | 0.3882 | 0.9048 | | 0.1098 | 2.26 | 9500 | 0.3218 | 0.9037 | | 0.1056 | 2.38 | 10000 | 0.3426 | 0.9140 | | 0.112 | 2.49 | 10500 | 0.3631 | 0.9025 | | 0.1066 | 2.61 | 11000 | 0.4084 | 0.9106 | | 0.126 | 2.73 | 11500 | 0.3191 | 0.9117 | | 0.12 | 2.85 | 12000 | 0.4091 | 0.9048 | | 0.1092 | 2.97 | 12500 | 0.3602 | 0.9060 | | 0.0826 | 3.09 | 13000 | 0.3571 | 0.9163 | | 0.0603 | 3.21 | 13500 | 0.4021 | 0.9243 | | 0.0636 | 3.33 | 14000 | 0.3893 | 0.9186 | | 0.0775 | 3.44 | 14500 | 0.4373 | 0.9151 | | 0.0842 | 3.56 | 15000 | 0.4100 | 0.9174 | | 0.0902 | 3.68 | 15500 | 0.3878 | 0.9037 | | 0.092 | 3.8 | 16000 | 0.3723 | 0.9140 | | 0.0978 | 3.92 | 16500 | 0.3492 | 0.9163 | | 0.0682 | 4.04 | 17000 | 0.4597 | 0.9209 | | 0.0481 | 4.16 | 17500 | 0.4668 | 0.9186 | | 0.0561 | 4.28 | 18000 | 0.4083 | 0.9209 | | 0.0571 | 4.39 | 18500 | 0.4040 | 0.9174 | | 0.0511 | 4.51 | 19000 | 0.4032 | 0.9197 | | 0.062 | 4.63 | 19500 | 0.4090 | 0.9140 | | 0.0618 | 4.75 | 20000 | 0.4150 | 0.9106 | | 0.0599 | 4.87 | 20500 | 0.3623 | 0.9209 | | 0.0614 | 4.99 | 21000 | 0.4421 | 0.9083 | | 0.0385 | 5.11 | 21500 | 0.4328 | 0.9197 | | 0.0331 | 5.23 | 22000 | 0.4569 | 0.9209 | | 0.0343 | 5.34 | 22500 | 0.5130 | 0.9094 | | 0.0389 | 5.46 | 23000 | 0.4741 | 0.9232 | | 0.0413 | 5.58 | 23500 | 0.4654 | 0.9060 | | 0.0444 | 5.7 | 24000 | 0.4888 | 0.9014 | | 0.0406 | 5.82 | 24500 | 0.4085 | 0.9220 | | 0.031 | 5.94 | 25000 | 0.4760 | 0.9197 | | 0.037 | 6.06 | 25500 | 0.5403 | 0.9094 | | 0.0239 | 6.18 | 26000 | 0.5945 | 0.9060 | | 0.0267 | 6.29 | 26500 | 0.4595 | 0.9140 | | 0.0338 | 6.41 | 27000 | 0.4923 | 0.9106 | | 0.0293 | 6.53 | 27500 | 0.6128 | 0.8979 | | 0.0253 | 6.65 | 28000 | 0.5428 | 0.9083 | | 0.0296 | 6.77 | 28500 | 0.5244 | 0.9002 | | 0.0279 | 6.89 | 29000 | 0.5732 | 0.9048 | | 0.0321 | 7.01 | 29500 | 0.5824 | 0.9094 | | 0.0179 | 7.13 | 30000 | 0.6336 | 0.9094 | | 0.0177 | 7.24 | 30500 | 0.7145 | 0.9140 | | 0.0262 | 7.36 | 31000 | 0.5504 | 0.9083 | | 0.0182 | 7.48 | 31500 | 0.5924 | 0.9071 | | 0.0187 | 7.6 | 32000 | 0.5613 | 0.9151 | | 0.012 | 7.72 | 32500 | 0.6129 | 0.9083 | | 0.021 | 7.84 | 33000 | 0.5698 | 0.9106 | | 0.024 | 7.96 | 33500 | 0.6231 | 0.9083 | | 0.0136 | 8.08 | 34000 | 0.7155 | 0.9117 | | 0.0088 | 8.19 | 34500 | 0.7918 | 0.9060 | | 0.0129 | 8.31 | 35000 | 0.6727 | 0.9094 | | 0.0113 | 8.43 | 35500 | 0.6531 | 0.9117 | | 0.0141 | 8.55 | 36000 | 0.7040 | 0.9037 | | 0.0111 | 8.67 | 36500 | 0.6551 | 0.9094 | | 0.0111 | 8.79 | 37000 | 0.6928 | 0.9071 | | 0.0116 | 8.91 | 37500 | 0.6313 | 0.9094 | | 0.0107 | 9.03 | 38000 | 0.7104 | 0.9094 | | 0.006 | 9.14 | 38500 | 0.7446 | 0.9117 | | 0.0048 | 9.26 | 39000 | 0.7537 | 0.9140 | | 0.0099 | 9.38 | 39500 | 0.7715 | 0.9140 | | 0.0067 | 9.5 | 40000 | 0.7633 | 0.9117 | | 0.0037 | 9.62 | 40500 | 0.7669 | 0.9128 | | 0.006 | 9.74 | 41000 | 0.7714 | 0.9128 | | 0.0063 | 9.86 | 41500 | 0.8020 | 0.9106 | | 0.0107 | 9.98 | 42000 | 0.7985 | 0.9117 |
f736772342ff6ad379f65e43125a984f
mit
[]
false
babushork on Stable Diffusion This is the `<babushork>` concept taught to Stable Diffusion via Textual Inversion. You can load this concept into the [Stable Conceptualizer](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/stable_conceptualizer_inference.ipynb) notebook. You can also train your own concepts and load them into the concept libraries using [this notebook](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/sd_textual_inversion_training.ipynb). Here is the new concept you will be able to use as an `object`: ![<babushork> 0](https://huggingface.co/sd-concepts-library/babushork/resolve/main/concept_images/0.jpeg) ![<babushork> 1](https://huggingface.co/sd-concepts-library/babushork/resolve/main/concept_images/3.jpeg) ![<babushork> 2](https://huggingface.co/sd-concepts-library/babushork/resolve/main/concept_images/5.jpeg) ![<babushork> 3](https://huggingface.co/sd-concepts-library/babushork/resolve/main/concept_images/1.jpeg) ![<babushork> 4](https://huggingface.co/sd-concepts-library/babushork/resolve/main/concept_images/2.jpeg) ![<babushork> 5](https://huggingface.co/sd-concepts-library/babushork/resolve/main/concept_images/4.jpeg)
de182ed81b1bbe4e72df734dac5d98a6
apache-2.0
['vision', 'image-segmentation', 'generated_from_trainer']
false
segformer-b0-finetuned-segments-sidewalk This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 0.5679 - Miou: 0.2769 - Macc: 0.3331 - Overall Accuracy: 0.8424 - Per Category Iou: [nan, 0.7174911859423314, 0.8790751054409742, 0.6065232798410057, 0.6975274018055722, 0.3486407385349508, nan, 0.40093167116703843, 0.28779837903852556, 0.0, 0.7870339041746186, 0.0, 0.0, 0.0, 0.0, 0.1464360606454247, 0.0, 0.0, 0.6770283275082656, 0.0, 0.338555175257431, 0.14697310016578427, 0.0, nan, 0.0, 0.27163002251763635, 0.0, 0.0, 0.8257437911843676, 0.7169333376341568, 0.9108105550493353, 0.0, 0.0, 0.1016801552778885, 0.0] - Per Category Accuracy: [nan, 0.9199960254104915, 0.9327745517652714, 0.7304629327758765, 0.7378309547498484, 0.45295941407150275, nan, 0.5188608021128075, 0.5327441812670195, 0.0, 0.9353764765979435, 0.0, 0.0, 0.0, 0.0, 0.1588525415198792, 0.0, 0.0, 0.9238854794385364, 0.0, 0.4400394213522207, 0.15130051149615126, 0.0, nan, 0.0, 0.3570096986572905, 0.0, 0.0, 0.9359897980968498, 0.8570458108260572, 0.9549583230619891, 0.0, 0.0, 0.11786971668879294, 0.0]
7bcabfa0b91f1465f19eeb772e36469e
apache-2.0
['vision', 'image-segmentation', 'generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5
68207b660badde905464a20c124f3055
apache-2.0
['vision', 'image-segmentation', 'generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Miou | Macc | Overall Accuracy | Per Category Iou | Per Category Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:----------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:| | 1.357 | 1.0 | 400 | 1.0006 | 0.1632 | 0.2069 | 0.7524 | [nan, 0.5642795884663824, 0.7491853309192827, 0.0, 0.40589649630192104, 0.02723606910696284, nan, 0.0002207740938439576, 0.0, 0.0, 0.6632462867093903, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.5671699281129761, 0.0, 0.0009207911027492868, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.7507253434892517, 0.6157793573905029, 0.8774768871968204, 0.0, 0.0, 0.0, 0.0] | [nan, 0.6839993330882016, 0.9786792586618772, 0.0, 0.4818162160949784, 0.02785198456498826, nan, 0.00022133459131411787, 0.0, 0.0, 0.9043689536433023, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8606078323791991, 0.0, 0.0009210330367246509, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.895198618615298, 0.8549807032886052, 0.9328734839751688, 0.0, 0.0, 0.0, 0.0] | | 1.6346 | 2.0 | 800 | 0.7856 | 0.1903 | 0.2334 | 0.7917 | [nan, 0.6276046255936906, 0.8379492348238635, 0.0, 0.5220035981992285, 0.19441920935217594, nan, 0.16135703555333, 0.0, 0.0, 0.7357165628674137, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.567598980063164, 0.0, 0.07867871139133086, 0.0, 0.0, nan, 0.0, 0.02123705398363847, 0.0, 0.0, 0.7917172051343153, 0.6589515948064048, 0.8916684207946344, 0.0, 0.0, 0.00013685918191589503, 0.0] | [nan, 0.8610263337355926, 0.9499345560017969, 0.0, 0.5908796687797819, 0.2144081438468206, nan, 0.1813236746419022, 0.0, 0.0, 0.8825551027577866, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.9239907140298015, 0.0, 0.08495225520298297, 0.0, 0.0, nan, 0.0, 0.021302829364985724, 0.0, 0.0, 0.9258397010509258, 0.8834861376443207, 0.9489131468773239, 0.0, 0.0, 0.0001372777815910495, 0.0] | | 0.659 | 3.0 | 1200 | 0.6798 | 0.2215 | 0.2687 | 0.8107 | [nan, 0.6728474586764454, 0.8404607924530816, 0.21147709475332813, 0.5407350347311378, 0.23535489130104167, nan, 0.3087159264982809, 0.0060319580742948155, 0.0, 0.7331305064022374, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.6378031991744924, 0.0, 0.35289337122777764, 6.24997656258789e-05, 0.0, nan, 0.0, 0.14698390926256938, 0.0, 0.0, 0.8019042204623998, 0.669283249725758, 0.8928145424856038, 0.0, 0.0, 0.03847722460691187, 0.0] | [nan, 0.866012011452706, 0.9627112260298595, 0.21236715482371135, 0.5645869262075475, 0.2750610095322395, nan, 0.3857655597748765, 0.0060319580742948155, 0.0, 0.939196440844118, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8380282443529743, 0.0, 0.5749902063170915, 6.256068386334744e-05, 0.0, nan, 0.0, 0.1605725590139305, 0.0, 0.0, 0.9212803460870584, 0.8870298583701837, 0.959700359744241, 0.0, 0.0, 0.04453994364914478, 0.0] | | 0.5481 | 4.0 | 1600 | 0.5999 | 0.2522 | 0.2998 | 0.8312 | [nan, 0.7078353465279917, 0.8661728761172196, 0.3857324719136883, 0.6338278880825696, 0.3440050078187208, nan, 0.35980405625532347, 0.23875867241702606, 0.0, 0.773703347865372, 0.0, 0.0, 0.0, 0.0, 0.0004931363471679884, 0.0, 0.0, 0.6554146448850521, 0.0, 0.367673493717809, 0.03089804641909161, 0.0, nan, 0.0, 0.21529017459808872, 0.0, 0.0, 0.818951849158376, 0.7007504838794707, 0.9053929635423027, 0.0, 0.0, 0.06626212301200333, 0.0] | [nan, 0.8955207784307155, 0.9536263694097721, 0.39712577675621036, 0.6989299616008556, 0.4248959179453637, nan, 0.42984959564233455, 0.26168627652468784, 0.0, 0.9055166364779607, 0.0, 0.0, 0.0, 0.0, 0.0004932058379466533, 0.0, 0.0, 0.8632164276000204, 0.0, 0.6365580872107307, 0.031401709658368616, 0.0, nan, 0.0, 0.2497286263775161, 0.0, 0.0, 0.9296676429517725, 0.8858954297713482, 0.9555756265860916, 0.0, 0.0, 0.0750792276952902, 0.0] | | 0.7855 | 5.0 | 2000 | 0.5679 | 0.2769 | 0.3331 | 0.8424 | [nan, 0.7174911859423314, 0.8790751054409742, 0.6065232798410057, 0.6975274018055722, 0.3486407385349508, nan, 0.40093167116703843, 0.28779837903852556, 0.0, 0.7870339041746186, 0.0, 0.0, 0.0, 0.0, 0.1464360606454247, 0.0, 0.0, 0.6770283275082656, 0.0, 0.338555175257431, 0.14697310016578427, 0.0, nan, 0.0, 0.27163002251763635, 0.0, 0.0, 0.8257437911843676, 0.7169333376341568, 0.9108105550493353, 0.0, 0.0, 0.1016801552778885, 0.0] | [nan, 0.9199960254104915, 0.9327745517652714, 0.7304629327758765, 0.7378309547498484, 0.45295941407150275, nan, 0.5188608021128075, 0.5327441812670195, 0.0, 0.9353764765979435, 0.0, 0.0, 0.0, 0.0, 0.1588525415198792, 0.0, 0.0, 0.9238854794385364, 0.0, 0.4400394213522207, 0.15130051149615126, 0.0, nan, 0.0, 0.3570096986572905, 0.0, 0.0, 0.9359897980968498, 0.8570458108260572, 0.9549583230619891, 0.0, 0.0, 0.11786971668879294, 0.0] |
2d8355abca867f8da648661135424511
apache-2.0
['classification']
false
IDEA-CCNL/Erlangshen-TCBert-1.3B-Sentence-Embedding-Chinese - Github: [Fengshenbang-LM](https://github.com/IDEA-CCNL/Fengshenbang-LM) - Docs: [Fengshenbang-Docs](https://fengshenbang-doc.readthedocs.io/)
bf5e24af0e3b0e424acaa8ab02248fc9
apache-2.0
['classification']
false
模型分类 Model Taxonomy | 需求 Demand | 任务 Task | 系列 Series | 模型 Model | 参数 Parameter | 额外 Extra | | :----: | :----: | :----: | :----: | :----: | :----: | | 通用 General | 句子表征 | 二郎神 Erlangshen | TCBert (sentence representation) | 1.3BM | Chinese |
18ce55ef5a83712e432eb43c15ea4b90
apache-2.0
['classification']
false
Loading models tokenizer=BertTokenizer.from_pretrained("IDEA-CCNL/Erlangshen-TCBert-1.3B-Sentence-Embedding-Chinese") model=BertForMaskedLM.from_pretrained("IDEA-CCNL/Erlangshen-TCBert-1.3B-Sentence-Embedding-Chinese")
cc6e2ed37628f17c727cc72f310a4946
apache-2.0
['generated_from_trainer']
false
vit-base-avengers-v1 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.5324 - Accuracy: 0.8683 Refer to this [medium article](https://medium.com/@dingusagar/marvel-character-classification-by-fine-tuning-vision-transformer-45c14a7d8719) for more info on how it was trained.
e854b6ed733d76815f4fef8a7eb93750
apache-2.0
['generated_from_trainer']
false
Limitations Training was done on google images for these search terms each representing a class. Iron Man,Captain America,Thor,Spider Man,Docter Strage,Black Panther,Ant Man,Captain Marvel,Hulk,Black Widow,Hawkeye Avengers,Scarlet Witch,Vision Avengers,Bucky Barnes,Falcon Avengers,Loki Therefore it has seen more of images where these super heros are in their suit or superhero outfit. For example an image of hulk is detected correctly, but an image of Bruce Banner is not simply because the model has't seen those images. A little bit of data augmentation will help.
7a14f77ea67de55a2ede2c4448342aa2
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.8183 | 1.27 | 100 | 1.0134 | 0.8464 | | 0.2234 | 2.53 | 200 | 0.6146 | 0.8495 | | 0.1206 | 3.8 | 300 | 0.5324 | 0.8683 |
caabd96fb9ae78521961926b826aa379
apache-2.0
['generated_from_trainer']
false
my_awesome_billsum_model This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the billsum dataset. It achieves the following results on the evaluation set: - Loss: 2.3829 - Rouge1: 0.1966 - Rouge2: 0.0969 - Rougel: 0.1655 - Rougelsum: 0.1657 - Gen Len: 19.0
f75d6142e8ee8336d96f28f279858f5a
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:| | No log | 1.0 | 248 | 2.5380 | 0.1441 | 0.0519 | 0.1188 | 0.1189 | 19.0 | | No log | 2.0 | 496 | 2.4335 | 0.1939 | 0.0933 | 0.162 | 0.1622 | 19.0 | | 2.8683 | 3.0 | 744 | 2.3940 | 0.1974 | 0.0974 | 0.1665 | 0.1666 | 19.0 | | 2.8683 | 4.0 | 992 | 2.3829 | 0.1966 | 0.0969 | 0.1655 | 0.1657 | 19.0 |
8b3c3af80a33443fbf7ef57b46bdbbae
mit
['generated_from_trainer']
false
BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext-ContaminationQAmodel_PubmedBERT This model is a fine-tuned version of [Sotireas/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext-ContaminationQAmodel_PubmedBERT](https://huggingface.co/Sotireas/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext-ContaminationQAmodel_PubmedBERT) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 3.0853
ece03d073b9a15a707039732d4f9725a
mit
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 21 | 3.8118 | | No log | 2.0 | 42 | 3.5006 | | No log | 3.0 | 63 | 3.1242 | | No log | 4.0 | 84 | 2.9528 | | No log | 5.0 | 105 | 2.9190 | | No log | 6.0 | 126 | 2.9876 | | No log | 7.0 | 147 | 3.0574 | | No log | 8.0 | 168 | 3.0718 | | No log | 9.0 | 189 | 3.0426 | | No log | 10.0 | 210 | 3.0853 |
c49af33ea3e5845e52ed624eab1408a5
apache-2.0
['generated_from_trainer']
false
finetuning-sentiment-model-3000-samples This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the imdb dataset. It achieves the following results on the evaluation set: - Loss: 0.3182 - Accuracy: 0.8767 - F1: 0.8754
33795a66f79d811274aebc35b86e4bce
creativeml-openrail-m
['stable-diffusion', 'stable-diffusion-diffusers', 'diffusers', 'waifu-diffusion']
false
Hitokomoru Diffusion V2 ![Anime Girl](https://huggingface.co/Linaqruf/hitokomoru-diffusion-v2/resolve/main/example_image/thumbnail.png) A latent diffusion model that has been trained on Japanese Artist artwork, [ヒトこもる/Hitokomoru](https://www.pixiv.net/en/users/30837811). The current model is fine-tuned from [waifu-diffusion-1-4](https://huggingface.co/hakurei/waifu-diffusion-v1-4) (`wd-1-4-anime_e2.ckpt`) with a learning rate of `2.0e-6`, 15000 training steps and 4 batch sizes on the `257 artworks` collected from Danbooru. This model supposed to be a continuation of [hitokomoru-diffusion](https://huggingface.co/Linaqruf/hitokomoru-diffusion/) fine-tuned from Anything V3.0. Dataset has been preprocessed using [Aspect Ratio Bucketing Tool](https://github.com/NovelAI/novelai-aspect-ratio-bucketing) so that it can be converted to latents and trained at non-square resolutions. Like other anime-style Stable Diffusion models, it also supports Danbooru tags to generate images. e.g. **_1girl, white hair, golden eyes, beautiful eyes, detail, flower meadow, cumulonimbus clouds, lighting, detailed sky, garden_** - Use it with the [`Automatic1111's Stable Diffusion Webui`](https://github.com/AUTOMATIC1111/stable-diffusion-webui) see: [how-to-use](
647a24db847a102beb15e72bbc1078b0
creativeml-openrail-m
['stable-diffusion', 'stable-diffusion-diffusers', 'diffusers', 'waifu-diffusion']
false
Model Details - **Developed by:** Linaqruf - **Model type:** Diffusion-based text-to-image generation model - **Model type:** This is a model that can be used to generate and modify images based on text prompts. - **License:** [CreativeML Open RAIL++-M License](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/LICENSE-MODEL) - **Finetuned from model:** [waifu-diffusion-v1-4-epoch-2](https://huggingface.co/hakurei/waifu-diffusion-v1-4/blob/main/wd-1-4-anime_e2.ckpt)
14aa6aac612fc5432862f969b88a899b
creativeml-openrail-m
['stable-diffusion', 'stable-diffusion-diffusers', 'diffusers', 'waifu-diffusion']
false
How to Use - Download the `hitokomoru-v2.ckpt` [here](https://huggingface.co/Linaqruf/hitokomoru-diffusion-v2/resolve/main/hitokomoru-v2.ckpt), or download the safetensors version [here](https://huggingface.co/Linaqruf/hitokomoru-diffusion-v2/resolve/main/hitokomoru-v2.safetensors). - This model is fine-tuned from [waifu-diffusion-v1-4-epoch-2](https://huggingface.co/hakurei/waifu-diffusion-v1-4/blob/main/wd-1-4-anime_e2.ckpt), which is also fine-tuned from [stable-diffusion-2-1-base](https://huggingface.co/stabilityai/stable-diffusion-2-1-base). So in order to run this model in [`Automatic1111's Stable Diffusion Webui`](https://github.com/AUTOMATIC1111/stable-diffusion-webui), you need to put inference config .YAML file next to the model, you can find it [here](https://huggingface.co/Linaqruf/hitokomoru-diffusion-v2/resolve/main/hitokomoru-v2.yaml) - You need to adjust your prompt using aesthetic tags, Based [Official Waifu Diffusion 1.4 release notes](https://gist.github.com/harubaru/8581e780a1cf61352a739f2ec2eef09b
3803d78efb58c1bdb5824b6f7eb52acc
creativeml-openrail-m
['stable-diffusion', 'stable-diffusion-diffusers', 'diffusers', 'waifu-diffusion']
false
prompting), an ideal negative prompt to guide the model towards high aesthetic generations would look like: ``` worst quality, low quality, medium quality, deleted, lowres, comic, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, jpeg artifacts, signature, watermark, username, blurry ``` - And, the following should also be prepended to prompts to get high aesthetic results: ``` masterpiece, best quality, high quality, absurdres ```
9e1f54f45fb3ae657a9edad3f2a7116c
creativeml-openrail-m
['stable-diffusion', 'stable-diffusion-diffusers', 'diffusers', 'waifu-diffusion']
false
🧨 Diffusers This model can be used just like any other Stable Diffusion model. For more information, please have a look at the [Stable Diffusion](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion). You can also export the model to [ONNX](https://huggingface.co/docs/diffusers/optimization/onnx), [MPS](https://huggingface.co/docs/diffusers/optimization/mps) and/or [FLAX/JAX](). You should install dependencies below in order to running the pipeline ```bash pip install diffusers transformers accelerate scipy safetensors ``` Running the pipeline (if you don't swap the scheduler it will run with the default DDIM, in this example we are swapping it to DPMSolverMultistepScheduler): ```python import torch from torch import autocast from diffusers import StableDiffusionPipeline, DPMSolverMultistepScheduler model_id = "Linaqruf/hitokomoru-diffusion-v2"
1ec975b0ace9010f23fd4d8aca67fc8a
creativeml-openrail-m
['stable-diffusion', 'stable-diffusion-diffusers', 'diffusers', 'waifu-diffusion']
false
Use the DPMSolverMultistepScheduler (DPM-Solver++) scheduler here instead pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16) pipe.scheduler = DPMSolverMultistepScheduler.from_config(pipe.scheduler.config) pipe = pipe.to("cuda") prompt = "masterpiece, best quality, high quality, 1girl, solo, sitting, confident expression, long blonde hair, blue eyes, formal dress" negative_prompt = "worst quality, low quality, medium quality, deleted, lowres, comic, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, jpeg artifacts, signature, watermark, username, blurry" with autocast("cuda"): image = pipe(prompt, negative_prompt=negative_prompt, width=512, height=728, guidance_scale=12, num_inference_steps=50).images[0] image.save("anime_girl.png") ```
2b0f01ce8812ec4c395d6d356e9e5560
creativeml-openrail-m
['stable-diffusion', 'stable-diffusion-diffusers', 'diffusers', 'waifu-diffusion']
false
Prompt and settings for Example Images ``` masterpiece, best quality, high quality, 1girl, solo, sitting, confident expression, long blonde hair, blue eyes, formal dress, jewelry, make-up, luxury, close-up, face, upper body. Negative prompt: worst quality, low quality, medium quality, deleted, lowres, comic, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, jpeg artifacts, signature, watermark, username, blurry Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 7, Seed: 994051800, Size: 512x768, Model hash: ea61e913a0, Model: hitokomoru-v2, Batch size: 2, Batch pos: 0, Denoising strength: 0.6, Clip skip: 2, ENSD: 31337, Hires upscale: 1.5, Hires steps: 20, Hires upscaler: Latent (nearest-exact) ``````
7b66cd673ce8668295461dc6fdbe222c
apache-2.0
['generated_from_trainer']
false
distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2146 - Accuracy: 0.9225 - F1: 0.9228
29fdb97f0b7e5b6401b99ee8b3d72348
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.8233 | 1.0 | 250 | 0.3068 | 0.9025 | 0.8995 | | 0.2394 | 2.0 | 500 | 0.2146 | 0.9225 | 0.9228 |
e16d9671a6adb5ad7e5dd7dc2d6301de
apache-2.0
['generated_from_trainer', 'automatic-speech-recognition', 'NbAiLab/NPSC', 'robust-speech-event', False, 'nb-NO', 'hf-asr-leaderboard']
false
XLSR-300M-bokmaal This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the NBAILAB/NPSC - 16K_MP3_BOKMAAL dataset. It achieves the following results on the evaluation set: - Loss: 0.1635 - Wer: 0.1005
7178d2144d7b502dd6e94f6fe4e86b89
apache-2.0
['generated_from_trainer', 'automatic-speech-recognition', 'NbAiLab/NPSC', 'robust-speech-event', False, 'nb-NO', 'hf-asr-leaderboard']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:-----:|:---------------:|:------:| | 3.0307 | 0.32 | 500 | 3.0026 | 1.0 | | 2.7865 | 0.64 | 1000 | 2.4849 | 0.9926 | | 0.7522 | 0.95 | 1500 | 0.4567 | 0.3594 | | 0.5703 | 1.27 | 2000 | 0.3440 | 0.2586 | | 0.4762 | 1.59 | 2500 | 0.2925 | 0.2178 | | 0.4585 | 1.91 | 3000 | 0.2442 | 0.1981 | | 0.4013 | 2.23 | 3500 | 0.2495 | 0.1818 | | 0.449 | 2.54 | 4000 | 0.2152 | 0.1808 | | 0.355 | 2.86 | 4500 | 0.2179 | 0.1670 | | 0.3142 | 3.18 | 5000 | 0.1953 | 0.1542 | | 0.3242 | 3.5 | 5500 | 0.2103 | 0.1526 | | 0.3016 | 3.82 | 6000 | 0.1911 | 0.1477 | | 0.2713 | 4.13 | 6500 | 0.1836 | 0.1422 | | 0.2807 | 4.45 | 7000 | 0.1924 | 0.1447 | | 0.2929 | 4.77 | 7500 | 0.1848 | 0.1402 | | 0.2595 | 5.09 | 8000 | 0.1783 | 0.1330 | | 0.2289 | 5.41 | 8500 | 0.1901 | 0.1313 | | 0.2567 | 5.72 | 9000 | 0.1784 | 0.1298 | | 0.2401 | 6.04 | 9500 | 0.1956 | 0.1298 | | 0.2098 | 6.36 | 10000 | 0.1748 | 0.1277 | | 0.2246 | 6.68 | 10500 | 0.1777 | 0.1254 | | 0.2197 | 7.0 | 11000 | 0.1703 | 0.1222 | | 0.2122 | 7.32 | 11500 | 0.1917 | 0.1221 | | 0.2746 | 7.63 | 12000 | 0.1769 | 0.1215 | | 0.2148 | 7.95 | 12500 | 0.1736 | 0.1193 | | 0.1915 | 8.27 | 13000 | 0.1814 | 0.1161 | | 0.2462 | 8.59 | 13500 | 0.1748 | 0.1166 | | 0.1872 | 8.91 | 14000 | 0.1769 | 0.1133 | | 0.1886 | 9.22 | 14500 | 0.1852 | 0.1143 | | 0.1789 | 9.54 | 15000 | 0.1696 | 0.1126 | | 0.1692 | 9.86 | 15500 | 0.1817 | 0.1122 | | 0.1765 | 10.18 | 16000 | 0.1769 | 0.1093 | | 0.1699 | 10.5 | 16500 | 0.1604 | 0.1084 | | 0.1591 | 10.81 | 17000 | 0.1777 | 0.1080 | | 0.1499 | 11.13 | 17500 | 0.1645 | 0.1074 | | 0.163 | 11.45 | 18000 | 0.1704 | 0.1065 | | 0.1597 | 11.77 | 18500 | 0.1576 | 0.1064 | | 0.1484 | 12.09 | 19000 | 0.1637 | 0.1041 | | 0.1464 | 12.4 | 19500 | 0.1631 | 0.1047 | | 0.156 | 12.72 | 20000 | 0.1686 | 0.1029 | | 0.1625 | 13.04 | 20500 | 0.1648 | 0.1023 | | 0.1395 | 13.36 | 21000 | 0.1688 | 0.1027 | | 0.1387 | 13.68 | 21500 | 0.1670 | 0.1013 | | 0.1434 | 13.99 | 22000 | 0.1677 | 0.1017 | | 0.1442 | 14.31 | 22500 | 0.1688 | 0.1008 | | 0.1439 | 14.63 | 23000 | 0.1647 | 0.1004 | | 0.137 | 14.95 | 23500 | 0.1636 | 0.1006 |
a60c18e3acd6c299360ba10daee6ade8
other
['generated_from_trainer']
false
segformer-b0-scene-parse-150 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the scene_parse_150 dataset. It achieves the following results on the evaluation set: - Loss: 2.3118 - Mean Iou: 0.0859 - Mean Accuracy: 0.1493 - Overall Accuracy: 0.5430 - Per Category Iou: [0.4898642376841085, 0.502026813829342, 0.9487341030299479, 0.44331193050176815, 0.28502594514455154, 0.5132976114794296, 0.8390207156308851, 0.0, 0.30530825819472024, 0.0, 0.06594624784212842, 0.0, 0.03397963180571876, 0.0, 0.0007459827819109256, 0.0, nan, 0.04554975143210437, 0.0, 0.07792795056021705, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] - Per Category Accuracy: [0.8215553632658342, 0.819071257846768, 0.9731147245348802, 0.8672811704363634, 0.9004683840749415, 0.594073476114797, 0.9732440887086908, 0.0, 0.40956851614311834, 0.0, 0.5229850345614389, 0.0, 0.034648027958062905, nan, 0.0007464041475862904, 0.0, nan, 0.0476077438413251, 0.0, 0.5009150608246313, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
ec3daec83d7e95cf88c5587c19bf77dc
other
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:| | 4.2849 | 1.0 | 20 | 4.2070 | 0.0194 | 0.0679 | 0.3746 | [0.3949829725229674, 0.4135772915291814, 0.0, 0.26980840849544657, 0.1282559786684443, 0.15076540186066723, 0.00908901592761032, 0.0, 0.013565775517419566, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0003970617431010522, 0.008885447023579041, 0.0, 0.0, 0.0, 0.005040122024006897, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.002655312914892643, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9524952286989713, 0.755535418800725, 0.0, 0.48244304323326054, 0.9011709601873537, 0.16045676614279827, 0.011822582618269517, 0.0, 0.013613165579542043, 0.0, 0.0, 0.0, 0.0, nan, 0.0004034617013979948, 0.07362999240057418, nan, 0.0, 0.0, 0.04090860157175153, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0034295175023651846, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 3.7699 | 2.0 | 40 | 3.9727 | 0.0380 | 0.1002 | 0.4224 | [0.43442101571739283, 0.35923049538654755, 0.6190543160517142, 0.3355717837774341, 0.10588647687723779, 0.31387526278906797, 0.038367652125468235, 0.0, 0.04485789722234148, 0.0, 0.07154189015637985, 0.0, 0.0, 0.0, 0.004233122680308957, 0.0003664849512116909, 0.0, 0.0, 0.0, 0.0284352014981458, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0007874615794955165, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9416870188264217, 0.6175399889685604, 0.926106819752938, 0.704992945957396, 0.9903981264637002, 0.35779874372493126, 0.04573495744569302, 0.0, 0.045626483993340794, 0.0, 0.12710556338500772, 0.0, 0.0, nan, 0.004256520949748845, 0.0013510090348729208, nan, 0.0, 0.0, 0.24846592744105933, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0009165089877010407, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 3.7161 | 3.0 | 60 | 3.6079 | 0.0535 | 0.1182 | 0.4695 | [0.46147636301683304, 0.42190388170055454, 0.5905298672673311, 0.34255470866251286, 0.10127362853686882, 0.2992112324204293, 0.616176968407381, 0.0, 0.031915928772988225, 0.0, 0.05093061049274829, 0.0, 0.0, 0.0, 0.006854943777434808, 0.021351935880581825, nan, 0.0002887139107611549, 0.0, 0.052485352485352486, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9419053665006146, 0.7625719013474116, 0.9179916443749813, 0.6481040461001222, 1.0, 0.32631618778537375, 0.779355481379214, 0.0, 0.03222619469992631, 0.0, 0.11146902892423327, 0.0, 0.0, nan, 0.006899195093905711, 0.08502913113231444, nan, 0.00029013029487788154, 0.0, 0.2989557541177737, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 3.8074 | 4.0 | 80 | 3.4130 | 0.0568 | 0.1189 | 0.4832 | [0.475136642179407, 0.45819879963242616, 0.6245854607342572, 0.31596148184808626, 0.12156252316359054, 0.31925064356714705, 0.5921768947963801, 0.0, 0.052846247434062785, 0.0, 0.03856766297226889, 0.0, 0.0, 0.0, 0.005636392708666453, 0.0001625685836212152, nan, 0.0, 0.0, 0.06471839249046642, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9399179983179142, 0.8159062852940404, 0.9651498301824412, 0.6092060397412009, 0.9601873536299765, 0.35716808354986, 0.8782527393788994, 0.0, 0.05381949182609645, 0.0, 0.16790819408093416, 0.0, 0.0, nan, 0.005688809989711727, 0.0003377522587182302, nan, 0.0, 0.0, 0.19000968887931963, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 3.3666 | 5.0 | 100 | 3.4479 | 0.0696 | 0.1338 | 0.5077 | [0.46280186689480507, 0.47811968526761967, 0.6189516129032258, 0.39204509433188267, 0.12150226270680849, 0.48382140822535485, 0.7870309951060359, 0.0, 0.09098262661206216, 0.0, 0.02553216306059369, 0.0, 0.0, 0.0, 0.007265179034769071, 0.008072378426861131, nan, 0.0, 0.0, 0.140759523528718, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7732317881865821, 0.8144485593465185, 0.9596345165459409, 0.8648009443779486, 0.977751756440281, 0.6712873035493555, 0.8431345135527166, 0.0, 0.0976174231052646, 0.0, 0.1469028924233273, 0.0, 0.0, nan, 0.007343002965443505, 0.01747867938866841, nan, 0.0, 0.0, 0.5139412207987942, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.8715 | 6.0 | 120 | 3.2604 | 0.0728 | 0.1299 | 0.5149 | [0.50879365259482, 0.4533002292169846, 0.900409470239023, 0.35700422307361396, 0.15217811822965027, 0.47367637662639406, 0.6628619419365922, 0.0, 0.12773258835090318, 0.0, 0.05044635946127251, 0.0, 0.0, 0.0, 0.0007838250663236595, 0.025494428563094335, nan, 0.0005788864330070519, 0.0, 0.07236342520212623, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9246377045998577, 0.7931605074462217, 0.9550359171650987, 0.7270640786762256, 0.9211943793911007, 0.548577651085156, 0.9103650757588997, 0.0, 0.13551486040228158, 0.0, 0.3280316757264613, 0.0, 0.0, nan, 0.0007867503177260898, 0.0737988685299333, nan, 0.0005802605897557631, 0.0, 0.17439982775325655, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 3.4302 | 7.0 | 140 | 3.1432 | 0.0594 | 0.1213 | 0.4632 | [0.45221752264588605, 0.37219674969901695, 0.605783851771199, 0.3125698077576481, 0.12471020243040969, 0.42464017248412095, 0.6040006848533273, 0.0, 0.06926821236351662, 0.0, 0.0440532531292848, 0.0, 0.0, 0.0, 0.0, 0.0015676977784575383, nan, 0.0, 0.0, 0.07605052848672338, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8683060263958077, 0.618740314658682, 0.9698987105888011, 0.6848796699612953, 0.9385245901639344, 0.4902163584840611, 0.9247741213890005, 0.0, 0.07348052727818565, 0.0, 0.3342057580028186, 0.0, 0.0, nan, 0.0, 0.0034619606518618592, nan, 0.0, 0.0, 0.15878996662719347, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.4766 | 8.0 | 160 | 3.0017 | 0.0731 | 0.1267 | 0.5143 | [0.4688651010981108, 0.44508907466943504, 0.8341699394002445, 0.3667585998450989, 0.15656454924159374, 0.5731453244361243, 0.7303214047877747, 0.0, 0.10742113264918282, 0.0, 0.027347890608437567, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.008315844700944387, 0.0, 0.08233957978421351, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8567558387785469, 0.7443437606702913, 0.9639926662859547, 0.798571093643958, 0.9197892271662763, 0.7182714865921647, 0.9103650757588997, 0.0, 0.1110722960617887, 0.0, 0.20777129051741494, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.008361027588753494, 0.0, 0.09365916675637852, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.7613 | 9.0 | 180 | 2.9787 | 0.0675 | 0.1268 | 0.4974 | [0.4946951488018788, 0.43475820007042393, 0.8041630056802009, 0.27142083984459003, 0.13245957905008665, 0.563133587635599, 0.5246769633385517, 0.0, 0.17340080548145823, 0.0, 0.02205332499422154, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.014390357940672243, 0.0, 0.14129719051799824, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.872157436760044, 0.6875055813831324, 0.976556160019236, 0.6229562813884331, 0.9218969555035129, 0.6597924707583899, 0.981047167997763, 0.0, 0.21715018694904614, 0.0, 0.10885175491577746, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.014612016669304215, 0.0, 0.2772096027559479, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.423 | 10.0 | 200 | 2.8971 | 0.0705 | 0.1221 | 0.4786 | [0.45285400219846106, 0.292995708500935, 0.8659067385015526, 0.41603214969939956, 0.1778944797264289, 0.46283203885294827, 0.9158788235968208, 0.0, 0.11377199379602905, 0.0, 0.02756293067079173, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.006541312879973321, 0.0, 0.006434223111033823, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.916192906126674, 0.4576313923252699, 0.9806287758107661, 0.9056938257589779, 0.8528103044496487, 0.524885850508312, 0.9777179706751018, 0.0, 0.12091918888676619, 0.0, 0.35514395007046506, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.006725747744896344, 0.0, 0.0079664118850253, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.9947 | 11.0 | 220 | 2.7385 | 0.0643 | 0.1201 | 0.4760 | [0.4625717328755147, 0.37213716519109774, 0.8277628134602982, 0.2659240653368483, 0.15398480224484362, 0.527751424867652, 0.4902700616280028, 0.0, 0.17549662897009846, 0.0, 0.03796166432912576, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.02023508790019075, 0.0, 0.007948046912862267, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8824581904638675, 0.5921939432143514, 0.9836945087313276, 0.5800315066859162, 0.8803278688524591, 0.6127788569074106, 0.987792943150242, 0.0, 0.21227040746704512, 0.0, 0.24521844171532112, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.020704752861739728, 0.0, 0.00882764560232533, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.5021 | 12.0 | 240 | 2.7417 | 0.0718 | 0.1267 | 0.4906 | [0.45723768505572465, 0.38457056157271563, 0.9111909002931318, 0.36509902170864206, 0.1882208839440645, 0.5653003453339632, 0.7384534282231228, 0.0, 0.13432339471333826, 0.0, 0.03152366835052954, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.025990476413877226, 0.0, 0.002576370997423629, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8224813191434301, 0.6344469834265752, 0.9763307384809594, 0.7870090448044816, 0.8637002341920375, 0.6490165905670056, 0.979220915398193, 0.0, 0.1543298490761715, 0.0, 0.43386349909402055, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.02922403333860843, 0.0, 0.003014318010550113, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.1942 | 13.0 | 260 | 2.7584 | 0.0676 | 0.1218 | 0.4845 | [0.4504119105257257, 0.40036808030610466, 0.8516919118998241, 0.32959773438753465, 0.1681251367315686, 0.4346956386731514, 0.658089415906469, 0.0, 0.1361765416368657, 0.0, 0.04049711603766891, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.04199944918755164, 0.0, 0.004621848739495798, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8772097593323414, 0.657665537257374, 0.9751284902768177, 0.726665103671804, 0.8998829039812647, 0.5061762653145313, 0.9669789063455724, 0.0, 0.15692803144018996, 0.0, 0.26810281189181934, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.048267130875138474, 0.0, 0.007105178167725266, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.9015 | 14.0 | 280 | 2.7956 | 0.0691 | 0.1297 | 0.4652 | [0.46908042480668444, 0.3684401690587303, 0.9327478223310057, 0.2938101507593599, 0.13839169684473168, 0.5286547501595961, 0.573426750047988, 0.0, 0.23750374988851683, 0.0, 0.03078130611368424, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.020949198194378633, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.6516060684479524, 0.6154624011766869, 0.9735805957139851, 0.6516989342842923, 0.9352459016393443, 0.7659788266357223, 0.9919347791894584, 0.0, 0.3197838486940859, 0.0, 0.55794913093081, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.022155404336129135, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.1112 | 15.0 | 300 | 2.5442 | 0.0756 | 0.1348 | 0.5229 | [0.49677059303217863, 0.4302482870914963, 0.8032426287041843, 0.4219673428099668, 0.16821276689509287, 0.500942291040595, 0.8748079268775643, 0.0, 0.2223202963773341, 0.0, 0.05334747004087555, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.03619629085567389, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8350302451963512, 0.7176240380322013, 0.9768266658651679, 0.8819116249799485, 0.8939110070257611, 0.6202542821825887, 0.9651002254417085, 0.0, 0.2928358942168609, 0.0, 0.5132541440171801, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.041594134092947196, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.3631 | 16.0 | 320 | 2.6116 | 0.0761 | 0.1291 | 0.4780 | [0.4773444435989143, 0.30408899790354993, 0.9460832250813714, 0.3578894375553185, 0.2547144028423066, 0.5841460034676791, 0.7330094544692665, 0.0, 0.23688384443079752, 0.0, 0.024772124309757743, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.03830117124478807, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7609153619719221, 0.47736453654821004, 0.9741065793032971, 0.8099562772752886, 0.8730679156908665, 0.7620771423526147, 0.9782597298194718, 0.0, 0.29266122649491005, 0.0, 0.477484732568284, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0482143799124334, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.0519 | 17.0 | 340 | 2.5895 | 0.0709 | 0.1269 | 0.4954 | [0.48482850272259875, 0.38933025954926664, 0.874342238072808, 0.34184587293666174, 0.181434395580537, 0.4821329973241969, 0.6515304786613667, 0.0, 0.21672806399768857, 0.0, 0.04150789065390832, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.025378816152870045, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8710737853399754, 0.6431828329787513, 0.9763457665835111, 0.7269365712005857, 0.880679156908665, 0.5727151181857169, 0.9567379109068349, 0.0, 0.2620397914904069, 0.0, 0.42909871820683176, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.026902990979585376, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.1177 | 18.0 | 360 | 2.5314 | 0.0793 | 0.1292 | 0.5039 | [0.4653830828650731, 0.3945759961048446, 0.9324341886759758, 0.4001519221337092, 0.29972535127174305, 0.4581277542714083, 0.8735896186653674, 0.0, 0.21039307302160773, 0.0, 0.03642280275104697, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.05377909589287319, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8311869217830109, 0.638557507945263, 0.9703946379730095, 0.9295336105592642, 0.8817330210772834, 0.526248076486466, 0.9870764229915591, 0.0, 0.26446330613247454, 0.0, 0.3507818267230387, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.07891544020678377, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.8557 | 19.0 | 380 | 2.5372 | 0.0830 | 0.1361 | 0.5226 | [0.5031437202609337, 0.43567175985678, 0.9378157906298971, 0.3999662626976136, 0.1953096101205889, 0.5999859316277106, 0.896640684170664, 0.0, 0.2707113789325403, 0.0, 0.02527710616384589, 0.0, 0.007388916625062406, 0.0, 0.0, 0.0, nan, 0.04184551718065553, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7330558161350844, 0.6871273605967484, 0.9623245469027081, 0.9362421490356733, 0.927400468384075, 0.7530965414595999, 0.9930882018839238, 0.0, 0.3776261564913621, 0.0, 0.3781625394268841, 0.0, 0.007388916625062406, nan, 0.0, 0.0, nan, 0.050139790051168434, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.7307 | 20.0 | 400 | 2.4994 | 0.0777 | 0.1317 | 0.5046 | [0.4800192398950199, 0.3449591461292408, 0.9396956273847639, 0.4272217614198487, 0.268896641357138, 0.4711749125273863, 0.8287388184697303, 0.0, 0.2517840493437567, 0.0, 0.04121483255095933, 0.0, 0.002596105841238143, 0.0, 0.0, 0.0, nan, 0.05892144437929143, 0.0, 0.0013361541031065582, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9018790030406936, 0.5466813752528038, 0.9659763758227886, 0.8792010628365067, 0.9206088992974238, 0.5452309477561111, 0.9900735743870257, 0.0, 0.30659643568679895, 0.0, 0.43896382793101135, 0.0, 0.002596105841238143, nan, 0.0, 0.0, nan, 0.08215962441314555, 0.0, 0.002583701151900097, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.7527 | 21.0 | 420 | 2.4304 | 0.0807 | 0.1338 | 0.5327 | [0.5015856972981559, 0.43082591968073747, 0.9418403911157535, 0.3963885377075013, 0.2506914330590364, 0.5804233460278098, 0.7429052091886804, 0.0, 0.2571854237623848, 0.0, 0.05346003315615867, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.042945515108256005, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.880929756744517, 0.7100044651065059, 0.9756544738661297, 0.8111696548660554, 0.8597189695550351, 0.6888448828233394, 0.9723877597385575, 0.0, 0.32602276138751674, 0.0, 0.4198375947922958, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0476077438413251, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.9816 | 22.0 | 440 | 2.4334 | 0.0824 | 0.1374 | 0.5165 | [0.4732090059520191, 0.48237301785754877, 0.9438233486493606, 0.36582122198257794, 0.33411702080468436, 0.5619725061466855, 0.700576076744301, 0.0, 0.28868093138237455, 0.0, 0.03287401235949254, 0.0, 0.043111067465383485, 0.0, 0.0, 0.0, nan, 0.05716521350324167, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.6924512356860969, 0.7802931211094476, 0.9692975864867301, 0.7897319463810499, 0.8819672131147541, 0.7639859404824971, 0.990388144212789, 0.0, 0.3773423214431921, 0.0, 0.5126501577075364, 0.0, 0.04383424862705941, nan, 0.0, 0.0, nan, 0.06744210581843119, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.5813 | 23.0 | 460 | 2.4229 | 0.0811 | 0.1348 | 0.5258 | [0.4807211786494602, 0.49459326575865226, 0.938036321539686, 0.39272092584681345, 0.2735519637135966, 0.47704798693133726, 0.7735578328901824, 0.0, 0.2279823899472586, 0.0, 0.051354439182700266, 0.0, 0.05645479668377418, 0.0, 0.0, 0.0, nan, 0.05121797646688442, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8175887138513295, 0.7979302918078428, 0.9741666917135042, 0.8394063910037306, 0.8898126463700234, 0.575834783851736, 0.9939882211076353, 0.0, 0.30838132147048386, 0.0, 0.4162807865243943, 0.0, 0.05711432850723914, nan, 0.0, 0.0, nan, 0.06865537796064779, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.7373 | 24.0 | 480 | 2.4194 | 0.0833 | 0.1335 | 0.5209 | [0.47295856989088453, 0.4010758246373534, 0.9478356084385722, 0.4552020300605114, 0.3144253866576763, 0.47478980784006647, 0.8815378441540047, 0.0, 0.23782469737163758, 0.0, 0.05128840994656988, 0.0, 0.03667553978112984, 0.0, 0.0, 0.0, nan, 0.059630358900921866, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9061125218347674, 0.6521419378562235, 0.9756544738661297, 0.8920422995767574, 0.8736533957845434, 0.5429942063351917, 0.9789412977752923, 0.0, 0.28675527414644797, 0.0, 0.45802295147976646, 0.0, 0.037144283574638046, nan, 0.0, 0.0, nan, 0.07029065780450493, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.855 | 25.0 | 500 | 2.3374 | 0.0807 | 0.1361 | 0.5333 | [0.48604188185600444, 0.4597550350021532, 0.9293649430223901, 0.3959473663050158, 0.17281708714356747, 0.5975194939881001, 0.7701785811214243, 0.0, 0.22781434599156117, 0.0, 0.05470044278442122, 0.0, 0.07640671273445213, 0.0, 0.0, 0.0, nan, 0.02489414513067601, 0.0, 0.0003542151604088655, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8514507989907485, 0.7234575683555275, 0.9755943614559226, 0.8506558408706704, 0.9360655737704918, 0.7097575742287026, 0.9918736128344489, 0.0, 0.29470811386152124, 0.0, 0.3656130461042883, 0.0, 0.07728407388916625, nan, 0.0, 0.0, nan, 0.02698211742364298, 0.0, 0.0007535795026375283, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.7691 | 26.0 | 520 | 2.3883 | 0.0849 | 0.1364 | 0.5267 | [0.474469192247138, 0.45106000902680193, 0.9462431454906078, 0.4263770051529127, 0.38456560427115355, 0.4726483862293509, 0.7962471970327331, 0.0, 0.26753188481254314, 0.0, 0.059773452481696854, 0.0, 0.05490234185210378, 0.0, 0.0, 0.0, nan, 0.07469007808708696, 0.0, 0.006923837784371909, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8627361389661642, 0.7402227300186484, 0.9750533497640588, 0.8430259580541537, 0.8350117096018735, 0.5468664598101293, 0.9866919487600706, 0.0, 0.34497966758549165, 0.0, 0.5166767331051607, 0.0, 0.056415376934598103, nan, 0.0, 0.0, nan, 0.08930737985968244, 0.0, 0.024868123587038434, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.1586 | 27.0 | 540 | 2.3779 | 0.0854 | 0.1406 | 0.5312 | [0.4678501563643622, 0.5013837697819475, 0.9380973066898349, 0.41811471015546997, 0.27226958694693515, 0.578128105346164, 0.8283608980534332, 0.0, 0.27534003499046283, 0.0, 0.03832732681196728, 0.0, 0.07413157384048127, 0.0, 0.0, 0.0, nan, 0.03650256622707082, 0.0, 0.012569581612497755, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7107742446787864, 0.8174900848370236, 0.973595623816537, 0.8629253505427295, 0.8675644028103044, 0.7582805680986857, 0.9842802467625522, 0.0, 0.361654976665484, 0.0, 0.539561103281659, 0.0, 0.0762855716425362, nan, 0.0, 0.0, nan, 0.04183151342512001, 0.0, 0.037678975131876416, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.4882 | 28.0 | 560 | 2.3883 | 0.0807 | 0.1330 | 0.5090 | [0.44901952206920415, 0.39232311084385646, 0.9421616696316223, 0.42997326597550106, 0.2785992790406827, 0.44117270081577875, 0.7924737017866088, 0.0, 0.23357056159947687, 0.0, 0.07039609536324386, 0.0, 0.0980335173883561, 0.0, 0.0, 0.0, nan, 0.05874853925422288, 0.0, 0.008242107942973524, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9130976256712169, 0.6169096209912537, 0.9789606564275195, 0.8513920937138815, 0.8868852459016393, 0.4920368641894335, 0.9953251428671293, 0.0, 0.28465380311672717, 0.0, 0.42782363599758405, 0.0, 0.10104842735896155, nan, 0.0, 0.0, nan, 0.0729282059397584, 0.0, 0.027882441597588545, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.5824 | 29.0 | 580 | 2.3286 | 0.0870 | 0.1422 | 0.5448 | [0.49555070795060346, 0.47669886943444867, 0.9465301980853222, 0.429631907145671, 0.23039130181987325, 0.6139492231693006, 0.8638062252198879, 0.0, 0.2764489478441842, 0.0, 0.05525274862179587, 0.0, 0.08505129926167418, 0.0, 0.0, 0.0, nan, 0.04205075812106718, 0.0, 0.007407788466484901, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8170509316167432, 0.7459905970110051, 0.9672838207447928, 0.9195921406037273, 0.9279859484777517, 0.734681264347519, 0.9937522937383129, 0.0, 0.35889850167844767, 0.0, 0.4762096503590363, 0.0, 0.08856714927608587, nan, 0.0, 0.0, nan, 0.04776599672944031, 0.0, 0.031004413822801162, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.5912 | 30.0 | 600 | 2.2626 | 0.0859 | 0.1400 | 0.5506 | [0.49475158849729506, 0.5110290319658362, 0.9474920364634558, 0.43933854351977947, 0.264796217689892, 0.5247187330936232, 0.8601239748970084, 0.0, 0.2863141923823105, 0.0, 0.06945163324267606, 0.0, 0.021489403842345017, 0.0, 0.0, 0.0, nan, 0.03598436786276202, 0.0, 0.01033003818191328, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.851810668305622, 0.8445381241299609, 0.9700039073066634, 0.8837296347939109, 0.8984777517564403, 0.5994424964052371, 0.9760402649376977, 0.0, 0.37699298599929043, 0.0, 0.49526877390779145, 0.0, 0.021667498751872193, nan, 0.0, 0.0, nan, 0.03740043255789418, 0.0, 0.04252341479168909, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.2659 | 31.0 | 620 | 2.3314 | 0.0821 | 0.1372 | 0.5285 | [0.47959682999872194, 0.47351398814605233, 0.9477783135346806, 0.39772012056152584, 0.25233860342555997, 0.4977334076638695, 0.7849239425607517, 0.0, 0.2803309133884856, 0.0, 0.06148684892400287, 0.0, 0.026432587040142026, 0.0, 0.0, 0.0, nan, 0.04319110519813742, 0.0, 0.02313470205307962, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8421305880830692, 0.7648569852651486, 0.9747978720206787, 0.8369261649453158, 0.8970725995316159, 0.5705792823928088, 0.9824889463658447, 0.0, 0.36252285690892716, 0.0, 0.4538621569022213, 0.0, 0.02675986020968547, nan, 0.0, 0.0, nan, 0.04795062509890805, 0.0, 0.09947249434815374, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.1962 | 32.0 | 640 | 2.3454 | 0.0823 | 0.1363 | 0.5250 | [0.47474492138216845, 0.4474584311384102, 0.9481941804357362, 0.40227477381729926, 0.26351167843563283, 0.49494855699587714, 0.8014021173388096, 0.0, 0.28657288639780826, 0.0, 0.06005103430190925, 0.0, 0.032823829120125415, 0.0, 0.0, 0.0, nan, 0.04381559844824714, 0.0, 0.022763889610727093, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8467239761920166, 0.718480287868043, 0.9745273661747468, 0.8562661697988261, 0.9088992974238876, 0.5602574775274758, 0.9888851994896979, 0.0, 0.3859665402145138, 0.0, 0.3995704986242534, 0.0, 0.03344982526210684, nan, 0.0, 0.0, nan, 0.04855726117001635, 0.0, 0.09430509204435354, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.3758 | 33.0 | 660 | 2.2960 | 0.0858 | 0.1381 | 0.5521 | [0.5224921581254285, 0.5076405148598259, 0.9404880298640839, 0.40783730221277903, 0.3364309496595935, 0.5956182247041231, 0.7330045437207653, 0.0, 0.30502471233599465, 0.0, 0.054505055661064276, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.057240453824569965, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8136200426991007, 0.8610248732697712, 0.9806137477082144, 0.7981967975057892, 0.7927400468384075, 0.7251288648957729, 0.9571485992904702, 0.0, 0.429164051199476, 0.0, 0.48728273270250316, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.06187687925304637, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.1937 | 34.0 | 680 | 2.3717 | 0.0809 | 0.1382 | 0.5162 | [0.46788815264908856, 0.39713524992973664, 0.9420111667197038, 0.4486386060669676, 0.2132772688034307, 0.5013255182231519, 0.8557263450623428, 0.0, 0.27594406482049544, 0.0, 0.057764068612699365, 0.0, 0.06538386020027488, 0.0, 0.0, 0.0, nan, 0.03816454915505119, 0.0, 0.02353654837731271, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8601260755644692, 0.6272187639534579, 0.9787051786841393, 0.87901597133961, 0.8968384074941452, 0.5804049679204191, 0.9841054857482393, 0.0, 0.35598373406839334, 0.0, 0.5152674317159922, 0.0, 0.06650024962556166, nan, 0.0, 0.0, nan, 0.040802869652371156, 0.0, 0.1253095058671547, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.5121 | 35.0 | 700 | 2.3246 | 0.0880 | 0.1432 | 0.5554 | [0.5131815824963825, 0.5392289711975226, 0.9466915273149031, 0.4283665694793061, 0.32593553969284017, 0.5269490633089107, 0.8233152953838795, 0.0, 0.30573839098707556, 0.0, 0.07293464323245567, 0.0, 0.021460506706408346, 0.0, 0.000403437285673942, 0.0, nan, 0.04180831826401447, 0.0, 0.0274542272735826, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8223984278967458, 0.9082735797021512, 0.9690421087433501, 0.8514702434570156, 0.8822014051522248, 0.6059761358189754, 0.9625225004805927, 0.0, 0.4246827324581753, 0.0, 0.5076840480504664, 0.0, 0.021567648527209188, nan, 0.0004034617013979948, 0.0, nan, 0.04573508466529514, 0.0, 0.15706749919259338, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.8519 | 36.0 | 720 | 2.3376 | 0.0842 | 0.1419 | 0.5225 | [0.46978904879856803, 0.4126084563774106, 0.9491212653778559, 0.4236789043334405, 0.2236540258857515, 0.5790747855376033, 0.8381056863279266, 0.0, 0.26879812193451924, 0.0, 0.04799207015708248, 0.0, 0.07114199041002055, 0.0, 0.0, 0.0, nan, 0.043645116606489995, 0.0, 0.05208707360861759, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8405111761661384, 0.6478764478764478, 0.9739112139701241, 0.8616667283638323, 0.9247072599531616, 0.6939153906309125, 0.9946435749113088, 0.0, 0.3149914030730602, 0.0, 0.46788806120394605, 0.0, 0.072591113330005, nan, 0.0, 0.0, nan, 0.05069367515957166, 0.0, 0.24986543223167187, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.4294 | 37.0 | 740 | 2.2841 | 0.0839 | 0.1441 | 0.5415 | [0.4954846291829034, 0.4962055595459022, 0.9514013206162876, 0.42174959529692424, 0.2305590278787174, 0.5754330741060775, 0.7389902340151097, 0.0, 0.28239909124140666, 0.0, 0.0626513819303892, 0.0, 0.024224519940915804, 0.0, 0.0, 0.0, nan, 0.042740019564913585, 0.0, 0.04342717990344769, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8097565019085204, 0.8020171775273817, 0.9743921132517809, 0.8144149257783098, 0.9286885245901639, 0.7055699906662294, 0.9812306670627916, 0.0, 0.39216178597745693, 0.0, 0.5068787329709415, 0.0, 0.02456315526709935, nan, 0.0, 0.0, nan, 0.04839900828190114, 0.0, 0.2188610184088707, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.459 | 38.0 | 760 | 2.3317 | 0.0867 | 0.1431 | 0.5364 | [0.48030892247498846, 0.4902823326267501, 0.9503766450743031, 0.4450063131445763, 0.2943483275663206, 0.5136606089555695, 0.8482513022217497, 0.0, 0.27528882635446444, 0.0, 0.06326279885549861, 0.0, 0.047208220240403255, 0.0, 0.0, 0.0, nan, 0.060309323519102696, 0.0, 0.03801530978390878, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8180314744128874, 0.799466813752528, 0.9745423942772985, 0.8712421284699514, 0.8964871194379391, 0.5967979280711049, 0.9761538595970011, 0.0, 0.36950956578696, 0.0, 0.5252667606200926, 0.0, 0.048627059410883675, nan, 0.0, 0.0, nan, 0.07261170016352798, 0.0, 0.2052965873613952, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 2.034 | 39.0 | 780 | 2.3242 | 0.0843 | 0.1413 | 0.5302 | [0.4710699494256834, 0.4498673330662481, 0.9501882167556683, 0.4472577527454994, 0.2698644793152639, 0.5192405498658997, 0.887438278016587, 0.0, 0.2855318739559738, 0.0, 0.05453332753345803, 0.0, 0.04646858256210424, 0.0, 0.0, 0.0, nan, 0.04317150187487665, 0.0, 0.04550834280940336, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8402523937374652, 0.7196412155595829, 0.9786901505815876, 0.9050768541026558, 0.8860655737704918, 0.5966465696290877, 0.9705352929868405, 0.0, 0.3564094866406485, 0.0, 0.504798335682169, 0.0, 0.04762855716425362, nan, 0.0, 0.0, nan, 0.046157092366935694, 0.0, 0.2111099149531704, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.5324 | 40.0 | 800 | 2.2868 | 0.0883 | 0.1486 | 0.5472 | [0.49940400402662244, 0.5083495805389489, 0.9490884123181207, 0.4509435293966673, 0.2786019330186559, 0.5677884018378244, 0.8314473654804562, 0.0, 0.31029173032849505, 0.0, 0.05001421201913482, 0.0, 0.028731123749754853, 0.0, 0.0, 0.0, nan, 0.040415210998567375, 0.0, 0.07729100529100529, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7953698162644757, 0.8252252252252252, 0.9763307384809594, 0.8672153601263558, 0.8708430913348946, 0.672847136382365, 0.9752625784240052, 0.0, 0.4256706967604596, 0.0, 0.5549963089725521, 0.0, 0.029256115826260608, nan, 0.0, 0.0, nan, 0.042411774014875774, 0.0, 0.39315319194746473, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 0.9781 | 41.0 | 820 | 2.3017 | 0.0840 | 0.1446 | 0.5409 | [0.48699415491479947, 0.49647550426649484, 0.9501249433736172, 0.4438547243769686, 0.25326284487031225, 0.5011356078509064, 0.8192399537057384, 0.0, 0.2892233406202705, 0.0, 0.06541450159698718, 0.0, 0.03028817878847285, 0.0, 0.0010690872415532022, 0.0, nan, 0.04215851602023609, 0.0, 0.07201478580386807, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8351495277220676, 0.8013868095500749, 0.9770971717110998, 0.8630322922964919, 0.8975409836065574, 0.5955744473314666, 0.9772723300886038, 0.0, 0.39429600720504354, 0.0, 0.46043889671834104, 0.0, 0.030853719420868696, nan, 0.001069173508704686, 0.0, nan, 0.046157092366935694, 0.0, 0.35235224459037573, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.8622 | 42.0 | 840 | 2.2950 | 0.0857 | 0.1470 | 0.5461 | [0.4913800969050347, 0.4812997463766939, 0.9505381368926793, 0.4417351172128826, 0.27227900950187156, 0.5758524840465098, 0.8246946843046753, 0.0, 0.28934631315573683, 0.0, 0.06457821914751172, 0.0, 0.023750862323839557, 0.0, 0.0011094525356033405, 0.0, nan, 0.04595292627064958, 0.0, 0.08056667693255312, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8383843080804814, 0.7785569826386153, 0.9781791950948273, 0.8723979220394615, 0.8858313817330211, 0.6802679044423703, 0.9789150836231454, 0.0, 0.375579269124751, 0.0, 0.4660761022750151, 0.0, 0.024063904143784322, nan, 0.0011095196788444856, 0.0, nan, 0.04974415783088041, 0.0, 0.4224351383356658, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.2804 | 43.0 | 860 | 2.2951 | 0.0849 | 0.1478 | 0.5377 | [0.48451599766768166, 0.48109465504526955, 0.949467190948824, 0.45194919436160397, 0.2604415174221296, 0.49844030171455667, 0.8417267812231743, 0.0, 0.28899516544132386, 0.0, 0.07183873428904149, 0.0, 0.04780876494023904, 0.0, 0.0005042864346949068, 0.0, nan, 0.04418214972155143, 0.0, 0.07652985991644139, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8467118457656725, 0.7779371207942637, 0.9761353731477863, 0.866429749550639, 0.9076112412177986, 0.56566433742842, 0.9767655231470963, 0.0, 0.3934990857236429, 0.0, 0.48023622575666064, 0.0, 0.0491263105341987, nan, 0.0005043271267474935, 0.0, nan, 0.047291238065094686, 0.0, 0.5028528366885564, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 0.9332 | 44.0 | 880 | 2.2954 | 0.0855 | 0.1481 | 0.5415 | [0.48976540558400794, 0.4756783947143979, 0.9481545039411303, 0.4618954166858103, 0.27559142571544926, 0.5034935597610996, 0.8391252946181824, 0.0, 0.31266797442155014, 0.0, 0.06981689635173575, 0.0, 0.032793323514973, 0.0, 0.00171453929320639, 0.0, nan, 0.045068804344987855, 0.0, 0.07658063398579594, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8390959597593324, 0.7749323667691015, 0.979787202067867, 0.868362927407115, 0.8716627634660421, 0.5883681037309856, 0.9737246814980515, 0.0, 0.4419693785649955, 0.0, 0.5089591302597141, 0.0, 0.03334997503744384, nan, 0.001714712230941478, 0.0, nan, 0.045524080814474864, 0.0, 0.47593928302293037, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.324 | 45.0 | 900 | 2.2862 | 0.0861 | 0.1485 | 0.5382 | [0.4806012442350636, 0.46105107975440246, 0.9500672396655557, 0.4599660155102145, 0.2762265512265512, 0.5208319873658519, 0.865308936539415, 0.0, 0.3128679841831633, 0.0, 0.06213835289671949, 0.0, 0.046143250688705235, 0.0, 0.00020171050508310473, 0.0, nan, 0.04695333092149465, 0.0, 0.07882262154117517, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8332430290483276, 0.7474929740235863, 0.9767665534549608, 0.888488542836342, 0.8966042154566745, 0.6100964489627743, 0.9760490029884134, 0.0, 0.4115826533118638, 0.0, 0.5296959935574793, 0.0, 0.046829755366949576, nan, 0.0002017308506989974, 0.0, nan, 0.04792424961755552, 0.0, 0.4618365809021423, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.4788 | 46.0 | 920 | 2.2889 | 0.0847 | 0.1507 | 0.5451 | [0.4934466469350711, 0.49289445845878427, 0.9496988084245702, 0.44836450261034844, 0.23945292272076518, 0.5718315301391036, 0.852962747914311, 0.0, 0.3047983204022455, 0.0, 0.05879457369952657, 0.0, 0.03952530404080031, 0.0, 8.068908478405583e-05, 0.0, nan, 0.03758519476642953, 0.0, 0.08181665453323626, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8018737465226111, 0.7954692301631077, 0.9737759610471581, 0.8809943937842162, 0.9204918032786885, 0.6844302615978406, 0.9755684101990528, 0.0, 0.4033459785486204, 0.0, 0.5450640896584121, 0.0, 0.04023964053919121, nan, 8.069234027959896e-05, 0.0, nan, 0.039853352323679904, 0.0, 0.4718484228657552, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.0849 | 47.0 | 940 | 2.3190 | 0.0847 | 0.1499 | 0.5375 | [0.48270265167103205, 0.4883099968500236, 0.9489863037726748, 0.453687342170703, 0.26604904256784684, 0.5300531090789863, 0.8582729222561327, 0.0, 0.3025795526473707, 0.0, 0.057744191168373524, 0.0, 0.061194895591647334, 0.0, 0.0, 0.0, nan, 0.0461902785576524, 0.0, 0.07613617021276596, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8062831564986738, 0.785829853176792, 0.9756695019686814, 0.871238015325576, 0.9045667447306791, 0.6185178645005592, 0.9755596721483372, 0.0, 0.40182855271417267, 0.0, 0.5630494597678009, 0.0, 0.06320519221168247, nan, 0.0, 0.0, nan, 0.04990241071899562, 0.0, 0.48153730218538054, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.6687 | 48.0 | 960 | 2.2997 | 0.0839 | 0.1493 | 0.5419 | [0.4884039519989931, 0.5023855202911851, 0.9487134477260426, 0.4387315758616788, 0.2578810853950519, 0.5168101008425782, 0.8481733943976995, 0.0, 0.30145030552941604, 0.0, 0.06553419599907698, 0.0, 0.04092146189735614, 0.0, 6.051315152493142e-05, 0.0, nan, 0.03923789388905668, 0.0, 0.08363174912213608, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8159248237044705, 0.8095238095238095, 0.974647590995161, 0.8772267535362759, 0.90807962529274, 0.602675680902769, 0.9741965362366963, 0.0, 0.4074124614502879, 0.0, 0.49553721226763303, 0.0, 0.04203694458312531, nan, 6.051925520969922e-05, 0.0, nan, 0.04111937542860157, 0.0, 0.5179244267413069, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.5107 | 49.0 | 980 | 2.3102 | 0.0833 | 0.1492 | 0.5368 | [0.48352497928524046, 0.48689002364782025, 0.9489120566960494, 0.44220181960314686, 0.2353657811850003, 0.5099299471402703, 0.8485805611101012, 0.0, 0.292445292371914, 0.0, 0.0640117658387975, 0.0, 0.062292844609085594, 0.0, 0.0001815211472136504, 0.0, nan, 0.040059637287969886, 0.0, 0.08392424840753396, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8282109238532703, 0.777104509757571, 0.9738961858675723, 0.8748329035097461, 0.9275175644028103, 0.5860809094960605, 0.9789500358260079, 0.0, 0.39157228241587294, 0.0, 0.4907053217904839, 0.0, 0.06380429355966051, nan, 0.00018155776562909766, 0.0, nan, 0.043229413936804344, 0.0, 0.5262138012703197, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | | 1.3671 | 50.0 | 1000 | 2.3118 | 0.0859 | 0.1493 | 0.5430 | [0.4898642376841085, 0.502026813829342, 0.9487341030299479, 0.44331193050176815, 0.28502594514455154, 0.5132976114794296, 0.8390207156308851, 0.0, 0.30530825819472024, 0.0, 0.06594624784212842, 0.0, 0.03397963180571876, 0.0, 0.0007459827819109256, 0.0, nan, 0.04554975143210437, 0.0, 0.07792795056021705, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8215553632658342, 0.819071257846768, 0.9731147245348802, 0.8672811704363634, 0.9004683840749415, 0.594073476114797, 0.9732440887086908, 0.0, 0.40956851614311834, 0.0, 0.5229850345614389, 0.0, 0.034648027958062905, nan, 0.0007464041475862904, 0.0, nan, 0.0476077438413251, 0.0, 0.5009150608246313, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
6466b54931d0072a3e3273414b442a92
apache-2.0
['part-of-speech', 'token-classification']
false
XLM-RoBERTa base Universal Dependencies v2.8 POS tagging: Spanish This model is part of our paper called: - Make the Best of Cross-lingual Transfer: Evidence from POS Tagging with over 100 Languages Check the [Space](https://huggingface.co/spaces/wietsedv/xpos) for more details.
1c8f619f812603e7a44c9f5a88710d0d
apache-2.0
['part-of-speech', 'token-classification']
false
Usage ```python from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("wietsedv/xlm-roberta-base-ft-udpos28-es") model = AutoModelForTokenClassification.from_pretrained("wietsedv/xlm-roberta-base-ft-udpos28-es") ```
e563f6ea5057c489101876ad4451826b
apache-2.0
['generated_from_trainer']
false
tiny-mlm-glue-cola-from-scratch-custom-tokenizer-expand-vocab-target-glue-mnli This model is a fine-tuned version of [muhtasham/tiny-mlm-glue-cola-from-scratch-custom-tokenizer-expand-vocab](https://huggingface.co/muhtasham/tiny-mlm-glue-cola-from-scratch-custom-tokenizer-expand-vocab) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.0276 - Accuracy: 0.4674
4f135d78adc7f1808c94e5f93c213ac5
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.0986 | 0.04 | 500 | 1.0969 | 0.3718 | | 1.0962 | 0.08 | 1000 | 1.0945 | 0.3666 | | 1.0855 | 0.12 | 1500 | 1.0747 | 0.4085 | | 1.0653 | 0.16 | 2000 | 1.0666 | 0.4193 | | 1.0599 | 0.2 | 2500 | 1.0542 | 0.4371 | | 1.0522 | 0.24 | 3000 | 1.0414 | 0.4413 | | 1.0435 | 0.29 | 3500 | 1.0455 | 0.4278 | | 1.0359 | 0.33 | 4000 | 1.0366 | 0.4463 | | 1.0339 | 0.37 | 4500 | 1.0327 | 0.4582 | | 1.0275 | 0.41 | 5000 | 1.0276 | 0.4674 |
59094d0c7571f58686bdb188eb4386a6
mit
['generated_from_trainer', 'gpt2', 'generation']
false
Model Card for mpuig/job-experience This model is a fine-tuned version of [GPT-2](https://huggingface.co/gpt2) to generate fake job experience descriptions. While this may not have practical applications in the real world, it served as a valuable learning experience for understanding the process of fine-tuning a language learning model. Through this repository, I hope to share my insights and findings on the capabilities and limitations of GPT-2 in generating job experiences. The goal was to obtain a model where, starting with a sentence like &
0816105bd7a1492cf96834ba4d7e8f70
cc-by-4.0
['question generation']
false
Model Card of `lmqg/bart-base-squadshifts-amazon-qg` This model is fine-tuned version of [lmqg/bart-base-squad](https://huggingface.co/lmqg/bart-base-squad) for question generation task on the [lmqg/qg_squadshifts](https://huggingface.co/datasets/lmqg/qg_squadshifts) (dataset_name: amazon) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
35e77bd0b44ebbfe72ca7a58f59dc42d
cc-by-4.0
['question generation']
false
Overview - **Language model:** [lmqg/bart-base-squad](https://huggingface.co/lmqg/bart-base-squad) - **Language:** en - **Training data:** [lmqg/qg_squadshifts](https://huggingface.co/datasets/lmqg/qg_squadshifts) (amazon) - **Online Demo:** [https://autoqg.net/](https://autoqg.net/) - **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation) - **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
f9004eaf75009342133e94abfe91e7e0
cc-by-4.0
['question generation']
false
model prediction questions = model.generate_q(list_context="William Turner was an English painter who specialised in watercolour landscapes", list_answer="William Turner") ``` - With `transformers` ```python from transformers import pipeline pipe = pipeline("text2text-generation", "lmqg/bart-base-squadshifts-amazon-qg") output = pipe("<hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.") ```
2750c9efb4fa31b6872dcd88d2f9ff32
cc-by-4.0
['question generation']
false
Evaluation - ***Metric (Question Generation)***: [raw metric file](https://huggingface.co/lmqg/bart-base-squadshifts-amazon-qg/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_squadshifts.amazon.json) | | Score | Type | Dataset | |:-----------|--------:|:-------|:---------------------------------------------------------------------------| | BERTScore | 92.77 | amazon | [lmqg/qg_squadshifts](https://huggingface.co/datasets/lmqg/qg_squadshifts) | | Bleu_1 | 29.48 | amazon | [lmqg/qg_squadshifts](https://huggingface.co/datasets/lmqg/qg_squadshifts) | | Bleu_2 | 19.95 | amazon | [lmqg/qg_squadshifts](https://huggingface.co/datasets/lmqg/qg_squadshifts) | | Bleu_3 | 13.91 | amazon | [lmqg/qg_squadshifts](https://huggingface.co/datasets/lmqg/qg_squadshifts) | | Bleu_4 | 9.92 | amazon | [lmqg/qg_squadshifts](https://huggingface.co/datasets/lmqg/qg_squadshifts) | | METEOR | 22.78 | amazon | [lmqg/qg_squadshifts](https://huggingface.co/datasets/lmqg/qg_squadshifts) | | MoverScore | 63.25 | amazon | [lmqg/qg_squadshifts](https://huggingface.co/datasets/lmqg/qg_squadshifts) | | ROUGE_L | 27.94 | amazon | [lmqg/qg_squadshifts](https://huggingface.co/datasets/lmqg/qg_squadshifts) |
3f0f7fe5a2324aa6b714fb256a75529f
cc-by-4.0
['question generation']
false
Training hyperparameters The following hyperparameters were used during fine-tuning: - dataset_path: lmqg/qg_squadshifts - dataset_name: amazon - input_types: ['paragraph_answer'] - output_types: ['question'] - prefix_types: None - model: lmqg/bart-base-squad - max_length: 512 - max_length_output: 32 - epoch: 4 - batch: 8 - lr: 5e-05 - fp16: False - random_seed: 1 - gradient_accumulation_steps: 8 - label_smoothing: 0.15 The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/bart-base-squadshifts-amazon-qg/raw/main/trainer_config.json).
dcb6a0fd00dfd91016416bce83b9a8f3
apache-2.0
['automatic-speech-recognition', 'generated_from_trainer', 'hf-asr-leaderboard', 'mozilla-foundation/common_voice_8_0', 'robust-speech-event']
false
XLS-R-1B - Hindi This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - HI dataset. It achieves the following results on the evaluation set: - Loss: 0.6921 - Wer: 0.3547
865582e600b7cd582f655b66ae3d9051
apache-2.0
['automatic-speech-recognition', 'generated_from_trainer', 'hf-asr-leaderboard', 'mozilla-foundation/common_voice_8_0', 'robust-speech-event']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 1500 - num_epochs: 50.0 - mixed_precision_training: Native AMP
f93bdbe00dfc34962fc4b80b2adbc2c7
apache-2.0
['automatic-speech-recognition', 'generated_from_trainer', 'hf-asr-leaderboard', 'mozilla-foundation/common_voice_8_0', 'robust-speech-event']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:| | 2.0674 | 2.07 | 400 | 1.3411 | 0.8835 | | 1.324 | 4.15 | 800 | 0.9311 | 0.7142 | | 1.2023 | 6.22 | 1200 | 0.8060 | 0.6170 | | 1.1573 | 8.29 | 1600 | 0.7415 | 0.4972 | | 1.1117 | 10.36 | 2000 | 0.7248 | 0.4588 | | 1.0672 | 12.44 | 2400 | 0.6729 | 0.4350 | | 1.0336 | 14.51 | 2800 | 0.7117 | 0.4346 | | 1.0025 | 16.58 | 3200 | 0.7019 | 0.4272 | | 0.9578 | 18.65 | 3600 | 0.6792 | 0.4118 | | 0.9272 | 20.73 | 4000 | 0.6863 | 0.4156 | | 0.9321 | 22.8 | 4400 | 0.6535 | 0.3972 | | 0.8802 | 24.87 | 4800 | 0.6766 | 0.3906 | | 0.844 | 26.94 | 5200 | 0.6782 | 0.3949 | | 0.8387 | 29.02 | 5600 | 0.6916 | 0.3921 | | 0.8042 | 31.09 | 6000 | 0.6806 | 0.3797 | | 0.793 | 33.16 | 6400 | 0.7120 | 0.3831 | | 0.7567 | 35.23 | 6800 | 0.6862 | 0.3808 | | 0.7463 | 37.31 | 7200 | 0.6893 | 0.3709 | | 0.7053 | 39.38 | 7600 | 0.7096 | 0.3701 | | 0.6906 | 41.45 | 8000 | 0.6921 | 0.3676 | | 0.6891 | 43.52 | 8400 | 0.7167 | 0.3663 | | 0.658 | 45.6 | 8800 | 0.6833 | 0.3580 | | 0.6576 | 47.67 | 9200 | 0.6914 | 0.3569 | | 0.6358 | 49.74 | 9600 | 0.6922 | 0.3551 |
31fa146a495567b2165556c820fd4a0a
apache-2.0
['automatic-speech-recognition', 'generated_from_trainer', 'hf-asr-leaderboard', 'mozilla-foundation/common_voice_8_0', 'robust-speech-event']
false
Evaluation Commands 1. To evaluate on `mozilla-foundation/common_voice_8_0` with split `test` ```bash python eval.py --model_id anuragshas/wav2vec2-xls-r-1b-hi-with-lm --dataset mozilla-foundation/common_voice_8_0 --config hi --split test ```
e55f07f07044078b262907b555664dbb
apache-2.0
['automatic-speech-recognition', 'generated_from_trainer', 'hf-asr-leaderboard', 'mozilla-foundation/common_voice_8_0', 'robust-speech-event']
false
Inference With LM ```python import torch from datasets import load_dataset from transformers import AutoModelForCTC, AutoProcessor import torchaudio.functional as F model_id = "anuragshas/wav2vec2-xls-r-1b-hi-with-lm" sample_iter = iter(load_dataset("mozilla-foundation/common_voice_8_0", "hi", split="test", streaming=True, use_auth_token=True)) sample = next(sample_iter) resampled_audio = F.resample(torch.tensor(sample["audio"]["array"]), 48_000, 16_000).numpy() model = AutoModelForCTC.from_pretrained(model_id) processor = AutoProcessor.from_pretrained(model_id) input_values = processor(resampled_audio, return_tensors="pt").input_values with torch.no_grad(): logits = model(input_values).logits transcription = processor.batch_decode(logits.numpy()).text
223c55581294120e4c2e843523d746a1
apache-2.0
['generated_from_trainer']
false
opus-mt-iir-en-finetuned-fa-to-en This model is a fine-tuned version of [Helsinki-NLP/opus-mt-iir-en](https://huggingface.co/Helsinki-NLP/opus-mt-iir-en) on the opus_infopankki dataset. It achieves the following results on the evaluation set: - Loss: 1.0968 - Bleu: 36.687 - Gen Len: 16.039
31ae83961eff5e10caf158bc88385490
apache-2.0
['generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-06 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 30 - mixed_precision_training: Native AMP
0120492462b49db36b728f343e8be455
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len | |:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:| | 3.1614 | 1.0 | 1509 | 2.8058 | 12.326 | 16.5467 | | 2.7235 | 2.0 | 3018 | 2.4178 | 15.6912 | 16.6396 | | 2.4839 | 3.0 | 4527 | 2.1905 | 18.1971 | 16.4884 | | 2.3044 | 4.0 | 6036 | 2.0272 | 20.197 | 16.4735 | | 2.1943 | 5.0 | 7545 | 1.9012 | 22.2265 | 16.4266 | | 2.0669 | 6.0 | 9054 | 1.7984 | 23.7711 | 16.353 | | 1.985 | 7.0 | 10563 | 1.7100 | 24.986 | 16.284 | | 1.9024 | 8.0 | 12072 | 1.6346 | 26.1758 | 16.217 | | 1.8484 | 9.0 | 13581 | 1.5692 | 27.2782 | 16.1924 | | 1.7761 | 10.0 | 15090 | 1.5111 | 28.2761 | 16.144 | | 1.733 | 11.0 | 16599 | 1.4599 | 29.2184 | 16.2438 | | 1.6772 | 12.0 | 18108 | 1.4150 | 30.0026 | 16.1949 | | 1.6297 | 13.0 | 19617 | 1.3743 | 30.7839 | 16.1565 | | 1.5918 | 14.0 | 21126 | 1.3370 | 31.4921 | 16.1323 | | 1.5548 | 15.0 | 22635 | 1.3038 | 32.0621 | 16.076 | | 1.5333 | 16.0 | 24144 | 1.2743 | 32.6881 | 16.0078 | | 1.5145 | 17.0 | 25653 | 1.2478 | 33.3794 | 16.1228 | | 1.4826 | 18.0 | 27162 | 1.2240 | 33.8335 | 16.0809 | | 1.4488 | 19.0 | 28671 | 1.2021 | 34.2819 | 16.0479 | | 1.4386 | 20.0 | 30180 | 1.1829 | 34.7206 | 16.0578 | | 1.4127 | 21.0 | 31689 | 1.1660 | 35.031 | 16.0717 | | 1.4089 | 22.0 | 33198 | 1.1510 | 35.4142 | 16.0391 | | 1.3922 | 23.0 | 34707 | 1.1380 | 35.6777 | 16.0461 | | 1.377 | 24.0 | 36216 | 1.1273 | 35.95 | 16.0569 | | 1.3598 | 25.0 | 37725 | 1.1175 | 36.2435 | 16.0426 | | 1.3515 | 26.0 | 39234 | 1.1097 | 36.4009 | 16.0247 | | 1.3441 | 27.0 | 40743 | 1.1042 | 36.4815 | 16.0447 | | 1.3412 | 28.0 | 42252 | 1.1001 | 36.6092 | 16.0489 | | 1.3527 | 29.0 | 43761 | 1.0976 | 36.6703 | 16.0383 | | 1.3397 | 30.0 | 45270 | 1.0968 | 36.687 | 16.039 |
e6ebdf146df05a1d62ae75079142e90b
creativeml-openrail-m
['stable-diffusion', 'stable-diffusion-diffusers', 'text-to-image', 'art', 'artistic', 'diffusers', 'midjourney', 'artificial', 'artificial-journey', 'journey', 'portrait', 'art', 'diffusion', 'photorealistic']
false
ARTificialJourney-1.0 768X768 This is an AI model trained on ~100 hand picked 768X768 images targeted to get the best close up portrait pictures, as well as amazing looking landscapes. This model does not handle full-body portraits well due to being trained on more close up images, but it will be improved in the next update of this model. Please use the keywords **"artificial-journey style"** before or after the prompt. Also be sure to add more detailed text to the prompt to get the best results. Resolutions 1024x768px, 768x1024px, and 768x768 work best for this model. *.safetensors file was generated using the new sd-webui-model-converter.*
bead86daf4b1bf28170e6c96790a40a7
creativeml-openrail-m
['stable-diffusion', 'stable-diffusion-diffusers', 'text-to-image', 'art', 'artistic', 'diffusers', 'midjourney', 'artificial', 'artificial-journey', 'journey', 'portrait', 'art', 'diffusion', 'photorealistic']
false
CKPT & Safetensors Download [Download ARTificialJourneyV1-768px.ckpt) (2.9GB)](https://huggingface.co/Kaludi/ARTificialJourney-v1.0-768/blob/main/ARTificialJourneyV1-768px.ckpt) [Download ARTificialJourneyV1-768px.safetensors) (2.9GB)](https://huggingface.co/Kaludi/ARTificialJourney-v1.0-768/blob/main/ARTificialJourneyV1-768px.safetensors)
4c4590e7ab65d314692336c468d57aff
creativeml-openrail-m
['stable-diffusion', 'stable-diffusion-diffusers', 'text-to-image', 'art', 'artistic', 'diffusers', 'midjourney', 'artificial', 'artificial-journey', 'journey', 'portrait', 'art', 'diffusion', 'photorealistic']
false
🧨 Diffusers This model can be used just like any other Stable Diffusion model. For more information, please have a look at the [Stable Diffusion Pipeline](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion). ```python from diffusers import StableDiffusionPipeline, DPMSolverMultistepScheduler import torch prompt = ( "artificial-journey style portrait of male dark magician, d & d, dark eyeliner, intricate, elegant, highly detailed, digital painting, artstation, concept art, matte, sharp focus, illustration") model_id = "Kaludi/ARTificialJourney-v1.0-768" pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16) pipe.scheduler = DPMSolverMultistepScheduler.from_config(pipe.scheduler.config) pipe = pipe.to("cuda") image = pipe(prompt, num_inference_steps=30).images[0] image.save("./result.jpg") ``` ![img](https://huggingface.co/Kaludi/ARTificialJourney-v1.0-768/resolve/main/Images2.jpg)
212002a6c3e36a85bffbe627a33d50f7
mit
['generated_from_keras_callback']
false
Amitesh007/text_generation-finetuned-gpt2 This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 3.9088 - Validation Loss: 3.6320 - Epoch: 0
958401c4f312618efe98e9da1fe50467
mit
['generated_from_keras_callback']
false
Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 5e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32
c658dd36091aa10ce750817032eba06e
mit
['generated_from_trainer']
false
xlm-roberta-base-finetuned-panx-de This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the xtreme dataset. It achieves the following results on the evaluation set: - Loss: 0.1374 - F1: 0.8627
19d2adba679500b5550ef04326d43ad6
mit
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:----:|:---------------:|:------:| | 0.2596 | 1.0 | 525 | 0.1571 | 0.8302 | | 0.1292 | 2.0 | 1050 | 0.1416 | 0.8455 | | 0.0809 | 3.0 | 1575 | 0.1374 | 0.8627 |
5b696081de7655b8124ce5e071055500
apache-2.0
['generated_from_trainer']
false
small-mlm-glue-stsb-from-scratch-custom-tokenizer-expand-vocab This model is a fine-tuned version of [google/bert_uncased_L-4_H-512_A-8](https://huggingface.co/google/bert_uncased_L-4_H-512_A-8) on the None dataset. It achieves the following results on the evaluation set: - Loss: 5.8182
10483806e898c6f52b0a136cb3d694cc
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 7.929 | 0.7 | 500 | 6.7432 | | 6.8164 | 1.39 | 1000 | 6.4671 | | 6.5278 | 2.09 | 1500 | 6.3719 | | 6.3088 | 2.78 | 2000 | 6.2202 | | 6.3032 | 3.48 | 2500 | 5.9957 | | 6.1976 | 4.17 | 3000 | 6.0049 | | 6.0579 | 4.87 | 3500 | 5.9357 | | 6.0549 | 5.56 | 4000 | 5.9458 | | 5.9356 | 6.26 | 4500 | 5.8563 | | 5.9506 | 6.95 | 5000 | 5.8182 |
9d0d985e379cee67ceb6ff59ef0a1a35
apache-2.0
['argumentation']
false
Generate a chain of reasoning from one claim to another This model has the same model parameters as [`gpt-neo-2.7B`](https://huggingface.co/EleutherAI/gpt-neo-2.7B), but with an additional soft prompt which has been optimized on the task of generating a sequence of claims (a 'chain of reasoning') that joins one claim to another. It was trained as part of a University of Melbourne [research project](https://github.com/Hunt-Laboratory/language-model-optimization) evaluating how large language models can best be optimized to perform argumentative reasoning tasks. Code used for optimization and evaluation can be found in the project [GitHub repository](https://github.com/Hunt-Laboratory/language-model-optimization). A paper reporting on model evaluation is currently under review.
9d28c8d16792526cdefefc5175a0b5b5
mit
['summarization']
false
Model description BART is a transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder. BART is pre-trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text. BART is particularly effective when fine-tuned for text generation (e.g. summarization, translation) but also works well for comprehension tasks (e.g. text classification, question answering). This particular checkpoint has been fine-tuned on CNN Daily Mail, a large collection of text-summary pairs.
c0d481e2f80d7904ce8956f8a1859a01
mit
['summarization']
false
How to use Here is how to use this model with the [pipeline API](https://huggingface.co/transformers/main_classes/pipelines.html): ```python from transformers import pipeline summarizer = pipeline("summarization", model="ML-unipi/bart-large-tos") ARTICLE = """ New York (CNN)When Liana Barrientos was 23 years old, she got married in Westchester County, New York. A year later, she got married again in Westchester County, but to a different man and without divorcing her first husband. Only 18 days after that marriage, she got hitched yet again. Then, Barrientos declared "I do" five more times, sometimes only within two weeks of each other. In 2010, she married once more, this time in the Bronx. In an application for a marriage license, she stated it was her "first and only" marriage. Barrientos, now 39, is facing two criminal counts of "offering a false instrument for filing in the first degree," referring to her false statements on the 2010 marriage license application, according to court documents. Prosecutors said the marriages were part of an immigration scam. On Friday, she pleaded not guilty at State Supreme Court in the Bronx, according to her attorney, Christopher Wright, who declined to comment further. After leaving court, Barrientos was arrested and charged with theft of service and criminal trespass for allegedly sneaking into the New York subway through an emergency exit, said Detective Annette Markowski, a police spokeswoman. In total, Barrientos has been married 10 times, with nine of her marriages occurring between 1999 and 2002. All occurred either in Westchester County, Long Island, New Jersey or the Bronx. She is believed to still be married to four men, and at one time, she was married to eight men at once, prosecutors say. Prosecutors said the immigration scam involved some of her husbands, who filed for permanent residence status shortly after the marriages. Any divorces happened only after such filings were approved. It was unclear whether any of the men will be prosecuted. The case was referred to the Bronx District Attorney\'s Office by Immigration and Customs Enforcement and the Department of Homeland Security\'s Investigation Division. Seven of the men are from so-called "red-flagged" countries, including Egypt, Turkey, Georgia, Pakistan and Mali. Her eighth husband, Rashid Rajput, was deported in 2006 to his native Pakistan after an investigation by the Joint Terrorism Task Force. If convicted, Barrientos faces up to four years in prison. Her next court appearance is scheduled for May 18. """ print(summarizer(ARTICLE, max_length=130, min_length=30, do_sample=False)) >>> [{'summary_text': 'Liana Barrientos, 39, is charged with two counts of "offering a false instrument for filing in the first degree" In total, she has been married 10 times, with nine of her marriages occurring between 1999 and 2002. She is believed to still be married to four men.'}] ```
28b44997e9a179b49444a078812ca7c2
apache-2.0
['generated_from_trainer']
false
tiny-mlm-glue-rte-target-glue-mrpc This model is a fine-tuned version of [muhtasham/tiny-mlm-glue-rte](https://huggingface.co/muhtasham/tiny-mlm-glue-rte) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.0186 - Accuracy: 0.7328 - F1: 0.8143
afd589995420be9a5f715cc5d7544187
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.5841 | 4.35 | 500 | 0.5459 | 0.7279 | 0.8230 | | 0.4421 | 8.7 | 1000 | 0.5767 | 0.7426 | 0.8309 | | 0.2968 | 13.04 | 1500 | 0.6520 | 0.7402 | 0.8239 | | 0.185 | 17.39 | 2000 | 0.7858 | 0.7377 | 0.8231 | | 0.115 | 21.74 | 2500 | 1.0186 | 0.7328 | 0.8143 |
97c415127e29099dca79a8dcb759a7d0
apache-2.0
[]
false
distilbert-base-en-fr-zh-ja-vi-cased We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages. Our versions give exactly the same representations produced by the original model which preserves the original accuracy. For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
b5d84abe346794f096870aa82f9b6124
apache-2.0
[]
false
How to use ```python from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-zh-ja-vi-cased") model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-zh-ja-vi-cased") ``` To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
e639ee10b44bbdc1d206353ac0efa83f
cc-by-4.0
['translation', 'opus-mt-tc']
false
Model Details Neural machine translation model for translating from Italic languages (itc) to Baltic languages (bat). This model is part of the [OPUS-MT project](https://github.com/Helsinki-NLP/Opus-MT), an effort to make neural machine translation models widely available and accessible for many languages in the world. All models are originally trained using the amazing framework of [Marian NMT](https://marian-nmt.github.io/), an efficient NMT implementation written in pure C++. The models have been converted to pyTorch using the transformers library by huggingface. Training data is taken from [OPUS](https://opus.nlpl.eu/) and training pipelines use the procedures of [OPUS-MT-train](https://github.com/Helsinki-NLP/Opus-MT-train). **Model Description:** - **Developed by:** Language Technology Research Group at the University of Helsinki - **Model Type:** Translation (transformer-big) - **Release**: 2022-07-27 - **License:** CC-BY-4.0 - **Language(s):** - Source Language(s): cat fra glg ita por spa - Target Language(s): lav lit prg - Language Pair(s): cat-lav cat-lit fra-lav fra-lit glg-lav glg-lit ita-lav ita-lit por-lav por-lit spa-lit - Valid Target Language Labels: >>lav<< >>lit<< >>ltg<< >>ndf<< >>olt<< >>prg<< >>prg_Latn<< >>sgs<< >>svx<< >>sxl<< >>xcu<< >>xgl<< >>xsv<< >>xzm<< - **Original Model**: [opusTCv20210807_transformer-big_2022-07-27.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/itc-bat/opusTCv20210807_transformer-big_2022-07-27.zip) - **Resources for more information:** - [OPUS-MT-train GitHub Repo](https://github.com/Helsinki-NLP/OPUS-MT-train) - More information about released models for this language pair: [OPUS-MT itc-bat README](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/itc-bat/README.md) - [More information about MarianNMT models in the transformers library](https://huggingface.co/docs/transformers/model_doc/marian) - [Tatoeba Translation Challenge](https://github.com/Helsinki-NLP/Tatoeba-Challenge/ This is a multilingual translation model with multiple target languages. A sentence initial language token is required in the form of `>>id<<` (id = valid target language ID), e.g. `>>lav<<`
1403ec9bd387c9cdc7503d21918e3a03
cc-by-4.0
['translation', 'opus-mt-tc']
false
How to Get Started With the Model A short example code: ```python from transformers import MarianMTModel, MarianTokenizer src_text = [ ">>lit<< Els gats són complexos individus.", ">>sgs<< No." ] model_name = "pytorch-models/opus-mt-tc-big-itc-bat" tokenizer = MarianTokenizer.from_pretrained(model_name) model = MarianMTModel.from_pretrained(model_name) translated = model.generate(**tokenizer(src_text, return_tensors="pt", padding=True)) for t in translated: print( tokenizer.decode(t, skip_special_tokens=True) )
a4235bb4d4bdc68e3cc20974d0e1422a
cc-by-4.0
['translation', 'opus-mt-tc']
false
no no no no no no no no no no no no no no no no no no no no no ``` You can also use OPUS-MT models with the transformers pipelines, for example: ```python from transformers import pipeline pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-big-itc-bat") print(pipe(">>lit<< Els gats són complexos individus."))
ccfeb58b4839fd580e1ca38dc42d984f
cc-by-4.0
['translation', 'opus-mt-tc']
false
Training - **Data**: opusTCv20210807 ([source](https://github.com/Helsinki-NLP/Tatoeba-Challenge)) - **Pre-processing**: SentencePiece (spm32k,spm32k) - **Model Type:** transformer-big - **Original MarianNMT Model**: [opusTCv20210807_transformer-big_2022-07-27.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/itc-bat/opusTCv20210807_transformer-big_2022-07-27.zip) - **Training Scripts**: [GitHub Repo](https://github.com/Helsinki-NLP/OPUS-MT-train)
c13178cf381d84a3462291df9fdbe4f1
cc-by-4.0
['translation', 'opus-mt-tc']
false
Evaluation * test set translations: [opusTCv20210807_transformer-big_2022-07-27.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/itc-bat/opusTCv20210807_transformer-big_2022-07-27.test.txt) * test set scores: [opusTCv20210807_transformer-big_2022-07-27.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/itc-bat/opusTCv20210807_transformer-big_2022-07-27.eval.txt) * benchmark results: [benchmark_results.txt](benchmark_results.txt) * benchmark output: [benchmark_translations.zip](benchmark_translations.zip) | langpair | testset | chr-F | BLEU |
2f93839372607bb93709951f9ca31423
cc-by-4.0
['translation', 'opus-mt-tc']
false
words | |----------|---------|-------|-------|-------|--------| | ita-lit | tatoeba-test-v2021-08-07 | 0.67640 | 40.9 | 224 | 1321 | | spa-lit | tatoeba-test-v2021-08-07 | 0.68805 | 45.9 | 454 | 2352 | | cat-lav | flores101-devtest | 0.52215 | 21.9 | 1012 | 22092 | | cat-lit | flores101-devtest | 0.52380 | 20.2 | 1012 | 20695 | | fra-lav | flores101-devtest | 0.53390 | 23.0 | 1012 | 22092 | | fra-lit | flores101-devtest | 0.53595 | 21.1 | 1012 | 20695 | | glg-lav | flores101-devtest | 0.51043 | 20.7 | 1012 | 22092 | | glg-lit | flores101-devtest | 0.51854 | 19.9 | 1012 | 20695 | | ita-lav | flores101-devtest | 0.51065 | 19.6 | 1012 | 22092 | | ita-lit | flores101-devtest | 0.51309 | 17.4 | 1012 | 20695 | | por-lav | flores101-devtest | 0.53493 | 22.9 | 1012 | 22092 | | por-lit | flores101-devtest | 0.53821 | 21.8 | 1012 | 20695 | | spa-lav | flores101-devtest | 0.49290 | 17.4 | 1012 | 22092 | | spa-lit | flores101-devtest | 0.49836 | 16.2 | 1012 | 20695 |
ec37d38f3da832f424b14028d50867cc
apache-2.0
['generated_from_trainer']
false
albert-base-v2-finetuned-wnli This model is a fine-tuned version of [albert-base-v2](https://huggingface.co/albert-base-v2) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 0.6981 - Accuracy: 0.5634
f0fdae77a5dc6bd88efc255296145a51
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 10 | 0.6954 | 0.4930 | | No log | 2.0 | 20 | 0.6981 | 0.5634 | | No log | 3.0 | 30 | 0.7036 | 0.4225 | | No log | 4.0 | 40 | 0.7062 | 0.3944 | | No log | 5.0 | 50 | 0.7035 | 0.4225 |
3aef8ac7bffecaff8b9ad452511e54fb
cc-by-4.0
[]
false
Model description This is the T5-3B model for System 2 as described in our paper Just-DREAM-about-it: Figurative Language Understanding with DREAM-FLUTE, FigLang workshop @ EMNLP 2022 (Arxiv link: https://arxiv.org/abs/2210.16407) System 2: Jointly predicting the type of figurative language Using type of figurative language provided as part of the training set (Chakrabarty et al., 2022), one of our models jointly predicts the type of figurative language, together with the target label and explanation: ``` Input <Premise> <Hypothesis> Output <Figurative-Language-Type> <Label> <Explanation> ```
a015db6c1e404ff52544928476478338
cc-by-4.0
[]
false
How to use this model? We provide a quick example of how you can try out System 2 in our paper with just a few lines of code: ``` >>> from transformers import AutoTokenizer, AutoModelForSeq2SeqLM >>> model = AutoModelForSeq2SeqLM.from_pretrained("allenai/System2_FigLang2022") >>> tokenizer = AutoTokenizer.from_pretrained("t5-3b") >>> input_string = "Premise: Yesterday two gangs were fighting just in front of my home. Hypothesis: Yesterday I saw two gangs fighting right in front of my house and it totally didn't make me scared at all. What is the type of figurative language involved? Is there a contradiction or entailment between the premise and hypothesis?" >>> input_ids = tokenizer.encode(input_string, return_tensors="pt") >>> output = model.generate(input_ids, max_length=200) >>> tokenizer.batch_decode(output, skip_special_tokens=True) ['Answer : [Type] Sarcasm [Label] Contradiction. Explanation : Seeing two gangs of people fighting in public can be really dangerous and scary, so someone who claims that they were not scared at all is being sarcastic.'] ```
7d96b1ec1c4ab2d3b61a626a3f4ba05c
cc-by-4.0
[]
false
Model details This model is a fine-tuned version of [t5-3b](https://huggingface.co/t5-3b). It achieves the following results on the evaluation set: - Loss: 0.6078 - Rouge1: 62.8674 - Rouge2: 45.0585 - Rougel: 57.5618 - Rougelsum: 57.5172 - Gen Len: 50.7558
0b20df1f888ec170c18dbb7170a10b81
cc-by-4.0
[]
false
Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:| | 0.8068 | 0.33 | 1000 | 0.7251 | 30.6353 | 25.0792 | 30.619 | 30.6274 | 19.0 | | 0.7276 | 0.66 | 2000 | 0.6715 | 30.8651 | 26.1492 | 30.8543 | 30.8519 | 19.0 | | 0.7063 | 1.0 | 3000 | 0.6338 | 31.0263 | 26.6749 | 31.0094 | 31.0098 | 19.0 | | 0.4516 | 1.33 | 4000 | 0.6447 | 30.9942 | 26.5984 | 30.9834 | 30.9778 | 19.0 | | 0.4538 | 1.66 | 5000 | 0.6183 | 31.0179 | 26.7012 | 31.005 | 31.0018 | 19.0 | | 0.4373 | 1.99 | 6000 | 0.6078 | 31.0085 | 26.7116 | 30.9952 | 30.9894 | 19.0 | | 0.2743 | 2.32 | 7000 | 0.6910 | 31.0051 | 26.7349 | 30.9975 | 30.9851 | 19.0 | | 0.2819 | 2.65 | 8000 | 0.6831 | 31.0876 | 26.848 | 31.0766 | 31.0753 | 19.0 | | 0.2849 | 2.99 | 9000 | 0.6673 | 30.9223 | 26.5899 | 30.9165 | 30.9073 | 19.0 |
ebb50cd39cbfc1c7093c63f19eaa383f
apache-2.0
['vision', 'image-classification']
false
Vision Transformer (base-sized model) Vision Transformer (ViT) model pre-trained on ImageNet-21k (14 million images, 21,843 classes) at resolution 224x224, and fine-tuned on ImageNet 2012 (1 million images, 1,000 classes) at resolution 384x384. It was introduced in the paper [An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale](https://arxiv.org/abs/2010.11929) by Dosovitskiy et al. and first released in [this repository](https://github.com/google-research/vision_transformer). However, the weights were converted from the [timm repository](https://github.com/rwightman/pytorch-image-models) by Ross Wightman, who already converted the weights from JAX to PyTorch. Credits go to him. Disclaimer: The team releasing ViT did not write a model card for this model so this model card has been written by the Hugging Face team.
9c218df9a2cdcd52be9e62f973405d2c
apache-2.0
['vision', 'image-classification']
false
Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a supervised fashion, namely ImageNet-21k, at a resolution of 224x224 pixels. Next, the model was fine-tuned on ImageNet (also referred to as ILSVRC2012), a dataset comprising 1 million images and 1,000 classes, at a higher resolution of 384x384. Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image.
48ebaf29914f848646f9fd20e4c13c9b
apache-2.0
['vision', 'image-classification']
false
How to use Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes: ```python from transformers import ViTFeatureExtractor, ViTForImageClassification from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) feature_extractor = ViTFeatureExtractor.from_pretrained('google/vit-base-patch16-384') model = ViTForImageClassification.from_pretrained('google/vit-base-patch16-384') inputs = feature_extractor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits
b3ef7ebc40b15c118c648e5a01fea72d
apache-2.0
['vision', 'image-classification']
false
model predicts one of the 1000 ImageNet classes predicted_class_idx = logits.argmax(-1).item() print("Predicted class:", model.config.id2label[predicted_class_idx]) ``` Currently, both the feature extractor and model support PyTorch. Tensorflow and JAX/FLAX are coming soon, and the API of ViTFeatureExtractor might change.
f3fd4821a1d6a5fdb553cb341cd15b4f
apache-2.0
['vision', 'image-classification']
false
Preprocessing The exact details of preprocessing of images during training/validation can be found [here](https://github.com/google-research/vision_transformer/blob/master/vit_jax/input_pipeline.py). Images are resized/rescaled to the same resolution (224x224 during pre-training, 384x384 during fine-tuning) and normalized across the RGB channels with mean (0.5, 0.5, 0.5) and standard deviation (0.5, 0.5, 0.5).
a09739b803340c9a2c48aa0a99c70595
mit
['recsys', 'pytorch', 'sentence_transformers']
false
Model Details `paper-rec` goal is to recommend users what scientific papers to read next based on their preferences. This is a test model used to explore Hugging Face Hub capabilities and identify requirements to enable support for recommendation task in the ecosystem.
be72267743e22a1fca08d327c6d66030