license
stringlengths
2
30
tags
stringlengths
2
513
is_nc
bool
1 class
readme_section
stringlengths
201
597k
hash
stringlengths
32
32
apache-2.0
['text generation', 'pytorch', 'causal-lm']
false
ReGPT-125M-200G This model was trained on GPT-Neo-125M with [Mengzi Retrieval LM](https://github.com/Langboat/mengzi-retrieval-lm). For more details, please refer to this [document](https://github.com/Langboat/mengzi-retrieval-lm/blob/main/README.md).
2a081a389e134b7945e0b38f7da4c884
apache-2.0
['text generation', 'pytorch', 'causal-lm']
false
How to use You have to use a forked transformers: https://github.com/Langboat/transformers ```python from transformers import Re_gptForCausalLM model = Re_gptForCausalLM.from_pretrained('Langboat/ReGPT-125M-200G') ```
8c8dc3c9e285e8432df5595010423ba1
creativeml-openrail-m
['text-to-image', 'stable-diffusion']
false
Galverse-Diffusion-wf-8888 Dreambooth model trained by jarvissan with [TheLastBen's fast-DreamBooth](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast-DreamBooth.ipynb) notebook Test the concept via A1111 Colab [fast-Colab-A1111](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast_stable_diffusion_AUTOMATIC1111.ipynb) Sample pictures of this concept: Base model: Waifu-Diffusion Traning data: 8888 Galverse PFPs (512x512) taged with gal_{n}.png Created by Jarvis (@jarvissan22) in Collobration with the galverse team @galverseNFT
90dfecbb200e0b550157ce06f4ad8e7f
creativeml-openrail-m
['text-to-image', 'stable-diffusion']
false
Example images 1 Dragon gal breaving fire fullbody, draong scales, wings, tail, short red hair, purple eyes, pose from above ![Test gal 1](https://huggingface.co/sd-dreambooth-library/galverse-diffusion-wf-8888/resolve/main/01483-1751575086-dragon20gal20breaving20fire2020full20body20dragon20scales20wings20tail20short20red20hair20purple20eyes20pose20from20above.png) 2 Vampire gal laughing, pink hair, black clothes, pail white skinm green eyes, heart lips, fullbody , high detail ![Test gal 2](https://huggingface.co/sd-dreambooth-library/galverse-diffusion-wf-8888/resolve/main/01939-1230204258-vampire20gal2020laughing20pink20hair20black20clothespail20white20skin20green20eyes20pink20heart20lips20fullbodyhigh20detail.png) 3 Gal working as a delivery girl, working, running, while holding a package, fullbody, wearing brown cap and work clothes, wide ![Test gal 3](https://huggingface.co/sd-dreambooth-library/galverse-diffusion-wf-8888/resolve/main/02141-1320348834-Gal20working20as20a20delivery20girl20workingrunning20while20holding20a20package2020fullbody20wearing20a20brown20cap20and20work20clothes20wide20a.png) 4 Gal gishing, hoolding a fishing rod, fishing, green hair, yellow eyes, in the style of galverse ![Test gal 4](https://huggingface.co/sd-dreambooth-library/galverse-diffusion-wf-8888/resolve/main/02233-2263146988-Gal20fishing20holding20a20fish20fishing20rod20fishing20green20hair20yellow20eyes20in20the20style20of20galverse.png)
4f44016e46d823794cc9e78c6799c18b
apache-2.0
['t5', 'contrastive learning', 'ranking', 'decoding', 'metric learning', 'pytorch', 'text generation', 'retrieval']
false
Method-2: Loading the model with HuggingFace APIs ``` from transformers import T5Tokenizer, AutoModel tokenizer = T5Tokenizer.from_pretrained(f"google/t5-v1_1-xl") model = AutoModel.from_pretrained("kalpeshk2011/rankgen-t5-xl-pg19", trust_remote_code=True) ```
0861d7ee1adc06da1b98d78b1e9f1c3c
creativeml-openrail-m
['text-to-image', 'stable-diffusion']
false
Ayaka_DB Dreambooth model trained by Falon with [TheLastBen's fast-DreamBooth](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast-DreamBooth.ipynb) notebook Test the concept via A1111 Colab [fast-Colab-A1111](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast_stable_diffusion_AUTOMATIC1111.ipynb) Sample pictures of this concept:
e8a25f8c941ea30837a682bde3505f19
apache-2.0
['automatic-speech-recognition', 'fr']
false
exp_w2v2r_fr_vp-100k_gender_male-0_female-10_s469 Fine-tuned [facebook/wav2vec2-large-100k-voxpopuli](https://huggingface.co/facebook/wav2vec2-large-100k-voxpopuli) for speech recognition using the train split of [Common Voice 7.0 (fr)](https://huggingface.co/datasets/mozilla-foundation/common_voice_7_0). When using this model, make sure that your speech input is sampled at 16kHz. This model has been fine-tuned by the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) tool.
54b20833297f0a427255890a70322eef
mit
['generated_from_trainer']
false
pegasus-base-qag-bg-finetuned-punctuation-bg This model is a fine-tuned version of [rmihaylov/pegasus-base-qag-bg](https://huggingface.co/rmihaylov/pegasus-base-qag-bg) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0318
1344b4ee15d0365524df5c4b8ec09772
mit
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | 0.0563 | 1.0 | 4063 | 0.0279 | | 0.0301 | 2.0 | 8126 | 0.0260 | | 0.0227 | 3.0 | 12189 | 0.0259 | | 0.0178 | 4.0 | 16252 | 0.0281 | | 0.0145 | 5.0 | 20315 | 0.0290 | | 0.0122 | 6.0 | 24378 | 0.0300 | | 0.0105 | 7.0 | 28441 | 0.0305 | | 0.0095 | 8.0 | 32504 | 0.0318 |
e0d26be9de4f1dc72a4b93ac00622bb7
apache-2.0
['automatic-speech-recognition', 'CTC', 'Attention', 'Tranformer', 'pytorch', 'speechbrain']
false
CRDNN with CTC/Attention and RNNLM trained on LibriSpeech This repository provides all the necessary tools to perform automatic speech recognition from an end-to-end system pretrained on LibriSpeech (EN) within SpeechBrain. For a better experience, we encourage you to learn more about [SpeechBrain](https://speechbrain.github.io). The performance of the model is the following: | Release | Test clean WER | Test other WER | GPUs | |:-------------:|:--------------:|:--------------:|:--------:| | 05-03-21 | 2.90 | 8.51 | 1xV100 16GB |
9d19d6391183dae880278c3926d2966a
apache-2.0
['automatic-speech-recognition', 'CTC', 'Attention', 'Tranformer', 'pytorch', 'speechbrain']
false
Pipeline description This ASR system is composed of 3 different but linked blocks: 1. Tokenizer (unigram) that transforms words into subword units and trained with the train transcriptions of LibriSpeech. 2. Neural language model (Transformer LM) trained on the full 10M words dataset. 3. Acoustic model (CRDNN + CTC/Attention). The CRDNN architecture is made of N blocks of convolutional neural networks with normalization and pooling on the frequency domain. Then, a bidirectional LSTM with projection layers is connected to a final DNN to obtain the final acoustic representation that is given to the CTC and attention decoders. The system is trained with recordings sampled at 16kHz (single channel). The code will automatically normalize your audio (i.e., resampling + mono channel selection) when calling *transcribe_file* if needed.
22d180044084f81f64273b5d7680e833
apache-2.0
['automatic-speech-recognition', 'CTC', 'Attention', 'Tranformer', 'pytorch', 'speechbrain']
false
Transcribing your own audio files (in English) ```python from speechbrain.pretrained import EncoderDecoderASR asr_model = EncoderDecoderASR.from_hparams(source="speechbrain/asr-crdnn-transformerlm-librispeech", savedir="pretrained_models/asr-crdnn-transformerlm-librispeech") asr_model.transcribe_file("speechbrain/asr-crdnn-transformerlm-librispeech/example.wav") ```
5c626750f95c75d51ee98941bf801b9c
apache-2.0
['automatic-speech-recognition', 'CTC', 'Attention', 'Tranformer', 'pytorch', 'speechbrain']
false
Training The model was trained with SpeechBrain (Commit hash: 'eca313cc'). To train it from scratch follow these steps: 1. Clone SpeechBrain: ```bash git clone https://github.com/speechbrain/speechbrain/ ``` 2. Install it: ```bash cd speechbrain pip install -r requirements.txt pip install -e . ``` 3. Run Training: ```bash cd recipes/LibriSpeech/ASR/seq2seq python train.py hparams/train_BPE_5000.yaml --data_folder=your_data_folder ``` You can find our training results (models, logs, etc) [here](https://drive.google.com/drive/folders/1kSwdBT8kDhnmTLzrOPDL77LX_Eq-3Tzl?usp=sharing).
011d72c1eebef0420de65b4d5642fa89
apache-2.0
['image-classification', 'generated_from_trainer']
false
convnext_manuscript_iiif This model is a fine-tuned version of [facebook/convnext-base-224-22k](https://huggingface.co/facebook/convnext-base-224-22k) on the davanstrien/iiif_manuscripts_label_ge_50 dataset. It achieves the following results on the evaluation set: - Loss: 5.5856 - F1: 0.0037
2e1351883ae1fd2027fe1d8b2b3aa507
apache-2.0
['image-classification', 'generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 64 - eval_batch_size: 64 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 30.0 - mixed_precision_training: Native AMP
a40a0216ebe801f29c7ec43f45d7aa0a
apache-2.0
['image-classification', 'generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:-----:|:---------------:|:------:| | 6.5753 | 1.0 | 2038 | 6.4121 | 0.0016 | | 5.9865 | 2.0 | 4076 | 5.9466 | 0.0021 | | 5.6521 | 3.0 | 6114 | 5.7645 | 0.0029 | | 5.3123 | 4.0 | 8152 | 5.6890 | 0.0033 | | 5.0337 | 5.0 | 10190 | 5.6692 | 0.0034 | | 4.743 | 6.0 | 12228 | 5.5856 | 0.0037 | | 4.4387 | 7.0 | 14266 | 5.5969 | 0.0042 | | 4.1422 | 8.0 | 16304 | 5.6711 | 0.0043 | | 3.8372 | 9.0 | 18342 | 5.6761 | 0.0044 | | 3.5244 | 10.0 | 20380 | 5.8469 | 0.0042 | | 3.2321 | 11.0 | 22418 | 5.8774 | 0.0045 | | 2.9004 | 12.0 | 24456 | 6.1186 | 0.0047 | | 2.5937 | 13.0 | 26494 | 6.2398 | 0.0046 | | 2.2983 | 14.0 | 28532 | 6.3732 | 0.0049 | | 2.0611 | 15.0 | 30570 | 6.5024 | 0.0045 | | 1.8153 | 16.0 | 32608 | 6.6585 | 0.0047 | | 1.6075 | 17.0 | 34646 | 6.8333 | 0.0043 | | 1.4342 | 18.0 | 36684 | 6.9529 | 0.0044 | | 1.2614 | 19.0 | 38722 | 7.1129 | 0.0046 | | 1.1463 | 20.0 | 40760 | 7.1977 | 0.0039 | | 1.0387 | 21.0 | 42798 | 7.2700 | 0.0044 | | 0.9635 | 22.0 | 44836 | 7.3375 | 0.0040 | | 0.8872 | 23.0 | 46874 | 7.4003 | 0.0039 | | 0.8156 | 24.0 | 48912 | 7.4884 | 0.0039 | | 0.7544 | 25.0 | 50950 | 7.4764 | 0.0039 | | 0.6893 | 26.0 | 52988 | 7.5153 | 0.0042 | | 0.6767 | 27.0 | 55026 | 7.5427 | 0.0043 | | 0.6098 | 28.0 | 57064 | 7.5547 | 0.0042 | | 0.5871 | 29.0 | 59102 | 7.5533 | 0.0041 | | 0.5696 | 30.0 | 61140 | 7.5595 | 0.0041 |
0ec53aa0281b0a54853a2d0a5b67d9a9
mit
['generated_from_trainer']
false
deberta-v3-small-finetuned-Disaster-Tweets-Part1 This model is a fine-tuned version of [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4014 - Accuracy: 0.8564 - F1: 0.8557
688ca622b714186d03cb4db595a5e7f3
mit
['generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 2 - mixed_precision_training: Native AMP
b8558af4cab2dc1558286583268ae9d6
mit
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | No log | 1.0 | 203 | 0.3828 | 0.8415 | 0.8414 | | No log | 2.0 | 406 | 0.4014 | 0.8564 | 0.8557 |
5989cdf3a810d9d305b1ca54e85cfff4
apache-2.0
['generated_from_keras_callback']
false
evangeloc/t5-small-finetuned-xsum This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 2.7203 - Validation Loss: 2.4006 - Train Rouge1: 28.1689 - Train Rouge2: 7.9798 - Train Rougel: 22.6998 - Train Rougelsum: 22.7228 - Train Gen Len: 18.865 - Epoch: 0
488140d48f802ee988ec26293fc2faf4
apache-2.0
['generated_from_keras_callback']
false
Training results | Train Loss | Validation Loss | Train Rouge1 | Train Rouge2 | Train Rougel | Train Rougelsum | Train Gen Len | Epoch | |:----------:|:---------------:|:------------:|:------------:|:------------:|:---------------:|:-------------:|:-----:| | 2.7203 | 2.4006 | 28.1689 | 7.9798 | 22.6998 | 22.7228 | 18.865 | 0 |
36776201233d2abd364697dc1ab2c9f9
cc-by-4.0
[]
false
Model description This is the T5-3B model for the "classify" component of System 4's "Classify then explain" pipeline, as described in our paper Just-DREAM-about-it: Figurative Language Understanding with DREAM-FLUTE, FigLang workshop @ EMNLP 2022 (Arxiv link: https://arxiv.org/abs/2210.16407) System 4: Two-step System - Classify then explain In contrast to Systems 1 to 3 where the entailment/contradiction label and associated explanation are predicted jointly, System 4 uses a two-step “classify then explain” pipeline. This current model is for the "classify" component of the pipeline. The input-output format is: ``` Input <Premise> <Hypothesis> Output <Label> ```
908c9d605218ef36a81317d1008a636a
cc-by-4.0
[]
false
How to use this model? We provide a quick example of how you can try out the "classify" component of System 4 in our paper with just a few lines of code: ``` >>> from transformers import AutoTokenizer, AutoModelForSeq2SeqLM >>> model = AutoModelForSeq2SeqLM.from_pretrained("allenai/System4_classify_FigLang2022") >>> tokenizer = AutoTokenizer.from_pretrained("t5-3b") >>> input_string = "Premise: After releasing his rage he was like a ferocious wolf. Hypothesis: After letting off his rage he sat down like a lamb. Is there a contradiction or entailment between the premise and hypothesis? Answer : " >>> input_ids = tokenizer.encode(input_string, return_tensors="pt") >>> output = model.generate(input_ids, max_length=200) >>> tokenizer.batch_decode(output, skip_special_tokens=True) ['Contradiction'] ```
039522f469a786c04b93873b4a5e983d
cc-by-4.0
[]
false
Model details This model is a fine-tuned version of [t5-3b](https://huggingface.co/t5-3b). It achieves the following results on the evaluation set: - Loss: 0.0604 - Rouge1: 95.0232 - Rouge2: 0.0 - Rougel: 95.0232 - Rougelsum: 95.0232 - Gen Len: 3.4074
51d29be14168ef13ed1e1c9d012fd6f5
cc-by-4.0
[]
false
Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:-----:|:----:|:---------------:|:-------:|:------:|:-------:|:---------:|:-------:| | 0.1221 | 0.33 | 1000 | 0.1460 | 91.7717 | 0.0 | 91.9044 | 91.8381 | 3.4751 | | 0.0957 | 0.66 | 2000 | 0.0904 | 93.6297 | 0.0 | 93.6961 | 93.6961 | 3.3955 | | 0.0721 | 1.0 | 3000 | 0.0720 | 94.8905 | 0.0 | 94.9569 | 94.8905 | 3.4061 | | 0.0413 | 1.33 | 4000 | 0.0786 | 94.5587 | 0.0 | 94.5587 | 94.5587 | 3.4346 | | 0.042 | 1.66 | 5000 | 0.0604 | 95.0232 | 0.0 | 95.0232 | 95.0232 | 3.4074 | | 0.0413 | 1.99 | 6000 | 0.0737 | 95.2223 | 0.0 | 95.2223 | 95.2223 | 3.4413 | | 0.0198 | 2.32 | 7000 | 0.1045 | 95.0896 | 0.0 | 95.1559 | 95.1559 | 3.4101 | | 0.0253 | 2.65 | 8000 | 0.0836 | 95.2887 | 0.0 | 95.2887 | 95.2887 | 3.4393 | | 0.0198 | 2.99 | 9000 | 0.0922 | 94.7578 | 0.0 | 94.7578 | 94.7578 | 3.4180 |
a800432b40bb42a230d58ad0ebe81a50
apache-2.0
['generated_from_trainer']
false
long-t5-local-base-finetuned This model is a fine-tuned version of [google/long-t5-local-base](https://huggingface.co/google/long-t5-local-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 9.2722 - Rouge1: 3.8848 - Rouge2: 0.5914 - Rougel: 3.5038 - Rougelsum: 3.7022 - Gen Len: 19.0
57e3d32d9d17682808b2f202a6558094
apache-2.0
['generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-06 - train_batch_size: 3 - eval_batch_size: 3 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50000
5423def48af084a29b26d704fbbc53d1
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:| | No log | 0.16 | 100 | 342.4395 | 0.0 | 0.0 | 0.0 | 0.0 | 19.0 | | No log | 0.31 | 200 | 323.6985 | 0.0 | 0.0 | 0.0 | 0.0 | 19.0 | | No log | 0.47 | 300 | 303.8767 | 0.0 | 0.0 | 0.0 | 0.0 | 19.0 | | No log | 0.62 | 400 | 284.7559 | 0.0 | 0.0 | 0.0 | 0.0 | 19.0 | | 295.8376 | 0.78 | 500 | 263.0420 | 0.0 | 0.0 | 0.0 | 0.0 | 19.0 | | 295.8376 | 0.93 | 600 | 243.2220 | 0.0242 | 0.0 | 0.0223 | 0.0242 | 19.0 | | 295.8376 | 1.09 | 700 | 224.4514 | 0.0493 | 0.0 | 0.0507 | 0.0513 | 19.0 | | 295.8376 | 1.24 | 800 | 203.9065 | 0.0656 | 0.0 | 0.0634 | 0.0658 | 19.0 | | 295.8376 | 1.4 | 900 | 184.8686 | 0.0609 | 0.0 | 0.058 | 0.0616 | 19.0 | | 199.938 | 1.55 | 1000 | 167.5315 | 0.0638 | 0.0 | 0.0626 | 0.063 | 19.0 | | 199.938 | 1.71 | 1100 | 151.2369 | 0.0421 | 0.0 | 0.0411 | 0.0413 | 19.0 | | 199.938 | 1.86 | 1200 | 137.2366 | 0.0358 | 0.0 | 0.0346 | 0.0342 | 19.0 | | 199.938 | 2.02 | 1300 | 125.3076 | 0.0173 | 0.0 | 0.0157 | 0.0157 | 19.0 | | 199.938 | 2.17 | 1400 | 114.5600 | 0.0173 | 0.0 | 0.0157 | 0.0157 | 19.0 | | 136.1309 | 2.33 | 1500 | 105.9237 | 0.0361 | 0.0 | 0.0344 | 0.0363 | 19.0 | | 136.1309 | 2.48 | 1600 | 97.4123 | 0.0526 | 0.0 | 0.051 | 0.054 | 19.0 | | 136.1309 | 2.64 | 1700 | 89.0873 | 0.0427 | 0.0 | 0.0407 | 0.0418 | 19.0 | | 136.1309 | 2.79 | 1800 | 82.0562 | 0.0496 | 0.0 | 0.0462 | 0.0462 | 19.0 | | 136.1309 | 2.95 | 1900 | 76.2360 | 0.0361 | 0.0 | 0.0345 | 0.0363 | 19.0 | | 99.2229 | 3.1 | 2000 | 70.0604 | 0.0438 | 0.0 | 0.0425 | 0.0439 | 19.0 | | 99.2229 | 3.26 | 2100 | 65.1038 | 0.0454 | 0.0 | 0.0441 | 0.0447 | 19.0 | | 99.2229 | 3.41 | 2200 | 59.1831 | 0.0344 | 0.0 | 0.0318 | 0.0318 | 19.0 | | 99.2229 | 3.57 | 2300 | 53.0313 | 0.0471 | 0.0 | 0.0448 | 0.0454 | 19.0 | | 99.2229 | 3.72 | 2400 | 48.2110 | 0.0369 | 0.0 | 0.0369 | 0.0369 | 19.0 | | 73.4208 | 3.88 | 2500 | 44.2004 | 0.0425 | 0.0 | 0.0427 | 0.044 | 19.0 | | 73.4208 | 4.03 | 2600 | 40.1925 | 0.0632 | 0.0 | 0.0619 | 0.0612 | 19.0 | | 73.4208 | 4.19 | 2700 | 36.3698 | 0.0887 | 0.0 | 0.0873 | 0.086 | 19.0 | | 73.4208 | 4.34 | 2800 | 33.2154 | 0.164 | 0.0 | 0.1652 | 0.1705 | 19.0 | | 73.4208 | 4.5 | 2900 | 30.9366 | 0.1106 | 0.0 | 0.1138 | 0.1144 | 19.0 | | 55.6661 | 4.65 | 3000 | 28.5672 | 0.1289 | 0.0 | 0.1295 | 0.131 | 19.0 | | 55.6661 | 4.81 | 3100 | 27.0910 | 0.2501 | 0.0 | 0.2514 | 0.2527 | 19.0 | | 55.6661 | 4.96 | 3200 | 25.6666 | 0.318 | 0.0 | 0.3322 | 0.3203 | 19.0 | | 55.6661 | 5.12 | 3300 | 24.6176 | 0.6319 | 0.0 | 0.6419 | 0.6299 | 19.0 | | 55.6661 | 5.27 | 3400 | 23.6474 | 1.6632 | 0.0033 | 1.665 | 1.6244 | 19.0 | | 45.1105 | 5.43 | 3500 | 22.7063 | 3.1374 | 0.0 | 3.1331 | 3.1333 | 19.0 | | 45.1105 | 5.58 | 3600 | 21.9191 | 5.0757 | 0.0 | 5.0694 | 5.0456 | 19.0 | | 45.1105 | 5.74 | 3700 | 21.3359 | 5.6576 | 0.0 | 5.689 | 5.6772 | 19.0 | | 45.1105 | 5.89 | 3800 | 20.6990 | 5.828 | 0.0 | 5.8801 | 5.8688 | 19.0 | | 45.1105 | 6.05 | 3900 | 20.1800 | 6.3727 | 0.0 | 6.3801 | 6.3716 | 19.0 | | 39.6923 | 6.2 | 4000 | 19.7415 | 6.2209 | 0.0 | 6.2347 | 6.2368 | 19.0 | | 39.6923 | 6.36 | 4100 | 19.2800 | 5.7215 | 0.0 | 5.7452 | 5.7295 | 19.0 | | 39.6923 | 6.51 | 4200 | 18.9683 | 6.1018 | 0.0062 | 6.1 | 6.0935 | 19.0 | | 39.6923 | 6.67 | 4300 | 18.5776 | 6.0354 | 0.0062 | 6.0227 | 6.0103 | 19.0 | | 39.6923 | 6.82 | 4400 | 18.2629 | 5.4438 | 0.0062 | 5.441 | 5.4629 | 19.0 | | 36.1688 | 6.98 | 4500 | 18.0268 | 5.3214 | 0.0091 | 5.3093 | 5.2992 | 19.0 | | 36.1688 | 7.13 | 4600 | 17.7740 | 5.2223 | 0.0123 | 5.2132 | 5.2084 | 19.0 | | 36.1688 | 7.29 | 4700 | 17.5345 | 5.178 | 0.0231 | 5.1615 | 5.1243 | 19.0 | | 36.1688 | 7.44 | 4800 | 17.3846 | 5.3899 | 0.0277 | 5.3414 | 5.3534 | 19.0 | | 36.1688 | 7.6 | 4900 | 17.1999 | 5.315 | 0.0272 | 5.2572 | 5.2477 | 19.0 | | 33.5745 | 7.75 | 5000 | 17.0078 | 5.9014 | 0.028 | 5.8181 | 5.8058 | 19.0 | | 33.5745 | 7.91 | 5100 | 16.6418 | 5.7546 | 0.0242 | 5.6903 | 5.6746 | 19.0 | | 33.5745 | 8.06 | 5200 | 16.6330 | 6.6893 | 0.0182 | 6.6354 | 6.6178 | 19.0 | | 33.5745 | 8.22 | 5300 | 16.3423 | 6.1679 | 0.0072 | 6.1518 | 6.128 | 19.0 | | 33.5745 | 8.37 | 5400 | 16.2373 | 6.7659 | 0.0139 | 6.7271 | 6.7076 | 19.0 | | 31.9486 | 8.53 | 5500 | 16.1523 | 7.1991 | 0.0139 | 7.1674 | 7.1283 | 19.0 | | 31.9486 | 8.68 | 5600 | 16.0607 | 7.7042 | 0.0169 | 7.6741 | 7.6537 | 19.0 | | 31.9486 | 8.84 | 5700 | 15.7647 | 7.1238 | 0.02 | 7.1113 | 7.0586 | 19.0 | | 31.9486 | 8.99 | 5800 | 15.6194 | 7.3055 | 0.0116 | 7.3311 | 7.2683 | 19.0 | | 31.9486 | 9.15 | 5900 | 15.4994 | 7.3365 | 0.0139 | 7.3026 | 7.2708 | 19.0 | | 30.5224 | 9.3 | 6000 | 15.4207 | 8.1959 | 0.0116 | 8.1917 | 8.1651 | 19.0 | | 30.5224 | 9.46 | 6100 | 15.2981 | 7.7936 | 0.0144 | 7.7826 | 7.7488 | 19.0 | | 30.5224 | 9.61 | 6200 | 15.2391 | 7.95 | 0.0144 | 7.9371 | 7.895 | 19.0 | | 30.5224 | 9.77 | 6300 | 15.0941 | 7.1669 | 0.0144 | 7.146 | 7.1251 | 19.0 | | 30.5224 | 9.92 | 6400 | 14.9979 | 6.2157 | 0.0076 | 6.2086 | 6.1774 | 19.0 | | 29.1236 | 10.08 | 6500 | 14.9523 | 7.4422 | 0.0137 | 7.3929 | 7.393 | 19.0 | | 29.1236 | 10.23 | 6600 | 14.9515 | 7.2375 | 0.0137 | 7.1728 | 7.1779 | 19.0 | | 29.1236 | 10.39 | 6700 | 14.8874 | 7.5071 | 0.0068 | 7.4544 | 7.4739 | 19.0 | | 29.1236 | 10.54 | 6800 | 14.8057 | 5.9608 | 0.0169 | 5.8754 | 5.8691 | 19.0 | | 29.1236 | 10.7 | 6900 | 14.6818 | 5.6345 | 0.021 | 5.5422 | 5.5331 | 19.0 | | 28.314 | 10.85 | 7000 | 14.5409 | 5.5799 | 0.0169 | 5.4915 | 5.4833 | 19.0 | | 28.314 | 11.01 | 7100 | 14.4512 | 4.3498 | 0.0368 | 4.2243 | 4.2193 | 19.0 | | 28.314 | 11.16 | 7200 | 14.4560 | 4.0453 | 0.0372 | 3.9481 | 3.9228 | 19.0 | | 28.314 | 11.32 | 7300 | 14.3851 | 5.1332 | 0.0426 | 5.0186 | 4.9882 | 19.0 | | 28.314 | 11.47 | 7400 | 14.2265 | 4.8944 | 0.0371 | 4.7869 | 4.7765 | 19.0 | | 27.5349 | 11.63 | 7500 | 14.1214 | 3.8846 | 0.0335 | 3.7882 | 3.7677 | 19.0 | | 27.5349 | 11.78 | 7600 | 14.1505 | 3.9992 | 0.0514 | 3.883 | 3.8385 | 19.0 | | 27.5349 | 11.94 | 7700 | 13.9923 | 3.4526 | 0.0664 | 3.325 | 3.3258 | 19.0 | | 27.5349 | 12.09 | 7800 | 14.0299 | 2.3086 | 0.0346 | 2.25 | 2.219 | 19.0 | | 27.5349 | 12.25 | 7900 | 13.9814 | 2.4402 | 0.0628 | 2.3282 | 2.3004 | 19.0 | | 26.4286 | 12.4 | 8000 | 13.8561 | 2.9869 | 0.0654 | 2.8769 | 2.8485 | 19.0 | | 26.4286 | 12.56 | 8100 | 13.8259 | 1.9609 | 0.0386 | 1.8863 | 1.8846 | 19.0 | | 26.4286 | 12.71 | 8200 | 13.8127 | 2.0628 | 0.0355 | 1.9915 | 1.9738 | 19.0 | | 26.4286 | 12.87 | 8300 | 13.7174 | 1.9904 | 0.081 | 1.888 | 1.9069 | 19.0 | | 26.4286 | 13.02 | 8400 | 13.6308 | 2.1398 | 0.1055 | 2.0204 | 2.0468 | 19.0 | | 26.108 | 13.18 | 8500 | 13.6490 | 1.8934 | 0.0788 | 1.7942 | 1.8188 | 19.0 | | 26.108 | 13.33 | 8600 | 13.5996 | 1.8746 | 0.0901 | 1.7441 | 1.8006 | 19.0 | | 26.108 | 13.49 | 8700 | 13.5394 | 1.7846 | 0.0895 | 1.6648 | 1.7331 | 19.0 | | 26.108 | 13.64 | 8800 | 13.5368 | 2.1345 | 0.1287 | 1.9808 | 2.0814 | 19.0 | | 26.108 | 13.8 | 8900 | 13.4793 | 2.5234 | 0.1611 | 2.3289 | 2.4292 | 19.0 | | 25.4931 | 13.95 | 9000 | 13.3633 | 2.8056 | 0.1953 | 2.5619 | 2.7088 | 19.0 | | 25.4931 | 14.11 | 9100 | 13.5182 | 3.087 | 0.2192 | 2.8182 | 2.9928 | 19.0 | | 25.4931 | 14.26 | 9200 | 13.3372 | 2.6353 | 0.175 | 2.4145 | 2.589 | 19.0 | | 25.4931 | 14.42 | 9300 | 13.2822 | 2.7577 | 0.1905 | 2.5277 | 2.7215 | 19.0 | | 25.4931 | 14.57 | 9400 | 13.2011 | 3.1891 | 0.2381 | 2.9276 | 3.142 | 19.0 | | 24.9241 | 14.73 | 9500 | 13.2201 | 2.609 | 0.1683 | 2.4162 | 2.5905 | 19.0 | | 24.9241 | 14.88 | 9600 | 13.2206 | 3.1083 | 0.2241 | 2.8627 | 3.0606 | 19.0 | | 24.9241 | 15.04 | 9700 | 13.2157 | 3.6233 | 0.2731 | 3.338 | 3.5642 | 19.0 | | 24.9241 | 15.19 | 9800 | 13.1195 | 3.1785 | 0.2318 | 2.9449 | 3.1306 | 19.0 | | 24.9241 | 15.35 | 9900 | 13.0481 | 3.0249 | 0.2192 | 2.7991 | 2.9925 | 19.0 | | 24.4511 | 15.5 | 10000 | 13.0693 | 3.1189 | 0.2287 | 2.8726 | 3.0669 | 19.0 | | 24.4511 | 15.66 | 10100 | 12.9204 | 2.6405 | 0.1899 | 2.4337 | 2.61 | 19.0 | | 24.4511 | 15.81 | 10200 | 12.9200 | 2.9037 | 0.2148 | 2.6775 | 2.8683 | 19.0 | | 24.4511 | 15.97 | 10300 | 12.9203 | 2.8847 | 0.2034 | 2.6586 | 2.8438 | 19.0 | | 24.4511 | 16.12 | 10400 | 12.8723 | 2.8195 | 0.1976 | 2.5922 | 2.7803 | 19.0 | | 23.8949 | 16.28 | 10500 | 12.9749 | 3.2658 | 0.2217 | 2.9905 | 3.2262 | 19.0 | | 23.8949 | 16.43 | 10600 | 12.7975 | 2.9762 | 0.1844 | 2.7295 | 2.9474 | 19.0 | | 23.8949 | 16.59 | 10700 | 12.7497 | 2.5496 | 0.1406 | 2.3536 | 2.5269 | 19.0 | | 23.8949 | 16.74 | 10800 | 12.6485 | 2.5509 | 0.1454 | 2.343 | 2.5182 | 19.0 | | 23.8949 | 16.9 | 10900 | 12.6574 | 2.1914 | 0.1281 | 2.0113 | 2.1574 | 19.0 | | 23.4963 | 17.05 | 11000 | 12.6919 | 2.1748 | 0.1299 | 1.9909 | 2.1229 | 19.0 | | 23.4963 | 17.21 | 11100 | 12.5660 | 2.3751 | 0.1177 | 2.1417 | 2.326 | 19.0 | | 23.4963 | 17.36 | 11200 | 12.5866 | 2.6893 | 0.1344 | 2.4378 | 2.6318 | 19.0 | | 23.4963 | 17.52 | 11300 | 12.5427 | 2.5546 | 0.1411 | 2.3175 | 2.5073 | 19.0 | | 23.4963 | 17.67 | 11400 | 12.5011 | 2.347 | 0.1223 | 2.1322 | 2.3077 | 19.0 | | 23.1492 | 17.83 | 11500 | 12.5168 | 2.2304 | 0.1141 | 2.0657 | 2.1951 | 19.0 | | 23.1492 | 17.98 | 11600 | 12.4043 | 2.4485 | 0.1209 | 2.2548 | 2.4114 | 19.0 | | 23.1492 | 18.14 | 11700 | 12.4192 | 2.0551 | 0.0887 | 1.8996 | 2.0199 | 19.0 | | 23.1492 | 18.29 | 11800 | 12.3799 | 2.1076 | 0.0932 | 1.9464 | 2.0589 | 19.0 | | 23.1492 | 18.45 | 11900 | 12.4263 | 2.4136 | 0.1152 | 2.2172 | 2.357 | 19.0 | | 22.7005 | 18.6 | 12000 | 12.3218 | 2.1197 | 0.1105 | 1.9997 | 2.0873 | 19.0 | | 22.7005 | 18.76 | 12100 | 12.3297 | 2.1883 | 0.1102 | 2.0414 | 2.1267 | 19.0 | | 22.7005 | 18.91 | 12200 | 12.3026 | 1.966 | 0.0954 | 1.8387 | 1.9469 | 19.0 | | 22.7005 | 19.07 | 12300 | 12.3030 | 2.0179 | 0.0955 | 1.8834 | 1.9858 | 19.0 | | 22.7005 | 19.22 | 12400 | 12.2478 | 1.9549 | 0.0948 | 1.8437 | 1.9092 | 19.0 | | 22.3178 | 19.38 | 12500 | 12.1803 | 1.6396 | 0.0648 | 1.5296 | 1.6208 | 19.0 | | 22.3178 | 19.53 | 12600 | 12.1732 | 1.5568 | 0.0769 | 1.4894 | 1.5387 | 19.0 | | 22.3178 | 19.69 | 12700 | 12.1342 | 1.6861 | 0.0782 | 1.6105 | 1.666 | 19.0 | | 22.3178 | 19.84 | 12800 | 12.1313 | 2.023 | 0.0965 | 1.9295 | 2.0072 | 19.0 | | 22.3178 | 20.0 | 12900 | 12.1315 | 1.5878 | 0.0701 | 1.5153 | 1.5467 | 19.0 | | 21.8344 | 20.16 | 13000 | 12.0611 | 1.6406 | 0.0637 | 1.5665 | 1.6033 | 19.0 | | 21.8344 | 20.31 | 13100 | 12.0327 | 1.5913 | 0.0544 | 1.5209 | 1.552 | 19.0 | | 21.8344 | 20.47 | 13200 | 12.0466 | 1.3618 | 0.0494 | 1.3186 | 1.33 | 19.0 | | 21.8344 | 20.62 | 13300 | 12.0787 | 1.4445 | 0.0451 | 1.4073 | 1.41 | 19.0 | | 21.8344 | 20.78 | 13400 | 11.9829 | 1.3465 | 0.0494 | 1.3247 | 1.3167 | 19.0 | | 21.6309 | 20.93 | 13500 | 11.9072 | 1.4165 | 0.0519 | 1.3761 | 1.3839 | 19.0 | | 21.6309 | 21.09 | 13600 | 11.9261 | 1.3969 | 0.0502 | 1.3606 | 1.3618 | 19.0 | | 21.6309 | 21.24 | 13700 | 11.8313 | 1.3337 | 0.0337 | 1.2974 | 1.316 | 19.0 | | 21.6309 | 21.4 | 13800 | 11.7709 | 1.3045 | 0.0371 | 1.2746 | 1.2889 | 19.0 | | 21.6309 | 21.55 | 13900 | 11.8402 | 1.6106 | 0.0391 | 1.5678 | 1.5697 | 19.0 | | 21.2262 | 21.71 | 14000 | 11.7132 | 1.3261 | 0.0222 | 1.296 | 1.3051 | 19.0 | | 21.2262 | 21.86 | 14100 | 11.7206 | 1.41 | 0.0252 | 1.374 | 1.3985 | 19.0 | | 21.2262 | 22.02 | 14200 | 11.7033 | 1.6231 | 0.0478 | 1.5632 | 1.5851 | 19.0 | | 21.2262 | 22.17 | 14300 | 11.7385 | 1.8974 | 0.0618 | 1.8339 | 1.8583 | 19.0 | | 21.2262 | 22.33 | 14400 | 11.6519 | 1.8998 | 0.0541 | 1.8285 | 1.8552 | 19.0 | | 20.8055 | 22.48 | 14500 | 11.6039 | 1.9561 | 0.0582 | 1.859 | 1.9073 | 19.0 | | 20.8055 | 22.64 | 14600 | 11.6322 | 1.7731 | 0.0442 | 1.7061 | 1.7303 | 19.0 | | 20.8055 | 22.79 | 14700 | 11.6046 | 1.8874 | 0.0618 | 1.8083 | 1.8539 | 19.0 | | 20.8055 | 22.95 | 14800 | 11.5051 | 1.4271 | 0.016 | 1.3996 | 1.4086 | 19.0 | | 20.8055 | 23.1 | 14900 | 11.5564 | 1.743 | 0.0451 | 1.6787 | 1.727 | 19.0 | | 20.6263 | 23.26 | 15000 | 11.5024 | 1.9313 | 0.0575 | 1.8357 | 1.887 | 19.0 | | 20.6263 | 23.41 | 15100 | 11.5281 | 2.082 | 0.0435 | 1.9865 | 2.0327 | 19.0 | | 20.6263 | 23.57 | 15200 | 11.4223 | 1.9773 | 0.0332 | 1.9038 | 1.9432 | 19.0 | | 20.6263 | 23.72 | 15300 | 11.4675 | 1.7845 | 0.0831 | 1.6835 | 1.7414 | 19.0 | | 20.6263 | 23.88 | 15400 | 11.3882 | 2.1183 | 0.0715 | 1.9965 | 2.0725 | 19.0 | | 20.3154 | 24.03 | 15500 | 11.4197 | 2.4045 | 0.1336 | 2.2302 | 2.3024 | 19.0 | | 20.3154 | 24.19 | 15600 | 11.3558 | 1.9596 | 0.1196 | 1.8152 | 1.8748 | 19.0 | | 20.3154 | 24.34 | 15700 | 11.3438 | 2.0931 | 0.111 | 1.9469 | 1.999 | 19.0 | | 20.3154 | 24.5 | 15800 | 11.3021 | 2.2159 | 0.1257 | 2.0511 | 2.1345 | 19.0 | | 20.3154 | 24.65 | 15900 | 11.3178 | 2.093 | 0.132 | 1.9083 | 1.9969 | 19.0 | | 20.0858 | 24.81 | 16000 | 11.2377 | 1.6589 | 0.1129 | 1.5625 | 1.6245 | 19.0 | | 20.0858 | 24.96 | 16100 | 11.2058 | 1.6667 | 0.0854 | 1.5597 | 1.6223 | 19.0 | | 20.0858 | 25.12 | 16200 | 11.1602 | 2.0907 | 0.1219 | 1.9297 | 1.9988 | 19.0 | | 20.0858 | 25.27 | 16300 | 11.1666 | 1.86 | 0.1092 | 1.7398 | 1.7993 | 19.0 | | 20.0858 | 25.43 | 16400 | 11.1807 | 1.8879 | 0.1818 | 1.7579 | 1.8335 | 19.0 | | 19.7588 | 25.58 | 16500 | 11.1310 | 2.0377 | 0.1612 | 1.8653 | 1.9538 | 19.0 | | 19.7588 | 25.74 | 16600 | 11.1577 | 2.1441 | 0.1767 | 1.9546 | 2.0518 | 19.0 | | 19.7588 | 25.89 | 16700 | 11.0748 | 1.8679 | 0.1892 | 1.7249 | 1.7822 | 19.0 | | 19.7588 | 26.05 | 16800 | 11.1048 | 2.2775 | 0.2072 | 2.0566 | 2.1521 | 19.0 | | 19.7588 | 26.2 | 16900 | 11.0498 | 1.8117 | 0.161 | 1.6879 | 1.7357 | 19.0 | | 19.4627 | 26.36 | 17000 | 11.0435 | 1.7875 | 0.1627 | 1.6626 | 1.7306 | 19.0 | | 19.4627 | 26.51 | 17100 | 10.9406 | 1.7333 | 0.1645 | 1.6051 | 1.6671 | 19.0 | | 19.4627 | 26.67 | 17200 | 10.9242 | 1.596 | 0.1426 | 1.4747 | 1.5341 | 19.0 | | 19.4627 | 26.82 | 17300 | 10.9571 | 1.9874 | 0.2109 | 1.8109 | 1.9061 | 19.0 | | 19.4627 | 26.98 | 17400 | 10.9265 | 1.6999 | 0.1353 | 1.5574 | 1.6402 | 19.0 | | 19.2619 | 27.13 | 17500 | 10.8919 | 1.7543 | 0.1709 | 1.587 | 1.6605 | 19.0 | | 19.2619 | 27.29 | 17600 | 10.8382 | 2.126 | 0.2056 | 1.8609 | 2.0021 | 19.0 | | 19.2619 | 27.44 | 17700 | 10.8936 | 1.9626 | 0.1726 | 1.7402 | 1.8665 | 19.0 | | 19.2619 | 27.6 | 17800 | 10.8565 | 1.7668 | 0.1673 | 1.5914 | 1.7099 | 19.0 | | 19.2619 | 27.75 | 17900 | 10.9047 | 2.0972 | 0.1867 | 1.8519 | 2.0224 | 19.0 | | 19.0457 | 27.91 | 18000 | 10.7900 | 2.7761 | 0.2904 | 2.4403 | 2.6936 | 19.0 | | 19.0457 | 28.06 | 18100 | 10.7191 | 2.3652 | 0.2431 | 2.0989 | 2.2767 | 19.0 | | 19.0457 | 28.22 | 18200 | 10.7462 | 3.3125 | 0.361 | 2.847 | 3.1506 | 19.0 | | 19.0457 | 28.37 | 18300 | 10.7721 | 2.9247 | 0.3 | 2.5443 | 2.806 | 19.0 | | 19.0457 | 28.53 | 18400 | 10.7208 | 2.5398 | 0.2812 | 2.2211 | 2.4312 | 19.0 | | 18.8301 | 28.68 | 18500 | 10.6708 | 2.5902 | 0.281 | 2.2765 | 2.4881 | 19.0 | | 18.8301 | 28.84 | 18600 | 10.7220 | 2.276 | 0.2061 | 1.9904 | 2.1922 | 19.0 | | 18.8301 | 28.99 | 18700 | 10.6855 | 2.8678 | 0.3496 | 2.52 | 2.751 | 19.0 | | 18.8301 | 29.15 | 18800 | 10.6550 | 2.5232 | 0.2724 | 2.2108 | 2.4314 | 19.0 | | 18.8301 | 29.3 | 18900 | 10.6488 | 2.5629 | 0.2203 | 2.2361 | 2.4261 | 19.0 | | 18.5872 | 29.46 | 19000 | 10.6123 | 2.5052 | 0.1923 | 2.1381 | 2.3821 | 19.0 | | 18.5872 | 29.61 | 19100 | 10.6105 | 3.7779 | 0.3653 | 3.2404 | 3.5759 | 19.0 | | 18.5872 | 29.77 | 19200 | 10.5823 | 3.8282 | 0.3743 | 3.2645 | 3.6077 | 19.0 | | 18.5872 | 29.92 | 19300 | 10.5606 | 3.0976 | 0.277 | 2.6041 | 2.8838 | 19.0 | | 18.5872 | 30.08 | 19400 | 10.5846 | 3.638 | 0.3482 | 3.0804 | 3.4294 | 19.0 | | 18.2839 | 30.23 | 19500 | 10.4722 | 2.6173 | 0.2326 | 2.2268 | 2.4656 | 19.0 | | 18.2839 | 30.39 | 19600 | 10.5211 | 3.5085 | 0.3377 | 2.9751 | 3.2889 | 19.0 | | 18.2839 | 30.54 | 19700 | 10.4735 | 2.4781 | 0.2097 | 2.1099 | 2.3338 | 19.0 | | 18.2839 | 30.7 | 19800 | 10.4545 | 3.1459 | 0.3022 | 2.6844 | 2.9559 | 19.0 | | 18.2839 | 30.85 | 19900 | 10.4525 | 3.6095 | 0.3637 | 3.0873 | 3.3886 | 19.0 | | 18.1352 | 31.01 | 20000 | 10.4409 | 4.0556 | 0.4621 | 3.3857 | 3.7778 | 19.0 | | 18.1352 | 31.16 | 20100 | 10.4132 | 3.8346 | 0.3863 | 3.2323 | 3.6266 | 19.0 | | 18.1352 | 31.32 | 20200 | 10.4468 | 2.3736 | 0.1977 | 2.0195 | 2.236 | 19.0 | | 18.1352 | 31.47 | 20300 | 10.3896 | 3.6954 | 0.3512 | 3.1402 | 3.4667 | 19.0 | | 18.1352 | 31.63 | 20400 | 10.3546 | 3.5158 | 0.3558 | 3.0575 | 3.3116 | 19.0 | | 17.9834 | 31.78 | 20500 | 10.3632 | 3.179 | 0.3374 | 2.7634 | 2.9846 | 19.0 | | 17.9834 | 31.94 | 20600 | 10.3168 | 3.9121 | 0.4012 | 3.3812 | 3.687 | 19.0 | | 17.9834 | 32.09 | 20700 | 10.2772 | 3.6148 | 0.3667 | 3.1059 | 3.3541 | 19.0 | | 17.9834 | 32.25 | 20800 | 10.3173 | 3.1448 | 0.2924 | 2.6948 | 2.9338 | 19.0 | | 17.9834 | 32.4 | 20900 | 10.2154 | 2.4611 | 0.1922 | 2.1597 | 2.3288 | 19.0 | | 17.6192 | 32.56 | 21000 | 10.2957 | 3.3177 | 0.3762 | 2.8085 | 3.0595 | 19.0 | | 17.6192 | 32.71 | 21100 | 10.2064 | 3.4663 | 0.3819 | 3.0229 | 3.2201 | 19.0 | | 17.6192 | 32.87 | 21200 | 10.2235 | 3.245 | 0.3179 | 2.7618 | 3.0066 | 19.0 | | 17.6192 | 33.02 | 21300 | 10.2193 | 2.5572 | 0.2775 | 2.216 | 2.3892 | 19.0 | | 17.6192 | 33.18 | 21400 | 10.2467 | 3.4873 | 0.3934 | 3.02 | 3.2701 | 19.0 | | 17.5532 | 33.33 | 21500 | 10.2378 | 2.8087 | 0.3049 | 2.4001 | 2.6218 | 19.0 | | 17.5532 | 33.49 | 21600 | 10.2086 | 3.8967 | 0.4801 | 3.3678 | 3.603 | 19.0 | | 17.5532 | 33.64 | 21700 | 10.2384 | 2.6534 | 0.3239 | 2.3276 | 2.4692 | 19.0 | | 17.5532 | 33.8 | 21800 | 10.1929 | 2.6025 | 0.2845 | 2.2653 | 2.4507 | 19.0 | | 17.5532 | 33.95 | 21900 | 10.1016 | 3.3244 | 0.377 | 2.8311 | 3.0784 | 19.0 | | 17.3872 | 34.11 | 22000 | 10.1407 | 3.4245 | 0.4024 | 3.044 | 3.1865 | 19.0 | | 17.3872 | 34.26 | 22100 | 10.0760 | 3.9251 | 0.4272 | 3.4064 | 3.6497 | 19.0 | | 17.3872 | 34.42 | 22200 | 10.0998 | 3.3034 | 0.3438 | 2.8977 | 3.1141 | 19.0 | | 17.3872 | 34.57 | 22300 | 10.0834 | 2.4967 | 0.266 | 2.2301 | 2.3647 | 19.0 | | 17.3872 | 34.73 | 22400 | 9.9902 | 4.0828 | 0.4867 | 3.5482 | 3.7861 | 19.0 | | 17.1744 | 34.88 | 22500 | 10.0366 | 3.5772 | 0.4377 | 3.1153 | 3.3199 | 19.0 | | 17.1744 | 35.04 | 22600 | 10.0299 | 3.5342 | 0.433 | 3.0501 | 3.2176 | 19.0 | | 17.1744 | 35.19 | 22700 | 9.9912 | 3.7754 | 0.4445 | 3.3191 | 3.502 | 19.0 | | 17.1744 | 35.35 | 22800 | 9.9580 | 4.5086 | 0.5514 | 3.8986 | 4.1987 | 19.0 | | 17.1744 | 35.5 | 22900 | 9.9676 | 3.526 | 0.3942 | 3.0859 | 3.3082 | 19.0 | | 17.0687 | 35.66 | 23000 | 9.9874 | 3.7058 | 0.5139 | 3.2353 | 3.4611 | 19.0 | | 17.0687 | 35.81 | 23100 | 9.9536 | 3.6588 | 0.4552 | 3.1591 | 3.3554 | 19.0 | | 17.0687 | 35.97 | 23200 | 9.8948 | 3.6279 | 0.3933 | 3.1403 | 3.3426 | 19.0 | | 17.0687 | 36.12 | 23300 | 9.8397 | 3.8101 | 0.4971 | 3.3152 | 3.5133 | 19.0 | | 17.0687 | 36.28 | 23400 | 9.8995 | 3.3201 | 0.4209 | 2.9101 | 3.0903 | 19.0 | | 16.7686 | 36.43 | 23500 | 9.9085 | 4.0108 | 0.6389 | 3.5055 | 3.7286 | 19.0 | | 16.7686 | 36.59 | 23600 | 9.8688 | 3.6051 | 0.5164 | 3.1651 | 3.3781 | 19.0 | | 16.7686 | 36.74 | 23700 | 9.8673 | 4.4987 | 0.6051 | 3.8789 | 4.1868 | 19.0 | | 16.7686 | 36.9 | 23800 | 9.8848 | 3.6926 | 0.5635 | 3.1681 | 3.3902 | 19.0 | | 16.7686 | 37.05 | 23900 | 9.8497 | 3.518 | 0.4283 | 3.1159 | 3.3112 | 19.0 | | 16.7432 | 37.21 | 24000 | 9.8044 | 3.3369 | 0.3772 | 2.9784 | 3.147 | 19.0 | | 16.7432 | 37.36 | 24100 | 9.7768 | 3.5862 | 0.3819 | 3.1273 | 3.3535 | 19.0 | | 16.7432 | 37.52 | 24200 | 9.7536 | 4.1823 | 0.5884 | 3.645 | 3.8843 | 19.0 | | 16.7432 | 37.67 | 24300 | 9.7953 | 4.3981 | 0.6441 | 3.7941 | 4.0623 | 19.0 | | 16.7432 | 37.83 | 24400 | 9.6742 | 3.7833 | 0.4755 | 3.3516 | 3.5543 | 19.0 | | 16.5714 | 37.98 | 24500 | 9.7946 | 3.3839 | 0.495 | 3.0021 | 3.156 | 19.0 | | 16.5714 | 38.14 | 24600 | 9.7544 | 4.3873 | 0.6486 | 3.8188 | 4.0653 | 19.0 | | 16.5714 | 38.29 | 24700 | 9.7586 | 3.4403 | 0.4756 | 3.0402 | 3.2405 | 19.0 | | 16.5714 | 38.45 | 24800 | 9.7895 | 3.6822 | 0.6247 | 3.2612 | 3.4746 | 19.0 | | 16.5714 | 38.6 | 24900 | 9.6964 | 3.8743 | 0.6209 | 3.4159 | 3.6051 | 19.0 | | 16.3393 | 38.76 | 25000 | 9.7190 | 4.1508 | 0.635 | 3.5925 | 3.8753 | 19.0 | | 16.3393 | 38.91 | 25100 | 9.6435 | 3.6755 | 0.4777 | 3.268 | 3.4572 | 19.0 | | 16.3393 | 39.07 | 25200 | 9.6390 | 2.9478 | 0.4049 | 2.6531 | 2.7782 | 19.0 | | 16.3393 | 39.22 | 25300 | 9.6300 | 2.9973 | 0.3897 | 2.6662 | 2.7943 | 19.0 | | 16.3393 | 39.38 | 25400 | 9.6229 | 3.6726 | 0.4182 | 3.2207 | 3.4595 | 19.0 | | 16.3076 | 39.53 | 25500 | 9.6392 | 2.9691 | 0.3692 | 2.6709 | 2.8182 | 19.0 | | 16.3076 | 39.69 | 25600 | 9.5978 | 2.8167 | 0.3437 | 2.593 | 2.7155 | 19.0 | | 16.3076 | 39.84 | 25700 | 9.6111 | 3.5135 | 0.5453 | 3.1415 | 3.3042 | 19.0 | | 16.3076 | 40.0 | 25800 | 9.6118 | 3.459 | 0.4963 | 3.1351 | 3.2809 | 19.0 | | 16.3076 | 40.16 | 25900 | 9.5994 | 3.5735 | 0.539 | 3.2556 | 3.3904 | 19.0 | | 16.0684 | 40.31 | 26000 | 9.5526 | 3.3388 | 0.4689 | 2.9753 | 3.1562 | 19.0 | | 16.0684 | 40.47 | 26100 | 9.5365 | 3.0882 | 0.392 | 2.8072 | 2.9556 | 19.0 | | 16.0684 | 40.62 | 26200 | 9.5571 | 3.0022 | 0.4109 | 2.7108 | 2.8575 | 19.0 | | 16.0684 | 40.78 | 26300 | 9.5240 | 3.506 | 0.5734 | 3.1577 | 3.3378 | 19.0 | | 16.0684 | 40.93 | 26400 | 9.4913 | 3.5936 | 0.5165 | 3.2452 | 3.4134 | 19.0 | | 15.9425 | 41.09 | 26500 | 9.5297 | 3.7802 | 0.6862 | 3.4061 | 3.5436 | 19.0 | | 15.9425 | 41.24 | 26600 | 9.4657 | 3.8433 | 0.6105 | 3.4621 | 3.638 | 19.0 | | 15.9425 | 41.4 | 26700 | 9.5049 | 3.5822 | 0.6462 | 3.231 | 3.3745 | 19.0 | | 15.9425 | 41.55 | 26800 | 9.4739 | 2.9668 | 0.4426 | 2.7345 | 2.8134 | 19.0 | | 15.9425 | 41.71 | 26900 | 9.4868 | 3.7458 | 0.6934 | 3.3708 | 3.5492 | 19.0 | | 15.7779 | 41.86 | 27000 | 9.4683 | 3.5254 | 0.6006 | 3.1629 | 3.3011 | 19.0 | | 15.7779 | 42.02 | 27100 | 9.4108 | 4.2731 | 0.7412 | 3.8236 | 4.0171 | 19.0 | | 15.7779 | 42.17 | 27200 | 9.3994 | 3.5014 | 0.5738 | 3.1525 | 3.3306 | 19.0 | | 15.7779 | 42.33 | 27300 | 9.3760 | 3.4929 | 0.4954 | 3.1402 | 3.3028 | 19.0 | | 15.7779 | 42.48 | 27400 | 9.4201 | 4.2777 | 0.7152 | 3.7943 | 4.0349 | 19.0 | | 15.7238 | 42.64 | 27500 | 9.3913 | 3.6489 | 0.6371 | 3.2903 | 3.4528 | 19.0 | | 15.7238 | 42.79 | 27600 | 9.4269 | 3.5269 | 0.6042 | 3.2049 | 3.3528 | 19.0 | | 15.7238 | 42.95 | 27700 | 9.3847 | 3.4735 | 0.5963 | 3.1522 | 3.2796 | 19.0 | | 15.7238 | 43.1 | 27800 | 9.3474 | 3.8327 | 0.6428 | 3.406 | 3.5698 | 19.0 | | 15.7238 | 43.26 | 27900 | 9.3293 | 3.5475 | 0.6313 | 3.1725 | 3.3367 | 19.0 | | 15.5108 | 43.41 | 28000 | 9.3802 | 4.249 | 0.7997 | 3.7924 | 3.9849 | 19.0 | | 15.5108 | 43.57 | 28100 | 9.2588 | 3.4476 | 0.4676 | 3.1758 | 3.2993 | 19.0 | | 15.5108 | 43.72 | 28200 | 9.3447 | 4.0267 | 0.7081 | 3.6208 | 3.7957 | 19.0 | | 15.5108 | 43.88 | 28300 | 9.2853 | 4.0105 | 0.7799 | 3.5848 | 3.7619 | 19.0 | | 15.5108 | 44.03 | 28400 | 9.2753 | 3.1833 | 0.4678 | 2.9068 | 3.0168 | 19.0 | | 15.4004 | 44.19 | 28500 | 9.2345 | 3.6778 | 0.5955 | 3.3212 | 3.4724 | 19.0 | | 15.4004 | 44.34 | 28600 | 9.3130 | 3.9958 | 0.6892 | 3.5871 | 3.772 | 19.0 | | 15.4004 | 44.5 | 28700 | 9.2984 | 4.1868 | 0.696 | 3.7194 | 3.9197 | 19.0 | | 15.4004 | 44.65 | 28800 | 9.2722 | 3.8848 | 0.5914 | 3.5038 | 3.7022 | 19.0 |
df91981972f2d9ecbaee6c5d4893e968
mit
['conversational']
false
DialoGPT Trained on the Speech of a TV Series Character This is an instance of [microsoft/DialoGPT-medium](https://huggingface.co/microsoft/DialoGPT-medium) trained on a TV series character, Sheldon from [The Big Bang Theory](https://en.wikipedia.org/wiki/The_Big_Bang_Theory). The data comes from [a Kaggle TV series script dataset](https://www.kaggle.com/mitramir5/the-big-bang-theory-series-transcript). Chat with the model: ```python from transformers import AutoTokenizer, AutoModelWithLMHead tokenizer = AutoTokenizer.from_pretrained("spirax/DialoGPT-medium-sheldon") model = AutoModelWithLMHead.from_pretrained("spirax/DialoGPT-medium-sheldon")
70ef5fb0e5a6e7ca744c5c08df3cfa7d
mit
[]
false
Caitlin Fairchild, character, gen13 comics, by J. Scott Campbell on Stable Diffusion This is the `<Caitlin-Fairchild>` concept taught to Stable Diffusion via Textual Inversion. You can load this concept into the [Stable Conceptualizer](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/stable_conceptualizer_inference.ipynb) notebook. You can also train your own concepts and load them into the concept libraries using [this notebook](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/sd_textual_inversion_training.ipynb). Here is the new concept you will be able to use as an `object`: ![<Caitlin-Fairchild> 0](https://huggingface.co/sd-concepts-library/caitlin-fairchild-character-gen13-comics-by-j-scott-campbell/resolve/main/concept_images/172.jpeg) ![<Caitlin-Fairchild> 1](https://huggingface.co/sd-concepts-library/caitlin-fairchild-character-gen13-comics-by-j-scott-campbell/resolve/main/concept_images/180.jpeg) ![<Caitlin-Fairchild> 2](https://huggingface.co/sd-concepts-library/caitlin-fairchild-character-gen13-comics-by-j-scott-campbell/resolve/main/concept_images/128.jpeg) ![<Caitlin-Fairchild> 3](https://huggingface.co/sd-concepts-library/caitlin-fairchild-character-gen13-comics-by-j-scott-campbell/resolve/main/concept_images/215.jpeg) ![<Caitlin-Fairchild> 4](https://huggingface.co/sd-concepts-library/caitlin-fairchild-character-gen13-comics-by-j-scott-campbell/resolve/main/concept_images/273.jpeg) ![<Caitlin-Fairchild> 5](https://huggingface.co/sd-concepts-library/caitlin-fairchild-character-gen13-comics-by-j-scott-campbell/resolve/main/concept_images/97.jpeg) ![<Caitlin-Fairchild> 7](https://huggingface.co/sd-concepts-library/caitlin-fairchild-character-gen13-comics-by-j-scott-campbell/resolve/main/concept_images/45.jpeg) ![<Caitlin-Fairchild> 8](https://huggingface.co/sd-concepts-library/caitlin-fairchild-character-gen13-comics-by-j-scott-campbell/resolve/main/concept_images/87.jpeg) ![<Caitlin-Fairchild> 9](https://huggingface.co/sd-concepts-library/caitlin-fairchild-character-gen13-comics-by-j-scott-campbell/resolve/main/concept_images/222.jpeg) ![<Caitlin-Fairchild> 10](https://huggingface.co/sd-concepts-library/caitlin-fairchild-character-gen13-comics-by-j-scott-campbell/resolve/main/concept_images/127.jpeg) ![<Caitlin-Fairchild> 280](https://huggingface.co/sd-concepts-library/caitlin-fairchild-character-gen13-comics-by-j-scott-campbell/resolve/main/concept_images/108.jpeg)
47dd259dabc999be8e6f28079bc4e27f
apache-2.0
['generated_from_trainer']
false
distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.3399 - Accuracy: 0.901 - F1: 0.8976
f0f117b251cb46617978ab7b2e017004
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | No log | 1.0 | 125 | 0.5129 | 0.8465 | 0.8300 | | 0.7331 | 2.0 | 250 | 0.3399 | 0.901 | 0.8976 |
d2ca07523bf7d3686abbaa8d57a893c4
apache-2.0
['generated_from_trainer']
false
wav2vec2-xlsr-53-espeak-cv-ft-mhr2-ntsema-colab This model is a fine-tuned version of [facebook/wav2vec2-xlsr-53-espeak-cv-ft](https://huggingface.co/facebook/wav2vec2-xlsr-53-espeak-cv-ft) on the audiofolder dataset. It achieves the following results on the evaluation set: - Loss: 0.7562 - Wer: 0.7993
f7a4245f9f416bd28c759a7902dbb819
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:| | 5.5636 | 5.79 | 400 | 1.8357 | 1.0 | | 1.6348 | 11.59 | 800 | 0.6797 | 0.8528 | | 0.8624 | 17.39 | 1200 | 0.6651 | 0.8194 | | 0.5248 | 23.19 | 1600 | 0.6892 | 0.7826 | | 0.3328 | 28.98 | 2000 | 0.7562 | 0.7993 |
30783920c9fab77908373c2b776d617d
mit
[]
false
model by no3 This your the Stable Diffusion model fine-tuned the azura-sd-1.4-beta3 concept taught to Stable Diffusion with Dreambooth. It can be used by modifying the `instance_prompt`: **sks_azura** You can also train your own concepts and upload them to the library by using [this notebook](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/sd_dreambooth_training.ipynb). And you can run your new concept via `diffusers`: [Colab Notebook for Inference](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/sd_dreambooth_inference.ipynb), [Spaces with the Public Concepts loaded](https://huggingface.co/spaces/sd-dreambooth-library/stable-diffusion-dreambooth-concepts) If you have issues or questions feel free to visit the Community Tab and start discussion about it. Here are the images used for training this concept: ![image 1](https://huggingface.co/no3/azura-sd-1.4-beta3/resolve/main/concept_images/4.jpg) ![image 2](https://huggingface.co/no3/azura-sd-1.4-beta3/resolve/main/concept_images/1.jpg) ![image 3](https://huggingface.co/no3/azura-sd-1.4-beta3/resolve/main/concept_images/2.jpg) ![image 4](https://huggingface.co/no3/azura-sd-1.4-beta3/resolve/main/concept_images/5.jpg) ![image 5](https://huggingface.co/no3/azura-sd-1.4-beta3/resolve/main/concept_images/6.jpg) ![image 6](https://huggingface.co/no3/azura-sd-1.4-beta3/resolve/main/concept_images/3.jpg)
9d8e23b7a629dae79aa0a4fb159d7b2e
apache-2.0
['generated_from_trainer']
false
distilgpt2-finetuned-restaurant-reviews This model is a fine-tuned version of [distilgpt2](https://huggingface.co/distilgpt2) on a subset of the Yelp restaurant reviews dataset. It achieves the following results on the evaluation set: - Loss: 3.4668
fa2a9aa8611bc968ebb38ce0b9d25f5e
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 3.6331 | 1.0 | 2536 | 3.5280 | | 3.5676 | 2.0 | 5072 | 3.4793 | | 3.5438 | 3.0 | 7608 | 3.4668 |
3c93c87492465d74d7f7af3380785370
apache-2.0
['audio', 'automatic-speech-recognition', 'speech', 'xlsr-fine-tuning-week']
false
Wav2Vec2-Large-XLSR-53-Marathi Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Marathi using the [OpenSLR SLR64](http://openslr.org/64/) dataset. Note that this data contains only female voices. Please keep this in mind before using the model for your task, although it works very well for male voice too. When using this model, make sure that your speech input is sampled at 16kHz.
4a1e301796b749c46285bbc53a2b8e10
apache-2.0
['audio', 'automatic-speech-recognition', 'speech', 'xlsr-fine-tuning-week']
false
Usage The model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi `sentence` and `path` fields: ```python import torch import torchaudio from datasets import load_dataset from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
1f38b0b9d8329ae2dddaae0a7bf83315
apache-2.0
['audio', 'automatic-speech-recognition', 'speech', 'xlsr-fine-tuning-week']
false
TODO: WRITE YOUR CODE TO LOAD THE TEST DATASET. For sample see the Colab link in Training Section. processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr") model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr") resampler = torchaudio.transforms.Resample(48_000, 16_000)
6e2c0c9071ccbae74add6cdd97980a09
apache-2.0
['audio', 'automatic-speech-recognition', 'speech', 'xlsr-fine-tuning-week']
false
Evaluation The model can be evaluated as follows on 10% of the Marathi data on OpenSLR. ```python import torch import torchaudio from datasets import load_dataset, load_metric from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor import re
47e6f37565c87584bf57aa4b86a4cefb
apache-2.0
['audio', 'automatic-speech-recognition', 'speech', 'xlsr-fine-tuning-week']
false
TODO: WRITE YOUR CODE TO LOAD THE TEST DATASET. For sample see the Colab link in Training Section. wer = load_metric("wer") processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr") model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr") model.to("cuda") chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\‘\”\�\–\…]' resampler = torchaudio.transforms.Resample(48_000, 16_000)
f1b0d13876cee9f0d02e12e089960374
apache-2.0
['audio', 'automatic-speech-recognition', 'speech', 'xlsr-fine-tuning-week']
false
We need to read the aduio files as arrays def evaluate(batch): inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True) with torch.no_grad(): logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits pred_ids = torch.argmax(logits, dim=-1) batch["pred_strings"] = processor.batch_decode(pred_ids) return batch result = test_dataset.map(evaluate, batched=True, batch_size=8) print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"]))) ``` **Test Result**: 14.53 %
e66e725898ed7871d62dee10b26bca85
apache-2.0
['audio', 'automatic-speech-recognition', 'speech', 'xlsr-fine-tuning-week']
false
Training 90% of the OpenSLR Marathi dataset was used for training. The colab notebook used for training can be found [here](https://colab.research.google.com/drive/1_BbLyLqDUsXG3RpSULfLRjC6UY3RjwME?usp=sharing).
1fac2811dd2fd2ce23b968d6f714aa81
mit
[]
false
TEST on Stable Diffusion This is the `<AIO>` concept taught to Stable Diffusion via Textual Inversion. You can load this concept into the [Stable Conceptualizer](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/stable_conceptualizer_inference.ipynb) notebook. You can also train your own concepts and load them into the concept libraries using [this notebook](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/sd_textual_inversion_training.ipynb). Here is the new concept you will be able to use as a `style`: ![<AIO> 0](https://huggingface.co/sd-concepts-library/test/resolve/main/concept_images/1.jpeg) ![<AIO> 1](https://huggingface.co/sd-concepts-library/test/resolve/main/concept_images/2.jpeg) ![<AIO> 2](https://huggingface.co/sd-concepts-library/test/resolve/main/concept_images/0.jpeg) ![<AIO> 3](https://huggingface.co/sd-concepts-library/test/resolve/main/concept_images/3.jpeg)
64e872836a46cddb71ab2c3c0761eca0
apache-2.0
['generated_from_trainer']
false
finetuned-ner-finegrained This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3198 - Precision: 0.6498 - Recall: 0.6861 - F1: 0.6674 - Accuracy: 0.9083
1affb930c954955181a13240514833ae
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.3214 | 1.0 | 16472 | 0.3173 | 0.6260 | 0.6728 | 0.6486 | 0.9040 | | 0.266 | 2.0 | 32944 | 0.3115 | 0.6430 | 0.6857 | 0.6636 | 0.9070 | | 0.2163 | 3.0 | 49416 | 0.3198 | 0.6498 | 0.6861 | 0.6674 | 0.9083 |
899ca10d3f6183239272ac8957e1d9d4
mit
['bart', 'pytorch']
false
BART-IT: Italian pretraining for BART sequence to sequence model BART-IT is a sequence-to-sequence model, based on the BART architecture that is specifically tailored to the Italian language. The model is pre-trained on a [large corpus of Italian text](https://huggingface.co/datasets/gsarti/clean_mc4_it), and can be fine-tuned on a variety of tasks.
9c28f1744f7bb0750c455dfe02b7a63f
mit
['bart', 'pytorch']
false
Fine-tuning The model in this repository is a pre-trained model without any fine-tuning. In order to use the model for a specific task, you can fine-tune it on a specific dataset. The model has been fine-tuned for the abstractive summarization task on 3 different Italian datasets: - [FanPage](https://huggingface.co/datasets/ARTeLab/fanpage) - finetuned model [here](https://huggingface.co/morenolq/bart-it-fanpage) - [IlPost](https://huggingface.co/datasets/ARTeLab/ilpost) - finetuned model [here](https://huggingface.co/morenolq/bart-it-ilpost) - [WITS](https://huggingface.co/datasets/Silvia/WITS) - finetuned model [here](https://huggingface.co/morenolq/bart-it-WITS)
3fb1b858f2286d2467d0dac021db4799
mit
['bart', 'pytorch']
false
Usage In order to use the model, you can use the following code: ```python from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("morenolq/bart-it") model = AutoModelForSeq2SeqLM.from_pretrained("morenolq/bart-it") input_ids = tokenizer.encode("Il modello BART-IT è stato pre-addestrato su un corpus di testo italiano", return_tensors="pt") outputs = model.generate(input_ids, max_length=40, num_beams=4, early_stopping=True) print(tokenizer.decode(outputs[0], skip_special_tokens=True)) ```
3a841a2f63cef234d486c7fb2202a79d
apache-2.0
['generated_from_trainer']
false
jlg-model This model is a fine-tuned version of [datificate/gpt2-small-spanish](https://huggingface.co/datificate/gpt2-small-spanish) on the None dataset. It achieves the following results on the evaluation set: - Loss: 3.4882
f838e704f9af6f6c9d7e91b5e0de2f87
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 42 | 3.5391 | | No log | 2.0 | 84 | 3.5001 | | No log | 3.0 | 126 | 3.4882 |
6a78915bfb1deff083ac2d7c5b4dbef3
mit
['generated_from_trainer']
false
bart-cnn-pubmed-arxiv-pubmed-v3-e100 This model is a fine-tuned version of [theojolliffe/bart-cnn-pubmed-arxiv-pubmed](https://huggingface.co/theojolliffe/bart-cnn-pubmed-arxiv-pubmed) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.1806 - Rouge1: 59.4159 - Rouge2: 48.867 - Rougel: 51.9013 - Rougelsum: 58.3382 - Gen Len: 142.0
0ea4ec8edcdb4ffcd8e90a1048780f29
mit
['generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 - mixed_precision_training: Native AMP
fbc8dfc252eb268999b561ecb66d4436
mit
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:--------:| | 1.2541 | 1.0 | 795 | 0.9350 | 52.5594 | 32.6314 | 35.2302 | 50.1767 | 142.0 | | 0.7018 | 2.0 | 1590 | 0.8022 | 53.4804 | 35.4649 | 37.1673 | 51.2428 | 142.0 | | 0.5266 | 3.0 | 2385 | 0.7752 | 52.9462 | 34.3697 | 36.611 | 50.6922 | 142.0 | | 0.3475 | 4.0 | 3180 | 0.7771 | 53.4605 | 35.4738 | 38.5714 | 51.3798 | 142.0 | | 0.2691 | 5.0 | 3975 | 0.7424 | 54.1132 | 35.7289 | 39.2653 | 51.6822 | 141.4259 | | 0.182 | 6.0 | 4770 | 0.8037 | 53.7969 | 35.7324 | 38.4764 | 51.4929 | 141.7778 | | 0.1446 | 7.0 | 5565 | 0.7686 | 55.0274 | 38.7813 | 42.6251 | 52.9847 | 142.0 | | 0.1191 | 8.0 | 6360 | 0.7807 | 55.4651 | 38.6537 | 41.2746 | 53.578 | 141.8704 | | 0.0976 | 9.0 | 7155 | 0.8045 | 55.2843 | 40.2358 | 42.8464 | 54.0957 | 142.0 | | 0.0882 | 10.0 | 7950 | 0.8533 | 56.8288 | 41.6714 | 44.3961 | 54.9406 | 142.0 | | 0.0721 | 11.0 | 8745 | 0.8962 | 55.3187 | 40.1599 | 43.2103 | 54.1964 | 142.0 | | 0.0597 | 12.0 | 9540 | 0.8653 | 55.5706 | 40.2321 | 44.0075 | 53.9883 | 142.0 | | 0.054 | 13.0 | 10335 | 0.8566 | 55.6622 | 40.0252 | 42.6907 | 54.0548 | 142.0 | | 0.0476 | 14.0 | 11130 | 0.8900 | 57.5046 | 43.6309 | 46.449 | 55.9909 | 142.0 | | 0.0432 | 15.0 | 11925 | 0.9149 | 55.604 | 39.9591 | 43.1729 | 54.3703 | 142.0 | | 0.0403 | 16.0 | 12720 | 0.9258 | 55.1275 | 39.6566 | 42.3852 | 53.7656 | 142.0 | | 0.0351 | 17.0 | 13515 | 0.9184 | 58.2352 | 44.6109 | 47.3863 | 56.9529 | 142.0 | | 0.032 | 18.0 | 14310 | 0.9275 | 55.9687 | 41.2482 | 44.0076 | 54.0707 | 142.0 | | 0.0313 | 19.0 | 15105 | 0.9635 | 56.3574 | 41.2113 | 44.8358 | 54.6279 | 142.0 | | 0.0258 | 20.0 | 15900 | 0.9478 | 57.8445 | 44.297 | 46.8836 | 56.2003 | 142.0 | | 0.0277 | 21.0 | 16695 | 0.9363 | 58.4823 | 46.0943 | 48.7817 | 57.5883 | 141.6667 | | 0.0219 | 22.0 | 17490 | 0.9705 | 57.6022 | 43.9147 | 47.3054 | 56.3866 | 142.0 | | 0.0231 | 23.0 | 18285 | 0.9857 | 56.5809 | 42.9124 | 46.789 | 55.3897 | 142.0 | | 0.021 | 24.0 | 19080 | 1.0155 | 56.9745 | 43.8859 | 46.6109 | 55.708 | 142.0 | | 0.02 | 25.0 | 19875 | 1.0095 | 57.9702 | 45.1809 | 48.2856 | 56.6941 | 142.0 | | 0.0175 | 26.0 | 20670 | 0.9634 | 57.7023 | 45.1577 | 48.2398 | 56.5282 | 142.0 | | 0.0161 | 27.0 | 21465 | 1.0197 | 58.739 | 46.3307 | 49.2328 | 57.5778 | 142.0 | | 0.0186 | 28.0 | 22260 | 0.9790 | 56.1661 | 42.9731 | 45.8654 | 54.4365 | 142.0 | | 0.0145 | 29.0 | 23055 | 0.9883 | 55.8554 | 41.7405 | 45.177 | 54.478 | 142.0 | | 0.013 | 30.0 | 23850 | 0.9977 | 55.5831 | 41.2429 | 44.8063 | 53.886 | 142.0 | | 0.0131 | 31.0 | 24645 | 0.9765 | 57.4478 | 44.8905 | 48.1376 | 56.102 | 141.463 | | 0.0118 | 32.0 | 25440 | 1.0000 | 58.4282 | 46.6557 | 49.4122 | 57.1979 | 142.0 | | 0.0117 | 33.0 | 26235 | 0.9924 | 57.1995 | 44.4177 | 47.6248 | 56.0251 | 141.2407 | | 0.011 | 34.0 | 27030 | 1.0698 | 57.8918 | 45.925 | 49.0505 | 56.9352 | 142.0 | | 0.0093 | 35.0 | 27825 | 1.0297 | 57.7003 | 45.4556 | 47.9919 | 56.5134 | 141.8148 | | 0.0112 | 36.0 | 28620 | 1.0429 | 58.4039 | 46.6401 | 49.3897 | 57.4753 | 142.0 | | 0.0101 | 37.0 | 29415 | 1.0761 | 59.2768 | 47.5384 | 50.2152 | 57.9493 | 142.0 | | 0.0095 | 38.0 | 30210 | 1.0254 | 58.6205 | 47.246 | 50.87 | 57.7829 | 142.0 | | 0.0087 | 39.0 | 31005 | 1.0216 | 57.7667 | 44.7762 | 48.067 | 56.6006 | 142.0 | | 0.0082 | 40.0 | 31800 | 1.0587 | 58.4703 | 45.8371 | 48.5321 | 57.2036 | 142.0 | | 0.0075 | 41.0 | 32595 | 1.0621 | 58.5629 | 46.8885 | 49.5943 | 57.4579 | 142.0 | | 0.0079 | 42.0 | 33390 | 1.0845 | 57.664 | 45.5954 | 48.408 | 56.661 | 141.9815 | | 0.0076 | 43.0 | 34185 | 1.0705 | 58.1776 | 46.0435 | 49.3126 | 57.138 | 142.0 | | 0.0074 | 44.0 | 34980 | 1.0636 | 58.1022 | 46.4877 | 48.7985 | 56.9073 | 142.0 | | 0.007 | 45.0 | 35775 | 1.0810 | 57.8251 | 44.8767 | 47.8991 | 56.5977 | 142.0 | | 0.0057 | 46.0 | 36570 | 1.0560 | 58.5086 | 46.3448 | 49.2576 | 57.4386 | 142.0 | | 0.0062 | 47.0 | 37365 | 1.0903 | 58.8772 | 47.2886 | 49.9502 | 57.611 | 142.0 | | 0.0058 | 48.0 | 38160 | 1.0847 | 59.4672 | 48.3847 | 51.602 | 58.4588 | 142.0 | | 0.0061 | 49.0 | 38955 | 1.0798 | 59.5308 | 48.0396 | 50.8641 | 58.5016 | 142.0 | | 0.0062 | 50.0 | 39750 | 1.0795 | 59.5026 | 48.5319 | 51.7426 | 58.7111 | 142.0 | | 0.0051 | 51.0 | 40545 | 1.0842 | 57.7941 | 46.1198 | 48.7341 | 56.7164 | 142.0 | | 0.0057 | 52.0 | 41340 | 1.0777 | 58.6131 | 46.3924 | 49.0787 | 57.1278 | 142.0 | | 0.0039 | 53.0 | 42135 | 1.1133 | 57.6447 | 45.6699 | 48.5207 | 56.6447 | 142.0 | | 0.0038 | 54.0 | 42930 | 1.0714 | 58.1462 | 46.4616 | 49.273 | 57.2771 | 142.0 | | 0.004 | 55.0 | 43725 | 1.0852 | 58.6577 | 47.2095 | 50.4702 | 57.7724 | 142.0 | | 0.0044 | 56.0 | 44520 | 1.1152 | 59.0564 | 47.1621 | 50.2807 | 58.3122 | 142.0 | | 0.0042 | 57.0 | 45315 | 1.0831 | 58.1767 | 46.8127 | 49.9166 | 57.1833 | 142.0 | | 0.0038 | 58.0 | 46110 | 1.1156 | 57.8515 | 46.3229 | 48.6843 | 56.7218 | 142.0 | | 0.0038 | 59.0 | 46905 | 1.1105 | 57.9332 | 45.8354 | 49.27 | 57.1209 | 142.0 | | 0.0034 | 60.0 | 47700 | 1.1104 | 60.0207 | 49.2067 | 51.8751 | 58.9484 | 142.0 | | 0.0028 | 61.0 | 48495 | 1.1533 | 58.3432 | 46.8835 | 50.2868 | 57.5427 | 141.6111 | | 0.0026 | 62.0 | 49290 | 1.1441 | 58.6838 | 46.9472 | 49.9524 | 57.5287 | 142.0 | | 0.0028 | 63.0 | 50085 | 1.1232 | 58.0202 | 45.5855 | 48.6554 | 56.8368 | 141.9444 | | 0.0037 | 64.0 | 50880 | 1.1520 | 58.3905 | 47.0348 | 49.8478 | 57.3665 | 142.0 | | 0.0029 | 65.0 | 51675 | 1.1358 | 59.231 | 48.7251 | 51.6138 | 58.5718 | 142.0 | | 0.0026 | 66.0 | 52470 | 1.1559 | 58.9482 | 47.2137 | 49.4299 | 57.7235 | 142.0 | | 0.0025 | 67.0 | 53265 | 1.1272 | 59.3333 | 47.7419 | 50.7018 | 58.326 | 142.0 | | 0.0026 | 68.0 | 54060 | 1.1613 | 58.6404 | 47.3218 | 50.255 | 57.4646 | 142.0 | | 0.0015 | 69.0 | 54855 | 1.1575 | 58.7927 | 47.7018 | 50.695 | 57.796 | 142.0 | | 0.0018 | 70.0 | 55650 | 1.1463 | 58.9455 | 47.2691 | 50.176 | 57.9997 | 142.0 | | 0.0023 | 71.0 | 56445 | 1.1622 | 58.5943 | 46.9325 | 49.4159 | 57.2131 | 142.0 | | 0.0024 | 72.0 | 57240 | 1.1258 | 58.2779 | 47.4119 | 49.9836 | 57.4867 | 142.0 | | 0.0019 | 73.0 | 58035 | 1.1333 | 58.9185 | 47.5755 | 50.0765 | 57.8661 | 142.0 | | 0.0017 | 74.0 | 58830 | 1.1469 | 60.5037 | 49.4508 | 52.2863 | 59.6675 | 141.963 | | 0.0017 | 75.0 | 59625 | 1.1349 | 59.4264 | 47.4554 | 50.0383 | 58.3103 | 142.0 | | 0.0025 | 76.0 | 60420 | 1.1215 | 58.2795 | 46.9852 | 49.5787 | 57.4501 | 142.0 | | 0.0012 | 77.0 | 61215 | 1.1272 | 58.2248 | 47.0914 | 50.2569 | 57.1888 | 142.0 | | 0.001 | 78.0 | 62010 | 1.1648 | 59.3808 | 48.4901 | 51.118 | 58.6251 | 142.0 | | 0.0011 | 79.0 | 62805 | 1.1433 | 58.8697 | 47.6232 | 50.0226 | 57.6299 | 142.0 | | 0.001 | 80.0 | 63600 | 1.1486 | 59.0608 | 47.1931 | 50.1354 | 57.8687 | 142.0 | | 0.0011 | 81.0 | 64395 | 1.1695 | 58.341 | 47.0306 | 49.9269 | 57.339 | 142.0 | | 0.001 | 82.0 | 65190 | 1.1589 | 58.9283 | 48.4586 | 51.2319 | 57.9485 | 142.0 | | 0.0009 | 83.0 | 65985 | 1.1868 | 59.1377 | 48.2469 | 50.8486 | 58.1111 | 142.0 | | 0.001 | 84.0 | 66780 | 1.1664 | 58.7706 | 47.5868 | 50.5937 | 57.7824 | 142.0 | | 0.0009 | 85.0 | 67575 | 1.1719 | 57.8121 | 45.5997 | 48.2442 | 56.5272 | 142.0 | | 0.0006 | 86.0 | 68370 | 1.1662 | 58.5204 | 47.5947 | 50.1839 | 57.6431 | 142.0 | | 0.0007 | 87.0 | 69165 | 1.1668 | 59.2416 | 48.2985 | 51.0347 | 58.2794 | 142.0 | | 0.0007 | 88.0 | 69960 | 1.1619 | 58.6933 | 47.5716 | 50.6785 | 57.5726 | 142.0 | | 0.0003 | 89.0 | 70755 | 1.1765 | 59.2853 | 48.6451 | 51.3017 | 58.2603 | 142.0 | | 0.0005 | 90.0 | 71550 | 1.1766 | 59.248 | 48.5642 | 50.9843 | 58.1706 | 142.0 | | 0.0005 | 91.0 | 72345 | 1.1983 | 59.0009 | 48.311 | 51.0192 | 57.9822 | 142.0 | | 0.0006 | 92.0 | 73140 | 1.1721 | 59.1248 | 49.0902 | 51.9937 | 58.2288 | 142.0 | | 0.0003 | 93.0 | 73935 | 1.1799 | 58.2448 | 47.4011 | 49.987 | 57.515 | 142.0 | | 0.0005 | 94.0 | 74730 | 1.1900 | 59.931 | 49.6663 | 52.3233 | 58.962 | 142.0 | | 0.0004 | 95.0 | 75525 | 1.1868 | 59.5898 | 49.0004 | 51.4835 | 58.6463 | 142.0 | | 0.0093 | 96.0 | 76320 | 1.1831 | 59.9405 | 49.83 | 52.4355 | 59.0702 | 142.0 | | 0.0004 | 97.0 | 77115 | 1.1841 | 59.7379 | 49.5435 | 52.5255 | 58.8526 | 142.0 | | 0.0004 | 98.0 | 77910 | 1.1790 | 59.5515 | 49.0724 | 51.9888 | 58.5488 | 142.0 | | 0.0003 | 99.0 | 78705 | 1.1786 | 59.7712 | 49.0557 | 51.8137 | 58.7144 | 142.0 | | 0.0002 | 100.0 | 79500 | 1.1806 | 59.4159 | 48.867 | 51.9013 | 58.3382 | 142.0 |
e19102faaf5a4a701eb303ad5c32568a
apache-2.0
['whisper-event', 'generated_from_trainer']
false
Whisper Small dysarthric Dutch This model is a fine-tuned version of [qmeeus/whisper-small-nl](https://huggingface.co/qmeeus/whisper-small-nl) on the data/copas copas-full dataset. It achieves the following results on the evaluation set: - Loss: 0.4702 - Wer: 22.1638
ed5070e1160f3ae8f88b64141c6b2637
apache-2.0
['whisper-event', 'generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 10000 - mixed_precision_training: Native AMP
8d31cfbde8b1cef01844d2bdff18bdd0
apache-2.0
['whisper-event', 'generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:-----:|:---------------:|:-------:| | 0.1618 | 0.05 | 500 | 0.3787 | 28.9235 | | 0.0583 | 1.05 | 1000 | 0.3732 | 25.7702 | | 0.0382 | 2.05 | 1500 | 0.4001 | 25.4621 | | 0.0316 | 3.05 | 2000 | 0.4081 | 24.7010 | | 0.0169 | 4.05 | 2500 | 0.4325 | 24.1935 | | 0.0153 | 5.05 | 3000 | 0.4325 | 33.4179 | | 0.0074 | 6.05 | 3500 | 0.4367 | 23.9398 | | 0.0096 | 7.05 | 4000 | 0.4390 | 23.3055 | | 0.0054 | 8.05 | 4500 | 0.4441 | 23.7042 | | 0.0032 | 9.04 | 5000 | 0.4493 | 23.2693 | | 0.004 | 10.04 | 5500 | 0.4524 | 23.3418 | | 0.0048 | 11.04 | 6000 | 0.4498 | 23.7224 | | 0.001 | 12.04 | 6500 | 0.4577 | 22.8887 | | 0.0002 | 13.04 | 7000 | 0.4577 | 22.0913 | | 0.0001 | 14.04 | 7500 | 0.4616 | 22.1276 | | 0.0001 | 15.04 | 8000 | 0.4639 | 22.2726 | | 0.0001 | 16.04 | 8500 | 0.4662 | 22.1095 | | 0.0001 | 17.04 | 9000 | 0.4684 | 22.1457 | | 0.0001 | 18.04 | 9500 | 0.4697 | 22.1457 | | 0.0001 | 19.04 | 10000 | 0.4702 | 22.1638 |
9dd9b2192235d78ccd22b634352ef172
apache-2.0
['automatic-speech-recognition', 'mozilla-foundation/common_voice_8_0', 'robust-speech-event', 'generated_from_trainer', 'hf-asr-leaderboard']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 32 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 1200 - num_epochs: 30.0 - mixed_precision_training: Native AMP
f9a31d3de420018308a65b3a3692074e
apache-2.0
['automatic-speech-recognition', 'en']
false
exp_w2v2r_en_xls-r_age_teens-5_sixties-5_s870 Fine-tuned [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) for speech recognition using the train split of [Common Voice 7.0 (en)](https://huggingface.co/datasets/mozilla-foundation/common_voice_7_0). When using this model, make sure that your speech input is sampled at 16kHz. This model has been fine-tuned by the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) tool.
45a4c5a5033b29771eab466a52291731
apache-2.0
['generated_from_trainer']
false
bert-base-uncased-issues-128 This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.2551
42fa619ecdacfaaff783144e35433e2a
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 2.0984 | 1.0 | 291 | 1.7081 | | 1.6512 | 2.0 | 582 | 1.4289 | | 1.4854 | 3.0 | 873 | 1.3845 | | 1.3924 | 4.0 | 1164 | 1.3844 | | 1.3375 | 5.0 | 1455 | 1.1944 | | 1.2969 | 6.0 | 1746 | 1.2848 | | 1.2443 | 7.0 | 2037 | 1.2678 | | 1.1998 | 8.0 | 2328 | 1.2151 | | 1.1805 | 9.0 | 2619 | 1.1638 | | 1.1396 | 10.0 | 2910 | 1.2131 | | 1.1333 | 11.0 | 3201 | 1.1966 | | 1.0974 | 12.0 | 3492 | 1.1687 | | 1.0822 | 13.0 | 3783 | 1.2283 | | 1.0736 | 14.0 | 4074 | 1.1640 | | 1.0595 | 15.0 | 4365 | 1.1207 | | 1.0515 | 16.0 | 4656 | 1.2551 |
ae182e2d082314b4dc1b99b016871343
apache-2.0
['generated_from_trainer']
false
swin-finetuned-food101-e3 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 0.2714 - Accuracy: 0.9227
c5cca3809876df483a4822ee35c543c0
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.5565 | 1.0 | 1183 | 0.3939 | 0.8856 | | 0.3466 | 2.0 | 2366 | 0.2936 | 0.9156 | | 0.1172 | 3.0 | 3549 | 0.2714 | 0.9227 |
33bf5cc088824f49a652d7da754b3c34
gpl-2.0
['corenlp']
false
Core NLP model for english-kbp CoreNLP is your one stop shop for natural language processing in Java! CoreNLP enables users to derive linguistic annotations for text, including token and sentence boundaries, parts of speech, named entities, numeric and time values, dependency and constituency parses, coreference, sentiment, quote attributions, and relations. Find more about it in [our website](https://stanfordnlp.github.io/CoreNLP) and our [GitHub repository](https://github.com/stanfordnlp/CoreNLP). This card and repo were automatically prepared with `hugging_corenlp.py` in the `stanfordnlp/huggingface-models` repo Last updated 2023-01-21 01:36:45.937
404a1c993b8591c331754ee51b4c408f
apache-2.0
['multiberts', 'multiberts-seed_2', 'multiberts-seed_2-step_500k']
false
MultiBERTs, Intermediate Checkpoint - Seed 2, Step 500k MultiBERTs is a collection of checkpoints and a statistical library to support robust research on BERT. We provide 25 BERT-base models trained with similar hyper-parameters as [the original BERT model](https://github.com/google-research/bert) but with different random seeds, which causes variations in the initial weights and order of training instances. The aim is to distinguish findings that apply to a specific artifact (i.e., a particular instance of the model) from those that apply to the more general procedure. We also provide 140 intermediate checkpoints captured during the course of pre-training (we saved 28 checkpoints for the first 5 runs). The models were originally released through [http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our paper [The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163). This is model
44b4930bebff558d10416af2e48a252a
apache-2.0
['multiberts', 'multiberts-seed_2', 'multiberts-seed_2-step_500k']
false
How to use Using code from [BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on Tensorflow: ``` from transformers import BertTokenizer, TFBertModel tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_500k') model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_500k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='tf') output = model(encoded_input) ``` PyTorch version: ``` from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_500k') model = BertModel.from_pretrained("google/multiberts-seed_2-step_500k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ```
af1aeb36352b556b91cdf4eeee1b5ad9
creativeml-openrail-m
['text-to-image']
false
training params ```json { "pretrained_model_name_or_path": "CompVis/stable-diffusion-v1-4", "instance_data_dir": "./a9054d36-59d1-4374-ab1f-2ca457b539e2/instance_data", "class_data_dir": "./class_data/a-portrait-of-a-person", "output_dir": "./a9054d36-59d1-4374-ab1f-2ca457b539e2/", "with_prior_preservation": true, "prior_loss_weight": 1.0, "instance_prompt": "a portrait of [V]", "class_prompt": "a portrait of a person", "resolution": 512, "train_batch_size": 1, "gradient_accumulation_steps": 1, "gradient_checkpointing": true, "use_8bit_adam": true, "learning_rate": 5e-06, "lr_scheduler": "constant", "lr_warmup_steps": 0, "num_class_images": 200, "max_train_steps": 1050, "mixed_precision": "fp16" } ```
5afc05f88b14de14e01778ea17e62762
cc-by-4.0
['espnet', 'audio', 'automatic-speech-recognition']
false
Environments - date: `Wed Apr 27 09:30:57 EDT 2022` - python version: `3.8.5 (default, Sep 4 2020, 07:30:14) [GCC 7.3.0]` - espnet version: `espnet 0.10.7a1` - pytorch version: `pytorch 1.8.1+cu102` - Git hash: `21d19be00089678ca27f7fce474ef8d787689512` - Commit date: `Wed Mar 16 08:06:52 2022 -0400`
0b3e0db38fb019ee3c8c5f7f32068b2a
cc-by-4.0
['espnet', 'audio', 'automatic-speech-recognition']
false
WER |dataset|Snt|Wrd|Corr|Sub|Del|Ins|Err|S.Err| |---|---|---|---|---|---|---|---|---| |decode_lm_weight0.0_asr_model_valid.loss.ave_10best/dev_clean|2703|54402|97.7|2.1|0.2|0.3|2.6|31.5| |decode_lm_weight0.0_asr_model_valid.loss.ave_10best/dev_other|2864|50948|93.8|5.6|0.6|0.6|6.8|50.8| |decode_lm_weight0.0_asr_model_valid.loss.ave_10best/test_clean|2620|52576|97.5|2.3|0.2|0.3|2.8|32.7| |decode_lm_weight0.0_asr_model_valid.loss.ave_10best/test_other|2939|52343|94.1|5.3|0.6|0.7|6.6|51.8| |decode_lm_weight0.4_lm_lm_train_lm_transformer2_en_bpe5000_17epoch_asr_model_valid.loss.ave_10best/dev_clean|2703|54402|98.0|1.8|0.2|0.2|2.2|28.2| |decode_lm_weight0.4_lm_lm_train_lm_transformer2_en_bpe5000_17epoch_asr_model_valid.loss.ave_10best/dev_other|2864|50948|94.8|4.5|0.7|0.5|5.7|45.1| |decode_lm_weight0.4_lm_lm_train_lm_transformer2_en_bpe5000_17epoch_asr_model_valid.loss.ave_10best/test_clean|2620|52576|97.9|1.9|0.2|0.3|2.4|29.3| |decode_lm_weight0.4_lm_lm_train_lm_transformer2_en_bpe5000_17epoch_asr_model_valid.loss.ave_10best/test_other|2939|52343|94.9|4.3|0.7|0.5|5.6|47.0|
dd2574f95e783de79b6ef60830ff7ee6
cc-by-4.0
['espnet', 'audio', 'automatic-speech-recognition']
false
CER |dataset|Snt|Wrd|Corr|Sub|Del|Ins|Err|S.Err| |---|---|---|---|---|---|---|---|---| |decode_lm_weight0.0_asr_model_valid.loss.ave_10best/dev_clean|2703|288456|99.4|0.4|0.3|0.2|0.9|31.5| |decode_lm_weight0.0_asr_model_valid.loss.ave_10best/dev_other|2864|265951|97.7|1.4|0.9|0.8|3.0|50.8| |decode_lm_weight0.0_asr_model_valid.loss.ave_10best/test_clean|2620|281530|99.4|0.4|0.3|0.3|0.9|32.7| |decode_lm_weight0.0_asr_model_valid.loss.ave_10best/test_other|2939|272758|97.9|1.2|0.9|0.8|2.8|51.8| |decode_lm_weight0.4_lm_lm_train_lm_transformer2_en_bpe5000_17epoch_asr_model_valid.loss.ave_10best/dev_clean|2703|288456|99.4|0.3|0.3|0.2|0.8|28.2| |decode_lm_weight0.4_lm_lm_train_lm_transformer2_en_bpe5000_17epoch_asr_model_valid.loss.ave_10best/dev_other|2864|265951|97.9|1.1|1.0|0.6|2.7|45.1| |decode_lm_weight0.4_lm_lm_train_lm_transformer2_en_bpe5000_17epoch_asr_model_valid.loss.ave_10best/test_clean|2620|281530|99.4|0.3|0.3|0.2|0.9|29.3| |decode_lm_weight0.4_lm_lm_train_lm_transformer2_en_bpe5000_17epoch_asr_model_valid.loss.ave_10best/test_other|2939|272758|98.1|0.9|1.0|0.6|2.5|47.0|
f33eb38f6213854b0a000b58ea2f5587
cc-by-4.0
['espnet', 'audio', 'automatic-speech-recognition']
false
TER |dataset|Snt|Wrd|Corr|Sub|Del|Ins|Err|S.Err| |---|---|---|---|---|---|---|---|---| |decode_lm_weight0.0_asr_model_valid.loss.ave_10best/dev_clean|2703|68010|97.2|2.1|0.7|0.4|3.3|31.5| |decode_lm_weight0.0_asr_model_valid.loss.ave_10best/dev_other|2864|63110|92.7|5.6|1.7|1.2|8.6|50.8| |decode_lm_weight0.0_asr_model_valid.loss.ave_10best/test_clean|2620|65818|97.0|2.2|0.9|0.4|3.4|32.7| |decode_lm_weight0.0_asr_model_valid.loss.ave_10best/test_other|2939|65101|93.0|5.1|1.9|1.0|8.0|51.8| |decode_lm_weight0.4_lm_lm_train_lm_transformer2_en_bpe5000_17epoch_asr_model_valid.loss.ave_10best/dev_clean|2703|68010|97.5|1.8|0.8|0.4|2.9|28.2| |decode_lm_weight0.4_lm_lm_train_lm_transformer2_en_bpe5000_17epoch_asr_model_valid.loss.ave_10best/dev_other|2864|63110|93.5|4.5|1.9|0.9|7.4|45.1| |decode_lm_weight0.4_lm_lm_train_lm_transformer2_en_bpe5000_17epoch_asr_model_valid.loss.ave_10best/test_clean|2620|65818|97.3|1.9|0.8|0.4|3.0|29.3| |decode_lm_weight0.4_lm_lm_train_lm_transformer2_en_bpe5000_17epoch_asr_model_valid.loss.ave_10best/test_other|2939|65101|93.9|4.1|1.9|0.8|6.9|47.0|
5955694316a7ba60e38ebec6caabf2f8
cc-by-4.0
['espnet', 'audio', 'automatic-speech-recognition']
false
ASR config <details><summary>expand</summary> ``` config: conf/tuning/transducer/train_conformer-rnn_transducer.yaml print_config: false log_level: INFO dry_run: false iterator_type: sequence output_dir: exp/asr_train_conformer-rnn_transducer_raw_en_bpe5000_sp ngpu: 1 seed: 0 num_workers: 1 num_att_plot: 0 dist_backend: nccl dist_init_method: env:// dist_world_size: 4 dist_rank: 0 local_rank: 0 dist_master_addr: localhost dist_master_port: 35239 dist_launcher: null multiprocessing_distributed: true unused_parameters: false sharded_ddp: false cudnn_enabled: true cudnn_benchmark: false cudnn_deterministic: true collect_stats: false write_collected_feats: false max_epoch: 25 patience: null val_scheduler_criterion: - valid - loss early_stopping_criterion: - valid - loss - min best_model_criterion: - - valid - loss - min keep_nbest_models: 10 nbest_averaging_interval: 0 grad_clip: 5.0 grad_clip_type: 2.0 grad_noise: false accum_grad: 4 no_forward_run: false resume: true train_dtype: float32 use_amp: false log_interval: null use_matplotlib: true use_tensorboard: true use_wandb: false wandb_project: null wandb_id: null wandb_entity: null wandb_name: null wandb_model_log_interval: -1 detect_anomaly: false pretrain_path: null init_param: [] ignore_init_mismatch: false freeze_param: [] num_iters_per_epoch: null batch_size: 20 valid_batch_size: null batch_bins: 10000000 valid_batch_bins: null train_shape_file: - exp/asr_stats_raw_en_bpe5000_sp/train/speech_shape - exp/asr_stats_raw_en_bpe5000_sp/train/text_shape.bpe valid_shape_file: - exp/asr_stats_raw_en_bpe5000_sp/valid/speech_shape - exp/asr_stats_raw_en_bpe5000_sp/valid/text_shape.bpe batch_type: numel valid_batch_type: null fold_length: - 80000 - 150 sort_in_batch: descending sort_batch: descending multiple_iterator: false chunk_length: 500 chunk_shift_ratio: 0.5 num_cache_chunks: 1024 train_data_path_and_name_and_type: - - dump/raw/train_960_sp/wav.scp - speech - kaldi_ark - - dump/raw/train_960_sp/text - text - text valid_data_path_and_name_and_type: - - dump/raw/dev/wav.scp - speech - kaldi_ark - - dump/raw/dev/text - text - text allow_variable_data_keys: false max_cache_size: 0.0 max_cache_fd: 32 valid_max_cache_size: null optim: adam optim_conf: lr: 0.0015 weight_decay: 1.0e-06 scheduler: warmuplr scheduler_conf: warmup_steps: 25000 token_list: - <blank> - <unk> - ▁THE - S - ▁AND - ▁OF - ▁TO - ▁A - ▁IN - ▁I - ▁HE - ▁THAT - ▁WAS - ED - ▁IT - '''' - ▁HIS - ING - ▁YOU - ▁WITH - ▁FOR - ▁HAD - T - ▁AS - ▁HER - ▁IS - ▁BE - ▁BUT - ▁NOT - ▁SHE - D - ▁AT - ▁ON - LY - ▁HIM - ▁THEY - ▁ALL - ▁HAVE - ▁BY - ▁SO - ▁THIS - ▁MY - ▁WHICH - ▁ME - ▁SAID - ▁FROM - ▁ONE - Y - E - ▁WERE - ▁WE - ▁NO - N - ▁THERE - ▁OR - ER - ▁AN - ▁WHEN - ▁ARE - ▁THEIR - ▁WOULD - ▁IF - ▁WHAT - ▁THEM - ▁WHO - ▁OUT - M - ▁DO - ▁WILL - ▁UP - ▁BEEN - P - R - ▁MAN - ▁THEN - ▁COULD - ▁MORE - C - ▁INTO - ▁NOW - ▁VERY - ▁YOUR - ▁SOME - ▁LITTLE - ES - ▁TIME - RE - ▁CAN - ▁LIKE - LL - ▁ABOUT - ▁HAS - ▁THAN - ▁DID - ▁UPON - ▁OVER - IN - ▁ANY - ▁WELL - ▁ONLY - B - ▁SEE - ▁GOOD - ▁OTHER - ▁TWO - L - ▁KNOW - ▁GO - ▁DOWN - ▁BEFORE - A - AL - ▁OUR - ▁OLD - ▁SHOULD - ▁MADE - ▁AFTER - ▁GREAT - ▁DAY - ▁MUST - ▁COME - ▁HOW - ▁SUCH - ▁CAME - LE - ▁WHERE - ▁US - ▁NEVER - ▁THESE - ▁MUCH - ▁DE - ▁MISTER - ▁WAY - G - ▁S - ▁MAY - ATION - ▁LONG - OR - ▁AM - ▁FIRST - ▁BACK - ▁OWN - ▁RE - ▁AGAIN - ▁SAY - ▁MEN - ▁WENT - ▁HIMSELF - ▁HERE - NESS - ▁THINK - V - IC - ▁EVEN - ▁THOUGHT - ▁HAND - ▁JUST - ▁O - ▁UN - VE - ION - ▁ITS - 'ON' - ▁MAKE - ▁MIGHT - ▁TOO - K - ▁AWAY - ▁LIFE - TH - ▁WITHOUT - ST - ▁THROUGH - ▁MOST - ▁TAKE - ▁DON - ▁EVERY - F - O - ▁SHALL - ▁THOSE - ▁EYES - AR - ▁STILL - ▁LAST - ▁HOUSE - ▁HEAD - ABLE - ▁NOTHING - ▁NIGHT - ITY - ▁LET - ▁MANY - ▁OFF - ▁BEING - ▁FOUND - ▁WHILE - EN - ▁SAW - ▁GET - ▁PEOPLE - ▁FACE - ▁YOUNG - CH - ▁UNDER - ▁ONCE - ▁TELL - AN - ▁THREE - ▁PLACE - ▁ROOM - ▁YET - ▁SAME - IL - US - U - ▁FATHER - ▁RIGHT - EL - ▁THOUGH - ▁ANOTHER - LI - RI - ▁HEART - IT - ▁PUT - ▁TOOK - ▁GIVE - ▁EVER - ▁E - ▁PART - ▁WORK - ERS - ▁LOOK - ▁NEW - ▁KING - ▁MISSUS - ▁SIR - ▁LOVE - ▁MIND - ▁LOOKED - W - RY - ▁ASKED - ▁LEFT - ET - ▁LIGHT - CK - ▁DOOR - ▁MOMENT - RO - ▁WORLD - ▁THINGS - ▁HOME - UL - ▁THING - LA - ▁WHY - ▁MOTHER - ▁ALWAYS - ▁FAR - FUL - ▁WATER - CE - IVE - UR - ▁HEARD - ▁SOMETHING - ▁SEEMED - I - LO - ▁BECAUSE - OL - ▁END - ▁TOLD - ▁CON - ▁YES - ▁GOING - ▁GOT - RA - IR - ▁WOMAN - ▁GOD - EST - TED - ▁FIND - ▁KNEW - ▁SOON - ▁EACH - ▁SIDE - H - TON - MENT - ▁OH - NE - Z - LING - ▁AGAINST - TER - ▁NAME - ▁MISS - ▁QUITE - ▁WANT - ▁YEARS - ▁FEW - ▁BETTER - ENT - ▁HALF - ▁DONE - ▁ALSO - ▁BEGAN - ▁HAVING - ▁ENOUGH - IS - ▁LADY - ▁WHOLE - LESS - ▁BOTH - ▁SEEN - ▁SET - ▁WHITE - ▁COURSE - IES - ▁VOICE - ▁CALLED - ▁D - ▁EX - ATE - ▁TURNED - ▁GAVE - ▁C - ▁POOR - MAN - UT - NA - ▁DEAR - ISH - ▁GIRL - ▁MORNING - ▁BETWEEN - LED - ▁NOR - IA - ▁AMONG - MA - ▁ - ▁SMALL - ▁REST - ▁WHOM - ▁FELT - ▁HANDS - ▁MYSELF - ▁HIGH - ▁M - ▁HOWEVER - ▁HERSELF - ▁P - CO - ▁STOOD - ID - ▁KIND - ▁HUNDRED - AS - ▁ROUND - ▁ALMOST - TY - ▁SINCE - ▁G - AM - ▁LA - SE - ▁BOY - ▁MA - ▁PERHAPS - ▁WORDS - ATED - ▁HO - X - ▁MO - ▁SAT - ▁REPLIED - ▁FOUR - ▁ANYTHING - ▁TILL - ▁UNTIL - ▁BLACK - TION - ▁CRIED - RU - TE - ▁FACT - ▁HELP - ▁NEXT - ▁LOOKING - ▁DOES - ▁FRIEND - ▁LAY - ANCE - ▁POWER - ▁BROUGHT - VER - ▁FIRE - ▁KEEP - PO - FF - ▁COUNTRY - ▁SEA - ▁WORD - ▁CAR - ▁DAYS - ▁TOGETHER - ▁IMP - ▁REASON - KE - ▁INDEED - TING - ▁MATTER - ▁FULL - ▁TEN - TIC - ▁LAND - ▁RATHER - ▁AIR - ▁HOPE - ▁DA - ▁OPEN - ▁FEET - ▁EN - ▁FIVE - ▁POINT - ▁CO - OM - ▁LARGE - ▁B - ▁CL - ME - ▁GONE - ▁CHILD - INE - GG - ▁BEST - ▁DIS - UM - ▁HARD - ▁LORD - OUS - ▁WIFE - ▁SURE - ▁FORM - DE - ▁DEATH - ANT - ▁NATURE - ▁BA - ▁CARE - ▁BELIEVE - PP - ▁NEAR - ▁RO - ▁RED - ▁WAR - IE - ▁SPEAK - ▁FEAR - ▁CASE - ▁TAKEN - ▁ALONG - ▁CANNOT - ▁HEAR - ▁THEMSELVES - CI - ▁PRESENT - AD - ▁MASTER - ▁SON - ▁THUS - ▁LI - ▁LESS - ▁SUN - ▁TRUE - IM - IOUS - ▁THOUSAND - ▁MONEY - ▁W - ▁BEHIND - ▁CHILDREN - ▁DOCTOR - AC - ▁TWENTY - ▁WISH - ▁SOUND - ▁WHOSE - ▁LEAVE - ▁ANSWERED - ▁THOU - ▁DUR - ▁HA - ▁CERTAIN - ▁PO - ▁PASSED - GE - TO - ▁ARM - ▁LO - ▁STATE - ▁ALONE - TA - ▁SHOW - ▁NEED - ▁LIVE - ND - ▁DEAD - ENCE - ▁STRONG - ▁PRE - ▁TI - ▁GROUND - SH - TI - ▁SHORT - IAN - UN - ▁PRO - ▁HORSE - MI - ▁PRINCE - ARD - ▁FELL - ▁ORDER - ▁CALL - AT - ▁GIVEN - ▁DARK - ▁THEREFORE - ▁CLOSE - ▁BODY - ▁OTHERS - ▁SENT - ▁SECOND - ▁OFTEN - ▁CA - ▁MANNER - MO - NI - ▁BRING - ▁QUESTION - ▁HOUR - ▁BO - AGE - ▁ST - ▁TURN - ▁TABLE - ▁GENERAL - ▁EARTH - ▁BED - ▁REALLY - ▁SIX - 'NO' - IST - ▁BECOME - ▁USE - ▁READ - ▁SE - ▁VI - ▁COMING - ▁EVERYTHING - ▁EM - ▁ABOVE - ▁EVENING - ▁BEAUTIFUL - ▁FEEL - ▁RAN - ▁LEAST - ▁LAW - ▁ALREADY - ▁MEAN - ▁ROSE - WARD - ▁ITSELF - ▁SOUL - ▁SUDDENLY - ▁AROUND - RED - ▁ANSWER - ICAL - ▁RA - ▁WIND - ▁FINE - ▁WON - ▁WHETHER - ▁KNOWN - BER - NG - ▁TA - ▁CAPTAIN - ▁EYE - ▁PERSON - ▁WOMEN - ▁SORT - ▁ASK - ▁BROTHER - ▁USED - ▁HELD - ▁BIG - ▁RETURNED - ▁STRANGE - ▁BU - ▁PER - ▁FREE - ▁EITHER - ▁WITHIN - ▁DOUBT - ▁YEAR - ▁CLEAR - ▁SIGHT - ▁GRA - ▁LOST - ▁KEPT - ▁F - PE - ▁BAR - ▁TOWN - ▁SLEEP - ARY - ▁HAIR - ▁FRIENDS - ▁DREAM - ▁FELLOW - PER - ▁DEEP - QUE - ▁BECAME - ▁REAL - ▁PAST - ▁MAKING - RING - ▁COMP - ▁ACT - ▁BAD - HO - STER - ▁YE - ▁MEANS - ▁RUN - MEN - ▁DAUGHTER - ▁SENSE - ▁CITY - ▁SOMETIMES - ▁TOWARDS - ▁ROAD - ▁SP - ▁LU - ▁READY - ▁FOOT - ▁COLD - ▁SA - ▁LETTER - ▁ELSE - ▁MAR - ▁STA - BE - ▁TRUTH - ▁LE - BO - ▁BUSINESS - CHE - ▁JOHN - ▁SUBJECT - ▁COURT - ▁IDEA - ILY - ▁RIVER - ATING - ▁FAMILY - HE - ▁DIDN - ▁GLAD - ▁SEVERAL - IAL - ▁UNDERSTAND - ▁SC - ▁POSSIBLE - ▁DIFFERENT - ▁RETURN - ▁ARMS - ▁LOW - ▁HOLD - ▁TALK - ▁RU - ▁WINDOW - ▁INTEREST - ▁SISTER - SON - ▁SH - ▁BLOOD - ▁SAYS - ▁CAP - ▁DI - ▁HUMAN - ▁CAUSE - NCE - ▁THANK - ▁LATE - GO - ▁CUT - ▁ACROSS - ▁STORY - NT - ▁COUNT - ▁ABLE - DY - LEY - ▁NUMBER - ▁STAND - ▁CHURCH - ▁THY - ▁SUPPOSE - LES - BLE - OP - ▁EFFECT - BY - ▁K - ▁NA - ▁SPOKE - ▁MET - ▁GREEN - ▁HUSBAND - ▁RESPECT - ▁PA - ▁FOLLOWED - ▁REMEMBER - ▁LONGER - ▁AGE - ▁TAKING - ▁LINE - ▁SEEM - ▁HAPPY - LAND - EM - ▁STAY - ▁PLAY - ▁COMMON - ▁GA - ▁BOOK - ▁TIMES - ▁OBJECT - ▁SEVEN - QUI - DO - UND - ▁FL - ▁PRETTY - ▁FAIR - WAY - ▁WOOD - ▁REACHED - ▁APPEARED - ▁SWEET - ▁FALL - BA - ▁PASS - ▁SIGN - ▁TREE - IONS - ▁GARDEN - ▁ILL - ▁ART - ▁REMAIN - ▁OPENED - ▁BRIGHT - ▁STREET - ▁TROUBLE - ▁PAIN - ▁CONTINUED - ▁SCHOOL - OUR - ▁CARRIED - ▁SAYING - HA - ▁CHANGE - ▁FOLLOW - ▁GOLD - ▁SW - ▁FEELING - ▁COMMAND - ▁BEAR - ▁CERTAINLY - ▁BLUE - ▁NE - CA - ▁WILD - ▁ACCOUNT - ▁OUGHT - UD - ▁T - ▁BREATH - ▁WANTED - ▁RI - ▁HEAVEN - ▁PURPOSE - ▁CHARACTER - ▁RICH - ▁PE - ▁DRESS - OS - FA - ▁TH - ▁ENGLISH - ▁CHANCE - ▁SHIP - ▁VIEW - ▁TOWARD - AK - ▁JOY - ▁JA - ▁HAR - ▁NEITHER - ▁FORCE - ▁UNCLE - DER - ▁PLAN - ▁PRINCESS - DI - ▁CHIEF - ▁HAT - ▁LIVED - ▁AB - ▁VISIT - ▁MOR - TEN - ▁WALL - UC - ▁MINE - ▁PLEASURE - ▁SMILE - ▁FRONT - ▁HU - ▁DEAL - OW - ▁FURTHER - GED - ▁TRIED - DA - VA - ▁NONE - ▁ENTERED - ▁QUEEN - ▁PAY - ▁EL - ▁EXCEPT - ▁SHA - ▁FORWARD - ▁EIGHT - ▁ADDED - ▁PUBLIC - ▁EIGHTEEN - ▁STAR - ▁HAPPENED - ▁LED - ▁WALKED - ▁ALTHOUGH - ▁LATER - ▁SPIRIT - ▁WALK - ▁BIT - ▁MEET - LIN - ▁FI - LT - ▁MOUTH - ▁WAIT - ▁HOURS - ▁LIVING - ▁YOURSELF - ▁FAST - ▁CHA - ▁HALL - ▁BEYOND - ▁BOAT - ▁SECRET - ENS - ▁CHAIR - RN - ▁RECEIVED - ▁CAT - RESS - ▁DESIRE - ▁GENTLEMAN - UGH - ▁LAID - EVER - ▁OCCASION - ▁WONDER - ▁GU - ▁PARTY - DEN - ▁FISH - ▁SEND - ▁NEARLY - ▁TRY - CON - ▁SEEMS - RS - ▁BELL - ▁BRA - ▁SILENCE - IG - ▁GUARD - ▁DIE - ▁DOING - ▁TU - ▁COR - ▁EARLY - ▁BANK - ▁FIGURE - IF - ▁ENGLAND - ▁MARY - ▁AFRAID - LER - ▁FO - ▁WATCH - ▁FA - ▁VA - ▁GRE - ▁AUNT - PED - ▁SERVICE - ▁JE - ▁PEN - ▁MINUTES - ▁PAN - ▁TREES - NED - ▁GLASS - ▁TONE - ▁PLEASE - ▁FORTH - ▁CROSS - ▁EXCLAIMED - ▁DREW - ▁EAT - ▁AH - ▁GRAVE - ▁CUR - PA - URE - CENT - ▁MILES - ▁SOFT - ▁AGO - ▁POSITION - ▁WARM - ▁LENGTH - ▁NECESSARY - ▁THINKING - ▁PICTURE - ▁PI - SHIP - IBLE - ▁HEAVY - ▁ATTENTION - ▁DOG - ABLY - ▁STANDING - ▁NATURAL - ▁APPEAR - OV - ▁CAUGHT - VO - ISM - ▁SPRING - ▁EXPERIENCE - ▁PAT - OT - ▁STOPPED - ▁REGARD - ▁HARDLY - ▁SELF - ▁STRENGTH - ▁GREW - ▁KNIGHT - ▁OPINION - ▁WIDE - ▁INSTEAD - ▁SOUTH - ▁TRANS - ▁CORNER - ▁LEARN - ▁ISLAND - ▁MI - ▁THIRD - ▁STE - ▁STRAIGHT - ▁TEA - ▁BOUND - ▁SEEING - ▁JU - ▁DINNER - ▁BEAUTY - ▁PEACE - AH - ▁REP - ▁SILENT - ▁CRE - ALLY - RIC - ▁STEP - ▁VER - ▁JO - GER - ▁SITTING - ▁THIRTY - ▁SAVE - ENED - ▁GLANCE - ▁REACH - ▁ACTION - ▁SAL - ▁SAD - ▁STONE - ITIES - ▁FRENCH - ▁STRUCK - ▁PAPER - ▁WHATEVER - ▁SUB - ▁DISTANCE - ▁WRONG - ▁KNOWLEDGE - ▁SAFE - ▁SNOW - ▁MUSIC - ▁FIFTY - RON - ▁ATTEMPT - ▁GOVERNMENT - TU - ▁CROWD - ▁BESIDES - ▁LOVED - ▁BOX - ▁DIRECTION - ▁TRAIN - ▁NORTH - ▁THICK - ▁GETTING - AV - ▁FLOOR - ▁COMPANY - ▁BLOW - ▁PLAIN - TRO - ▁BESIDE - ▁ROCK - ▁IMMEDIATELY - FI - ▁SHADOW - ▁SIT - ORS - ILE - ▁DRINK - ▁SPOT - ▁DANGER - ▁AL - ▁SAINT - ▁SLOWLY - ▁PALACE - IER - ▁RESULT - ▁PETER - ▁FOREST - ▁BELONG - ▁SU - ▁PAR - RIS - ▁TEARS - ▁APPEARANCE - ▁GATE - BU - ITION - ▁QUICKLY - ▁QUIET - ▁LONDON - ▁START - ▁BROWN - TRA - KIN - ▁CONSIDER - ▁BATTLE - ▁ANNE - ▁PIECE - ▁DIED - ▁SUCCESS - ▁LIPS - ▁FILLED - ▁FORGET - ▁POST - IFIED - ▁MARGARET - ▁FOOD - HAM - ▁PLEASANT - ▁FE - ▁EXPRESSION - ▁POCKET - ▁FRESH - ▁WEAR - TRI - ▁BROKEN - ▁LAUGHED - GING - ▁FOLLOWING - WN - IP - ▁TOUCH - ▁YOUTH - ATIVE - ▁LEG - ▁WEEK - ▁REMAINED - ▁EASY - NER - RK - ▁ENTER - ▁FIGHT - ▁PLACED - ▁TRAVEL - ▁SIMPLE - ▁GIRLS - ▁WAITING - ▁STOP - ▁WAVE - AU - ▁WISE - ▁CAMP - TURE - UB - ▁VE - ▁OFFICE - ▁GRAND - ▁FIT - ▁JUDGE - UP - MENTS - ▁QUICK - HI - ▁FLO - RIES - VAL - ▁COMFORT - ▁PARTICULAR - ▁STARTED - ▁SUIT - ▁NI - ▁PALE - ▁IMPOSSIBLE - ▁HOT - ▁CONVERSATION - ▁SCENE - ▁BOYS - ▁WIN - ▁BRE - ▁SOCIETY - ▁OUTSIDE - ▁WRITE - ▁EFFORT - ▁TALKING - ▁FORTUNE - ▁NINE - ▁WA - ▁SINGLE - ▁RULE - ▁PORT - ▁WINTER - ▁CAST - ▁CRA - ▁HAPPEN - ▁CRO - ▁SHUT - NING - ▁GUN - ▁NOBLE - ▁BEGIN - ▁PATH - ▁SKY - ▁WONDERFUL - ▁SUDDEN - ▁ARMY - ▁CHE - ▁WORTH - ▁MOUNTAIN - ▁MIN - AG - ▁FLU - ▁GRACE - ▁CHAPTER - ▁BELOW - ▁RING - ▁TURNING - ▁IRON - ▁TOP - ▁AFTERNOON - ORY - ▁EVIL - ▁TRUST - ▁BOW - ▁TRI - ▁SAIL - ▁CONTENT - ▁HORSES - ITE - ▁SILVER - AP - ▁LAD - ▁RUNNING - ▁HILL - ▁BEGINNING - ▁MAD - ▁HABIT - GRA - ▁CLOTHES - ▁MORROW - ▁CRY - ▁FASHION - ▁PRESENCE - ▁Z - FE - ▁ARRIVED - ▁QUARTER - ▁PERFECT - ▁WO - ▁TRA - ▁USUAL - ▁NECK - ▁MARRIED - ▁SEAT - ▁WI - ▁GAR - ▁SAND - ▁SHORE - ▁GIVING - NY - ▁PROBABLY - ▁MINUTE - ▁EXPECT - ▁DU - ▁SHOT - ▁INSTANT - ▁DEGREE - ▁COLOR - ▁WEST - RT - ▁MARCH - ▁BIRD - ▁SHOWED - ▁GREATER - ▁SERIOUS - ▁CARRY - ▁COVERED - ▁FORMER - ▁LOUD - ▁MOVED - ▁MASS - ▁SEEK - ▁CHO - GEN - ▁ROMAN - IB - ▁MOON - ▁BOARD - ▁STREAM - ▁EASILY - ▁WISHED - ▁SEARCH - ▁COULDN - ▁MONTHS - ▁SICK - LIE - ▁DUTY - ▁TWELVE - ▁FAINT - ▁STRANGER - ▁SURPRISE - ▁KILL - ▁LEAVING - ▁JOURNEY - ▁SCARCELY - ▁RAISED - ▁SPEAKING - ▁TERRIBLE - ▁TOM - ▁FIELD - ▁GAME - ▁QUA - ▁PROMISE - ▁LIE - ▁CONDITION - ▁TRO - ▁PERSONAL - ▁TALL - ▁STICK - ▁THREW - ▁MARRY - ▁VAN - ▁BURN - ▁ACCORDING - ▁RISE - ▁ATTACK - ▁SWORD - ▁GUESS - ▁THOUGHTS - ▁THIN - ▁THROW - ▁CALM - SIDE - ▁VILLAGE - ▁DEN - ▁ANXIOUS - ▁MER - GI - ▁EXPECTED - ▁BALL - ▁ESPECIALLY - ▁CHARGE - ▁MEASURE - ISE - ▁NICE - ▁TRYING - ▁ALLOW - ▁SHARP - ▁BREAD - ▁HONOUR - ▁HONOR - ▁ENTIRELY - ▁BILL - ▁BRI - ▁WRITTEN - ▁AR - ▁BROKE - ▁KILLED - ▁MARK - ▁VEN - ▁LADIES - ▁LEARNED - ▁FLOWERS - PLE - ▁FORTY - ▁OFFER - ▁HAPPINESS - ▁PRAY - ▁CLASS - ▁FER - ▁PRINCIPLE - GU - ▁BOOKS - ▁SHAPE - ▁SUMMER - ▁JACK - ▁DRAW - ▁GOLDEN - ▁DECIDED - ▁LEAD - ▁UNLESS - ▁HARM - ▁LISTEN - HER - ▁SHOOK - ▁INFLUENCE - ▁PERFECTLY - ▁MARRIAGE - ▁BROAD - ▁ESCAPE - ▁STATES - ▁MIDDLE - ▁PLANT - ▁MIL - ▁MOVEMENT - ▁NOISE - ▁ENEMY - ▁HISTORY - ▁BREAK - ROUS - ▁UNDERSTOOD - ▁LATTER - FER - ▁COMES - ▁MERELY - ▁SIMPLY - WI - ▁IMAGINE - ▁LOWER - ▁CONDUCT - ▁BORN - WA - ▁YARD - ▁KA - ▁CLOSED - ▁NOTE - GA - ▁STRA - RAN - ▁EXIST - EV - ▁SPEECH - ▁BITTER - JO - ▁MAKES - ▁GRASS - ▁REPLY - ▁CHANGED - ▁MON - ▁LYING - ▁DANCE - ▁FINALLY - ▁AMERICAN - ▁ENJOY - ▁CONTAIN - ▁MEANT - USE - ▁OBSERVED - THER - ▁LAUGH - ▁AFTERWARDS - ▁BEAT - ▁RACE - ▁EQUAL - ▁RAIN - PS - ▁STEPS - ▁BENEATH - ▁TAIL - ▁TASTE - IO - EY - ▁CHAR - ▁GE - GN - TIN - ▁GROW - ▁TE - IANS - ▁MOVE - ▁REPEATED - ▁DRIVE - TUR - ▁SI - CLOCK - ▁BRAVE - ▁MADAME - ▁LOT - ▁CASTLE - ▁HI - AND - ▁FUTURE - ▁RELATION - ▁SORRY - ▁HEALTH - ▁DICK - ▁R - ▁BUILDING - ▁EDGE - ▁BLESS - ▁SPITE - WE - ▁MIS - ▁PRISONER - ▁ALLOWED - ▁PH - ▁CATCH - MER - ETH - ▁COAT - ▁COMPLETE - ▁WOULDN - ▁CREATURE - ▁YELLOW - ▁IMPORTANT - ▁ADD - ▁PASSING - ▁DARKNESS - ▁CARRIAGE - ▁MILL - ▁FIFTEEN - NCY - ▁HUNG - ▁OB - ▁PLEASED - ▁SPREAD - ▁CURIOUS - ▁WORSE - ▁CIRCUMSTANCES - ▁GI - LAR - ▁CAL - ▁HY - ▁MERE - ▁JANE - ▁EAST - BI - ▁CUP - ▁BLIND - ▁PASSION - ▁DISCOVERED - ▁NOTICE - ▁REPORT - ▁SPACE - ▁PRESENTLY - ▁SORROW - ▁PACK - ▁DIN - CY - ▁DRY - ▁ANCIENT - ▁DRESSED - ▁COVER - ▁VO - ▁EXISTENCE - ▁EXACTLY - ▁BEAST - ▁PROPER - ▁DROPPED - ▁CLEAN - ▁COLOUR - ▁HOST - ▁CHAMBER - ▁FAITH - LET - ▁DETERMINED - ▁PRIEST - ▁STORM - ▁SKIN - ▁DARE - ▁PERSONS - ▁PICK - ▁NARROW - ▁SUPPORT - ▁PRIVATE - ▁SMILED - ▁COUSIN - ▁DRAWING - ▁ATTEND - ▁COOK - ▁PREVENT - ▁VARIOUS - ▁BLA - ▁FIXED - ▁WEAK - THE - ▁HOLE - ▁BOTTOM - ▁NOBODY - ADE - ▁LEGS - ITCH - ▁INDIVIDUAL - ▁EARS - LIKE - ▁ADVANTAGE - ▁FRANCE - ▁BON - ▁WINE - ▁LIVES - OD - ▁WALLS - ▁TIRED - ▁SHOP - ▁ANIMAL - ▁CRU - ▁WROTE - ▁ROYAL - ▁CONSIDERED - ▁MORAL - ▁COMPANION - ▁LOSE - ▁ISN - ▁BAG - ▁LAKE - ▁INTER - ▁COM - ▁LETTERS - ▁LUCK - ▁EAR - ▁GERMAN - ▁PET - ▁SAKE - ▁DROP - ▁PAID - ▁BREAKFAST - ▁LABOR - ▁DESERT - ▁DECLARED - ▁HUM - ▁STUDY - ▁INSTANCE - ONE - ▁SOMEWHAT - ▁CLOTH - ▁SPECIAL - ▁COLONEL - ▁SONG - ▁MAIN - ▁VALUE - ▁PROUD - ▁EXPRESS - ▁NATION - ▁HANDSOME - ▁CONFESS - ▁PU - ▁PASSAGE - ▁PERIOD - ▁CUSTOM - ▁HURT - ▁SHOULDER - ▁CHRIST - ZA - ▁RECEIVE - ▁DIFFICULT - ▁DEPEND - ▁MEETING - ▁CHI - ▁GEN - LIGHT - ▁BELIEVED - ▁SOCIAL - ▁DIFFICULTY - ▁GREATEST - ▁DRAWN - ▁GRANT - ▁BIRDS - ▁ANGRY - ▁HEAT - UFF - ▁DUE - ▁PLACES - ▁SIN - ▁COURAGE - ▁EVIDENTLY - ▁GENTLE - ▁CRUEL - ▁GEORGE - ▁GRI - ▁SERVANT - ▁U - ▁PURE - OOK - ▁KNOWS - ▁KNOWING - LF - ▁WRITING - ▁REMEMBERED - ▁CU - ▁HOLDING - ▁TENDER - ▁QUI - ▁BURST - ▁SURELY - IGN - ▁VALLEY - ▁FU - ▁BUTTER - ▁SPOKEN - ▁STORE - ▁DISC - ▁CHRISTIAN - ▁PARIS - ▁HENRY - ▁FINISHED - ▁PROVE - ▁FOOL - ▁SOLDIERS - ▁LANGUAGE - ▁INSIDE - ▁BAN - ▁FALLEN - ROW - ▁MAL - ▁BABY - ▁SITUATION - ▁WATCHED - ANS - ▁RUIN - ▁GENTLEMEN - ▁FRO - ▁FANCY - ▁ACCEPT - ▁SEASON - ▁OURSELVES - ▁SAN - ▁SPEED - IZED - ▁COOL - ▁SERVE - ▁VESSEL - ▁WILLIAM - ▁OBLIGED - ▁GROUP - FORM - ▁GOES - UOUS - ▁LEAVES - ▁PECULIAR - ▁NEWS - ▁VAIN - ▁EVERYBODY - ▁PIN - UG - ▁FORGOTTEN - ▁FRA - GAN - ▁CAREFULLY - ▁FLASH - UCH - ▁FUR - ▁MURDER - ▁DELIGHT - ▁WAITED - ▁RENDER - ▁PROPERTY - ▁NOTICED - ▁ROLL - ▁KNOCK - ▁EARNEST - KI - ▁HONEST - ▁PROMISED - ▁BAL - AW - ▁WALKING - ANG - ▁SQUARE - ▁QUIETLY - ▁CLOUD - WOOD - ▁FORMED - ▁HIGHER - ▁BUILT - ▁FATE - ▁TEACH - MY - ▁FALSE - ▁YORK - ▁DUST - ▁CLIMB - ▁FOND - ▁GROWN - ▁DESCEND - ▁RAG - ▁FRUIT - ▁GENERALLY - ▁OFFERED - ▁ER - ▁NURSE - POSE - ▁SPENT - ▁JOIN - ▁STATION - ▁MEANING - ▁SMOKE - HOOD - ▁ROUGH - JU - ▁LIKELY - ▁SURFACE - ▁KE - ▁MONTH - ▁POSSESSION - ▁TONGUE - ▁DUKE - ▁NOSE - ▁LAUGHING - ▁WEATHER - ▁WHISPERED - ▁SYSTEM - ▁LAWS - DDLE - ▁TOUCHED - ▁TRADE - LD - ▁SURPRISED - RIN - ▁ARCH - ▁WEALTH - FOR - ▁TEMPER - ▁FRANK - ▁GAL - ▁BARE - ▁OPPORTUNITY - ▁CLAIM - ▁ANIMALS - ▁REV - ▁COST - ▁WASH - ZE - ▁CORN - ▁OPPOSITE - ▁POLICE - ▁IDEAS - LON - ▁KEY - ▁READING - ▁COLLECT - CHED - ▁H - ▁CROWN - ▁TAR - ▁SWIFT - ▁SHOULDERS - ▁ICE - ▁GRAY - ▁SHARE - ▁PREPARED - ▁GRO - ▁UND - ▁TER - ▁EMPTY - CING - ▁SMILING - ▁AVOID - ▁DIFFERENCE - ▁EXPLAIN - ▁POUR - ▁ATTRACT - ▁OPENING - ▁WHEEL - ▁MATERIAL - ▁BREAST - ▁SUFFERING - ▁DISTINCT - ▁BOOT - ▁ROW - ▁FINGERS - HAN - ▁ALTOGETHER - ▁FAT - ▁PAPA - ▁BRAIN - ▁ASLEEP - ▁GREY - ▁SUM - ▁GAS - ▁WINDOWS - ▁ALIVE - ▁PROCEED - ▁FLOWER - ▁LEAP - ▁PUR - ▁PIECES - ▁ALTER - ▁MEMORY - IENT - ▁FILL - ▁CLO - ▁THROWN - ▁KINGDOM - ▁RODE - IUS - ▁MAID - ▁DIM - ▁BAND - ▁VIRTUE - ▁DISH - ▁GUEST - ▁LOSS - ▁CAUSED - ▁MOTION - ▁POT - ▁MILLION - ▁FAULT - ▁LOVELY - ▁HERO - PPING - ▁UNITED - ▁SPI - SOME - BRA - ▁MOUNTAINS - ▁NU - ▁SATISFIED - ▁DOLLARS - ▁LOVER - ▁CONCEAL - ▁VAST - ▁PULL - ▁HATH - ▁RUSH - ▁J - ▁DESPAIR - EX - ▁HEIGHT - ▁CE - ▁BENT - ▁PITY - ▁RISING - ATH - ▁PRIDE - ▁HURRY - KA - ▁SETTLED - ▁JUSTICE - ▁LIFTED - PEN - ▁SOLDIER - ▁FINDING - ▁REMARK - ▁REGULAR - ▁STRUGGLE - ▁MACHINE - ▁SING - ▁HURRIED - ▁SUFFICIENT - ▁REPRESENT - ▁DOUBLE - ▁ALARM - ▁SUPPER - ▁DREADFUL - ▁FORE - ATOR - ▁STOCK - ▁TIN - ▁EXAMPLE - ▁ROOF - ▁FLOW - ▁SUPPOSED - ▁PRESERV - ▁L - ▁LISTENED - OC - ▁STO - ▁SECURE - ▁FRIGHTENED - ▁DISTURB - ▁EMOTION - ▁SERVANTS - ▁YO - ▁BUY - ▁FORCED - ▁KITCHEN - ▁TERROR - ▁STAIRS - ▁SIXTY - KER - ▁ORDINARY - ▁DIRECTLY - ▁HEADS - ▁METHOD - ▁FORGIVE - ▁AWFUL - ▁REFLECT - ▁GREATLY - ▁TALKED - ▁RIDE - STONE - ▁FAVOUR - ▁WELCOME - ▁SEIZED - OU - ▁CONTROL - ▁ORDERED - ▁ANGEL - ▁USUALLY - ▁POET - ▁BOLD - LINE - ▁ADVENTURE - ▁WATCHING - ▁FOLK - ▁MISTRESS - IZE - ▁GROWING - ▁CAVE - ▁EVIDENCE - ▁FINGER - ▁SEVENTEEN - ▁MOVING - EOUS - ▁DOESN - ▁COW - ▁TYPE - ▁BOIL - ▁TALE - ▁DELIVER - ▁FARM - ▁MONSIEUR - ▁GATHERED - ▁FEELINGS - ▁RATE - ▁REMARKED - ▁PUTTING - ▁MAT - ▁CONTRARY - ▁CRIME - ▁PLA - ▁COL - ▁NEARER - TES - ▁CIVIL - ▁SHAME - ▁LOOSE - ▁DISCOVER - ▁FLAT - ▁TWICE - ▁FAIL - VIS - ▁UNC - EA - ▁EUROPE - ▁PATIENT - ▁UNTO - ▁SUFFER - ▁PAIR - ▁TREASURE - OSE - ▁EAGER - ▁FLY - ▁N - ▁VAL - ▁DAN - ▁SALT - ▁BORE - BBE - ▁ARTHUR - ▁AFFAIRS - ▁SLOW - ▁CONSIST - ▁DEVIL - LAN - ▁AFFECTION - ▁ENGAGED - ▁KISS - ▁YA - ▁OFFICER - IFICATION - ▁LAMP - ▁PARTS - HEN - ▁MILK - ▁PROCESS - ▁GIFT - ▁PULLED - ▁HID - ▁RAY - ▁EXCELLENT - ▁IMPRESSION - ▁AUTHORITY - ▁PROVED - ▁TELLING - TTE - ▁TOWER - ▁CONSEQUENCE - ▁FAVOR - ▁FLEW - ▁CHARLES - ISTS - ▁ADDRESS - ▁FAMILIAR - ▁LIMIT - ▁CONFIDENCE - ▁RARE - ▁WEEKS - ▁WOODS - ▁INTENTION - ▁DIRECT - ▁PERFORM - ▁SOLEMN - ▁DISTANT - ▁IMAGE - ▁PRESIDENT - ▁FIRM - ▁INDIAN - ▁RANK - ▁LIKED - ▁AGREE - ▁HOUSES - ▁WIL - ▁MATTERS - ▁PRISON - ▁MODE - ▁MAJOR - ▁WORKING - ▁SLIP - ▁WEIGHT - ▁AWARE - ▁BUSY - ▁LOOKS - ▁WOUND - ▁THOR - ▁BATH - ▁EXERCISE - ▁SIMILAR - ▁WORE - ▁AMOUNT - ▁QUESTIONS - ▁VIOLENT - ▁EXCUSE - ▁ASIDE - ▁TUR - ▁DULL - OF - ▁EMPEROR - ▁NEVERTHELESS - ▁SHOUT - ▁EXPLAINED - ▁SIZE - ▁ACCOMPLISH - FORD - CAN - ▁MISTAKE - ▁INSTANTLY - ▁SMOOTH - ▁STRIKE - ▁BOB - ISED - ▁HORROR - ▁SCIENCE - ▁PROTEST - ▁MANAGE - ▁OBEY - ▁NECESSITY - ▁SPLENDID - ▁PRESS - ▁INTERESTING - ▁RELIGION - ▁UNKNOWN - ▁FIERCE - ▁DISAPPEARED - ▁HOLY - ▁HATE - ▁PLAYED - ▁LIN - ▁NATURALLY - ▁DROVE - ▁LOUIS - TIES - ▁BRAND - INESS - RIE - ▁SHOOT - ▁CONSENT - ▁SEATED - ▁LINES - GUE - ▁AGREED - ▁CIRCLE - ▁STIR - ▁STREETS - ▁TASK - ▁RID - ▁PRODUCED - ▁ACCIDENT - ▁WITNESS - ▁LIBERTY - ▁DETAIL - ▁MINISTER - ▁POWERFUL - ▁SAVAGE - ▁SIXTEEN - ▁PRETEND - ▁COAST - ▁SQU - ▁UTTER - ▁NAMED - ▁CLEVER - ▁ADMIT - ▁COUPLE - ▁WICKED - ▁MESSAGE - ▁TEMPLE - ▁STONES - ▁YESTERDAY - ▁HILLS - DAY - ▁SLIGHT - ▁DIAMOND - ▁POSSIBLY - ▁AFFAIR - ▁ORIGINAL - ▁HEARING - ▁WORTHY - ▁SELL - NEY - ICK - ▁COTTAGE - ▁SACRIFICE - ▁PROGRESS - ▁SHOCK - ▁DESIGN - ▁SOUGHT - ▁PIT - ▁SUNDAY - ▁OTHERWISE - ▁CABIN - ▁PRAYER - ▁DWELL - ▁GAIN - ▁BRIDGE - ▁PARTICULARLY - ▁YIELD - ▁TREAT - RIGHT - ▁OAK - ▁ROPE - WIN - ▁ORDERS - ▁SUSPECT - ▁EDWARD - AB - ▁ELEVEN - ▁TEETH - ▁OCCURRED - DDING - ▁AMERICA - ▁FALLING - ▁LION - ▁DEPART - ▁KEEPING - ▁DEMAND - ▁PAUSED - ▁CEASED - INA - ▁FUN - ▁CHEER - ▁PARDON - ▁NATIVE - LUS - LOW - ▁DOGS - ▁REQUIRED - ILITY - ▁ELECT - ▁ENTERTAIN - ITUDE - ▁HUGE - ▁CARRYING - ▁BLU - ▁INSIST - ▁SATISFACTION - ▁HUNT - ▁COUNTENANCE - ▁UPPER - ▁MAIDEN - ▁FAILED - ▁JAMES - ▁FOREIGN - ▁GATHER - ▁TEST - BOARD - ▁TERMS - ▁SILK - ▁BEG - ▁BROTHERS - ▁PAGE - ▁KNEES - ▁SHOWN - ▁PROFESSOR - ▁MIGHTY - ▁DEFI - ▁CHARM - ▁REQUIRE - ▁LOG - MORE - ▁PROOF - ▁POSSESSED - ▁SOFTLY - ▁UNFORTUNATE - ▁PRICE - ▁SEVERE - ▁SINGING - ▁STAGE - ▁FREEDOM - ▁SHOUTED - ▁FARTHER - ▁MAJESTY - ▁PREVIOUS - ▁GUIDE - ▁MATCH - ▁CHEST - ▁INTENDED - ▁BI - ▁EXCITEMENT - ▁OFFICERS - ▁SUR - ▁SHAKE - ▁SENTIMENT - ▁GENTLY - ▁SUCCEEDED - ▁MENTION - ▁LOCK - ▁ACQUAINTANCE - ▁IMAGINATION - ▁PHYSICAL - ▁LEADING - ▁SLAVE - ▁CART - ▁POINTED - ▁STEAM - ▁SHADE - ▁PIPE - ▁BASE - ▁INVENT - ▁ALAS - ▁WORKED - ▁REGRET - ▁BUR - ▁FAITHFUL - ▁MENTIONED - ▁RECORD - ▁COMPLAIN - ▁SUPERIOR - ▁BAY - ▁PAL - EMENT - UE - ▁SEVENTY - ▁HOTEL - ▁SHEEP - ▁MEAL - ▁ADVICE - ▁HIDDEN - ▁DEMANDED - ▁CONSCIOUS - ▁BROW - ▁POSSESS - ▁FOURTH - ▁EVENTS - ▁FRI - ▁PRAISE - ▁ADVANCED - ▁RESOLVED - ▁STUFF - ▁CHEERFUL - ▁BIRTH - ▁GRIEF - ▁AFFORD - ▁FAIRY - ▁WAKE - ▁SIDES - ▁SUBSTANCE - ▁ARTICLE - ▁LEVEL - ▁MIST - ▁JOINED - ▁PRACTICAL - ▁CLEARLY - ▁TRACE - ▁AWAKE - ▁OBSERVE - ▁BASKET - ▁LACK - VILLE - ▁SPIRITS - ▁EXCITED - ▁ABANDON - ▁SHINING - ▁FULLY - ▁CALLING - ▁CONSIDERABLE - ▁SPRANG - ▁MILE - ▁DOZEN - ▁PEA - ▁DANGEROUS - ▁WIT - ▁JEW - ▁POUNDS - ▁FOX - ▁INFORMATION - ▁LIES - ▁DECK - NNY - ▁PAUL - ▁STARS - ▁ANGER - ▁SETTLE - ▁WILLING - ▁ADAM - ▁FACES - ▁SMITH - ▁IMPORTANCE - ▁STRAIN - WAR - ▁SAM - ▁FEATHER - ▁SERVED - ▁AUTHOR - ▁PERCEIVED - ▁FLAME - ▁DIVINE - ▁TRAIL - ▁ANYBODY - ▁SIGH - ▁DELICATE - KY - ▁FOLD - ▁HAVEN - ▁DESIRED - ▁CURIOSITY - ▁PRACTICE - ▁CONSIDERATION - ▁ABSOLUTELY - ▁CITIZEN - ▁BOTTLE - ▁INTERESTED - ▁MEAT - ▁OCCUPIED - ▁CHOOSE - ▁THROAT - ETTE - ▁CANDLE - ▁DAWN - ▁PROTECT - ▁SENTENCE - IED - ▁ROCKS - ▁PORTION - ▁APPARENTLY - ▁PRESENTED - ▁TIGHT - ▁ACTUALLY - ▁DYING - ▁HAM - ▁DAILY - ▁SUFFERED - ▁POLITICAL - ▁BODIES - ▁MODERN - ▁COMPLETELY - ▁SOONER - TAN - ▁PROP - ▁ADVANCE - ▁REFUSED - ▁FARMER - ▁POLITE - ▁THUNDER - ▁BRIEF - ▁ELSIE - ▁SAILOR - ▁SUGGESTED - ▁PLATE - ▁AID - ▁FLESH - ▁WEEP - ▁BUCK - ▁ANTI - ▁OCEAN - ▁SPEND - WELL - ▁ODD - ▁GOVERNOR - ▁ENTRANCE - ▁SUSPICION - ▁STEPPED - ▁RAPIDLY - ▁CHECK - ▁HIDE - ▁FLIGHT - ▁CLUB - ▁ENTIRE - ▁INDIANS - ASH - ▁CAPITAL - ▁MAMMA - HAR - ▁CORRECT - ▁CRACK - ▁SENSATION - ▁WORST - ▁PACE - ▁MIDST - ▁AUGUST - ▁PROPORTION - ▁INNOCENT - LINESS - ▁REGARDED - ▁DRIVEN - ORD - ▁HASTE - ▁EDUCATION - ▁EMPLOY - ▁TRULY - ▁INSTRUMENT - ▁MAG - ▁FRAME - ▁FOOLISH - ▁TAUGHT - ▁HANG - ▁ARGUMENT - ▁NINETEEN - ▁ELDER - ▁NAY - ▁NEEDED - ▁NEIGHBOR - ▁INSTRUCT - ▁PAPERS - ▁REWARD - ▁EQUALLY - ▁FIELDS - ▁DIG - HIN - ▁CONDITIONS - JA - ▁SPAR - ▁REQUEST - ▁WORN - ▁REMARKABLE - ▁LOAD - ▁WORSHIP - ▁PARK - ▁KI - ▁INTERRUPTED - ▁SKILL - ▁TERM - LAC - ▁CRITIC - ▁DISTRESS - ▁BELIEF - ▁STERN - IGHT - ▁TRACK - ▁HUNTING - ▁JEWEL - ▁GRADUALLY - ▁GLOW - ▁RUSHED - ▁MENTAL - ▁VISITOR - ▁PICKED - ▁BEHOLD - ▁EXPRESSED - ▁RUB - ▁SKI - ARTAGNAN - ▁MOREOVER - ▁OPERATION - ▁CAREFUL - ▁KEEN - ▁ASSERT - ▁WANDER - ▁ENEMIES - ▁MYSTERIOUS - ▁DEPTH - ▁PREFER - ▁CROSSED - ▁CHARMING - ▁DREAD - ▁FLOUR - ▁ROBIN - ▁TRE - ▁RELIEF - ▁INQUIRED - ▁APPLE - ▁HENCE - ▁WINGS - ▁CHOICE - ▁JUD - OO - ▁SPECIES - ▁DELIGHTED - IUM - ▁RAPID - ▁APPEAL - ▁FAMOUS - ▁USEFUL - ▁HELEN - ▁NEWSPAPER - ▁PLENTY - ▁BEARING - ▁NERVOUS - ▁PARA - ▁URGE - ▁ROAR - ▁WOUNDED - ▁CHAIN - ▁PRODUCE - ▁REFLECTION - ▁MERCHANT - ▁QUARREL - ▁GLORY - ▁BEGUN - ▁BARON - CUS - ▁QUEER - ▁MIX - ▁GAZE - ▁WHISPER - ▁BURIED - ▁DIV - ▁CARD - ▁FREQUENTLY - ▁TIP - ▁KNEE - ▁REGION - ▁ROOT - ▁LEST - ▁JEALOUS - CTOR - ▁SAVED - ▁ASKING - ▁TRIP - QUA - ▁UNION - HY - ▁COMPANIONS - ▁SHIPS - ▁HALE - ▁APPROACHED - ▁HARRY - ▁DRUNK - ▁ARRIVAL - ▁SLEPT - ▁FURNISH - HEAD - ▁PIG - ▁ABSENCE - ▁PHIL - ▁HEAP - ▁SHOES - ▁CONSCIOUSNESS - ▁KINDLY - ▁EVIDENT - ▁SCAR - ▁DETERMIN - ▁GRASP - ▁STEAL - ▁OWE - ▁KNIFE - ▁PRECIOUS - ▁ELEMENT - ▁PROCEEDED - ▁FEVER - ▁LEADER - ▁RISK - ▁EASE - ▁GRIM - ▁MOUNT - ▁MEANWHILE - ▁CENTURY - OON - ▁JUDGMENT - ▁AROSE - ▁VISION - ▁SPARE - ▁EXTREME - ▁CONSTANT - ▁OBSERVATION - ▁THRUST - ▁DELAY - ▁CENT - ▁INCLUD - ▁LIFT - ▁ADMIRE - ▁ISSUE - ▁FRIENDSHIP - ▁LESSON - ▁PRINCIPAL - ▁MOURN - ▁ACCEPTED - ▁BURNING - ▁CAPABLE - ▁EXTRAORDINARY - ▁SANG - ▁REMOVED - ▁HOPED - ▁HORN - ▁ALICE - ▁MUD - ▁APARTMENT - ▁FIGHTING - ▁BLAME - ▁TREMBLING - ▁SOMEBODY - ▁ANYONE - ▁BRIDE - ▁READER - ▁ROB - ▁EVERYWHERE - ▁LABOUR - ▁RECALL - ▁BULL - ▁HIT - ▁COUNCIL - ▁POPULAR - ▁CHAP - ▁TRIAL - ▁DUN - ▁WISHES - ▁BRILLIANT - ▁ASSURED - ▁FORGOT - ▁CONTINUE - ▁ACKNOWLEDG - ▁RETREAT - ▁INCREASED - ▁CONTEMPT - ▁GRANDFATHER - ▁SYMPATHY - ▁GHOST - ▁STRETCHED - ▁CREATURES - ▁CAB - ▁HIND - ▁PLAYING - ▁MISERABLE - ▁MEMBERS - ▁KINDNESS - ▁HIGHEST - ▁PRIM - ▁KISSED - ▁DESERVE - ▁HUT - ▁BEGGED - ▁EIGHTY - ▁CLOSELY - ▁WONDERED - ▁MILITARY - ▁REMIND - ▁ACCORDINGLY - ▁LARGER - ▁MAINTAIN - ▁ENGINE - ▁MOTIVE - ▁DESTROY - ▁STRIP - ▁HANS - ▁AHEAD - ▁INFINITE - ▁PROMPT - ▁INFORMED - TTLE - ▁PEER - ▁PRESSED - ▁TRAP - ▁SOMEWHERE - ▁BOUGHT - ▁VISIBLE - ▁ASHAMED - ▁TEAR - ▁NEIGHBOUR - ▁CONSTITUTION - ▁INTELLIGENCE - ▁PROFESSION - ▁HUNGRY - RIDGE - ▁SMELL - ▁STORIES - ▁LISTENING - ▁APPROACH - ▁STRING - ▁EXPLANATION - ▁IMMENSE - ▁RELIGIOUS - ▁THROUGHOUT - ▁HOLLOW - ▁AWAIT - ▁FLYING - ▁SCREAM - ▁ACTIVE - ▁RUM - ▁PRODUCT - ▁UNHAPPY - ▁VAGUE - ARIES - ▁ELIZABETH - ▁STUPID - ▁DIGNITY - ▁ISABEL - GAR - ▁BRO - ▁PITCH - ▁COMRADE - ▁STIFF - ▁RECKON - ▁SOLD - ▁SPARK - ▁STRO - ▁CRYING - ▁MAGIC - ▁REPEAT - PORT - ▁MARKED - ▁COMFORTABLE - ▁PROJECT - ▁BECOMING - ▁PARENTS - ▁SHELTER - ▁STOLE - ▁HINT - ▁NEST - ▁TRICK - ▁THOROUGHLY - ▁HOSPITAL - ▁WEAPON - ▁ROME - ▁STYLE - ▁ADMITTED - ▁SAFETY - FIELD - ▁UNDERSTANDING - ▁TREMBLE - ▁PRINT - ▁SLAVES - ▁WEARY - ▁ARTIST - ▁CREDIT - BURG - ▁CONCLUSION - ▁SELDOM - ▁UNUSUAL - ▁CLOUDS - ▁UNABLE - ▁GAY - ▁HANGING - ▁SCR - ▁BOWED - ▁DAVID - ▁VOL - ▁PUSHED - ▁ESCAPED - MOND - ▁WARN - ▁BETRAY - ▁EGGS - ▁PLAINLY - ▁EXHIBIT - ▁DISPLAY - ▁MEMBER - ▁GRIN - ▁PROSPECT - ▁BRUSH - ▁BID - ▁SUCCESSFUL - ▁EXTENT - ▁PERSUADE - ▁MID - ▁MOOD - ▁ARRANGED - ▁UNIVERSAL - ▁JIM - ▁SIGNAL - ▁WHILST - ▁PHILIP - ▁WOLF - RATE - ▁EAGERLY - ▁BILLY - ▁RETURNING - ▁CONSCIENCE - ▁FORTUNATE - ▁FEMALE - ▁GLEAM - ▁HASTILY - ▁PROVIDED - ▁OBTAIN - ▁INSTINCT - ▁CONCERNED - ▁CONCERNING - ▁SOMEHOW - ▁PINK - ▁RAGE - ▁ACCUSTOMED - ▁UNCONSCIOUS - ▁ADVISE - ▁BRANCHES - ▁TINY - ▁REFUSE - ▁BISHOP - ▁SUPPLY - ▁PEASANT - ▁LAWYER - ▁WASTE - ▁CONNECTION - ▁DEVELOP - ▁CORRESPOND - ▁PLUM - ▁NODDED - ▁SLIPPED - ▁EU - ▁CONSTANTLY - CUM - MMED - ▁FAIRLY - HOUSE - ▁KIT - ▁RANG - ▁FEATURES - ▁PAUSE - ▁PAINFUL - ▁JOE - ▁WHENCE - ▁LAUGHTER - ▁COACH - ▁CHRISTMAS - ▁EATING - ▁WHOLLY - ▁APART - ▁SUPER - ▁REVOLUTION - ▁LONELY - ▁CHEEKS - ▁THRONE - ▁CREW - ▁ATTAIN - ▁ESTABLISHED - TIME - ▁DASH - ▁FRIENDLY - ▁OPERA - ▁EARL - ▁EXHAUST - ▁CLIFF - ▁REVEAL - ▁ADOPT - ▁CENTRE - ▁MERRY - ▁SYLVIA - ▁IDEAL - ▁MISFORTUNE - ▁FEAST - ▁ARAB - ▁NUT - ▁FETCH - ▁FOUGHT - ▁PILE - ▁SETTING - ▁SOURCE - ▁PERSIST - ▁MERCY - ▁BARK - ▁LUC - ▁DEEPLY - ▁COMPARE - ▁ATTITUDE - ▁ENDURE - ▁DELIGHTFUL - ▁BEARD - ▁PATIENCE - ▁LOCAL - ▁UTTERED - ▁VICTORY - ▁TREATED - ▁SEPARATE - ▁WAG - ▁DRAGG - ▁TITLE - ▁TROOPS - ▁TRIUMPH - ▁REAR - ▁GAINED - ▁SINK - ▁DEFEND - ▁TIED - ▁FLED - ▁DARED - ▁INCREASE - ▁POND - ▁CONQUER - ▁FOREHEAD - ▁FAN - ▁ANXIETY - ▁ENCOUNTER - ▁SEX - ▁HALT - ▁SANK - ▁CHEEK - ▁HUMBLE - ▁WRITER - ▁EMPLOYED - ▁DISTINGUISHED - ▁RAISE - ▁WHIP - ▁GIANT - ▁RANGE - ▁OBTAINED - ▁FLAG - ▁MAC - ▁JUMPED - ▁DISCOVERY - ▁NATIONAL - ▁COMMISSION - ▁POSITIVE - ▁LOVING - ▁EXACT - ▁MURMURED - ▁GAZED - ▁REFER - ▁COLLEGE - ▁ENCOURAGE - ▁NOVEL - ▁CLOCK - ▁MORTAL - ▁ROLLED - ▁RAT - IZING - ▁GUILTY - ▁VICTOR - WORTH - ▁PRA - ▁APPROACHING - ▁RELATIVE - ▁ESTATE - ▁UGLY - ▁METAL - ▁ROBERT - ▁TENT - ▁ADMIRATION - ▁FOURTEEN - ▁BARBAR - ▁WITCH - ELLA - ▁CAKE - ▁SHONE - ▁MANAGED - ▁VOLUME - ▁GREEK - ▁DANCING - ▁WRETCHED - ▁CONDEMN - ▁MAGNIFICENT - ▁CONSULT - J - ▁ORGAN - ▁FLEET - ▁ARRANGEMENT - ▁INCIDENT - ▁MISERY - ▁ARROW - ▁STROKE - ▁ASSIST - ▁BUILD - ▁SUCCEED - ▁DESPERATE - ▁WIDOW - UDE - ▁MARKET - ▁WISDOM - ▁PRECISE - ▁CURRENT - ▁SPOIL - ▁BADE - ▁WOODEN - ▁RESIST - ▁OBVIOUS - ▁SENSIBLE - FALL - ▁ADDRESSED - ▁GIL - ▁COUNSEL - ▁PURCHASE - ▁SELECT - ▁USELESS - ▁STARED - ▁ARREST - ▁POISON - ▁FIN - ▁SWALLOW - ▁BLOCK - ▁SLID - ▁NINETY - ▁SPORT - ▁PROVIDE - ▁ANNA - ▁LAMB - ▁INTERVAL - ▁JUMP - ▁DESCRIBED - ▁STRIKING - ▁PROVISION - ▁PROPOSED - ▁MELANCHOLY - ▁WARRIOR - ▁SUGGEST - ▁DEPARTURE - ▁BURDEN - ▁LIMB - ▁TROUBLED - ▁MEADOW - ▁SACRED - ▁SOLID - ▁TRU - ▁LUCY - ▁RECOVER - ▁ENERGY - ▁POWDER - ▁RESUMED - ▁INTENSE - ▁BRITISH - ▁STRAW - ▁AGREEABLE - ▁EVERYONE - ▁CONCERN - ▁VOYAGE - ▁SOUTHERN - ▁BOSOM - ▁UTTERLY - ▁FEED - ▁ESSENTIAL - ▁CONFINE - ▁HOUSEHOLD - ▁EXTREMELY - ▁WONDERING - ▁LIST - ▁PINE - PHA - ▁EXPERIMENT - ▁JOSEPH - ▁MYSTERY - ▁RESTORE - ▁BLUSH - FOLD - ▁CHOSEN - ▁INTELLECT - ▁CURTAIN - OLOGY - ▁MOUNTED - ▁LAP - ▁EPI - ▁PUNISH - ▁WEDDING - ▁RECOGNIZED - ▁DRIFT - ▁PREPARATION - ▁RESOLUTION - ▁OPPRESS - ▁FIX - ▁VICTIM - OGRAPH - ▁SUMMON - ▁JULIA - ▁FLOOD - ▁WAL - ULATION - ▁SLIGHTLY - ▁LODGE - ▁WIRE - ▁CONFUSION - ▁UNEXPECTED - ▁CONCEIVE - ▁PRIZE - ▁JESUS - ▁ADDITION - ▁RUDE - ▁FATAL - ▁CARELESS - ▁PATCH - ▁KO - ▁CATHERINE - ▁PARLIAMENT - ▁PROFOUND - ▁ALOUD - ▁RELIEVE - ▁PUSH - ABILITY - ▁ACCOMPANIED - ▁SOVEREIGN - ▁SINGULAR - ▁ECHO - ▁COMPOSED - ▁SHAKING - ATORY - ▁ASSISTANCE - ▁TEACHER - ▁HORRIBLE - ▁STRICT - ▁VERSE - ▁PUNISHMENT - ▁GOWN - ▁MISTAKEN - ▁VARI - ▁SWEPT - ▁GESTURE - ▁BUSH - ▁STEEL - ▁AFFECTED - ▁DIRECTED - ▁SURROUNDED - ▁ABSURD - ▁SUGAR - ▁SCRAP - ▁IMMEDIATE - ▁SADDLE - ▁TY - ▁ARISE - ▁SIGHED - ▁EXCHANGE - ▁IMPATIENT - ▁SNAP - ▁EMBRACE - ▁DISEASE - ▁PROFIT - ▁RIDING - ▁RECOVERED - ▁GOVERN - ▁STRETCH - ▁CONVINCED - ▁LEANING - ▁DOMESTIC - ▁COMPLEX - ▁MANIFEST - ▁INDULGE - ▁GENIUS - ▁AGENT - ▁VEIL - ▁DESCRIPTION - ▁INCLINED - ▁DECEIVE - ▁DARLING - ▁REIGN - HU - ▁ENORMOUS - ▁RESTRAIN - ▁DUTIES - BURY - TTERED - ▁POLE - ▁ENABLE - ▁EXCEPTION - ▁INTIMATE - ▁COUNTESS - ▁TRIBE - ▁HANDKERCHIEF - ▁MIDNIGHT - ▁PROBLEM - ▁TRAMP - ▁OIL - CAST - ▁CRUSH - ▁DISCUSS - ▁RAM - ▁TROT - ▁UNRE - ▁WHIRL - ▁LOCKED - ▁HORIZON - ▁OFFICIAL - ▁SCHEME - ▁DROWN - ▁PIERRE - ▁PERMITTED - ▁CONNECTED - ▁ASSURE - ▁COCK - ▁UTMOST - ▁DEVOTED - ▁RELI - ▁SUFFICIENTLY - ▁INTELLECTUAL - ▁CARPET - ▁OBJECTION - ▁AFTERWARD - ▁REALITY - ▁NEGRO - ▁RETAIN - ▁ASCEND - ▁CEASE - ▁KATE - ▁MARVEL - KO - ▁BOND - MOST - ▁COAL - GATE - ▁IGNORANT - ▁BREAKING - ▁TWIN - ▁ASTONISHMENT - ▁COFFEE - ▁JAR - ▁CITIES - ▁ORIGIN - ▁EXECUT - ▁FINAL - ▁INHABITANTS - ▁STABLE - ▁CHIN - ▁PARTIES - ▁PLUNGE - ▁GENEROUS - ▁DESCRIBE - ▁ANNOUNCED - ▁MERIT - ▁REVERE - ▁ERE - ACIOUS - ZI - ▁DISAPPOINT - ▁SUGGESTION - ▁DOUBTLESS - ▁TRUNK - ▁STAMP - ▁JOB - ▁APPOINTED - ▁DIVIDED - ▁ACQUAINTED - CHI - ▁ABSOLUTE - ▁FEARFUL - ▁PRIVILEGE - ▁CRAFT - ▁STEEP - ▁HUNTER - ▁FORBID - ▁MODEST - ▁ENDEAVOUR - ▁SWEEP - ▁BEHELD - ▁ABSORB - ▁CONSTRUCT - ▁EMPIRE - ▁EXPEDITION - ▁ERECT - ▁OFFEND - ▁INTEND - ▁PERMIT - ▁DESTROYED - ▁CONTRACT - ▁THIRST - ▁WAGON - ▁EVA - ▁GLOOM - ▁ATMOSPHERE - ▁RESERVE - ▁VOTE - ▁GER - ▁NONSENSE - ▁PREVAIL - ▁QUALITY - ▁CLASP - ▁CONCLUDED - ▁RAP - ▁KATY - ▁ETERNAL - ▁MUTTERED - ▁NEGLECT - ▁SQUIRE - ▁CREEP - LOCK - ▁ELECTRIC - ▁HAY - ▁EXPENSE - ▁SCORN - ▁RETIRED - ▁STOUT - ▁MURMUR - ▁SHARPLY - ▁DISTRICT - ▁LEAF - ▁FAILURE - WICK - ▁JEAN - ▁NUMEROUS - ▁INFANT - ▁REALIZED - ▁TRAVELLER - ▁HUNGER - ▁JUNE - ▁MUN - ▁RECOMMEND - ▁CREP - ZZLE - ▁RICHARD - WORK - ▁MONTE - ▁PREACH - ▁PALM - AVI - ▁ANYWHERE - ▁DISPOSITION - ▁MIRROR - ▁VENTURE - ▁POUND - ▁CIGAR - ▁INVITED - ▁BENCH - ▁PROTECTION - ▁BENEFIT - ▁THOMAS - ▁CLERK - ▁REPROACH - ▁UNIFORM - ▁GENERATION - ▁SEAL - ▁COMPASS - ▁WARNING - ▁EXTENDED - ▁DIFFICULTIES - ▁MAYBE - ▁GROAN - ▁AFFECT - ▁COMB - ▁EARN - ▁WESTERN - ▁IDLE - ▁SCORE - ▁TAP - ▁ASTONISHED - ▁INTRODUCED - ▁LEISURE - ▁LIEUTENANT - ▁VIOLENCE - ▁FIRMLY - ▁MONSTER - ▁UR - ▁PROPERLY - ▁TWIST - ▁PIRATE - ▁ROBBER - ▁BATTER - ▁WEPT - ▁LEANED - ▁FOG - ▁ORNAMENT - ▁ANDREW - ▁BUSHES - ▁REPUBLIC - ▁CONFIDENT - ▁LEAN - ▁DART - ▁STOOP - ▁CURL - ▁COUNTER - ▁NORTHERN - ▁PEARL - ▁NEAREST - ▁FRANCIS - ▁WANDERING - ▁FREQUENT - ▁STARTLED - ▁STATEMENT - ▁OCCUR - ▁BLOOM - ▁NERVE - ▁INSPECT - ▁INDUCE - ▁FLATTER - ▁DATE - ▁AMBITION - ▁SLOPE - ▁MALE - ▁MADAM - ▁MONK - ▁RENT - ▁CONFIRM - ▁INVESTIGAT - ▁RABBIT - ▁REGIMENT - ▁SUBMIT - ▁SPELL - ▁FURIOUS - ▁RAIL - ▁BESTOW - ▁RALPH - ▁SCATTERED - ▁COMPELLED - ▁THREAD - ▁CHILL - ▁DENY - ▁PRONOUNC - ▁MANKIND - ▁CATTLE - ▁EXECUTION - ▁REBEL - ▁SUPREME - ▁VALUABLE - ▁LIKEWISE - ▁CONVEY - ▁TIDE - ▁GLOOMY - ▁COIN - ▁ACTUAL - ▁TAX - ▁PROVINCE - ▁GRATEFUL - ▁SPIRITUAL - ▁VANISHED - ▁DIANA - ▁HAUNT - ▁DRAGON - ▁CRAWL - ▁CHINA - ▁GRATITUDE - ▁NEAT - ▁FINISH - ▁INTENT - ▁FRIGHT - ▁EMBARRASS - ▁THIRTEEN - ▁RUTH - ▁SLIGHTEST - ▁DEVELOPMENT - ▁INTERVIEW - ▁SPECTACLE - ▁BROOK - VIE - ▁WEAKNESS - ▁AUDIENCE - ▁CONSEQUENTLY - ▁ABROAD - ▁ASPECT - ▁PAINTED - ▁RELEASE - ▁INSULT - ▁SOOTH - ▁DISAPPOINTMENT - ▁EMERG - ▁BRIG - ▁ESTEEM - ▁INVITATION - ▁PASSENGER - ▁PUBLISH - ▁PIANO - ▁IRISH - ▁DESK - ▁BEATEN - ▁FIFTH - ▁IMPULSE - ▁SWEAR - ▁EATEN - ▁PURPLE - ▁COMMITTED - ▁COUNTRIES - ▁PERCEIVE - ISON - ▁CELEBRAT - ▁GRANDMOTHER - ▁SHUDDER - ▁SUNSHINE - ▁SPANISH - ▁HITHERTO - ▁MARILLA - ▁SNAKE - ▁MOCK - ▁INTERFERE - ▁WALTER - ▁AMID - ▁MARBLE - ▁MISSION - TERIOR - ▁DRIVING - ▁FURNITURE - ▁STEADY - ▁CIRCUMSTANCE - ▁INTERPRET - ▁ENCHANT - ▁ERROR - ▁CONVICTION - ▁HELPLESS - ▁MEDICINE - ▁QUALITIES - ▁ITALIAN - ▁HASTENED - ▁OCCASIONALLY - ▁PURSUED - ▁HESITATED - ▁INDEPENDENT - ▁OLIVER - ▁LINGER - UX - ▁EXAMINED - ▁REPENT - ▁PHYSICIAN - ▁CHASE - ▁BELOVED - ▁ATTACHED - ▁FLORENCE - ▁HONEY - ▁MOUSE - ▁CRIES - ▁BAKE - ▁POEM - ▁DESTRUCTION - ▁FULFIL - ▁MESSENGER - ▁TRISTRAM - ▁FANCIED - ▁EXCESS - ▁CURSE - ▁CHU - ▁QUANTITY - ▁THORNTON - ▁CREATED - ▁CONTINUALLY - ▁LIGHTNING - ▁BORNE - ▁TOTAL - ▁DISPOSED - ▁RIFLE - ▁POLLY - ▁GOAT - ▁BACKWARD - ▁VIRGINIA - ▁KICK - ▁PERIL - ▁QUO - ▁GLORIOUS - ▁MULTITUDE - ▁LEATHER - ▁ABSENT - ▁DEMON - ▁DEBT - ▁TORTURE - ▁ACCORD - ▁MATE - ▁CATHOLIC - ▁PILL - ▁LIBRARY - ▁PURSUIT - ▁SHIRT - ▁DEAREST - ▁COLLAR - ▁BEACH - ▁ROBE - ▁DECLARE - ▁BRANCH - ▁TEMPT - ▁STEADILY - ▁DISGUST - ▁SILLY - ▁ARRIVE - ▁DRANK - ▁LEVI - ▁COMMUNICAT - ▁RACHEL - ▁WASHINGTON - ▁RESIGN - ▁MEANTIME - ▁LACE - ▁ENGAGEMENT - ▁QUIVER - ▁SEPARATED - ▁DISCUSSION - ▁VENTURED - ▁SURROUNDING - ▁POLISH - ▁NAIL - ▁SWELL - ▁JOKE - ▁LINCOLN - ▁STUDENT - ▁GLITTER - ▁RUSSIAN - ▁READILY - ▁CHRIS - ▁POVERTY - ▁DISGRACE - ▁CHEESE - ▁HEAVILY - ▁SCALE - ▁STAFF - ▁ENTREAT - ▁FAREWELL - ▁LUNCH - ▁PEEP - ▁MULE - ▁SOMEONE - ▁DISAPPEAR - ▁DECISION - ▁PISTOL - ▁PUN - ▁SPUR - ▁ASSUMED - ▁EXTEND - ▁ENTHUSIASM - ▁DEFINITE - ▁UNDERTAKE - ▁COMMITTEE - ▁SIMON - ▁FENCE - ▁APPLIED - ▁RELATED - ▁VICE - ▁UNPLEASANT - ▁PROBABLE - ▁PROCURE - ▁FROWN - ▁CLOAK - ▁HUMANITY - ▁FAMILIES - ▁PHILOSOPHER - ▁DWARF - ▁OVERCOME - ▁DEFEAT - ▁FASTENED - ▁MARSH - ▁CLASSES - ▁TOMB - ▁GRACIOUS - ▁REMOTE - ▁CELL - ▁SHRIEK - ▁RESCUE - ▁POOL - ▁ORGANIZ - ▁CHOSE - ▁CUTTING - ▁COWARD - ▁BORDER - ▁DIRTY - ▁MONKEY - ▁HOOK - ▁CHUCK - ▁EMILY - ▁JEST - ▁PLAC - ▁WEIGH - ▁ASSOCIATE - ▁GLIMPSE - ▁STUCK - ▁BOLT - ▁MURDERER - ▁PONY - ▁DISTINGUISH - ▁INSTITUTION - ▁CUNNING - ▁COMPLIMENT - ▁APPETITE - ▁REPUTATION - ▁FEEBLE - ▁KIN - ▁SERIES - ▁GRACEFUL - ▁PLATFORM - ▁BREEZE - ▁PHRASE - ▁CLAY - MONT - ▁RATTL - ▁OPPOSITION - ▁LANE - ▁BOAST - ▁GROWTH - ▁INCLINATION - ▁BEHAVE - ▁SUSAN - ▁DISTINCTION - ▁DISLIKE - ▁NICHOLAS - ▁SATISFY - ▁DRAMA - ▁ELBOW - ▁GAZING - ▁CONSUM - ▁SPIN - ▁OATH - ▁CHANNEL - ▁CHARACTERISTIC - ▁SPEAR - ▁SLAIN - ▁SAUCE - ▁FROG - ▁CONCEPTION - ▁TIMID - ▁ZEAL - ▁APPARENT - SHIRE - ▁CENTER - ▁VARIETY - ▁DUSK - ▁APT - ▁COLUMN - ▁REVENGE - ▁RIVAL - ▁IMITAT - ▁PASSIONATE - ▁SELFISH - ▁NORMAN - ▁REPAIR - ▁THRILL - ▁TREATMENT - ▁ROSA - ▁MARTIN - ▁INDIFFERENT - ▁THITHER - ▁GALLANT - ▁PEPPER - ▁RECOLLECT - ▁VINE - ▁SCARCE - ▁SHIELD - ▁MINGLED - CLOSE - ▁HARSH - ▁BRICK - ▁HUMOR - ▁MISCHIEF - ▁TREMENDOUS - ▁FUNCTION - ▁SMART - ▁SULTAN - ▁DISMISS - ▁THREATENED - ▁CHEAP - ▁FLOCK - ▁ENDEAVOR - ▁WHISK - ▁ITALY - ▁WAIST - ▁FLUTTER - ▁SMOKING - ▁MONARCH - ▁AFRICA - ▁ACCUSE - ▁HERBERT - ▁REFRESH - ▁REJOICE - ▁PILLOW - ▁EXPECTATION - ▁POETRY - ▁HOPELESS - ▁PERISH - ▁PHILOSOPHY - ▁WHISTLE - ▁BERNARD - ▁LAMENT - ▁IMPROVE - ▁SUP - ▁PERPLEX - ▁FOUNTAIN - ▁LEAGUE - ▁DESPISE - ▁IGNORANCE - ▁REFERENCE - ▁DUCK - ▁GROVE - ▁PURSE - ▁PARTNER - ▁PROPHET - ▁SHIVER - ▁NEIGHBOURHOOD - ▁REPRESENTATIVE - SAIL - ▁WIP - ▁ACQUIRED - ▁CHIMNEY - ▁DOCTRINE - ▁MAXIM - ▁ANGLE - ▁MAJORITY - ▁AUTUMN - ▁CONFUSED - ▁CRISTO - ▁ACHIEVE - ▁DISGUISE - ▁REDUCED - ▁EARLIER - ▁THEATRE - ▁DECIDE - MINATED - OLOGICAL - ▁OCCUPATION - ▁VIGOROUS - ▁CONTINENT - ▁DECLINE - ▁COMMUNITY - ▁MOTIONLESS - ▁HATRED - ▁COMMUNICATION - ▁BOWL - ▁COMMENT - ▁APPROVE - ▁CEREMONY - ▁CRIMINAL - ▁SCIENTIFIC - ▁DUCHESS - ▁VIVID - ▁SHIFT - ▁AVAIL - ▁DAMP - ▁JOHNSON - ▁SLENDER - ▁CONTRAST - ▁AMUSEMENT - ▁PLOT - ▁LYN - ▁ASSOCIATION - ▁SNATCH - ▁UNCERTAIN - ▁PRESSURE - ▁PERCH - ▁APPLY - ▁PLANET - ▁NOTWITHSTANDING - ▁SWUNG - ▁STIRRED - ▁ATTENDANT - ▁ENJOYMENT - ▁WORRY - ▁ALBERT - ▁NAKED - ▁TALENT - ▁MARIAN - ▁REFORM - ▁DELIBERATE - ▁INTELLIGENT - ▁SENSITIVE - ▁YONDER - ▁PUPIL - ▁FRIGHTFUL - ▁DOUBTFUL - ▁STANDARD - ▁MAGISTRATE - ▁SHEPHERD - ▁STOMACH - ▁DEPOSIT - ▁RENEW - ▁HEDGE - ▁FRANCS - ▁POSSIBILITY - ▁RESEMBLE - ▁FATIGUE - ▁PORTRAIT - ▁FAVORITE - ▁CREAM - ▁BURG - ▁SECRETARY - ▁DIVERS - ▁ACTIVITY - ▁SPECULAT - ▁HUMOUR - ▁FITTED - ▁EXTERNAL - ▁CETERA - ▁WRAPPED - ▁WHIT - ▁FRED - ▁EXAMINATION - ▁LODGING - ▁OWING - ▁JAW - ▁CROW - ▁BALANCE - ▁PUFF - ▁TENDERNESS - ▁PORTHOS - ▁ANCHOR - ▁INTERRUPT - ▁NECESSARILY - ▁PERPETUAL - ▁AGONY - ▁POPE - ▁SCHOLAR - ▁SCOTLAND - ▁SUPPRESS - ▁WRATH - ▁WRECK - ▁EXCEED - ▁PERFECTION - ▁INDIA - ▁TRADITION - ▁SECTION - ▁EASTERN - ▁DOORWAY - ▁WIVES - ▁CONVENTION - ▁ANNOUNC - ▁EGYPT - ▁CONTRADICT - ▁SCRATCH - ▁CENTRAL - ▁GLOVE - ▁WAX - ▁PREPARE - ▁ACCOMPANY - ▁INCREASING - ▁LIBERAL - ▁RAISING - ▁ORANGE - ▁SHOE - ▁ATTRIBUTE - ▁LITERATURE - ▁PUZZLED - ▁WITHDRAW - ▁WHITHER - ▁HAWK - ▁MOONLIGHT - ▁EXAMINE - ▁HAPPILY - ▁PRECEDE - ▁DETECTIVE - ▁INCHES - ▁SOLITARY - ▁DUTCH - ▁NAPOLEON - ▁UNEASY - ▁CARDINAL - ▁BLEW - ▁FOWL - ▁DECORAT - ▁CHILDHOOD - ▁TORMENT - ▁LOSING - ▁PERMISSION - ▁BLANK - ▁UPSTAIRS - ▁CAPACITY - ▁TRIFLE - ▁FOLLY - ▁RECOGNIZE - ▁REMOVE - ▁VENGEANCE - ▁ENTERPRISE - ▁BEDROOM - ▁ANYHOW - ▁INQUIRY - ▁ASHES - ▁DRAG - ▁HUSH - ▁AWKWARD - ▁SATURDAY - ▁GENUINE - ▁SURVIV - ▁SKIRT - ▁AFFECTIONATE - ▁TANG - ▁MUTUAL - ▁DISPUTE - ▁EAGLE - ▁INCOME - ▁BIND - ▁FAME - ▁IMPROVEMENT - ROVING - ▁DIFFER - ▁AWOKE - ▁SLEEVE - ▁SOLITUDE - ▁FAVOURITE - JI - ▁DETECT - ▁COMPREHEND - ▁PREPARING - ▁SERPENT - ▁SUMMIT - ▁KNOT - ▁KNIT - ▁COPY - ▁STOPPING - ▁FADED - ▁HIDEOUS - ▁JULIE - STEAD - ▁SHINE - ▁CONFLICT - ▁PROPOSITION - ▁REFUGE - ▁GALLERY - ▁BUNDLE - ▁AXE - ▁SLAVERY - ▁MASK - ▁ALYOSHA - ▁LADDER - ▁DEPARTMENT - ▁DISCHARGE - ▁DEPRESS - ▁GALLOP - ▁SCARLET - ▁KITTY - ▁RECEIVING - ▁SURRENDER - ▁SUSTAIN - ▁TWILIGHT - ▁CONGRESS - ▁IRELAND - ▁FUNNY - ▁LEND - ▁CONSTITUTE - ▁FUNERAL - ▁CRYSTAL - ▁SPAIN - ▁EXCEEDINGLY - ▁DAMN - ▁COMMUN - ▁CIVILIZATION - ▁PREJUDICE - ▁PORCH - ▁ASSISTANT - ▁INDUSTRY - ▁TUMBLE - ▁DEFENCE - ▁HITHER - ▁SMOT - ▁COLONI - ▁AMAZEMENT - ▁MARGUERITE - ▁MIRACLE - ▁INHERIT - ▁BEGGAR - ▁ENVELOPE - ▁INDIGNATION - ▁NATASHA - ▁PROPOSAL - ▁FRAGMENT - ▁ROUSED - ▁ROAST - ENCIES - ▁COMMENCED - ▁RESOURCE - ▁POPULATION - ▁QUOTH - ▁PURSUE - ▁EDUCAT - ▁AFFLICT - ▁CONTACT - ▁CRIMSON - ▁DIVISION - ▁DISORDER - ▁COPPER - ▁SOLICIT - ▁MODERATE - ▁DRUM - ▁SWIM - ▁SALUTE - ▁ASSUME - ▁MUSCLE - ▁OVERWHELM - ▁SHAKESPEARE - ▁STRUGGLING - ▁TRANQUIL - ▁CHICKEN - ▁TREAD - ▁CLAW - ▁BIBLE - ▁RIDGE - ▁THREAT - ▁VELVET - ▁EXPOSED - ▁IDIOT - ▁BARREL - ▁PENNY - ▁TEMPTATION - ▁DANGLARS - ▁CENTURIES - ▁DISTRIBUT - ▁REJECT - ▁RETORTED - ▁CONCENTRAT - ▁CORDIAL - ▁MOTOR - ▁CANNON - KEEP - ▁WRETCH - ▁ASSURANCE - ▁THIEF - ▁SURVEY - ▁VITAL - ▁RAILWAY - ▁JACKSON - ▁CRASH - ▁GROWL - ▁COMBAT - ▁RECOLLECTION - ▁SECURITY - ▁JACOB - ▁CLUTCH - ▁BLANKET - ▁NANCY - ▁CELLAR - ▁CONVENIENT - ▁INDIGNANT - ▁COARSE - ▁WORM - ▁SCREEN - ▁TRANSPORT - ▁BULLET - ▁APPRECIATE - ▁DEVOTION - ▁INVISIBLE - ▁DRIED - ▁MIXTURE - ▁CANDID - ▁PERFORMANCE - ▁RIPE - ▁EXQUISITE - ▁BARGAIN - ▁TOBACCO - ▁LOYAL - ▁MOULD - ▁ATTENTIVE - ▁DOROTHY - ▁BRUTE - ▁ESTABLISHMENT - ▁ABILITY - ▁INHABIT - ▁OBSCURE - ▁BORROW - ▁ESSENCE - ▁DISMAY - ▁FLEE - ▁BLADE - ▁PLUCK - ▁COFFIN - ▁SUNSET - ▁STEPHEN - ▁ECONOMIC - ▁HOLIDAY - ▁MECHANICAL - ▁COTTON - ▁AWAKENED - ▁SEIZE - ▁RIDICULOUS - ▁SANCHO - ▁HESITATION - ▁CORPSE - ▁SAVING - HOLD - FOOT - ▁ELDEST - ▁DESPITE - ▁EDITH - ▁CHERISH - ▁RESISTANCE - ▁WILSON - ▁ARGUE - ▁INQUIRE - ▁APPREHENSION - ▁AVENUE - ▁DRAKE - ▁PROPOSE - HURST - ▁INFERIOR - ▁STAIRCASE - ▁WHEREFORE - ▁CARLYLE - ▁COUCH - ▁ROUTE - ▁POLITICS - ▁TOMORROW - ▁THRONG - ▁NAUGHT - ▁SUNLIGHT - ▁INDIFFERENCE - ▁OBEDIENCE - ▁RECEPTION - ▁VEGETABLE - ▁IMPERFECT - ▁RESIDENCE - ▁TURKEY - ▁VIOLET - ▁SARAH - ▁ALTAR - ▁GRIEVE - ▁JERK - ▁ENSU - ▁MAGICIAN - ▁BLOSSOM - ▁LANTERN - ▁RESOLUTE - ▁THOUGHTFULLY - ▁FORTNIGHT - ▁TRUMPET - ▁VALJEAN - ▁UNWILLING - ▁LECTURE - ▁WHEREUPON - ▁HOLLAND - ▁CHANGING - ▁CREEK - ▁SLICE - ▁NORMAL - ▁ANNIE - ▁ACCENT - ▁FREDERICK - ▁DISAGREEABLE - ▁RUBBED - ▁DUMB - ▁ESTABLISH - ▁IMPORT - ▁AFFIRM - ▁MATTHEW - ▁BRISK - ▁CONVERT - ▁BENDING - ▁IVAN - ▁MADEMOISELLE - ▁MICHAEL - ▁EASIER - ▁JONES - ▁FACING - ▁EXCELLENCY - ▁LITERARY - ▁GOSSIP - ▁DEVOUR - ▁STAGGER - ▁PENCIL - ▁AVERAGE - ▁HAMMER - ▁TRIUMPHANT - ▁PREFERRED - ▁APPLICATION - ▁OCCUPY - ▁AUTHORITIES - BURN - ▁ASCERTAIN - ▁CORRIDOR - ▁DELICIOUS - ▁PRACTISE - ▁UNIVERSE - ▁SHILLING - ▁CONTEST - ▁ASHORE - ▁COMMIT - ▁ADMINISTRATION - ▁STUDIED - ▁RIGID - ▁ADORN - ▁ELSEWHERE - ▁INNOCENCE - ▁JOURNAL - ▁LANDSCAPE - ▁TELEGRAPH - ▁ANGRILY - ▁CAMPAIGN - ▁UNJUST - ▁CHALLENGE - ▁TORRENT - ▁RELATE - ▁ASSEMBLED - ▁IMPRESSED - ▁CANOE - ▁CONCLUD - ▁QUIXOTE - ▁SATISFACTORY - ▁NIECE - ▁DEAF - ▁RAFT - ▁JIMMY - ▁GLID - ▁REGULAT - ▁CHATTER - ▁GLACIER - ▁ENVY - ▁STATUE - ▁BOSTON - ▁RICHMOND - ▁DENIED - ▁FANNY - ▁SOLOMON - ▁VULGAR - ▁STALK - ▁REPLACE - ▁SPOON - ▁BASIN - ▁FEATURE - ▁CONVICT - ▁ARCHITECT - ▁ADMIRAL - ▁RIBBON - ▁PERMANENT - ▁APRIL - ▁JOLLY - ▁NEIGHBORHOOD - ▁IMPART - BOROUGH - CAMP - ▁HORRID - ▁IMMORTAL - ▁PRUDENCE - ▁SPANIARD - ▁SUPPOSING - ▁TELEPHONE - ▁TEMPERATURE - ▁PENETRATE - ▁OYSTER - ▁APPOINTMENT - ▁EGYPTIAN - ▁DWELT - ▁NEPHEW - ▁RAILROAD - ▁SEPTEMBER - ▁DEVICE - ▁WHEAT - ▁GILBERT - ▁ELEGANT - ▁ADVERTISE - ▁RATIONAL - ▁TURTLE - ▁BROOD - ▁ASSEMBLY - ▁CULTIVATE - ▁EDITOR - ▁SPECIMEN - ▁UNDOUBTEDLY - ▁WHALE - ▁DROPPING - ▁BALLOON - ▁MEDICAL - COMB - ▁COMPOSITION - ▁FOOTSTEPS - ▁LAUNCELOT - ▁DISCOURSE - ▁ERRAND - ▁CONVERSE - ▁ADVANCING - ▁DOWNSTAIRS - ▁TUMULT - ▁CORRUPT - ▁SUFFICE - ▁ANGUISH - ▁SHAGGY - ▁RETIRE - ▁TIMBER - ▁BLAZE - ▁ABSTRACT - ▁EMBROIDER - ▁PHOTOGRAPH - ▁PROSPERITY - ▁TERRIBLY - ▁TERRITORY - ▁THRESHOLD - ▁PAVEMENT - ▁INJURED - ▁LIMP - ▁AGITATION - ▁RASCAL - ▁PRESUME - ▁OBSERVING - ▁OBSTACLE - ▁SIMPLICITY - ▁SLUMBER - ▁SUPPLIED - ▁COMBINATION - ▁DRAIN - ▁WILDERNESS - ▁BELIEVING - ▁VILLAIN - ▁RECKLESS - ▁INJURY - ▁CLAPP - ▁FRIDAY - ▁HERCULES - ▁KENNEDY - ▁SYMPTOM - ▁SLEDGE - ▁CEILING - ▁LEMON - ▁PLAGUE - ▁MONDAY - ▁CANVAS - ▁IMPATIENCE - ▁UNCOMFORTABLE - ▁ACCESS - ▁FROZEN - ▁SENATOR - ▁FRANZ - ▁SWIMMING - ▁BARRIER - ▁ADJUST - ▁COMPARISON - ▁PROCLAIM - ▁WRINKL - ▁OVERLOOK - ▁MITYA - ▁GUILT - ▁PERCEPTION - ▁PRECAUTION - ▁SPECTATOR - ▁SURPRISING - ▁DISTRACT - ▁DISDAIN - ▁BONNET - ▁MAGNET - ▁PROFESS - ▁CONFOUND - ▁NARRATIVE - ▁STRUCTURE - ▁SKETCH - ▁ULTIMATE - ▁GLOBE - ▁INSECT - FICIENCY - ▁ORCHARD - ▁AMIABLE - ▁DESCENT - ▁INDEPENDENCE - ▁MANUFACTURE - ▁SPRINKLE - ▁NIGHTINGALE - ▁CUSHION - ▁EMINENT - ▁SCOTT - ▁ARRAY - ▁COSETTE - ▁WAVING - ▁EXTRACT - ▁IRREGULAR - ▁PERSECUT - ▁DERIVED - ▁WITHDREW - ▁CAUTION - ▁SUSPICIOUS - ▁MEMORIES - ▁NOWHERE - ▁SUBTLE - ▁THOROUGH - Q - ▁APPROPRIATE - ▁SLAUGHTER - ▁YOURSELVES - ▁THUMB - ▁TWAS - ▁ABODE - ▁BIDDING - ▁CONSPICUOUS - ▁REBECCA - ▁SERGEANT - ▁APRON - ▁ANTICIPATE - ▁DISCIPLINE - ▁GLANCING - ▁PILGRIM - ▁SULLEN - ▁CONTRIBUTE - ▁PRAIRIE - ▁CARVED - ▁COMMERCE - ▁EXCLAMATION - ▁MUSCULAR - ▁NOVEMBER - ▁PHENOMENA - ▁SYMBOL - ▁UMBRELLA - ▁DIMINISH - ▁PARLOUR - ▁THREATENING - ▁STUMP - ▁EXTENSIVE - ▁PLEASING - ▁REMEMBRANCE - ▁COMBINED - ▁SHERIFF - ▁SHAFT - ▁LAURA - ▁INTERCOURSE - ▁STRICKEN - ▁SUPPLIES - ▁LANDLORD - ▁SHRINK - ▁PRICK - ▁CAESAR - ▁DRUG - ▁BEWILDERED - ▁NAUTILUS - ▁BRUTAL - ▁COMMERCIAL - ▁MAGGIE - ▁SPHERE - ▁VIRGIN - ▁BRETHREN - ▁DESTINY - ▁POLICY - ▁TERRIFIED - ▁HOUSEKEEPER - ▁CRAZY - ▁ARDENT - ▁DISCERN - ▁WRAP - ▁MARQUIS - ▁RUSSIA - MOUTH - ▁BRITAIN - ▁HARBOUR - ▁CONCERT - ▁DONKEY - ▁DAMAGE - ▁SLIM - ABOUT - ▁LUXURY - ▁MONSTROUS - ▁TENDENCY - ▁PARADISE - ▁CULTURE - ▁JULIUS - ▁RAOUL - ▁REMEDY - ▁DECAY - ▁SCOLD - ▁SPLIT - ▁ASSAULT - ▁DECEMBER - ▁MOSCOW - ▁EXPLORE - ▁TROUSERS - ▁WRIST - PIECE - ▁MUSKET - ▁VALENTINE - ▁TYRANT - ▁ABRAHAM - ▁MEDIUM - ▁ARTIFICIAL - ▁FACULTY - ▁OBLIGATION - ▁RESEMBLANCE - ▁INQUIRIES - ▁DETAIN - ▁SWARM - ▁PLEDGE - ▁ADMIRABLE - ▁DEFECT - ▁SUPERINTEND - ▁PATRIOT - ▁CLUNG - ▁DISMAL - ▁RECIT - ▁IGNOR - ▁AMELIA - ▁JUSTIFY - ▁ELEPHANT - ▁ESTIMATE - ▁KNELT - ▁SERVING - ▁WHIM - ▁SHRILL - ▁STUDIO - ▁TEXT - ▁ALEXANDER - ▁WROUGHT - ▁ABUNDANT - ▁SITUATED - ▁REGAIN - ▁FIERY - ▁SNEER - ▁SWEAT - ▁GLARE - ▁NIGH - ▁ESCORT - ▁INEVITABLE - ▁PSMITH - ▁RELUCTANT - ▁PRECEDING - ▁RESORT - ▁OUTRAGE - ▁AMBASSADOR - ▁CONSOLATION - ▁RECOGNITION - ▁REMORSE - ▁BEHALF - ▁FORMIDABLE - ▁GRAVITY - ▁DIVIDE - ▁CONFRONT - ▁GIGANTIC - ▁OCTOBER - ▁FLANK - ▁SLEW - ▁CLARA - ▁FILM - ▁BULK - ▁POMP - ▁ELEANOR - ▁EMPHASIS - ▁JAPANESE - ▁CAVALRY - ▁EXCLUSIVE - ▁PERFUME - ▁BRONZE - ▁FEDERAL - ▁LIQUID - ▁RUBBING - ▁OVEN - DOLPH - ▁CONVULS - ▁DEPRIVED - ▁RESPONSIBILITY - ▁SIGNIFICANT - ▁WAISTCOAT - ▁CLUSTER - ▁MARTHA - ▁REVERSE - ▁ATTORNEY - ▁DROOP - ▁SKILFUL - ▁HABITUAL - ▁PUMP - ▁INTERVEN - ▁OWL - ▁CONJECTURE - ▁FANTASTIC - ▁RESPONSIBLE - ▁DESTINED - ▁DOCUMENT - ▁THEREUPON - ▁GODDESS - ▁PACIFIC - ▁WARRANT - ▁COSTUME - ▁BRIDLE - ▁CALIFORNIA - ▁DEMOCRATIC - ▁EUSTACE - ▁SQUIRREL - ▁UNCOMMON - ▁MARVELLOUS - ▁PLOUGH - ▁TRAGEDY - ▁VAULT - ▁HESITATE - ▁REFRAIN - ▁ADMIRING - ▁CORPORAL - ▁ENTITLED - ▁SHREWD - ▁SQUEEZ - ▁ACCURATE - ▁TEMPEST - ▁MONUMENT - ▁SIEGE - ▁CHINESE - ▁RAVEN - ▁LOUNG - ▁ASSASSIN - ▁INFLICT - ▁AGITATED - ▁DESIRABLE - ▁EARLIEST - ▁LAUNCH - ▁PILOT - ▁PULSE - ▁MUTE - LEIGH - ▁LIQUOR - ▁SCARECROW - ▁SKULL - ▁DESOLATE - ▁SUBLIME - ▁SERENE - ▁RECESS - ▁WAKING - ▁CHARLOTTE - ▁CIRCULAR - ▁INJUSTICE - ▁PINOCCHIO - ▁PRISCILLA - ▁THYSELF - ▁OCCURRENCE - ▁CASUAL - ▁FRANTIC - ▁LEGEND - ▁FERTIL - ▁BACKGROUND - ▁DELICACY - ▁ESTRALLA - ▁MANUSCRIPT - ▁RESPONSE - ▁UNIVERSITY - ▁WOLVES - ▁SCANDAL - ▁STUMBLE - ▁HOARSE - ▁BODILY - ▁CONVENT - ▁EXAMINING - ▁INCAPABLE - ▁PERCEIVING - ▁PHILADELPHIA - ▁SUBSEQUENT - ▁THIEVES - ▁ACCUMULAT - ▁DAMSEL - ▁SCOTCH - ▁UNDERNEATH - ▁NOBILITY - ▁SMASH - ▁REVOLT - ▁ENGAGE - ▁CATHEDRAL - ▁CHAMPION - ▁DESPATCH - ▁ETERNITY - ▁JANUARY - ▁PLEADED - ▁PROBABILITY - ▁JIMMIE - ▁PARALLEL - ▁FISHERMAN - ▁JERRY - ▁SWORE - ▁DRAUGHT - ▁OPPONENT - ▁PRIMITIVE - ▁SIGNIFICANCE - ▁SUBSTANTIAL - ▁AMAZED - ▁DUNBAR - ▁COMMEND - ▁CONTEMPLATE - ▁TESTIMONY - ▁IMPERIAL - ▁ADAPT - ▁JUICE - ▁CALAMIT - CULAR - ▁CHATEAU - ▁PHOENIX - ▁PRUDENT - ▁SOLUTION - ▁VILLEFORT - ▁REACTION - ▁RELAX - ▁YU - ▁PROHIBIT - ▁DISTRUST - ▁PLUNDER - ▁WELFARE - ▁NAVIGAT - ▁PARLOR - ▁LAZY - ▁DETACH - OMETER - ▁PRIV - ▁DISCOURAGE - ▁OBSTINATE - ▁REJOICING - ▁SERMON - ▁VEHICLE - ▁FANCIES - ▁ENLIGHTEN - ▁ACUTE - ▁ILLUSION - ▁ANTHEA - ▁MARTIAN - ▁EXCITE - ▁GENEROSITY - OLOGIST - ▁AMAZING - ▁UNWORTHY - ▁INTERNAL - ▁INCENSE - ▁VIBRAT - ▁ADHERE - ROACH - ▁FEBRUARY - ▁MEXICAN - ▁POTATOES - ▁INCESSANT - ▁INTERPOSED - ▁PARCEL - ▁VEXED - ▁PROMOTE - MIDST - ▁ARISTOCRAT - ▁CYRIL - ▁EMBARK - ▁ABUNDANCE - ▁LITERALLY - ▁SURGEON - ▁TERRACE - ▁ATLANTIC - ▁MARTYR - ▁SPECK - ▁SENATE - ▁LOAF - ▁ADMINISTER - ▁APPREHEND - ▁SUBDUED - ▁TEMPORARY - ▁DOMINION - ▁ELABORATE - ▁DIGNIFIED - ▁ELIZA - ▁SPLASH - ▁CONSEIL - ▁DEXTER - ▁UNSEEN - ▁TRAGIC - VOCATION - ▁GRATIFY - ▁BACHELOR - ▁DEFENSE - ▁EXCURSION - ▁FACULTIES - ▁PROPRIETOR - ▁SYMPATHETIC - ▁UNNECESSARY - ▁RADIANT - ▁VACANT - ▁OUNCE - ▁SCREW - ▁PHENOMENON - ▁PROMINENT - ▁WORRIED - ▁STUDIES - ▁CLIMATE - ▁KEITH - ▁ARAMIS - ▁BLISS - ▁CONTINUAL - ▁SURPASS - ▁HEBREW - ▁IDENTITY - ▁PROVOKE - ▁TEMPERAMENT - ▁CHARIOT - ▁HARBOR - ▁NINTH - ▁PRIOR - ▁DESIROUS - ▁JERUSALEM - ▁UNDERTAKING - ▁EDISON - ▁MIRTH - ▁SCOUT - ▁APPARATUS - ▁ILLUSTRATION - ▁INTELLIGIBLE - ▁INVARIABLY - ▁PIERCED - ▁REVIEW - ▁FLICKER - ▁HAZARD - ▁REVELATION - ▁DIXON - ▁EXCITING - ▁GOSPEL - ▁CONSTANCE - ▁OVERTAKE - ▁GUINEA - ▁ALADDIN - ▁CHICAGO - ▁TULLIVER - ▁HAMILTON - ▁GARRISON - ▁DISCIPLE - ▁INTENSITY - ▁TRAITOR - ▁CHANCELLOR - ▁PROVERB - ▁DAGGER - ▁FORESEE - ▁CONFIDE - ▁GLIMMER - ▁CHAUVELIN - ▁ILLUSTRATE - ▁VOLUNTEER - ▁JUNGLE - ▁STREAK - ▁SUNRISE - ▁DISSOLV - ▁QUEST - ▁AWHILE - ▁FELICITY - ▁LEGISLATURE - ▁LEONORA - ▁MAGAZINE - ▁PITIFUL - ▁COLONY - ▁SHAWL - ▁ARRIVING - ▁FUNDAMENTAL - ▁CARPENTER - ▁OVERFLOW - ▁EXPAND - ▁HARVEST - ▁FEMININE - ▁INNUMERABLE - ▁SCRAMBLE - ▁TWENTIETH - ▁TRIFLING - ▁GHASTL - ▁CONQUEST - ▁DANIEL - ▁FACILIT - ▁FORSAKE - ▁BEHAVIOUR - ▁GORGEOUS - ▁PRODUCING - ▁HAPPIER - ▁PROMISING - ▁RAINBOW - ▁INSTINCTIVELY - ▁DECREE - ▁EYEBROWS - ▁IRRESISTIBLE - ▁PHARAOH - ▁SCROOGE - ▁UNNATURAL - ▁CRUMBS - ▁REFINED - ▁DREARY - ▁TRENCH - ▁CONVINCE - ▁FRINGE - ▁EXTREMITY - ▁INTIMACY - ▁SCOUNDREL - ▁SUFFRAGE - ▁UNEASINESS - ▁BARRICADE - ▁CIRCULAT - ▁SAMUEL - ▁BRUCE - ▁DARCY - <sos/eos> init: null input_size: null ctc_conf: dropout_rate: 0.0 ctc_type: builtin reduce: true ignore_nan_grad: true joint_net_conf: joint_space_size: 640 model_conf: ctc_weight: 0.3 report_cer: true report_wer: true use_preprocessor: true token_type: bpe bpemodel: data/en_token_list/bpe_unigram5000/bpe.model non_linguistic_symbols: null cleaner: null g2p: null speech_volume_normalize: null rir_scp: null rir_apply_prob: 1.0 noise_scp: null noise_apply_prob: 1.0 noise_db_range: '13_15' frontend: default frontend_conf: n_fft: 512 hop_length: 160 fs: 16k specaug: specaug specaug_conf: apply_time_warp: true time_warp_window: 5 time_warp_mode: bicubic apply_freq_mask: true freq_mask_width_range: - 0 - 30 num_freq_mask: 2 apply_time_mask: true time_mask_width_range: - 0 - 40 num_time_mask: 2 normalize: global_mvn normalize_conf: stats_file: exp/asr_stats_raw_en_bpe5000_sp/train/feats_stats.npz preencoder: null preencoder_conf: {} encoder: conformer encoder_conf: output_size: 512 attention_heads: 8 linear_units: 2048 num_blocks: 12 dropout_rate: 0.1 positional_dropout_rate: 0.1 attention_dropout_rate: 0.1 input_layer: conv2d normalize_before: true macaron_style: true rel_pos_type: latest pos_enc_layer_type: rel_pos selfattention_layer_type: rel_selfattn activation_type: swish use_cnn_module: true cnn_module_kernel: 31 postencoder: null postencoder_conf: {} decoder: transducer decoder_conf: rnn_type: lstm num_layers: 1 hidden_size: 512 dropout: 0.1 dropout_embed: 0.2 required: - output_dir - token_list version: 0.10.7a1 distributed: true ``` </details>
5fe8ebedbdf0f1696fb0ef7f45db27c5
cc-by-4.0
['espnet', 'audio', 'automatic-speech-recognition']
false
Citing ESPnet ```BibTex @inproceedings{watanabe2018espnet, author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai}, title={{ESPnet}: End-to-End Speech Processing Toolkit}, year={2018}, booktitle={Proceedings of Interspeech}, pages={2207--2211}, doi={10.21437/Interspeech.2018-1456}, url={http://dx.doi.org/10.21437/Interspeech.2018-1456} } ``` or arXiv: ```bibtex @misc{watanabe2018espnet, title={ESPnet: End-to-End Speech Processing Toolkit}, author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai}, year={2018}, eprint={1804.00015}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
d0eb12a64082e6cda7667fb18267906b
apache-2.0
[]
false
Cross-Encoder for MS MARCO - EN-DE This is a cross-lingual Cross-Encoder model for EN-DE that can be used for passage re-ranking. It was trained on the [MS Marco Passage Ranking](https://github.com/microsoft/MSMARCO-Passage-Ranking) task. The model can be used for Information Retrieval: See [SBERT.net Retrieve & Re-rank](https://www.sbert.net/examples/applications/retrieve_rerank/README.html). The training code is available in this repository, see `train_script.py`.
460de442989ec7de68263b60cb60e475
apache-2.0
[]
false
Usage with SentenceTransformers When you have [SentenceTransformers](https://www.sbert.net/) installed, you can use the model like this: ```python from sentence_transformers import CrossEncoder model = CrossEncoder('model_name', max_length=512) query = 'How many people live in Berlin?' docs = ['Berlin has a population of 3,520,031 registered inhabitants in an area of 891.82 square kilometers.', 'New York City is famous for the Metropolitan Museum of Art.'] pairs = [(query, doc) for doc in docs] scores = model.predict(pairs) ```
f2cb87118b22f6a2ddb35d2c0108a75c
apache-2.0
[]
false
Usage with Transformers With the transformers library, you can use the model like this: ```python from transformers import AutoTokenizer, AutoModelForSequenceClassification import torch model = AutoModelForSequenceClassification.from_pretrained('model_name') tokenizer = AutoTokenizer.from_pretrained('model_name') features = tokenizer(['How many people live in Berlin?', 'How many people live in Berlin?'], ['Berlin has a population of 3,520,031 registered inhabitants in an area of 891.82 square kilometers.', 'New York City is famous for the Metropolitan Museum of Art.'], padding=True, truncation=True, return_tensors="pt") model.eval() with torch.no_grad(): scores = model(**features).logits print(scores) ```
d87d5132f29f054791fe3d89021f0141
apache-2.0
[]
false
Performance The performance was evaluated on three datasets: - **TREC-DL19 EN-EN**: The original [TREC 2019 Deep Learning Track](https://microsoft.github.io/msmarco/TREC-Deep-Learning-2019.html): Given an English query and 1000 documents (retrieved by BM25 lexical search), rank documents with according to their relevance. We compute NDCG@10. BM25 achieves a score of 45.46, a perfect re-ranker can achieve a score of 95.47. - **TREC-DL19 DE-EN**: The English queries of TREC-DL19 have been translated by a German native speaker to German. We rank the German queries versus the English passages from the original TREC-DL19 setup. We compute NDCG@10. - **GermanDPR DE-DE**: The [GermanDPR](https://www.deepset.ai/germanquad) dataset provides German queries and German passages from Wikipedia. We indexed the 2.8 Million paragraphs from German Wikipedia and retrieved for each query the top 100 most relevant passages using BM25 lexical search with Elasticsearch. We compute MRR@10. BM25 achieves a score of 35.85, a perfect re-ranker can achieve a score of 76.27. We also check the performance of bi-encoders using the same evaluation: The retrieved documents from BM25 lexical search are re-ranked using query & passage embeddings with cosine-similarity. Bi-Encoders can also be used for end-to-end semantic search. | Model-Name | TREC-DL19 EN-EN | TREC-DL19 DE-EN | GermanDPR DE-DE | Docs / Sec | | ------------- |:-------------:| :-----: | :---: | :----: | | BM25 | 45.46 | - | 35.85 | -| | **Cross-Encoder Re-Rankers** | | | | | [cross-encoder/msmarco-MiniLM-L6-en-de-v1](https://huggingface.co/cross-encoder/msmarco-MiniLM-L6-en-de-v1) | 72.43 | 65.53 | 46.77 | 1600 | | [cross-encoder/msmarco-MiniLM-L12-en-de-v1](https://huggingface.co/cross-encoder/msmarco-MiniLM-L12-en-de-v1) | 72.94 | 66.07 | 49.91 | 900 | | [svalabs/cross-electra-ms-marco-german-uncased](https://huggingface.co/svalabs/cross-electra-ms-marco-german-uncased) (DE only) | - | - | 53.67 | 260 | | [deepset/gbert-base-germandpr-reranking](https://huggingface.co/deepset/gbert-base-germandpr-reranking) (DE only) | - | - | 53.59 | 260 | | **Bi-Encoders (re-ranking)** | | | | | [sentence-transformers/msmarco-distilbert-multilingual-en-de-v2-tmp-lng-aligned](https://huggingface.co/sentence-transformers/msmarco-distilbert-multilingual-en-de-v2-tmp-lng-aligned) | 63.38 | 58.28 | 37.88 | 940 | | [sentence-transformers/msmarco-distilbert-multilingual-en-de-v2-tmp-trained-scratch](https://huggingface.co/sentence-transformers/msmarco-distilbert-multilingual-en-de-v2-tmp-trained-scratch) | 65.51 | 58.69 | 38.32 | 940 | | [svalabs/bi-electra-ms-marco-german-uncased](https://huggingface.co/svalabs/bi-electra-ms-marco-german-uncased) (DE only) | - | - | 34.31 | 450 | | [deepset/gbert-base-germandpr-question_encoder](https://huggingface.co/deepset/gbert-base-germandpr-question_encoder) (DE only) | - | - | 42.55 | 450 | Note: Docs / Sec gives the number of (query, document) pairs we can re-rank within a second on a V100 GPU.
f5772d6224d5bd825bdddd74096f52a5
apache-2.0
['generated_from_trainer']
false
all-roberta-large-v1-home-8-16-5 This model is a fine-tuned version of [sentence-transformers/all-roberta-large-v1](https://huggingface.co/sentence-transformers/all-roberta-large-v1) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.3789 - Accuracy: 0.3356
0ccf58b2104a69aa54726b5fe8735636
creativeml-openrail-m
[]
false
Basic explanation Token and Class words are what guide the AI to produce images similar to the trained style/object/character. Include any mix of these words in the prompt to produce verying results, or exclude them to have a less pronounced effect. There is usually at least a slight stylistic effect even without the words, but it is recommended to include at least one. Adding token word/phrase class word/phrase at the start of the prompt in that order produces results most similar to the trained concept, but they can be included elsewhere as well. Some models produce better results when not including all token/class words.
e2febb228ae84886585a850489fe96e0
creativeml-openrail-m
[]
false
Usage When using this model by itself, it is not necessary to use any keywords, but they will strengthen the style effect. Rossmix produces the best results, while ross-any also works quite well. Ross based on wd has a more of an illustration feel, but works best when mixed with other models. Rossmix and ross-any may work better with clip-skip 2, while ross most likely works better with clip skip 1.
9989dbae12db5db14edf313a8a1f455a
creativeml-openrail-m
[]
false
Example images Example images also include 3 extra mixes that include ross or ross-any. Positive: `m_ross, (illustration), (masterpiece), ((best quality)), (ultra-detailed), (official art), ((portrait of a beautiful girl)), upper body`\ Negative: `lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, artist name` Ground truth ![base](base.png) Examples ![grid1](img1.png) ![grid2](img2.png) 768x1024 without highres fix ![grid3](img3.png)
4a4c7d07ecaa46ca0fca58df4ade1951
apache-2.0
['generated_from_trainer']
false
distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.3554 - Accuracy: 0.902 - F1: 0.9001
b43b456b902568ca52ad589bd2f2a083
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 1.0993 | 1.0 | 125 | 0.5742 | 0.8045 | 0.7747 | | 0.4436 | 2.0 | 250 | 0.3554 | 0.902 | 0.9001 |
66b28d179e0c949398d742da005cea9c
apache-2.0
['generated_from_trainer']
false
vit-for-kaggle-mayo-clinic This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5538 - Accuracy: 0.7616
0bb8d3be41f9780f5104bc93d2b1a940
apache-2.0
['generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 8
5e4bf0cc8746fcf8d00ace5ae396cda2
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 10 | 0.5944 | 0.7483 | | No log | 2.0 | 20 | 0.5640 | 0.7483 | | No log | 3.0 | 30 | 0.5582 | 0.7483 | | No log | 4.0 | 40 | 0.5585 | 0.7483 | | No log | 5.0 | 50 | 0.5598 | 0.7483 | | No log | 6.0 | 60 | 0.5484 | 0.7483 | | No log | 7.0 | 70 | 0.5524 | 0.7417 | | No log | 8.0 | 80 | 0.5538 | 0.7616 |
6280c55635aec42fe94033f580ae9c12
apache-2.0
['whisper-event', 'generated_from_trainer']
false
Whisper Large v2 Azerbaijani This model is a fine-tuned version of [openai/whisper-large-v2](https://huggingface.co/openai/whisper-large-v2) on the mozilla-foundation/common_voice_11_0 az dataset. It achieves the following results on the evaluation set: - Loss: 0.9435 - Wer: 38.4615
d64e1704e72253c5827045118f97f281
apache-2.0
['whisper-event', 'generated_from_trainer']
false
Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 5000 - mixed_precision_training: Native AMP
3dc9d2e5a2a8f35373dcd09e66b62fda
apache-2.0
['whisper-event', 'generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:------:|:----:|:---------------:|:-------:| | 0.0 | 999.0 | 1000 | 0.8373 | 39.6450 | | 0.0 | 1999.0 | 2000 | 0.9435 | 38.4615 | | 0.0 | 2999.0 | 3000 | 1.0010 | 43.1953 | | 0.0 | 3999.0 | 4000 | 1.0380 | 44.3787 | | 0.0 | 4999.0 | 5000 | 1.0529 | 43.7870 |
fe82c6bb8baf5b3f941554d3b797d064
apache-2.0
['generated_from_trainer']
false
small-mlm-glue-qnli-target-glue-qqp This model is a fine-tuned version of [muhtasham/small-mlm-glue-qnli](https://huggingface.co/muhtasham/small-mlm-glue-qnli) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3296 - Accuracy: 0.8511 - F1: 0.8117
88a7b05faf24623e1a8e8aaf8d73e744
apache-2.0
['generated_from_trainer']
false
Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.4762 | 0.04 | 500 | 0.4247 | 0.7897 | 0.7473 | | 0.4188 | 0.09 | 1000 | 0.3880 | 0.8126 | 0.7702 | | 0.4011 | 0.13 | 1500 | 0.3760 | 0.8194 | 0.7750 | | 0.387 | 0.18 | 2000 | 0.3779 | 0.8189 | 0.7866 | | 0.3802 | 0.22 | 2500 | 0.3642 | 0.8320 | 0.7958 | | 0.3606 | 0.26 | 3000 | 0.3526 | 0.8358 | 0.7972 | | 0.3604 | 0.31 | 3500 | 0.3337 | 0.8495 | 0.8010 | | 0.3538 | 0.35 | 4000 | 0.3341 | 0.8483 | 0.8102 | | 0.3582 | 0.4 | 4500 | 0.3293 | 0.8503 | 0.8106 | | 0.345 | 0.44 | 5000 | 0.3296 | 0.8511 | 0.8117 |
a8a2d8aa0971b8e9cde30642afe6c693
apache-2.0
['Tensorflow']
false
Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. The model and its training code has been mainly taken from: [Tensorpack](https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN) . Regarding the dataset, please check: [Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation](https://arxiv.org/abs/1911.10683). The model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are calculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML. The code has been adapted so that it can be used in a **deep**doctection pipeline.
57b952cbb6512e8f73008db454dd2198
apache-2.0
['Tensorflow']
false
This is an inference model only To reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this [model](https://huggingface.co/deepdoctection/tp_casc_rcnn_X_32xd4_50_FPN_GN_2FC_pubtabnet_rc).
bd0695692260d3a7269350dbe0968835
apache-2.0
['Tensorflow']
false
How this model was trained. To recreate the model run on the **deep**doctection framework, run: ```python >>> import os >>> from deep_doctection.datasets import DatasetRegistry >>> from deep_doctection.eval import MetricRegistry >>> from deep_doctection.utils import get_configs_dir_path >>> from deep_doctection.train import train_faster_rcnn pubtabnet = DatasetRegistry.get_dataset("pubtabnet") pubtabnet.dataflow.categories.set_cat_to_sub_cat({"ITEM":"row_col"}) pubtabnet.dataflow.categories.filter_categories(categories=["ROW","COLUMN"]) path_config_yaml=os.path.join(get_configs_dir_path(),"tp/rows/conf_frcnn_rows.yaml") path_weights = "" dataset_train = pubtabnet config_overwrite=["TRAIN.STEPS_PER_EPOCH=500","TRAIN.STARTING_EPOCH=1", "TRAIN.CHECKPOINT_PERIOD=50"] build_train_config=["max_datapoints=500000","rows_and_cols=True"] dataset_val = pubtabnet build_val_config = ["max_datapoints=2000","rows_and_cols=True"] coco_metric = MetricRegistry.get_metric("coco") coco_metric.set_params(max_detections=[50,200,600], area_range=[[0,1000000],[0,200],[200,800],[800,1000000]]) train_faster_rcnn(path_config_yaml=path_config_yaml, dataset_train=dataset_train, path_weights=path_weights, config_overwrite=config_overwrite, log_dir="/path/to/dir", build_train_config=build_train_config, dataset_val=dataset_val, build_val_config=build_val_config, metric=coco_metric, pipeline_component_name="ImageLayoutService" ) ```
eb144d6d863aaca8bd0434a1e56acc67
cc-by-4.0
['espnet', 'audio', 'automatic-speech-recognition']
false
Demo: How to use in ESPnet2 Follow the [ESPnet installation instructions](https://espnet.github.io/espnet/installation.html) if you haven't done that already. ```bash cd espnet git checkout 28695114f2771ac3d2a9cc0b5fb30a2c3262e49a pip install -e . cd egs2/librimix/asr1 ./run.sh --skip_data_prep false --skip_train true --download_model espnet/simpleoier_librimix_asr_train_asr_transformer_multispkr_raw_en_char_sp ``` <!-- Generated by scripts/utils/show_asr_result.sh -->
6d3e47101561e670154df5458d394ae4
cc-by-4.0
['espnet', 'audio', 'automatic-speech-recognition']
false
Environments - date: `Thu Nov 10 14:58:09 EST 2022` - python version: `3.9.13 (main, Aug 25 2022, 23:26:10) [GCC 11.2.0]` - espnet version: `espnet 202209` - pytorch version: `pytorch 1.12.1` - Git hash: `b3c185d5d707bb385b74f42df2cc59bcf7d7e754` - Commit date: `Wed Nov 9 22:00:30 2022 -0500`
46256ed6bacb9277f264ca745a37bcaf
cc-by-4.0
['espnet', 'audio', 'automatic-speech-recognition']
false
WER |dataset|Snt|Wrd|Corr|Sub|Del|Ins|Err|S.Err| |---|---|---|---|---|---|---|---|---| |decode_multi_asrtrue_lm_lm_train_lm_transformer_en_char_valid.loss.ave_asr_model_valid.acc.ave/test|6000|111243|80.4|17.4|2.2|3.8|23.5|88.0|
5925b81e8aa5ab55e09b5324ad039ac3
cc-by-4.0
['espnet', 'audio', 'automatic-speech-recognition']
false
CER |dataset|Snt|Wrd|Corr|Sub|Del|Ins|Err|S.Err| |---|---|---|---|---|---|---|---|---| |decode_multi_asrtrue_lm_lm_train_lm_transformer_en_char_valid.loss.ave_asr_model_valid.acc.ave/test|6000|590408|90.5|6.1|3.5|3.9|13.5|88.0|
04f2c667389e2188fa37e25e1b625d96
cc-by-4.0
['espnet', 'audio', 'automatic-speech-recognition']
false
ASR config <details><summary>expand</summary> ``` config: conf/tuning/train_asr_transformer_multispkr.yaml print_config: false log_level: INFO dry_run: false iterator_type: sequence output_dir: exp/asr_train_asr_transformer_multispkr_raw_en_char_sp ngpu: 1 seed: 0 num_workers: 1 num_att_plot: 3 dist_backend: nccl dist_init_method: env:// dist_world_size: null dist_rank: null local_rank: 0 dist_master_addr: null dist_master_port: null dist_launcher: null multiprocessing_distributed: false unused_parameters: false sharded_ddp: false cudnn_enabled: true cudnn_benchmark: false cudnn_deterministic: true collect_stats: false write_collected_feats: false max_epoch: 45 patience: null val_scheduler_criterion: - valid - loss early_stopping_criterion: - valid - loss - min best_model_criterion: - - valid - acc - max keep_nbest_models: 10 nbest_averaging_interval: 0 grad_clip: 5.0 grad_clip_type: 2.0 grad_noise: false accum_grad: 1 no_forward_run: false resume: true train_dtype: float32 use_amp: false log_interval: null use_matplotlib: true use_tensorboard: true create_graph_in_tensorboard: false use_wandb: false wandb_project: null wandb_id: null wandb_entity: null wandb_name: null wandb_model_log_interval: -1 detect_anomaly: false pretrain_path: null init_param: [] ignore_init_mismatch: false freeze_param: [] num_iters_per_epoch: null batch_size: 20 valid_batch_size: null batch_bins: 5000000 valid_batch_bins: null train_shape_file: - exp/asr_stats_raw_en_char_sp/train/speech_shape - exp/asr_stats_raw_en_char_sp/train/text_shape.char - exp/asr_stats_raw_en_char_sp/train/text_spk2_shape.char valid_shape_file: - exp/asr_stats_raw_en_char_sp/valid/speech_shape - exp/asr_stats_raw_en_char_sp/valid/text_shape.char - exp/asr_stats_raw_en_char_sp/valid/text_spk2_shape.char batch_type: numel valid_batch_type: null fold_length: - 80000 - 150 - 150 sort_in_batch: descending sort_batch: descending multiple_iterator: false chunk_length: 500 chunk_shift_ratio: 0.5 num_cache_chunks: 1024 train_data_path_and_name_and_type: - - dump/raw/train_sp/wav.scp - speech - sound - - dump/raw/train_sp/text_spk1 - text - text - - dump/raw/train_sp/text_spk2 - text_spk2 - text valid_data_path_and_name_and_type: - - dump/raw/dev/wav.scp - speech - sound - - dump/raw/dev/text_spk1 - text - text - - dump/raw/dev/text_spk2 - text_spk2 - text allow_variable_data_keys: false max_cache_size: 0.0 max_cache_fd: 32 valid_max_cache_size: null optim: adam optim_conf: lr: 0.001 scheduler: warmuplr scheduler_conf: warmup_steps: 25000 token_list: - <blank> - <unk> - <space> - E - T - A - O - N - I - H - S - R - D - L - U - M - C - W - F - G - Y - P - B - V - K - '''' - X - J - Q - Z - <sos/eos> init: xavier_uniform input_size: null ctc_conf: reduce: false joint_net_conf: null use_preprocessor: true token_type: char bpemodel: null non_linguistic_symbols: null cleaner: null g2p: null speech_volume_normalize: null rir_scp: null rir_apply_prob: 1.0 noise_scp: null noise_apply_prob: 1.0 noise_db_range: '13_15' short_noise_thres: 0.5 frontend: default frontend_conf: fs: 16k specaug: null specaug_conf: {} normalize: global_mvn normalize_conf: stats_file: exp/asr_stats_raw_en_char_sp/train/feats_stats.npz model: pit_espnet model_conf: ctc_weight: 0.2 lsm_weight: 0.1 length_normalized_loss: false num_inf: 2 num_ref: 2 preencoder: null preencoder_conf: {} encoder: transformer_multispkr encoder_conf: output_size: 256 attention_heads: 4 linear_units: 2048 num_blocks: 8 num_blocks_sd: 4 dropout_rate: 0.1 positional_dropout_rate: 0.1 attention_dropout_rate: 0.1 input_layer: conv2d normalize_before: true num_inf: 2 postencoder: null postencoder_conf: {} decoder: transformer decoder_conf: attention_heads: 4 linear_units: 2048 num_blocks: 6 dropout_rate: 0.1 positional_dropout_rate: 0.1 self_attention_dropout_rate: 0.1 src_attention_dropout_rate: 0.1 preprocessor: multi preprocessor_conf: text_name: - text - text_spk2 required: - output_dir - token_list version: '202209' distributed: false ``` </details>
7b5f0b1e06822200fcb76cb5aa6997c5