modelId stringlengths 4 111 | lastModified stringlengths 24 24 | tags list | pipeline_tag stringlengths 5 30 ⌀ | author stringlengths 2 34 ⌀ | config null | securityStatus null | id stringlengths 4 111 | likes int64 0 9.53k | downloads int64 2 73.6M | library_name stringlengths 2 84 ⌀ | created timestamp[us] | card stringlengths 101 901k | card_len int64 101 901k | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
jangmin/whisper-small-ko-1159h | 2023-05-05T10:13:37.000Z | [
"transformers",
"pytorch",
"tensorboard",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | jangmin | null | null | jangmin/whisper-small-ko-1159h | 0 | 2 | transformers | 2023-05-04T22:44:43 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: whisper-small-ko-1159h
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-small-ko-1159h
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1752
- Wer: 10.4449
## Model description
The model was trained to transcript the audio sources into Korean text.
## Intended uses & limitations
More information needed
## Training and evaluation data
I downloaded all data from AI-HUB (https://aihub.or.kr/). Two datasets, in particular, caught my attention: "Instruction Audio Set" and "Noisy Conversation Audio Set".
I intentionally gathered 796 hours of audio from the first dataset and 363 hours of audio from the second dataset (This includes statistics for the training data only, and excludes information about the validation data.).
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- training_steps: 18483
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:-------:|
| 0.0953 | 0.33 | 2053 | 0.2155 | 13.0432 |
| 0.0803 | 0.67 | 4106 | 0.1951 | 12.0399 |
| 0.0746 | 1.0 | 6159 | 0.1836 | 11.3995 |
| 0.0509 | 1.33 | 8212 | 0.1819 | 11.0396 |
| 0.0525 | 1.67 | 10265 | 0.1782 | 10.9039 |
| 0.0493 | 2.0 | 12318 | 0.1743 | 10.7255 |
| 0.034 | 2.33 | 14371 | 0.1784 | 10.7377 |
| 0.0326 | 2.67 | 16424 | 0.1765 | 10.5471 |
| 0.0293 | 3.0 | 18477 | 0.1752 | 10.4449 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.13.1+cu117
- Datasets 2.11.0
- Tokenizers 0.13.2
| 2,329 | [
[
-0.03509521484375,
-0.045501708984375,
0.01029205322265625,
0.0011148452758789062,
-0.01837158203125,
-0.031829833984375,
-0.0208282470703125,
-0.02740478515625,
0.0153961181640625,
0.0224609375,
-0.05450439453125,
-0.04644775390625,
-0.042724609375,
-0.0129... |
qunfengd/distilbert-base-uncased-finetuned-emotion | 2023-05-05T02:12:20.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | qunfengd | null | null | qunfengd/distilbert-base-uncased-finetuned-emotion | 0 | 2 | transformers | 2023-05-05T01:44:51 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2217
- Accuracy: 0.922
- F1: 0.9221
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.8167 | 1.0 | 250 | 0.3190 | 0.906 | 0.9039 |
| 0.2442 | 2.0 | 500 | 0.2217 | 0.922 | 0.9221 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Tokenizers 0.13.3
| 1,485 | [
[
-0.03887939453125,
-0.04425048828125,
0.0207977294921875,
0.0257415771484375,
-0.02850341796875,
-0.0195770263671875,
-0.013885498046875,
-0.007205963134765625,
0.00958251953125,
0.0077056884765625,
-0.056884765625,
-0.05035400390625,
-0.061553955078125,
-0.... |
Ramya2300/autotrain-final-sentiment-analysis-55566129341 | 2023-05-05T02:15:26.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"autotrain",
"unk",
"dataset:Ramya2300/autotrain-data-final-sentiment-analysis",
"co2_eq_emissions",
"endpoints_compatible",
"region:us"
] | text-classification | Ramya2300 | null | null | Ramya2300/autotrain-final-sentiment-analysis-55566129341 | 0 | 2 | transformers | 2023-05-05T02:09:52 | ---
tags:
- autotrain
- text-classification
language:
- unk
widget:
- text: "I love AutoTrain 🤗"
datasets:
- Ramya2300/autotrain-data-final-sentiment-analysis
co2_eq_emissions:
emissions: 2.1068707556976243
---
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 55566129341
- CO2 Emissions (in grams): 2.1069
## Validation Metrics
- Loss: 0.652
- Accuracy: 0.780
- Macro F1: 0.761
- Micro F1: 0.780
- Weighted F1: 0.780
- Macro Precision: 0.759
- Micro Precision: 0.780
- Weighted Precision: 0.781
- Macro Recall: 0.763
- Micro Recall: 0.780
- Weighted Recall: 0.780
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoTrain"}' https://api-inference.huggingface.co/models/Ramya2300/autotrain-final-sentiment-analysis-55566129341
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("Ramya2300/autotrain-final-sentiment-analysis-55566129341", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("Ramya2300/autotrain-final-sentiment-analysis-55566129341", use_auth_token=True)
inputs = tokenizer("I love AutoTrain", return_tensors="pt")
outputs = model(**inputs)
``` | 1,349 | [
[
-0.037384033203125,
-0.029052734375,
0.0086517333984375,
0.0211334228515625,
-0.00815582275390625,
0.00672149658203125,
-0.01000213623046875,
-0.01129150390625,
0.000720977783203125,
0.007160186767578125,
-0.051544189453125,
-0.037994384765625,
-0.06130981445312... |
ellucas/Detector-de-enfermedades-en-frejol | 2023-05-05T06:29:44.000Z | [
"transformers",
"pytorch",
"tensorboard",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:beans",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | image-classification | ellucas | null | null | ellucas/Detector-de-enfermedades-en-frejol | 0 | 2 | transformers | 2023-05-05T06:25:54 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- beans
metrics:
- accuracy
model-index:
- name: Detector-de-enfermedades-en-frejol
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: beans
type: beans
config: default
split: validation
args: default
metrics:
- name: Accuracy
type: accuracy
value: 1.0
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Detector-de-enfermedades-en-frejol
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0057
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.0638 | 3.85 | 500 | 0.0057 | 1.0 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,667 | [
[
-0.0280914306640625,
-0.054290771484375,
0.022308349609375,
0.0232696533203125,
-0.02459716796875,
-0.03302001953125,
-0.01245880126953125,
-0.0211639404296875,
0.01071929931640625,
0.022308349609375,
-0.0404052734375,
-0.040802001953125,
-0.055145263671875,
... |
Neomedallion/dqn-SpaceInvadersNoFrameskip-v4 | 2023-05-05T07:14:43.000Z | [
"stable-baselines3",
"SpaceInvadersNoFrameskip-v4",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | Neomedallion | null | null | Neomedallion/dqn-SpaceInvadersNoFrameskip-v4 | 0 | 2 | stable-baselines3 | 2023-05-05T07:14:06 | ---
library_name: stable-baselines3
tags:
- SpaceInvadersNoFrameskip-v4
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: DQN
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: SpaceInvadersNoFrameskip-v4
type: SpaceInvadersNoFrameskip-v4
metrics:
- type: mean_reward
value: 557.50 +/- 99.66
name: mean_reward
verified: false
---
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga Neomedallion -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga Neomedallion -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga Neomedallion
```
## Hyperparameters
```python
OrderedDict([('batch_size', 32),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.01),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.0001),
('learning_starts', 100000),
('n_timesteps', 1000000.0),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 4),
('normalize', False)])
```
| 2,702 | [
[
-0.03704833984375,
-0.03729248046875,
0.0273590087890625,
0.0203704833984375,
-0.01084136962890625,
-0.01525115966796875,
0.01348876953125,
-0.0186614990234375,
0.0238494873046875,
0.0281219482421875,
-0.06903076171875,
-0.0404052734375,
-0.0253753662109375,
... |
mnavas/roberta-finetuned-WebClassification-v2-smalllinguaENES | 2023-05-05T10:38:32.000Z | [
"transformers",
"pytorch",
"roberta",
"text-classification",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-classification | mnavas | null | null | mnavas/roberta-finetuned-WebClassification-v2-smalllinguaENES | 0 | 2 | transformers | 2023-05-05T07:32:56 | ---
license: mit
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: roberta-finetuned-WebClassification-v2-smalllinguaENES
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-finetuned-WebClassification-v2-smalllinguaENES
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0053
- Accuracy: 0.9355
- F1: 0.9355
- Precision: 0.9355
- Recall: 0.9355
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
| No log | 1.0 | 16 | 2.4058 | 0.1613 | 0.1613 | 0.1613 | 0.1613 |
| No log | 2.0 | 32 | 2.3931 | 0.0968 | 0.0968 | 0.0968 | 0.0968 |
| No log | 3.0 | 48 | 1.9594 | 0.4516 | 0.4516 | 0.4516 | 0.4516 |
| No log | 4.0 | 64 | 1.7428 | 0.6129 | 0.6129 | 0.6129 | 0.6129 |
| No log | 5.0 | 80 | 1.3781 | 0.8387 | 0.8387 | 0.8387 | 0.8387 |
| No log | 6.0 | 96 | 1.0053 | 0.9355 | 0.9355 | 0.9355 | 0.9355 |
| No log | 7.0 | 112 | 0.8489 | 0.8387 | 0.8387 | 0.8387 | 0.8387 |
| No log | 8.0 | 128 | 0.7135 | 0.8710 | 0.8710 | 0.8710 | 0.8710 |
| No log | 9.0 | 144 | 0.6700 | 0.8710 | 0.8710 | 0.8710 | 0.8710 |
| No log | 10.0 | 160 | 0.6511 | 0.9355 | 0.9355 | 0.9355 | 0.9355 |
### Framework versions
- Transformers 4.27.3
- Pytorch 2.0.0+cpu
- Datasets 2.10.1
- Tokenizers 0.13.2
| 2,382 | [
[
-0.036956787109375,
-0.042724609375,
0.01477813720703125,
-0.003856658935546875,
-0.01042938232421875,
-0.0229339599609375,
-0.01198577880859375,
-0.0157318115234375,
0.02069091796875,
0.0241546630859375,
-0.05047607421875,
-0.051849365234375,
-0.053253173828125... |
PaulineSanchez/Modele_traduction_HF | 2023-05-11T13:10:58.000Z | [
"transformers",
"pytorch",
"marian",
"text2text-generation",
"food",
"translation",
"en",
"fr",
"dataset:PaulineSanchez/Trad_food",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | PaulineSanchez | null | null | PaulineSanchez/Modele_traduction_HF | 0 | 2 | transformers | 2023-05-05T07:41:40 | ---
language:
- en
- fr
datasets:
- PaulineSanchez/Trad_food
metrics:
- bleu
tags:
- food
pipeline_tag: translation
---
# train_hf
This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the PaulineSanchez/Trad_food dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5736
- Bleu: 77.4387
- Gen Len: 10.8386
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 6.0
### Training results
### Framework versions
- Transformers 4.29.0.dev0
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3 | 989 | [
[
-0.014739990234375,
-0.0364990234375,
0.01531982421875,
0.016876220703125,
-0.0163116455078125,
-0.0364990234375,
-0.0239715576171875,
-0.01316070556640625,
0.0211029052734375,
0.036224365234375,
-0.060638427734375,
-0.04412841796875,
-0.0548095703125,
0.023... |
amittian/setfit_address_version_0_0_1 | 2023-05-05T08:04:58.000Z | [
"sentence-transformers",
"pytorch",
"bert",
"setfit",
"text-classification",
"arxiv:2209.11055",
"license:apache-2.0",
"region:us"
] | text-classification | amittian | null | null | amittian/setfit_address_version_0_0_1 | 0 | 2 | sentence-transformers | 2023-05-05T08:04:07 | ---
license: apache-2.0
tags:
- setfit
- sentence-transformers
- text-classification
pipeline_tag: text-classification
---
# amittian/setfit_address_version_0_0_1
This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Usage
To use this model for inference, first install the SetFit library:
```bash
python -m pip install setfit
```
You can then run inference as follows:
```python
from setfit import SetFitModel
# Download from Hub and run inference
model = SetFitModel.from_pretrained("amittian/setfit_address_version_0_0_1")
# Run inference
preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"])
```
## BibTeX entry and citation info
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
| 1,563 | [
[
-0.009735107421875,
-0.059600830078125,
0.023956298828125,
-0.0172882080078125,
-0.006931304931640625,
-0.021392822265625,
-0.0164337158203125,
-0.0126495361328125,
0.0029621124267578125,
0.034576416015625,
-0.041259765625,
-0.0205078125,
-0.03839111328125,
... |
yagmurery/bert-base-uncased-finetuned-learningRate-2-cola-2e-05 | 2023-05-05T09:00:33.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | yagmurery | null | null | yagmurery/bert-base-uncased-finetuned-learningRate-2-cola-2e-05 | 0 | 2 | transformers | 2023-05-05T08:52:23 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-learningRate-2-cola-2e-05
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5822579998058149
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-learningRate-2-cola-2e-05
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4685
- Matthews Correlation: 0.5823
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4986 | 1.0 | 535 | 0.5249 | 0.4947 |
| 0.3134 | 2.0 | 1070 | 0.4685 | 0.5823 |
| 0.1964 | 3.0 | 1605 | 0.6025 | 0.5445 |
| 0.144 | 4.0 | 2140 | 0.7324 | 0.5699 |
| 0.0898 | 5.0 | 2675 | 0.8637 | 0.5720 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,060 | [
[
-0.031280517578125,
-0.046722412109375,
0.00745391845703125,
0.01511383056640625,
-0.02874755859375,
-0.0229949951171875,
-0.021087646484375,
-0.0143585205078125,
0.0195159912109375,
0.01213836669921875,
-0.05291748046875,
-0.03533935546875,
-0.051849365234375,
... |
yagmurery/bert-base-uncased-finetuned-learningRate-2-cola-3e-05 | 2023-05-05T09:08:39.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | yagmurery | null | null | yagmurery/bert-base-uncased-finetuned-learningRate-2-cola-3e-05 | 0 | 2 | transformers | 2023-05-05T09:00:37 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-learningRate-2-cola-3e-05
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5907527969578087
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-learningRate-2-cola-3e-05
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8555
- Matthews Correlation: 0.5908
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.2022 | 1.0 | 535 | 0.9205 | 0.5285 |
| 0.1155 | 2.0 | 1070 | 0.8555 | 0.5908 |
| 0.1312 | 3.0 | 1605 | 0.9399 | 0.5496 |
| 0.0956 | 4.0 | 2140 | 1.0178 | 0.5577 |
| 0.048 | 5.0 | 2675 | 1.1525 | 0.5528 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,060 | [
[
-0.032196044921875,
-0.047149658203125,
0.00986480712890625,
0.01534271240234375,
-0.0268402099609375,
-0.0229034423828125,
-0.020172119140625,
-0.0144805908203125,
0.021270751953125,
0.01338958740234375,
-0.053436279296875,
-0.03741455078125,
-0.0506591796875,
... |
yagmurery/bert-base-uncased-finetuned-learningRate-2-cola-4e-05 | 2023-05-05T09:16:26.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | yagmurery | null | null | yagmurery/bert-base-uncased-finetuned-learningRate-2-cola-4e-05 | 0 | 2 | transformers | 2023-05-05T09:08:43 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-learningRate-2-cola-4e-05
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.539019545585709
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-learningRate-2-cola-4e-05
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2969
- Matthews Correlation: 0.5390
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.1286 | 1.0 | 535 | 0.9932 | 0.5235 |
| 0.0942 | 2.0 | 1070 | 1.1242 | 0.5229 |
| 0.1325 | 3.0 | 1605 | 0.9707 | 0.5203 |
| 0.0916 | 4.0 | 2140 | 1.0752 | 0.5313 |
| 0.0403 | 5.0 | 2675 | 1.2969 | 0.5390 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,059 | [
[
-0.0313720703125,
-0.046112060546875,
0.01047515869140625,
0.01470184326171875,
-0.026519775390625,
-0.0228271484375,
-0.0197601318359375,
-0.01445770263671875,
0.0210113525390625,
0.013397216796875,
-0.05328369140625,
-0.03656005859375,
-0.049957275390625,
... |
cansurav/bert-base-uncased-finetuned-cola-learning_rate-4e-05 | 2023-05-05T09:33:19.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-learning_rate-4e-05 | 0 | 2 | transformers | 2023-05-05T09:18:58 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-learning_rate-4e-05
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5732046470010711
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-learning_rate-4e-05
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3213
- Matthews Correlation: 0.5732
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5002 | 1.0 | 535 | 0.5568 | 0.4891 |
| 0.2954 | 2.0 | 1070 | 0.5052 | 0.5210 |
| 0.1976 | 3.0 | 1605 | 0.7016 | 0.5033 |
| 0.1367 | 4.0 | 2140 | 0.9378 | 0.5628 |
| 0.0889 | 5.0 | 2675 | 1.0129 | 0.5470 |
| 0.0555 | 6.0 | 3210 | 1.1484 | 0.5575 |
| 0.0431 | 7.0 | 3745 | 1.1081 | 0.5527 |
| 0.028 | 8.0 | 4280 | 1.1268 | 0.5697 |
| 0.0192 | 9.0 | 4815 | 1.3071 | 0.5627 |
| 0.013 | 10.0 | 5350 | 1.3213 | 0.5732 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,429 | [
[
-0.0312347412109375,
-0.04681396484375,
0.007171630859375,
0.0165252685546875,
-0.0170745849609375,
-0.0172576904296875,
-0.01171112060546875,
-0.01216888427734375,
0.027008056640625,
0.00959014892578125,
-0.04791259765625,
-0.038970947265625,
-0.0504150390625,
... |
yagmurery/bert-base-uncased-finetuned-dropout-cola-0.2 | 2023-05-05T10:03:38.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | yagmurery | null | null | yagmurery/bert-base-uncased-finetuned-dropout-cola-0.2 | 0 | 2 | transformers | 2023-05-05T09:20:43 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-dropout-cola-0.2
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5957317644481708
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-dropout-cola-0.2
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8150
- Matthews Correlation: 0.5957
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4985 | 1.0 | 535 | 0.5022 | 0.4978 |
| 0.3168 | 2.0 | 1070 | 0.4357 | 0.5836 |
| 0.2116 | 3.0 | 1605 | 0.6536 | 0.5365 |
| 0.149 | 4.0 | 2140 | 0.8150 | 0.5957 |
| 0.0911 | 5.0 | 2675 | 0.8846 | 0.5838 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,042 | [
[
-0.0288238525390625,
-0.04559326171875,
0.01031494140625,
0.0145416259765625,
-0.0225830078125,
-0.02294921875,
-0.01708984375,
-0.01200103759765625,
0.0229644775390625,
0.0147247314453125,
-0.0540771484375,
-0.0390625,
-0.0533447265625,
-0.0237579345703125,... |
BlueAvenir/sti_security_class_model | 2023-05-05T09:26:22.000Z | [
"sentence-transformers",
"pytorch",
"xlm-roberta",
"feature-extraction",
"sentence-similarity",
"transformers",
"endpoints_compatible",
"region:us"
] | sentence-similarity | BlueAvenir | null | null | BlueAvenir/sti_security_class_model | 0 | 2 | sentence-transformers | 2023-05-05T09:26:12 | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# {MODEL_NAME}
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
model = AutoModel.from_pretrained('{MODEL_NAME}')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 228 with parameters:
```
{'batch_size': 16, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
Parameters of the fit()-Method:
```
{
"epochs": 1,
"evaluation_steps": 0,
"evaluator": "NoneType",
"max_grad_norm": 1,
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": 228,
"warmup_steps": 23,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information --> | 3,702 | [
[
-0.018402099609375,
-0.06121826171875,
0.02215576171875,
0.0215911865234375,
-0.0207977294921875,
-0.0303497314453125,
-0.01617431640625,
0.002166748046875,
0.016632080078125,
0.0293121337890625,
-0.047393798828125,
-0.04833984375,
-0.052642822265625,
0.0007... |
yagmurery/bert-base-uncased-finetuned-dropout-cola-0.4 | 2023-05-05T10:13:02.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | yagmurery | null | null | yagmurery/bert-base-uncased-finetuned-dropout-cola-0.4 | 0 | 2 | transformers | 2023-05-05T09:28:49 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-dropout-cola-0.4
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5780870172624647
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-dropout-cola-0.4
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9088
- Matthews Correlation: 0.5781
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.1124 | 1.0 | 535 | 1.0648 | 0.5327 |
| 0.0804 | 2.0 | 1070 | 0.9088 | 0.5781 |
| 0.0599 | 3.0 | 1605 | 1.2529 | 0.5599 |
| 0.036 | 4.0 | 2140 | 1.3387 | 0.5666 |
| 0.03 | 5.0 | 2675 | 1.3587 | 0.5709 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,042 | [
[
-0.0305633544921875,
-0.045166015625,
0.01451873779296875,
0.0152435302734375,
-0.0227203369140625,
-0.0234832763671875,
-0.015960693359375,
-0.01183319091796875,
0.0237274169921875,
0.01561737060546875,
-0.054473876953125,
-0.040008544921875,
-0.051788330078125... |
cansurav/bert-base-uncased-finetuned-cola-learning_rate-9e-06 | 2023-05-05T09:47:52.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-learning_rate-9e-06 | 0 | 2 | transformers | 2023-05-05T09:33:26 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-learning_rate-9e-06
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5753593483598531
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-learning_rate-9e-06
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9848
- Matthews Correlation: 0.5754
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 9e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5227 | 1.0 | 535 | 0.5061 | 0.4717 |
| 0.3617 | 2.0 | 1070 | 0.4769 | 0.5701 |
| 0.2584 | 3.0 | 1605 | 0.5299 | 0.5625 |
| 0.1998 | 4.0 | 2140 | 0.6801 | 0.5629 |
| 0.1492 | 5.0 | 2675 | 0.8519 | 0.5446 |
| 0.1323 | 6.0 | 3210 | 0.9372 | 0.5624 |
| 0.103 | 7.0 | 3745 | 0.9424 | 0.5753 |
| 0.0949 | 8.0 | 4280 | 0.9848 | 0.5754 |
| 0.0718 | 9.0 | 4815 | 1.0474 | 0.5652 |
| 0.0629 | 10.0 | 5350 | 1.0657 | 0.5731 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,429 | [
[
-0.03155517578125,
-0.0443115234375,
0.00762176513671875,
0.01381683349609375,
-0.017578125,
-0.0198822021484375,
-0.0148468017578125,
-0.01273345947265625,
0.02508544921875,
0.00945281982421875,
-0.0482177734375,
-0.039276123046875,
-0.049713134765625,
-0.0... |
cansurav/bert-base-uncased-finetuned-cola-learning_rate-8e-06 | 2023-05-05T10:02:23.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-learning_rate-8e-06 | 0 | 2 | transformers | 2023-05-05T09:48:00 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-learning_rate-8e-06
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5752615459764325
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-learning_rate-8e-06
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8389
- Matthews Correlation: 0.5753
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 8e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5241 | 1.0 | 535 | 0.4659 | 0.5046 |
| 0.3755 | 2.0 | 1070 | 0.4412 | 0.5650 |
| 0.2782 | 3.0 | 1605 | 0.5524 | 0.5395 |
| 0.2154 | 4.0 | 2140 | 0.6437 | 0.5651 |
| 0.1669 | 5.0 | 2675 | 0.7709 | 0.5650 |
| 0.1503 | 6.0 | 3210 | 0.8389 | 0.5753 |
| 0.1151 | 7.0 | 3745 | 0.8964 | 0.5681 |
| 0.1082 | 8.0 | 4280 | 0.9767 | 0.5548 |
| 0.0816 | 9.0 | 4815 | 0.9978 | 0.5498 |
| 0.0809 | 10.0 | 5350 | 1.0170 | 0.5576 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,429 | [
[
-0.033355712890625,
-0.0469970703125,
0.00714874267578125,
0.0150604248046875,
-0.016021728515625,
-0.0148162841796875,
-0.01168060302734375,
-0.0120086669921875,
0.0297698974609375,
0.01052093505859375,
-0.04876708984375,
-0.038970947265625,
-0.05218505859375,
... |
cansurav/bert-base-uncased-finetuned-cola-learning_rate-0.0001 | 2023-05-05T10:24:06.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-learning_rate-0.0001 | 0 | 2 | transformers | 2023-05-05T10:02:31 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-learning_rate-0.0001
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.0
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-learning_rate-0.0001
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7459
- Matthews Correlation: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.6205 | 1.0 | 535 | 0.7459 | 0.0 |
| 0.6218 | 2.0 | 1070 | 0.6288 | 0.0 |
| 0.6166 | 3.0 | 1605 | 0.6181 | 0.0 |
| 0.6196 | 4.0 | 2140 | 0.6279 | 0.0 |
| 0.6137 | 5.0 | 2675 | 0.6202 | 0.0 |
| 0.6138 | 6.0 | 3210 | 0.6203 | 0.0 |
| 0.6074 | 7.0 | 3745 | 0.6184 | 0.0 |
| 0.6128 | 8.0 | 4280 | 0.6220 | 0.0 |
| 0.6073 | 9.0 | 4815 | 0.6183 | 0.0 |
| 0.6113 | 10.0 | 5350 | 0.6196 | 0.0 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,414 | [
[
-0.0333251953125,
-0.046783447265625,
0.00817108154296875,
0.01465606689453125,
-0.0169525146484375,
-0.01605224609375,
-0.00965118408203125,
-0.010498046875,
0.03045654296875,
0.01226806640625,
-0.051849365234375,
-0.039794921875,
-0.050323486328125,
-0.020... |
yagmurery/bert-base-uncased-finetuned-dropout-cola-0.8 | 2023-05-05T10:20:27.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | yagmurery | null | null | yagmurery/bert-base-uncased-finetuned-dropout-cola-0.8 | 0 | 2 | transformers | 2023-05-05T10:13:07 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-dropout-cola-0.8
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.609298672684182
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-dropout-cola-0.8
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1085
- Matthews Correlation: 0.6093
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.0511 | 1.0 | 535 | 1.5284 | 0.5702 |
| 0.0458 | 2.0 | 1070 | 1.1085 | 0.6093 |
| 0.0667 | 3.0 | 1605 | 1.1696 | 0.5806 |
| 0.0406 | 4.0 | 2140 | 1.2386 | 0.5960 |
| 0.0314 | 5.0 | 2675 | 1.3074 | 0.5934 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,041 | [
[
-0.030303955078125,
-0.04541015625,
0.011871337890625,
0.01407623291015625,
-0.0226593017578125,
-0.0233917236328125,
-0.0161285400390625,
-0.0118865966796875,
0.0237579345703125,
0.0157012939453125,
-0.05389404296875,
-0.039703369140625,
-0.053436279296875,
... |
sarahflan/distilbert-base-uncased-finetuned-AS_sentences | 2023-07-18T08:18:10.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | sarahflan | null | null | sarahflan/distilbert-base-uncased-finetuned-AS_sentences | 0 | 2 | transformers | 2023-05-05T10:15:31 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-as_sentences
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-as_sentences
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0627
- Accuracy: 0.9733
- F1: 0.9733
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.6987 | 1.0 | 11 | 0.6958 | 0.46 | 0.3025 |
| 0.6851 | 2.0 | 22 | 0.6715 | 0.5667 | 0.4954 |
| 0.6315 | 3.0 | 33 | 0.4515 | 0.88 | 0.8791 |
| 0.4086 | 4.0 | 44 | 0.1662 | 0.96 | 0.9599 |
| 0.136 | 5.0 | 55 | 0.0857 | 0.9667 | 0.9666 |
| 0.0955 | 6.0 | 66 | 0.0661 | 0.9733 | 0.9733 |
| 0.022 | 7.0 | 77 | 0.0569 | 0.9667 | 0.9666 |
| 0.0272 | 8.0 | 88 | 0.0626 | 0.9667 | 0.9666 |
| 0.0346 | 9.0 | 99 | 0.0818 | 0.9667 | 0.9666 |
| 0.0157 | 10.0 | 110 | 0.0649 | 0.9667 | 0.9666 |
| 0.0232 | 11.0 | 121 | 0.1416 | 0.9533 | 0.9531 |
| 0.0202 | 12.0 | 132 | 0.0652 | 0.9733 | 0.9733 |
| 0.0069 | 13.0 | 143 | 0.0764 | 0.96 | 0.9599 |
| 0.0032 | 14.0 | 154 | 0.0842 | 0.9667 | 0.9666 |
| 0.0052 | 15.0 | 165 | 0.0697 | 0.9667 | 0.9666 |
| 0.0028 | 16.0 | 176 | 0.0773 | 0.9667 | 0.9666 |
| 0.0066 | 17.0 | 187 | 0.0809 | 0.9667 | 0.9667 |
| 0.0022 | 18.0 | 198 | 0.0569 | 0.9667 | 0.9666 |
| 0.002 | 19.0 | 209 | 0.0537 | 0.9733 | 0.9733 |
| 0.0016 | 20.0 | 220 | 0.0502 | 0.9733 | 0.9733 |
| 0.0015 | 21.0 | 231 | 0.0460 | 0.9733 | 0.9733 |
| 0.0013 | 22.0 | 242 | 0.0451 | 0.9733 | 0.9733 |
| 0.0013 | 23.0 | 253 | 0.0448 | 0.9733 | 0.9733 |
| 0.0012 | 24.0 | 264 | 0.0450 | 0.9733 | 0.9733 |
| 0.0012 | 25.0 | 275 | 0.0457 | 0.9733 | 0.9733 |
| 0.0011 | 26.0 | 286 | 0.0465 | 0.9733 | 0.9733 |
| 0.0011 | 27.0 | 297 | 0.0466 | 0.9733 | 0.9733 |
| 0.001 | 28.0 | 308 | 0.0613 | 0.9667 | 0.9666 |
| 0.001 | 29.0 | 319 | 0.0658 | 0.9667 | 0.9666 |
| 0.0009 | 30.0 | 330 | 0.0674 | 0.9667 | 0.9666 |
| 0.0008 | 31.0 | 341 | 0.0693 | 0.9667 | 0.9666 |
| 0.0009 | 32.0 | 352 | 0.0711 | 0.9667 | 0.9666 |
| 0.0008 | 33.0 | 363 | 0.0718 | 0.9667 | 0.9666 |
| 0.0028 | 34.0 | 374 | 0.0824 | 0.9667 | 0.9667 |
| 0.0011 | 35.0 | 385 | 0.0884 | 0.9667 | 0.9666 |
| 0.0008 | 36.0 | 396 | 0.1060 | 0.9667 | 0.9666 |
| 0.0009 | 37.0 | 407 | 0.0875 | 0.96 | 0.9599 |
| 0.0015 | 38.0 | 418 | 0.0623 | 0.9667 | 0.9666 |
| 0.0007 | 39.0 | 429 | 0.0610 | 0.9733 | 0.9733 |
| 0.0007 | 40.0 | 440 | 0.0614 | 0.9733 | 0.9733 |
| 0.0007 | 41.0 | 451 | 0.0617 | 0.9733 | 0.9733 |
| 0.0007 | 42.0 | 462 | 0.0618 | 0.9733 | 0.9733 |
| 0.0006 | 43.0 | 473 | 0.0620 | 0.9733 | 0.9733 |
| 0.0006 | 44.0 | 484 | 0.0621 | 0.9733 | 0.9733 |
| 0.0006 | 45.0 | 495 | 0.0622 | 0.9733 | 0.9733 |
| 0.0006 | 46.0 | 506 | 0.0624 | 0.9733 | 0.9733 |
| 0.0006 | 47.0 | 517 | 0.0625 | 0.9733 | 0.9733 |
| 0.0006 | 48.0 | 528 | 0.0626 | 0.9733 | 0.9733 |
| 0.0006 | 49.0 | 539 | 0.0627 | 0.9733 | 0.9733 |
| 0.0006 | 50.0 | 550 | 0.0627 | 0.9733 | 0.9733 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
| 4,923 | [
[
-0.040435791015625,
-0.04437255859375,
0.0162811279296875,
0.0030517578125,
-0.0001354217529296875,
0.007007598876953125,
0.0016508102416992188,
0.0066986083984375,
0.05645751953125,
0.024322509765625,
-0.043975830078125,
-0.050140380859375,
-0.052001953125,
... |
vg055/roberta-base-bne-finetuned-TripAdvisorDomainAdaptation-finetuned-e2-RestMex2023-polaridad | 2023-05-05T16:52:06.000Z | [
"transformers",
"pytorch",
"tensorboard",
"roberta",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | vg055 | null | null | vg055/roberta-base-bne-finetuned-TripAdvisorDomainAdaptation-finetuned-e2-RestMex2023-polaridad | 0 | 2 | transformers | 2023-05-05T10:27:51 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: roberta-base-bne-finetuned-TripAdvisorDomainAdaptation-finetuned-e2-RestMex2023-polaridad
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-bne-finetuned-TripAdvisorDomainAdaptation-finetuned-e2-RestMex2023-polaridad
This model is a fine-tuned version of [vg055/roberta-base-bne-finetuned-TripAdvisorDomainAdaptation](https://huggingface.co/vg055/roberta-base-bne-finetuned-TripAdvisorDomainAdaptation) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5996
- F1: 0.7468
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 0.5823 | 1.0 | 14159 | 0.5671 | 0.7452 |
| 0.4536 | 2.0 | 28318 | 0.5996 | 0.7468 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,602 | [
[
-0.039031982421875,
-0.042449951171875,
0.01357269287109375,
0.016387939453125,
-0.0341796875,
-0.040130615234375,
-0.015350341796875,
-0.01482391357421875,
0.00836944580078125,
0.033782958984375,
-0.05743408203125,
-0.047760009765625,
-0.04840087890625,
-0.... |
yagmurery/bert-base-uncased-finetuned-batchSize-cola-16 | 2023-05-05T10:37:53.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | yagmurery | null | null | yagmurery/bert-base-uncased-finetuned-batchSize-cola-16 | 0 | 2 | transformers | 2023-05-05T10:30:31 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-batchSize-cola-16
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.6125472225786625
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-batchSize-cola-16
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0969
- Matthews Correlation: 0.6125
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.0394 | 1.0 | 535 | 1.0969 | 0.6125 |
| 0.0289 | 2.0 | 1070 | 1.0612 | 0.5907 |
| 0.0559 | 3.0 | 1605 | 1.1586 | 0.5650 |
| 0.0373 | 4.0 | 2140 | 1.1325 | 0.5831 |
| 0.0261 | 5.0 | 2675 | 1.3065 | 0.5804 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,044 | [
[
-0.03155517578125,
-0.046722412109375,
0.01390838623046875,
0.0156707763671875,
-0.0252532958984375,
-0.0244140625,
-0.0187530517578125,
-0.01053619384765625,
0.026397705078125,
0.0164794921875,
-0.05352783203125,
-0.033782958984375,
-0.05291748046875,
-0.02... |
cansurav/bert-base-uncased-finetuned-cola-dropout-0.1 | 2023-05-05T10:49:51.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-dropout-0.1 | 0 | 2 | transformers | 2023-05-05T10:35:22 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-dropout-0.1
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.593197037544882
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-dropout-0.1
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1127
- Matthews Correlation: 0.5932
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.49 | 1.0 | 535 | 0.5310 | 0.4914 |
| 0.3003 | 2.0 | 1070 | 0.5391 | 0.5572 |
| 0.2033 | 3.0 | 1605 | 0.6975 | 0.5473 |
| 0.1427 | 4.0 | 2140 | 0.8513 | 0.5612 |
| 0.0998 | 5.0 | 2675 | 0.8598 | 0.5829 |
| 0.0783 | 6.0 | 3210 | 1.1127 | 0.5932 |
| 0.0456 | 7.0 | 3745 | 1.0697 | 0.5890 |
| 0.0395 | 8.0 | 4280 | 1.1813 | 0.5782 |
| 0.0277 | 9.0 | 4815 | 1.2958 | 0.5727 |
| 0.0205 | 10.0 | 5350 | 1.3045 | 0.5832 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,412 | [
[
-0.03131103515625,
-0.04541015625,
0.0085601806640625,
0.01439666748046875,
-0.019683837890625,
-0.0171661376953125,
-0.0118408203125,
-0.009765625,
0.0283355712890625,
0.0130615234375,
-0.053375244140625,
-0.039398193359375,
-0.05340576171875,
-0.0203704833... |
yagmurery/bert-base-uncased-finetuned-batchSize-cola-32 | 2023-05-05T10:44:24.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | yagmurery | null | null | yagmurery/bert-base-uncased-finetuned-batchSize-cola-32 | 0 | 2 | transformers | 2023-05-05T10:37:57 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-batchSize-cola-32
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5930181720231964
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-batchSize-cola-32
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0466
- Matthews Correlation: 0.5930
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 268 | 0.9600 | 0.5600 |
| 0.0668 | 2.0 | 536 | 0.9530 | 0.5765 |
| 0.0668 | 3.0 | 804 | 1.0466 | 0.5930 |
| 0.0327 | 4.0 | 1072 | 1.1919 | 0.5805 |
| 0.0327 | 5.0 | 1340 | 1.2359 | 0.5905 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,044 | [
[
-0.030181884765625,
-0.046478271484375,
0.011932373046875,
0.0159454345703125,
-0.0255584716796875,
-0.024322509765625,
-0.0189361572265625,
-0.01186370849609375,
0.0240478515625,
0.0150909423828125,
-0.05230712890625,
-0.033355712890625,
-0.05340576171875,
... |
yagmurery/bert-base-uncased-finetuned-batchSize-cola-64 | 2023-05-05T10:50:35.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | yagmurery | null | null | yagmurery/bert-base-uncased-finetuned-batchSize-cola-64 | 0 | 2 | transformers | 2023-05-05T10:44:28 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-batchSize-cola-64
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5961744294806522
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-batchSize-cola-64
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0984
- Matthews Correlation: 0.5962
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 134 | 1.2908 | 0.5651 |
| No log | 2.0 | 268 | 1.1057 | 0.5729 |
| No log | 3.0 | 402 | 1.0984 | 0.5962 |
| 0.0195 | 4.0 | 536 | 1.1799 | 0.5753 |
| 0.0195 | 5.0 | 670 | 1.2076 | 0.5804 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,044 | [
[
-0.0307464599609375,
-0.047088623046875,
0.01264190673828125,
0.01502227783203125,
-0.027191162109375,
-0.0269317626953125,
-0.01995849609375,
-0.01073455810546875,
0.024627685546875,
0.016571044921875,
-0.053253173828125,
-0.03411865234375,
-0.05242919921875,
... |
yagmurery/bert-base-uncased-finetuned-bestModel-optuna-cola | 2023-05-05T13:40:40.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | yagmurery | null | null | yagmurery/bert-base-uncased-finetuned-bestModel-optuna-cola | 0 | 2 | transformers | 2023-05-05T10:54:24 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-epochs-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5879831868448624
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-epochs-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5106
- Matthews Correlation: 0.5880
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.7248771148294196e-05
- train_batch_size: 64
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 134 | 0.4482 | 0.5047 |
| No log | 2.0 | 268 | 0.4230 | 0.5612 |
| No log | 3.0 | 402 | 0.4850 | 0.5677 |
| 0.3514 | 4.0 | 536 | 0.5106 | 0.5880 |
| 0.3514 | 5.0 | 670 | 0.5397 | 0.5727 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,049 | [
[
-0.0249786376953125,
-0.0546875,
0.00849151611328125,
0.0161895751953125,
-0.0233306884765625,
-0.0260162353515625,
-0.0164947509765625,
-0.0197906494140625,
0.0224609375,
0.01421356201171875,
-0.052490234375,
-0.033935546875,
-0.048614501953125,
-0.02183532... |
cansurav/bert-base-uncased-finetuned-cola-dropout-0.2 | 2023-05-05T11:11:07.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-dropout-0.2 | 0 | 2 | transformers | 2023-05-05T10:56:31 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-dropout-0.2
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5992215466535732
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-dropout-0.2
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4502
- Matthews Correlation: 0.5992
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4987 | 1.0 | 535 | 0.5145 | 0.4872 |
| 0.3065 | 2.0 | 1070 | 0.4502 | 0.5992 |
| 0.2059 | 3.0 | 1605 | 0.7547 | 0.5208 |
| 0.1467 | 4.0 | 2140 | 0.8557 | 0.5390 |
| 0.1006 | 5.0 | 2675 | 0.9277 | 0.5550 |
| 0.0796 | 6.0 | 3210 | 1.0832 | 0.5765 |
| 0.0532 | 7.0 | 3745 | 1.0337 | 0.5687 |
| 0.0367 | 8.0 | 4280 | 1.1539 | 0.5779 |
| 0.0276 | 9.0 | 4815 | 1.3224 | 0.5755 |
| 0.0192 | 10.0 | 5350 | 1.3055 | 0.5810 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,413 | [
[
-0.031402587890625,
-0.04583740234375,
0.01031494140625,
0.013214111328125,
-0.0180816650390625,
-0.017333984375,
-0.011474609375,
-0.009368896484375,
0.0297698974609375,
0.01363372802734375,
-0.052978515625,
-0.039031982421875,
-0.053436279296875,
-0.021179... |
reeducator/vicuna-13b-cocktail | 2023-05-26T08:16:31.000Z | [
"transformers",
"llama",
"text-generation",
"en",
"dataset:anon8231489123/ShareGPT_Vicuna_unfiltered",
"dataset:gozfarb/ShareGPT_Vicuna_unfiltered",
"dataset:gozfarb/bluemoon_roleplay_300k_vicuna",
"dataset:gozfarb/GPTeacher-Vicuna",
"dataset:gozfarb/SuperCOT-vicuna-dataset",
"dataset:gozfarb/Vicu... | text-generation | reeducator | null | null | reeducator/vicuna-13b-cocktail | 42 | 2 | transformers | 2023-05-05T11:09:37 | ---
datasets:
- anon8231489123/ShareGPT_Vicuna_unfiltered
- gozfarb/ShareGPT_Vicuna_unfiltered
- gozfarb/bluemoon_roleplay_300k_vicuna
- gozfarb/GPTeacher-Vicuna
- gozfarb/SuperCOT-vicuna-dataset
- gozfarb/Vicuna_Evol_Instruct_Cleaned
language:
- en
---
## General
Vicuna 1.1 13B finetune incorporating various datasets in addition to the unfiltered ShareGPT. This is an experiment attempting to enhance the creativity of the Vicuna 1.1, while also reducing censorship as much as possible. All datasets have been cleaned. Additionally, only the "instruct" portion of GPTeacher has been used.
## Models
*GGML quant for llama.cpp:*<br/>
ggml-vicuna-13b-cocktail-v1-q5_0.bin<br/>
*GPTQ 4-bit CUDA:*<br/>
vicuna-13b-cocktail-v1-4bit-128g.safetensors<br/>
## Remarks
This model has been trained with a modified prompt:<br/>
```
A chat between a user and an associate. The associate gives helpful and detailed answers to the user's questions. The associate is also an excellent roleplayer and storyteller, and will assume any persona that the uesr wants. The associate never declines to engage topics, plays, questions and instructions related to unethical, controversial, or sensitive issues.
USER: [user message]
ASSOCIATE: [model reply]</s>
```
The training roles are USER and ASSOCIATE.
| 1,289 | [
[
-0.0189666748046875,
-0.06585693359375,
0.03448486328125,
0.029449462890625,
-0.05096435546875,
-0.0272064208984375,
0.0062103271484375,
-0.03851318359375,
0.0224456787109375,
0.04815673828125,
-0.03668212890625,
-0.044219970703125,
-0.048583984375,
-0.01380... |
cansurav/bert-base-uncased-finetuned-cola-dropout-0.3 | 2023-05-05T11:25:39.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-dropout-0.3 | 0 | 2 | transformers | 2023-05-05T11:11:13 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-dropout-0.3
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.6036344190543846
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-dropout-0.3
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2847
- Matthews Correlation: 0.6036
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4995 | 1.0 | 535 | 0.5102 | 0.4897 |
| 0.3023 | 2.0 | 1070 | 0.4585 | 0.5848 |
| 0.1951 | 3.0 | 1605 | 0.6793 | 0.5496 |
| 0.145 | 4.0 | 2140 | 0.7694 | 0.5925 |
| 0.1024 | 5.0 | 2675 | 1.0057 | 0.5730 |
| 0.0691 | 6.0 | 3210 | 1.0275 | 0.5892 |
| 0.0483 | 7.0 | 3745 | 1.0272 | 0.5788 |
| 0.0404 | 8.0 | 4280 | 1.2537 | 0.5810 |
| 0.0219 | 9.0 | 4815 | 1.3020 | 0.5780 |
| 0.0224 | 10.0 | 5350 | 1.2847 | 0.6036 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,413 | [
[
-0.0291595458984375,
-0.044036865234375,
0.0098876953125,
0.01293182373046875,
-0.018280029296875,
-0.019500732421875,
-0.01275634765625,
-0.01030731201171875,
0.025665283203125,
0.0131988525390625,
-0.052337646484375,
-0.040374755859375,
-0.05340576171875,
... |
cansurav/bert-base-uncased-finetuned-cola-dropout-0.4 | 2023-05-05T11:40:12.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-dropout-0.4 | 0 | 2 | transformers | 2023-05-05T11:25:46 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-dropout-0.4
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5786416039440073
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-dropout-0.4
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0377
- Matthews Correlation: 0.5786
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5068 | 1.0 | 535 | 0.5131 | 0.4679 |
| 0.3198 | 2.0 | 1070 | 0.4943 | 0.5692 |
| 0.2057 | 3.0 | 1605 | 0.7169 | 0.5073 |
| 0.1574 | 4.0 | 2140 | 0.7962 | 0.5525 |
| 0.0985 | 5.0 | 2675 | 0.9113 | 0.5573 |
| 0.0767 | 6.0 | 3210 | 1.0377 | 0.5786 |
| 0.0525 | 7.0 | 3745 | 1.1992 | 0.5705 |
| 0.0415 | 8.0 | 4280 | 1.3376 | 0.5626 |
| 0.0191 | 9.0 | 4815 | 1.3548 | 0.5733 |
| 0.0167 | 10.0 | 5350 | 1.3856 | 0.5658 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,413 | [
[
-0.0305023193359375,
-0.044891357421875,
0.00997161865234375,
0.01291656494140625,
-0.0190887451171875,
-0.01898193359375,
-0.01290130615234375,
-0.01084136962890625,
0.0264129638671875,
0.01335906982421875,
-0.053253173828125,
-0.040557861328125,
-0.05340576171... |
cansurav/bert-base-uncased-finetuned-cola-dropout-0.5 | 2023-05-05T11:54:45.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-dropout-0.5 | 0 | 2 | transformers | 2023-05-05T11:40:18 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-dropout-0.5
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5960380981891474
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-dropout-0.5
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4578
- Matthews Correlation: 0.5960
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5124 | 1.0 | 535 | 0.5110 | 0.4947 |
| 0.3144 | 2.0 | 1070 | 0.4578 | 0.5960 |
| 0.198 | 3.0 | 1605 | 0.7233 | 0.5393 |
| 0.1458 | 4.0 | 2140 | 0.7943 | 0.5554 |
| 0.0968 | 5.0 | 2675 | 1.0669 | 0.5393 |
| 0.069 | 6.0 | 3210 | 1.0982 | 0.5689 |
| 0.0484 | 7.0 | 3745 | 1.2170 | 0.5446 |
| 0.0394 | 8.0 | 4280 | 1.2429 | 0.5831 |
| 0.0292 | 9.0 | 4815 | 1.3490 | 0.5684 |
| 0.0175 | 10.0 | 5350 | 1.3534 | 0.5743 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,413 | [
[
-0.0318603515625,
-0.04559326171875,
0.0094757080078125,
0.011505126953125,
-0.0169219970703125,
-0.016021728515625,
-0.01129150390625,
-0.0112762451171875,
0.029632568359375,
0.0150604248046875,
-0.053009033203125,
-0.040283203125,
-0.053924560546875,
-0.01... |
shinta0615/distilbert-base-uncased-finetuned-emotion | 2023-05-10T23:29:54.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:emotion",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | shinta0615 | null | null | shinta0615/distilbert-base-uncased-finetuned-emotion | 0 | 2 | transformers | 2023-05-05T11:53:21 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
config: split
split: validation
args: split
metrics:
- name: Accuracy
type: accuracy
value: 0.934
- name: F1
type: f1
value: 0.9344038684401179
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1601
- Accuracy: 0.934
- F1: 0.9344
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.1758 | 1.0 | 250 | 0.1753 | 0.925 | 0.9245 |
| 0.1142 | 2.0 | 500 | 0.1601 | 0.934 | 0.9344 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,840 | [
[
-0.03704833984375,
-0.041229248046875,
0.013763427734375,
0.0220794677734375,
-0.0288543701171875,
-0.019378662109375,
-0.0129852294921875,
-0.00798797607421875,
0.01202392578125,
0.00846099853515625,
-0.057342529296875,
-0.05145263671875,
-0.059539794921875,
... |
cansurav/bert-base-uncased-finetuned-cola-dropout-0.6 | 2023-05-05T12:09:19.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-dropout-0.6 | 0 | 2 | transformers | 2023-05-05T11:54:51 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-dropout-0.6
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5882977917441249
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-dropout-0.6
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4663
- Matthews Correlation: 0.5883
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5195 | 1.0 | 535 | 0.5203 | 0.5266 |
| 0.3115 | 2.0 | 1070 | 0.4663 | 0.5883 |
| 0.2036 | 3.0 | 1605 | 0.7295 | 0.5471 |
| 0.1495 | 4.0 | 2140 | 0.8474 | 0.5521 |
| 0.1011 | 5.0 | 2675 | 1.0427 | 0.5626 |
| 0.0782 | 6.0 | 3210 | 1.0771 | 0.5734 |
| 0.0462 | 7.0 | 3745 | 1.1497 | 0.5660 |
| 0.0393 | 8.0 | 4280 | 1.2397 | 0.5589 |
| 0.0262 | 9.0 | 4815 | 1.3244 | 0.5653 |
| 0.0217 | 10.0 | 5350 | 1.3070 | 0.5668 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,413 | [
[
-0.03271484375,
-0.04510498046875,
0.01055908203125,
0.01235198974609375,
-0.0172271728515625,
-0.01654052734375,
-0.01123809814453125,
-0.01116943359375,
0.029510498046875,
0.01519775390625,
-0.05352783203125,
-0.0401611328125,
-0.053436279296875,
-0.019027... |
mnavas/roberta-finetuned-WebClassification-v2-smalllinguaEN | 2023-05-05T13:32:49.000Z | [
"transformers",
"pytorch",
"roberta",
"text-classification",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-classification | mnavas | null | null | mnavas/roberta-finetuned-WebClassification-v2-smalllinguaEN | 0 | 2 | transformers | 2023-05-05T12:06:16 | ---
license: mit
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: roberta-finetuned-WebClassification-v2-smalllinguaEN
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-finetuned-WebClassification-v2-smalllinguaEN
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5844
- Accuracy: 0.7143
- F1: 0.7143
- Precision: 0.7143
- Recall: 0.7143
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
| No log | 1.0 | 7 | 2.3084 | 0.0714 | 0.0714 | 0.0714 | 0.0714 |
| No log | 2.0 | 14 | 2.2951 | 0.2857 | 0.2857 | 0.2857 | 0.2857 |
| No log | 3.0 | 21 | 2.2725 | 0.2143 | 0.2143 | 0.2143 | 0.2143 |
| No log | 4.0 | 28 | 2.0608 | 0.2143 | 0.2143 | 0.2143 | 0.2143 |
| No log | 5.0 | 35 | 1.8552 | 0.3571 | 0.3571 | 0.3571 | 0.3571 |
| No log | 6.0 | 42 | 1.6846 | 0.5714 | 0.5714 | 0.5714 | 0.5714 |
| No log | 7.0 | 49 | 1.5844 | 0.7143 | 0.7143 | 0.7143 | 0.7143 |
| No log | 8.0 | 56 | 1.4531 | 0.7143 | 0.7143 | 0.7143 | 0.7143 |
| No log | 9.0 | 63 | 1.3746 | 0.7143 | 0.7143 | 0.7143 | 0.7143 |
| No log | 10.0 | 70 | 1.3663 | 0.7143 | 0.7143 | 0.7143 | 0.7143 |
### Framework versions
- Transformers 4.27.3
- Pytorch 2.0.0+cpu
- Datasets 2.10.1
- Tokenizers 0.13.2
| 2,378 | [
[
-0.03985595703125,
-0.04217529296875,
0.0134429931640625,
-0.003826141357421875,
-0.01033782958984375,
-0.0235595703125,
-0.01117706298828125,
-0.012908935546875,
0.018524169921875,
0.0265655517578125,
-0.052825927734375,
-0.051177978515625,
-0.051055908203125,
... |
Omogo/distilbert-base-uncased-finetuned-emotion | 2023-05-05T12:48:49.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | Omogo | null | null | Omogo/distilbert-base-uncased-finetuned-emotion | 0 | 2 | transformers | 2023-05-05T12:33:41 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2300
- Accuracy: 0.918
- F1: 0.9183
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| No log | 1.0 | 250 | 0.3276 | 0.904 | 0.9011 |
| No log | 2.0 | 500 | 0.2300 | 0.918 | 0.9183 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Tokenizers 0.13.3
| 1,485 | [
[
-0.037353515625,
-0.04541015625,
0.01861572265625,
0.0260772705078125,
-0.02862548828125,
-0.01910400390625,
-0.0147857666015625,
-0.00930023193359375,
0.00970458984375,
0.0078277587890625,
-0.05615234375,
-0.04998779296875,
-0.062255859375,
-0.0079879760742... |
cansurav/bert-base-uncased-finetuned-cola-batch-2 | 2023-05-05T13:38:31.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-batch-2 | 0 | 2 | transformers | 2023-05-05T12:39:21 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-batch-2
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5725078939425798
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-batch-2
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3833
- Matthews Correlation: 0.5725
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:-----:|:---------------:|:--------------------:|
| 0.8292 | 1.0 | 4276 | 0.8945 | 0.5153 |
| 0.5519 | 2.0 | 8552 | 1.0523 | 0.5019 |
| 0.4064 | 3.0 | 12828 | 1.1277 | 0.5356 |
| 0.2463 | 4.0 | 17104 | 1.3046 | 0.5248 |
| 0.1523 | 5.0 | 21380 | 1.4914 | 0.5094 |
| 0.0697 | 6.0 | 25656 | 1.4854 | 0.5574 |
| 0.0894 | 7.0 | 29932 | 1.3833 | 0.5725 |
| 0.0375 | 8.0 | 34208 | 1.5318 | 0.5670 |
| 0.0297 | 9.0 | 38484 | 1.8043 | 0.5550 |
| 0.0105 | 10.0 | 42760 | 1.8241 | 0.5565 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,415 | [
[
-0.031768798828125,
-0.04669189453125,
0.0111236572265625,
0.013336181640625,
-0.0205841064453125,
-0.0181884765625,
-0.01120758056640625,
-0.0091400146484375,
0.0281524658203125,
0.0167388916015625,
-0.0518798828125,
-0.034881591796875,
-0.052886962890625,
... |
Gulce/bert-base-uncased-finetuned-cola | 2023-05-07T14:38:13.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | Gulce | null | null | Gulce/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-05T13:12:11 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.6033168402681877
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5228
- Matthews Correlation: 0.6033
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.6356323059895617e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4839 | 1.0 | 535 | 0.4273 | 0.5448 |
| 0.2633 | 2.0 | 1070 | 0.5228 | 0.6033 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,813 | [
[
-0.024810791015625,
-0.05206298828125,
0.00896453857421875,
0.019866943359375,
-0.0251617431640625,
-0.0202178955078125,
-0.01806640625,
-0.014923095703125,
0.0267486572265625,
0.016693115234375,
-0.05035400390625,
-0.029632568359375,
-0.05126953125,
-0.0217... |
lowkemy/bert-base-uncased-finetuned-cola | 2023-05-07T20:55:16.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | lowkemy | null | null | lowkemy/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-05T13:23:38 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.4967522429154307
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4542
- Matthews Correlation: 0.4968
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.499 | 1.0 | 535 | 0.4542 | 0.4968 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,722 | [
[
-0.0256805419921875,
-0.05230712890625,
0.0108184814453125,
0.0209808349609375,
-0.0281829833984375,
-0.021514892578125,
-0.019317626953125,
-0.01477813720703125,
0.025360107421875,
0.0164947509765625,
-0.049285888671875,
-0.0310211181640625,
-0.051422119140625,... |
MLer/distilbert-base-uncased-finetuned-emotion | 2023-05-06T03:11:37.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:emotion",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | MLer | null | null | MLer/distilbert-base-uncased-finetuned-emotion | 0 | 2 | transformers | 2023-05-05T13:25:58 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
config: split
split: validation
args: split
metrics:
- name: Accuracy
type: accuracy
value: 0.927
- name: F1
type: f1
value: 0.9270352010468786
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2181
- Accuracy: 0.927
- F1: 0.9270
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.8231 | 1.0 | 250 | 0.3117 | 0.907 | 0.9051 |
| 0.2503 | 2.0 | 500 | 0.2181 | 0.927 | 0.9270 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,840 | [
[
-0.03857421875,
-0.040985107421875,
0.01605224609375,
0.0221099853515625,
-0.0267791748046875,
-0.0211181640625,
-0.012725830078125,
-0.00878143310546875,
0.010223388671875,
0.0090789794921875,
-0.056488037109375,
-0.05218505859375,
-0.058868408203125,
-0.00... |
cansurav/bert-base-uncased-finetuned-cola-batch-4 | 2023-05-05T18:52:13.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-batch-4 | 0 | 2 | transformers | 2023-05-05T13:38:40 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-batch-4
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5990351356363471
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-batch-4
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3628
- Matthews Correlation: 0.5990
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:-----:|:---------------:|:--------------------:|
| 0.5495 | 1.0 | 2138 | 0.7520 | 0.4570 |
| 0.457 | 2.0 | 4276 | 0.8038 | 0.5567 |
| 0.2524 | 3.0 | 6414 | 0.9339 | 0.5416 |
| 0.1602 | 4.0 | 8552 | 1.0277 | 0.5809 |
| 0.1241 | 5.0 | 10690 | 1.2164 | 0.5830 |
| 0.1057 | 6.0 | 12828 | 1.2966 | 0.5855 |
| 0.0428 | 7.0 | 14966 | 1.3628 | 0.5990 |
| 0.0311 | 8.0 | 17104 | 1.3782 | 0.5843 |
| 0.0281 | 9.0 | 19242 | 1.6510 | 0.5452 |
| 0.0067 | 10.0 | 21380 | 1.5954 | 0.5713 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,415 | [
[
-0.032623291015625,
-0.045196533203125,
0.0127410888671875,
0.01294708251953125,
-0.01837158203125,
-0.0161590576171875,
-0.0116729736328125,
-0.0084381103515625,
0.0297698974609375,
0.016815185546875,
-0.052734375,
-0.037261962890625,
-0.050933837890625,
-0... |
platzi/platzi-distilroberta-base-mrpc-glue-paola-daft | 2023-05-07T14:00:42.000Z | [
"transformers",
"pytorch",
"tensorboard",
"roberta",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | platzi | null | null | platzi/platzi-distilroberta-base-mrpc-glue-paola-daft | 0 | 2 | transformers | 2023-05-05T14:50:13 | ---
license: apache-2.0
tags:
- text-classification
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
- f1
widget:
- text: ["Yucaipa owned Dominick 's before selling the chain to Safeway in 1998 for $ 2.5 billion.",
"Yucaipa bought Dominick's in 1995 for $ 693 million and sold it to Safeway for $ 1.8 billion in 1998."]
example_title: Not Equivalent
- text: ["Revenue in the first quarter of the year dropped 15 percent from the same period a year earlier.",
"With the scandal hanging over Stewart's company revenue the first quarter of the year dropped 15 percent from the same period a year earlier."]
example_title: Equivalent
model-index:
- name: platzi-distilroberta-base-mrpc-glue-paola-daft
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: datasetX
type: glue
config: mrpc
split: validation
args: mrpc
metrics:
- name: Accuracy
type: accuracy
value: 0.8382352941176471
- name: F1
type: f1
value: 0.8749999999999999
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# platzi-distilroberta-base-mrpc-glue-paola-daft
This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the datasetX dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4514
- Accuracy: 0.8382
- F1: 0.8750
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.5291 | 1.09 | 500 | 0.4514 | 0.8382 | 0.8750 |
| 0.3759 | 2.18 | 1000 | 0.6055 | 0.8382 | 0.8740 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,416 | [
[
-0.03424072265625,
-0.0435791015625,
0.01140594482421875,
0.0191192626953125,
-0.0291290283203125,
-0.0284271240234375,
-0.010711669921875,
-0.0013408660888671875,
-0.0008225440979003906,
0.01428985595703125,
-0.050872802734375,
-0.044952392578125,
-0.0580139160... |
ilkekas/bert-base-uncased-finetuned-cola | 2023-05-05T23:18:31.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | ilkekas | null | null | ilkekas/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-05T15:13:01 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5813817583744711
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7967
- Matthews Correlation: 0.5814
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4877 | 1.0 | 535 | 0.5040 | 0.5045 |
| 0.2911 | 2.0 | 1070 | 0.4858 | 0.5761 |
| 0.1783 | 3.0 | 1605 | 0.7177 | 0.5306 |
| 0.1263 | 4.0 | 2140 | 0.7967 | 0.5814 |
| 0.0763 | 5.0 | 2675 | 0.9040 | 0.5782 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,018 | [
[
-0.02716064453125,
-0.049713134765625,
0.0079345703125,
0.0173187255859375,
-0.022491455078125,
-0.0192718505859375,
-0.01555633544921875,
-0.0127716064453125,
0.0262451171875,
0.0158843994140625,
-0.051116943359375,
-0.03411865234375,
-0.05328369140625,
-0.... |
HilbertS/a2c-AntBulletEnv-v0 | 2023-07-24T15:49:04.000Z | [
"stable-baselines3",
"AntBulletEnv-v0",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | HilbertS | null | null | HilbertS/a2c-AntBulletEnv-v0 | 0 | 2 | stable-baselines3 | 2023-05-05T15:13:22 | ---
library_name: stable-baselines3
tags:
- AntBulletEnv-v0
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: A2C
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: AntBulletEnv-v0
type: AntBulletEnv-v0
metrics:
- type: mean_reward
value: 1210.21 +/- 147.67
name: mean_reward
verified: false
---
# **A2C** Agent playing **AntBulletEnv-v0**
This is a trained model of a **A2C** agent playing **AntBulletEnv-v0**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
| 791 | [
[
-0.02679443359375,
-0.04443359375,
0.0106964111328125,
0.0208892822265625,
-0.0034961700439453125,
0.0018033981323242188,
0.0187530517578125,
-0.0176544189453125,
0.0193939208984375,
0.0265655517578125,
-0.052642822265625,
-0.037506103515625,
-0.04425048828125,
... |
rethem-expeditecommerce/MiniLM-L6-GPL | 2023-05-09T15:46:31.000Z | [
"sentence-transformers",
"pytorch",
"bert",
"feature-extraction",
"sentence-similarity",
"en",
"dataset:s2orc",
"dataset:flax-sentence-embeddings/stackexchange_xml",
"dataset:ms_marco",
"dataset:gooaq",
"dataset:yahoo_answers_topics",
"dataset:code_search_net",
"dataset:search_qa",
"datase... | sentence-similarity | rethem-expeditecommerce | null | null | rethem-expeditecommerce/MiniLM-L6-GPL | 0 | 2 | sentence-transformers | 2023-05-05T15:17:37 | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
language: en
license: apache-2.0
datasets:
- s2orc
- flax-sentence-embeddings/stackexchange_xml
- ms_marco
- gooaq
- yahoo_answers_topics
- code_search_net
- search_qa
- eli5
- snli
- multi_nli
- wikihow
- natural_questions
- trivia_qa
- embedding-data/sentence-compression
- embedding-data/flickr30k-captions
- embedding-data/altlex
- embedding-data/simple-wiki
- embedding-data/QQP
- embedding-data/SPECTER
- embedding-data/PAQ_pairs
- embedding-data/WikiAnswers
---
# all-MiniLM-L6-v2
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/all-MiniLM-L6-v2')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
import torch.nn.functional as F
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/all-MiniLM-L6-v2')
model = AutoModel.from_pretrained('sentence-transformers/all-MiniLM-L6-v2')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
# Normalize embeddings
sentence_embeddings = F.normalize(sentence_embeddings, p=2, dim=1)
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/all-MiniLM-L6-v2)
------
## Background
The project aims to train sentence embedding models on very large sentence level datasets using a self-supervised
contrastive learning objective. We used the pretrained [`nreimers/MiniLM-L6-H384-uncased`](https://huggingface.co/nreimers/MiniLM-L6-H384-uncased) model and fine-tuned in on a
1B sentence pairs dataset. We use a contrastive learning objective: given a sentence from the pair, the model should predict which out of a set of randomly sampled other sentences, was actually paired with it in our dataset.
We developped this model during the
[Community week using JAX/Flax for NLP & CV](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104),
organized by Hugging Face. We developped this model as part of the project:
[Train the Best Sentence Embedding Model Ever with 1B Training Pairs](https://discuss.huggingface.co/t/train-the-best-sentence-embedding-model-ever-with-1b-training-pairs/7354). We benefited from efficient hardware infrastructure to run the project: 7 TPUs v3-8, as well as intervention from Googles Flax, JAX, and Cloud team member about efficient deep learning frameworks.
## Intended uses
Our model is intented to be used as a sentence and short paragraph encoder. Given an input text, it ouptuts a vector which captures
the semantic information. The sentence vector may be used for information retrieval, clustering or sentence similarity tasks.
By default, input text longer than 256 word pieces is truncated.
## Training procedure
### Pre-training
We use the pretrained [`nreimers/MiniLM-L6-H384-uncased`](https://huggingface.co/nreimers/MiniLM-L6-H384-uncased) model. Please refer to the model card for more detailed information about the pre-training procedure.
### Fine-tuning
We fine-tune the model using a contrastive objective. Formally, we compute the cosine similarity from each possible sentence pairs from the batch.
We then apply the cross entropy loss by comparing with true pairs.
#### Hyper parameters
We trained ou model on a TPU v3-8. We train the model during 100k steps using a batch size of 1024 (128 per TPU core).
We use a learning rate warm up of 500. The sequence length was limited to 128 tokens. We used the AdamW optimizer with
a 2e-5 learning rate. The full training script is accessible in this current repository: `train_script.py`.
#### Training data
We use the concatenation from multiple datasets to fine-tune our model. The total number of sentence pairs is above 1 billion sentences.
We sampled each dataset given a weighted probability which configuration is detailed in the `data_config.json` file.
| Dataset | Paper | Number of training tuples |
|--------------------------------------------------------|:----------------------------------------:|:--------------------------:|
| [Reddit comments (2015-2018)](https://github.com/PolyAI-LDN/conversational-datasets/tree/master/reddit) | [paper](https://arxiv.org/abs/1904.06472) | 726,484,430 |
| [S2ORC](https://github.com/allenai/s2orc) Citation pairs (Abstracts) | [paper](https://aclanthology.org/2020.acl-main.447/) | 116,288,806 |
| [WikiAnswers](https://github.com/afader/oqa#wikianswers-corpus) Duplicate question pairs | [paper](https://doi.org/10.1145/2623330.2623677) | 77,427,422 |
| [PAQ](https://github.com/facebookresearch/PAQ) (Question, Answer) pairs | [paper](https://arxiv.org/abs/2102.07033) | 64,371,441 |
| [S2ORC](https://github.com/allenai/s2orc) Citation pairs (Titles) | [paper](https://aclanthology.org/2020.acl-main.447/) | 52,603,982 |
| [S2ORC](https://github.com/allenai/s2orc) (Title, Abstract) | [paper](https://aclanthology.org/2020.acl-main.447/) | 41,769,185 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title, Body) pairs | - | 25,316,456 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title+Body, Answer) pairs | - | 21,396,559 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title, Answer) pairs | - | 21,396,559 |
| [MS MARCO](https://microsoft.github.io/msmarco/) triplets | [paper](https://doi.org/10.1145/3404835.3462804) | 9,144,553 |
| [GOOAQ: Open Question Answering with Diverse Answer Types](https://github.com/allenai/gooaq) | [paper](https://arxiv.org/pdf/2104.08727.pdf) | 3,012,496 |
| [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Title, Answer) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 1,198,260 |
| [Code Search](https://huggingface.co/datasets/code_search_net) | - | 1,151,414 |
| [COCO](https://cocodataset.org/#home) Image captions | [paper](https://link.springer.com/chapter/10.1007%2F978-3-319-10602-1_48) | 828,395|
| [SPECTER](https://github.com/allenai/specter) citation triplets | [paper](https://doi.org/10.18653/v1/2020.acl-main.207) | 684,100 |
| [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Question, Answer) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 681,164 |
| [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Title, Question) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 659,896 |
| [SearchQA](https://huggingface.co/datasets/search_qa) | [paper](https://arxiv.org/abs/1704.05179) | 582,261 |
| [Eli5](https://huggingface.co/datasets/eli5) | [paper](https://doi.org/10.18653/v1/p19-1346) | 325,475 |
| [Flickr 30k](https://shannon.cs.illinois.edu/DenotationGraph/) | [paper](https://transacl.org/ojs/index.php/tacl/article/view/229/33) | 317,695 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (titles) | | 304,525 |
| AllNLI ([SNLI](https://nlp.stanford.edu/projects/snli/) and [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) | [paper SNLI](https://doi.org/10.18653/v1/d15-1075), [paper MultiNLI](https://doi.org/10.18653/v1/n18-1101) | 277,230 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (bodies) | | 250,519 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (titles+bodies) | | 250,460 |
| [Sentence Compression](https://github.com/google-research-datasets/sentence-compression) | [paper](https://www.aclweb.org/anthology/D13-1155/) | 180,000 |
| [Wikihow](https://github.com/pvl/wikihow_pairs_dataset) | [paper](https://arxiv.org/abs/1810.09305) | 128,542 |
| [Altlex](https://github.com/chridey/altlex/) | [paper](https://aclanthology.org/P16-1135.pdf) | 112,696 |
| [Quora Question Triplets](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs) | - | 103,663 |
| [Simple Wikipedia](https://cs.pomona.edu/~dkauchak/simplification/) | [paper](https://www.aclweb.org/anthology/P11-2117/) | 102,225 |
| [Natural Questions (NQ)](https://ai.google.com/research/NaturalQuestions) | [paper](https://transacl.org/ojs/index.php/tacl/article/view/1455) | 100,231 |
| [SQuAD2.0](https://rajpurkar.github.io/SQuAD-explorer/) | [paper](https://aclanthology.org/P18-2124.pdf) | 87,599 |
| [TriviaQA](https://huggingface.co/datasets/trivia_qa) | - | 73,346 |
| **Total** | | **1,170,060,424** | | 10,610 | [
[
-0.0261077880859375,
-0.0640869140625,
0.024322509765625,
0.00792694091796875,
-0.0102996826171875,
-0.02105712890625,
-0.0173187255859375,
-0.0211334228515625,
0.025390625,
0.01397705078125,
-0.037109375,
-0.0400390625,
-0.04833984375,
0.00928497314453125,
... |
4bd4774h/bert-base-uncased-finetuned-cola | 2023-05-05T16:54:05.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | 4bd4774h | null | null | 4bd4774h/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-05T15:20:24 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5815775806078913
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0375
- Matthews Correlation: 0.5816
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.999174630178768e-05
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 6
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4594 | 1.0 | 1069 | 0.4619 | 0.5155 |
| 0.3105 | 2.0 | 2138 | 0.5069 | 0.5807 |
| 0.2003 | 3.0 | 3207 | 1.0033 | 0.5524 |
| 0.1074 | 4.0 | 4276 | 1.0375 | 0.5816 |
| 0.0715 | 5.0 | 5345 | 1.1228 | 0.5743 |
| 0.0355 | 6.0 | 6414 | 1.3127 | 0.5728 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,107 | [
[
-0.0272979736328125,
-0.04949951171875,
0.0081329345703125,
0.0141754150390625,
-0.022857666015625,
-0.0201568603515625,
-0.0153961181640625,
-0.01361083984375,
0.0257568359375,
0.01544189453125,
-0.053314208984375,
-0.033172607421875,
-0.05108642578125,
-0.... |
dudnspa0203/distilbert-base-uncased-finetuned-emotion | 2023-05-05T16:05:26.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:emotion",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | dudnspa0203 | null | null | dudnspa0203/distilbert-base-uncased-finetuned-emotion | 0 | 2 | transformers | 2023-05-05T15:59:15 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
config: split
split: validation
args: split
metrics:
- name: Accuracy
type: accuracy
value: 0.9265
- name: F1
type: f1
value: 0.9263631112132207
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2133
- Accuracy: 0.9265
- F1: 0.9264
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.7946 | 1.0 | 250 | 0.3031 | 0.906 | 0.9021 |
| 0.2424 | 2.0 | 500 | 0.2133 | 0.9265 | 0.9264 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,848 | [
[
-0.038238525390625,
-0.041534423828125,
0.0154876708984375,
0.0218505859375,
-0.0262451171875,
-0.01934814453125,
-0.01320648193359375,
-0.00870513916015625,
0.01067352294921875,
0.00838470458984375,
-0.05670166015625,
-0.051666259765625,
-0.059173583984375,
... |
MertU/bert-base-uncased-finetuned-cola | 2023-05-08T14:47:11.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | MertU | null | null | MertU/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-05T16:36:22 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5936351080219947
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5433
- Matthews Correlation: 0.5936
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5.18e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 268 | 0.4637 | 0.5232 |
| 0.3892 | 2.0 | 536 | 0.5122 | 0.5227 |
| 0.3892 | 3.0 | 804 | 0.5433 | 0.5936 |
| 0.126 | 4.0 | 1072 | 0.8598 | 0.5551 |
| 0.126 | 5.0 | 1340 | 0.8732 | 0.5906 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,021 | [
[
-0.025390625,
-0.050506591796875,
0.007610321044921875,
0.016326904296875,
-0.0212554931640625,
-0.0189056396484375,
-0.016021728515625,
-0.015289306640625,
0.0260467529296875,
0.0158538818359375,
-0.05072021484375,
-0.032745361328125,
-0.05328369140625,
-0.... |
vg055/roberta-base-bne-finetuned-TripAdvisorDomainAdaptation-finetuned-e2-RestMex2023-tipo | 2023-05-05T18:37:12.000Z | [
"transformers",
"pytorch",
"tensorboard",
"roberta",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | vg055 | null | null | vg055/roberta-base-bne-finetuned-TripAdvisorDomainAdaptation-finetuned-e2-RestMex2023-tipo | 0 | 2 | transformers | 2023-05-05T17:06:08 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: roberta-base-bne-finetuned-TripAdvisorDomainAdaptation-finetuned-e2-RestMex2023-tipo
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-bne-finetuned-TripAdvisorDomainAdaptation-finetuned-e2-RestMex2023-tipo
This model is a fine-tuned version of [vg055/roberta-base-bne-finetuned-TripAdvisorDomainAdaptation](https://huggingface.co/vg055/roberta-base-bne-finetuned-TripAdvisorDomainAdaptation) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0472
- F1: 0.9902
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 0.0479 | 1.0 | 14159 | 0.0521 | 0.9878 |
| 0.0154 | 2.0 | 28318 | 0.0472 | 0.9902 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,592 | [
[
-0.038421630859375,
-0.043701171875,
0.01230621337890625,
0.01416778564453125,
-0.032257080078125,
-0.04425048828125,
-0.0157623291015625,
-0.01715087890625,
0.0078277587890625,
0.0321044921875,
-0.057952880859375,
-0.04547119140625,
-0.050384521484375,
-0.0... |
arslan01/bert-base-uncased-finetuned-cola | 2023-05-06T20:55:32.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | arslan01 | null | null | arslan01/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-05T17:33:07 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5076423377649488
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4633
- Matthews Correlation: 0.5076
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4888 | 1.0 | 535 | 0.4633 | 0.5076 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,722 | [
[
-0.02569580078125,
-0.052459716796875,
0.0113677978515625,
0.0214080810546875,
-0.027923583984375,
-0.022064208984375,
-0.019195556640625,
-0.01529693603515625,
0.0255279541015625,
0.0164031982421875,
-0.049285888671875,
-0.030914306640625,
-0.050567626953125,
... |
Zeynoko/bert-base-uncased-finetuned-cola | 2023-05-07T18:47:34.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | Zeynoko | null | null | Zeynoko/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-05T18:02:43 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5099438022926766
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4571
- Matthews Correlation: 0.5099
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.485 | 1.0 | 535 | 0.4571 | 0.5099 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,716 | [
[
-0.025634765625,
-0.052276611328125,
0.0115966796875,
0.0210418701171875,
-0.0281219482421875,
-0.023529052734375,
-0.018768310546875,
-0.01494598388671875,
0.0254058837890625,
0.016754150390625,
-0.049468994140625,
-0.031402587890625,
-0.0506591796875,
-0.0... |
AskingAlex/exist-2023-task1 | 2023-05-07T22:09:01.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-classification | AskingAlex | null | null | AskingAlex/exist-2023-task1 | 0 | 2 | transformers | 2023-05-05T18:02:47 | ---
license: mit
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: exist-2023-task1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# exist-2023-task1
This model is a fine-tuned version of [microsoft/Multilingual-MiniLM-L12-H384](https://huggingface.co/microsoft/Multilingual-MiniLM-L12-H384) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1508
- F1: 0.9539
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| No log | 1.0 | 217 | 0.5111 | 0.7595 |
| No log | 2.0 | 434 | 0.4330 | 0.7788 |
| 0.5404 | 3.0 | 651 | 0.3532 | 0.8527 |
| 0.5404 | 4.0 | 868 | 0.3284 | 0.8439 |
| 0.3878 | 5.0 | 1085 | 0.2876 | 0.8875 |
| 0.3878 | 6.0 | 1302 | 0.2204 | 0.9212 |
| 0.299 | 7.0 | 1519 | 0.1917 | 0.9335 |
| 0.299 | 8.0 | 1736 | 0.1731 | 0.9452 |
| 0.299 | 9.0 | 1953 | 0.1570 | 0.9515 |
| 0.2339 | 10.0 | 2170 | 0.1508 | 0.9539 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,882 | [
[
-0.0305328369140625,
-0.032745361328125,
0.00937652587890625,
0.005397796630859375,
-0.015899658203125,
-0.0264892578125,
-0.007038116455078125,
-0.0194091796875,
0.0075225830078125,
0.013671875,
-0.0626220703125,
-0.040771484375,
-0.042266845703125,
-0.0080... |
uisikdag/ayla_ozetler3006_bertuncased | 2023-05-05T19:09:58.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-classification | uisikdag | null | null | uisikdag/ayla_ozetler3006_bertuncased | 0 | 2 | transformers | 2023-05-05T18:24:39 | ---
license: mit
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: ayla_ozetler3006_bertuncased
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ayla_ozetler3006_bertuncased
This model is a fine-tuned version of [dbmdz/bert-base-turkish-uncased](https://huggingface.co/dbmdz/bert-base-turkish-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2677
- Accuracy: 0.9148
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.7743 | 1.0 | 10 | 1.6741 | 0.3389 |
| 1.3789 | 2.0 | 20 | 0.9867 | 0.6907 |
| 0.6919 | 3.0 | 30 | 0.4551 | 0.8278 |
| 0.364 | 4.0 | 40 | 0.3367 | 0.8778 |
| 0.2237 | 5.0 | 50 | 0.2699 | 0.8944 |
| 0.1481 | 6.0 | 60 | 0.3266 | 0.8667 |
| 0.1267 | 7.0 | 70 | 0.2515 | 0.9111 |
| 0.0726 | 8.0 | 80 | 0.2603 | 0.9167 |
| 0.0567 | 9.0 | 90 | 0.2595 | 0.9111 |
| 0.0461 | 10.0 | 100 | 0.2677 | 0.9148 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.11.0
| 2,026 | [
[
-0.041015625,
-0.046600341796875,
0.0011281967163085938,
0.0081939697265625,
-0.0205535888671875,
-0.02783203125,
-0.010528564453125,
-0.0165863037109375,
0.015777587890625,
0.025909423828125,
-0.056549072265625,
-0.048980712890625,
-0.050750732421875,
-0.01... |
tunaozates/bert-base-uncased-finetuned-cola | 2023-05-05T18:34:57.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | tunaozates | null | null | tunaozates/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-05T18:25:08 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5372712841497043
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4558
- Matthews Correlation: 0.5373
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4965 | 1.0 | 535 | 0.4558 | 0.5373 |
### Framework versions
- Transformers 4.28.1
- Pytorch 1.13.1+cu117
- Datasets 2.11.0
- Tokenizers 0.12.1
| 1,723 | [
[
-0.0254364013671875,
-0.0526123046875,
0.01070404052734375,
0.0209808349609375,
-0.0282745361328125,
-0.0215606689453125,
-0.019439697265625,
-0.0144805908203125,
0.02557373046875,
0.0164794921875,
-0.048828125,
-0.03094482421875,
-0.050811767578125,
-0.0203... |
cansurav/bert-base-uncased-finetuned-cola-batch-8 | 2023-05-06T16:20:06.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-batch-8 | 0 | 2 | transformers | 2023-05-05T18:52:21 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-batch-8
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5909903281139832
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-batch-8
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6364
- Matthews Correlation: 0.5910
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:-----:|:---------------:|:--------------------:|
| 0.4737 | 1.0 | 1069 | 0.6579 | 0.4918 |
| 0.3331 | 2.0 | 2138 | 0.6364 | 0.5910 |
| 0.2223 | 3.0 | 3207 | 0.8108 | 0.5658 |
| 0.1445 | 4.0 | 4276 | 0.9036 | 0.5832 |
| 0.0841 | 5.0 | 5345 | 1.0537 | 0.5727 |
| 0.0634 | 6.0 | 6414 | 1.2565 | 0.5763 |
| 0.0384 | 7.0 | 7483 | 1.2944 | 0.5881 |
| 0.0278 | 8.0 | 8552 | 1.3246 | 0.5902 |
| 0.0251 | 9.0 | 9621 | 1.4406 | 0.5651 |
| 0.0091 | 10.0 | 10690 | 1.4599 | 0.5685 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,415 | [
[
-0.03302001953125,
-0.045562744140625,
0.01013946533203125,
0.01180267333984375,
-0.0198516845703125,
-0.0164794921875,
-0.01192474365234375,
-0.0100250244140625,
0.0297088623046875,
0.0171661376953125,
-0.052703857421875,
-0.037689208984375,
-0.0546875,
-0.... |
vg055/roberta-base-bne-finetuned-TripAdvisorDomainAdaptation-finetuned-e2-RestMex2023-pais | 2023-05-06T07:39:48.000Z | [
"transformers",
"pytorch",
"tensorboard",
"roberta",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | vg055 | null | null | vg055/roberta-base-bne-finetuned-TripAdvisorDomainAdaptation-finetuned-e2-RestMex2023-pais | 0 | 2 | transformers | 2023-05-05T18:57:35 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: roberta-base-bne-finetuned-TripAdvisorDomainAdaptation-finetuned-e2-RestMex2023-pais
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-bne-finetuned-TripAdvisorDomainAdaptation-finetuned-e2-RestMex2023-pais
This model is a fine-tuned version of [vg055/roberta-base-bne-finetuned-TripAdvisorDomainAdaptation](https://huggingface.co/vg055/roberta-base-bne-finetuned-TripAdvisorDomainAdaptation) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2102
- F1: 0.9437
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 0.1847 | 1.0 | 14159 | 0.1800 | 0.9383 |
| 0.0931 | 2.0 | 28318 | 0.2102 | 0.9437 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,592 | [
[
-0.03961181640625,
-0.044158935546875,
0.0144195556640625,
0.014678955078125,
-0.031494140625,
-0.044647216796875,
-0.01556396484375,
-0.01505279541015625,
0.007740020751953125,
0.031890869140625,
-0.058135986328125,
-0.04376220703125,
-0.050201416015625,
-0... |
sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr | 2023-05-05T22:35:25.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | sepehrbakhshi | null | null | sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr | 0 | 2 | transformers | 2023-05-05T20:10:00 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola_sepehr
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5398085142164725
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola_sepehr
This model is a fine-tuned version of [sepehrbakhshi/bert-base-uncased-finetuned-cola](https://huggingface.co/sepehrbakhshi/bert-base-uncased-finetuned-cola) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6254
- Matthews Correlation: 0.5398
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.1725 | 1.0 | 535 | 0.6254 | 0.5398 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,794 | [
[
-0.03228759765625,
-0.052459716796875,
0.0091094970703125,
0.0174407958984375,
-0.03369140625,
-0.021331787109375,
-0.019805908203125,
-0.016510009765625,
0.0261688232421875,
0.0169677734375,
-0.04949951171875,
-0.0301666259765625,
-0.052642822265625,
-0.014... |
dkoh12/distilbert-base-uncased-finetuned-clinc | 2023-05-06T01:36:20.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:clinc_oos",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | dkoh12 | null | null | dkoh12/distilbert-base-uncased-finetuned-clinc | 0 | 2 | transformers | 2023-05-05T20:57:52 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- clinc_oos
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased-finetuned-clinc
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: clinc_oos
type: clinc_oos
config: plus
split: validation
args: plus
metrics:
- name: Accuracy
type: accuracy
value: 0.9180645161290323
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-clinc
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the clinc_oos dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7720
- Accuracy: 0.9181
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 318 | 3.2887 | 0.7419 |
| 3.7868 | 2.0 | 636 | 1.8753 | 0.8371 |
| 3.7868 | 3.0 | 954 | 1.1570 | 0.8961 |
| 1.6927 | 4.0 | 1272 | 0.8573 | 0.9129 |
| 0.9056 | 5.0 | 1590 | 0.7720 | 0.9181 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,932 | [
[
-0.034393310546875,
-0.0416259765625,
0.012481689453125,
0.006755828857421875,
-0.027496337890625,
-0.0255126953125,
-0.0127716064453125,
-0.009307861328125,
0.0024433135986328125,
0.022125244140625,
-0.0465087890625,
-0.047943115234375,
-0.05810546875,
-0.0... |
dkoh12/distilbert-base-uncased-distilled-clinc | 2023-05-06T02:11:31.000Z | [
"transformers",
"pytorch",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:clinc_oos",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | dkoh12 | null | null | dkoh12/distilbert-base-uncased-distilled-clinc | 0 | 2 | transformers | 2023-05-05T23:42:44 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- clinc_oos
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased-distilled-clinc
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: clinc_oos
type: clinc_oos
config: plus
split: validation
args: plus
metrics:
- name: Accuracy
type: accuracy
value: 0.9441935483870968
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-distilled-clinc
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the clinc_oos dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1210
- Accuracy: 0.9442
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 318 | 0.7532 | 0.7613 |
| 0.9596 | 2.0 | 636 | 0.3779 | 0.8910 |
| 0.9596 | 3.0 | 954 | 0.2265 | 0.9239 |
| 0.3532 | 4.0 | 1272 | 0.1705 | 0.9345 |
| 0.1878 | 5.0 | 1590 | 0.1473 | 0.9390 |
| 0.1878 | 6.0 | 1908 | 0.1349 | 0.9419 |
| 0.1415 | 7.0 | 2226 | 0.1279 | 0.9452 |
| 0.1226 | 8.0 | 2544 | 0.1240 | 0.9448 |
| 0.1226 | 9.0 | 2862 | 0.1217 | 0.9435 |
| 0.1149 | 10.0 | 3180 | 0.1210 | 0.9442 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,243 | [
[
-0.032196044921875,
-0.0384521484375,
0.0133514404296875,
0.006931304931640625,
-0.0252685546875,
-0.0196533203125,
-0.008880615234375,
-0.006320953369140625,
0.0081939697265625,
0.0207061767578125,
-0.0430908203125,
-0.04925537109375,
-0.06134033203125,
-0.... |
sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr_final_last | 2023-05-06T00:36:18.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | sepehrbakhshi | null | null | sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr_final_last | 0 | 2 | transformers | 2023-05-06T00:22:08 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola_sepehr_final_last
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.523501779881147
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola_sepehr_final_last
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5483
- Matthews Correlation: 0.5235
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.2718 | 1.0 | 535 | 0.5483 | 0.5235 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,757 | [
[
-0.02685546875,
-0.0478515625,
0.0171051025390625,
0.01715087890625,
-0.030792236328125,
-0.025115966796875,
-0.0197906494140625,
-0.0146942138671875,
0.0221099853515625,
0.0168609619140625,
-0.048736572265625,
-0.03692626953125,
-0.051361083984375,
-0.02229... |
sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr_sepehr | 2023-05-06T00:57:23.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | sepehrbakhshi | null | null | sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr_sepehr | 0 | 2 | transformers | 2023-05-06T00:44:40 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola_sepehr_sepehr
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5179506685735915
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola_sepehr_sepehr
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5185
- Matthews Correlation: 0.5180
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.2612 | 1.0 | 535 | 0.5185 | 0.5180 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,750 | [
[
-0.028076171875,
-0.050628662109375,
0.01324462890625,
0.018402099609375,
-0.0309295654296875,
-0.025421142578125,
-0.0196990966796875,
-0.0149993896484375,
0.02490234375,
0.0177459716796875,
-0.049835205078125,
-0.03546142578125,
-0.0517578125,
-0.023941040... |
sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr | 2023-05-06T01:04:25.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | sepehrbakhshi | null | null | sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr | 0 | 2 | transformers | 2023-05-06T01:03:11 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5079531963854501
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4582
- Matthews Correlation: 0.5080
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4924 | 1.0 | 535 | 0.4582 | 0.5080 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,764 | [
[
-0.0278472900390625,
-0.049468994140625,
0.0139923095703125,
0.0182647705078125,
-0.03155517578125,
-0.0260467529296875,
-0.01959228515625,
-0.01535797119140625,
0.0240936279296875,
0.0167694091796875,
-0.0491943359375,
-0.0364990234375,
-0.052825927734375,
... |
huggingtweets/nanofaux | 2023-05-06T01:35:05.000Z | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"huggingtweets",
"en",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | huggingtweets | null | null | huggingtweets/nanofaux | 0 | 2 | transformers | 2023-05-06T01:34:56 | ---
language: en
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1619040835999260673/MdcqdOfL_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Nanofaux🔜AC</div>
<div style="text-align: center; font-size: 14px;">@nanofaux</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Nanofaux🔜AC.
| Data | Nanofaux🔜AC |
| --- | --- |
| Tweets downloaded | 3224 |
| Retweets | 34 |
| Short tweets | 1422 |
| Tweets kept | 1768 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/7a1ouuel/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nanofaux's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/7nxgc0jk) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/7nxgc0jk/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/nanofaux')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| 3,491 | [
[
-0.02777099609375,
-0.0667724609375,
0.0277099609375,
0.01342010498046875,
-0.0157470703125,
0.010498046875,
-0.00494384765625,
-0.034210205078125,
0.02520751953125,
0.005584716796875,
-0.07366943359375,
-0.0310821533203125,
-0.049163818359375,
-0.0080184936... |
TokenfreeEMNLPSubmission/mbert-base-finetuned-pos-ud-arabic-padt | 2023-05-06T03:51:43.000Z | [
"transformers",
"pytorch",
"bert",
"token-classification",
"canine",
"pretrained-on-english-language",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | TokenfreeEMNLPSubmission | null | null | TokenfreeEMNLPSubmission/mbert-base-finetuned-pos-ud-arabic-padt | 0 | 2 | transformers | 2023-05-06T03:51:23 | ---
license: apache-2.0
tags:
- canine
- pretrained-on-english-language
---
### How to use
Here is how to use this model:
```python
from transformers import CanineModel
model = CanineModel.from_pretrained('mushfiqur11/<repo name>')
``` | 238 | [
[
-0.0042877197265625,
0.0004191398620605469,
-0.00183868408203125,
0.00743865966796875,
-0.0229339599609375,
-0.0040435791015625,
0.0148162841796875,
0.007778167724609375,
0.00664520263671875,
0.038421630859375,
-0.05633544921875,
-0.011505126953125,
-0.029418945... |
TokenfreeEMNLPSubmission/mbert-base-finetuned-pos-ud-coptic-scriptorium | 2023-05-06T03:52:19.000Z | [
"transformers",
"pytorch",
"bert",
"token-classification",
"canine",
"pretrained-on-english-language",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | TokenfreeEMNLPSubmission | null | null | TokenfreeEMNLPSubmission/mbert-base-finetuned-pos-ud-coptic-scriptorium | 0 | 2 | transformers | 2023-05-06T03:52:02 | ---
license: apache-2.0
tags:
- canine
- pretrained-on-english-language
---
### How to use
Here is how to use this model:
```python
from transformers import CanineModel
model = CanineModel.from_pretrained('mushfiqur11/<repo name>')
``` | 238 | [
[
-0.0042877197265625,
0.0004191398620605469,
-0.00183868408203125,
0.00743865966796875,
-0.0229339599609375,
-0.0040435791015625,
0.0148162841796875,
0.007778167724609375,
0.00664520263671875,
0.038421630859375,
-0.05633544921875,
-0.011505126953125,
-0.029418945... |
TokenfreeEMNLPSubmission/mbert-base-finetuned-pos-ud-tamil-ttb | 2023-05-06T03:53:50.000Z | [
"transformers",
"pytorch",
"bert",
"token-classification",
"canine",
"pretrained-on-english-language",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | TokenfreeEMNLPSubmission | null | null | TokenfreeEMNLPSubmission/mbert-base-finetuned-pos-ud-tamil-ttb | 0 | 2 | transformers | 2023-05-06T03:53:33 | ---
license: apache-2.0
tags:
- canine
- pretrained-on-english-language
---
### How to use
Here is how to use this model:
```python
from transformers import CanineModel
model = CanineModel.from_pretrained('mushfiqur11/<repo name>')
``` | 238 | [
[
-0.004283905029296875,
0.00045490264892578125,
-0.0018529891967773438,
0.007434844970703125,
-0.0229644775390625,
-0.004047393798828125,
0.0148162841796875,
0.00774383544921875,
0.00662994384765625,
0.038421630859375,
-0.056304931640625,
-0.01148223876953125,
-0... |
TokenfreeEMNLPSubmission/mbert-base-finetuned-pos-ud-vietnamese-vtb | 2023-05-06T03:54:09.000Z | [
"transformers",
"pytorch",
"bert",
"token-classification",
"canine",
"pretrained-on-english-language",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | TokenfreeEMNLPSubmission | null | null | TokenfreeEMNLPSubmission/mbert-base-finetuned-pos-ud-vietnamese-vtb | 0 | 2 | transformers | 2023-05-06T03:53:52 | ---
license: apache-2.0
tags:
- canine
- pretrained-on-english-language
---
### How to use
Here is how to use this model:
```python
from transformers import CanineModel
model = CanineModel.from_pretrained('mushfiqur11/<repo name>')
``` | 238 | [
[
-0.004283905029296875,
0.00045490264892578125,
-0.0018529891967773438,
0.007434844970703125,
-0.0229644775390625,
-0.004047393798828125,
0.0148162841796875,
0.00774383544921875,
0.00662994384765625,
0.038421630859375,
-0.056304931640625,
-0.01148223876953125,
-0... |
TokenfreeEMNLPSubmission/canine-base-finetuned-masakhaner-conll_2003_en | 2023-05-06T03:55:44.000Z | [
"transformers",
"pytorch",
"canine",
"token-classification",
"pretrained-on-english-language",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | TokenfreeEMNLPSubmission | null | null | TokenfreeEMNLPSubmission/canine-base-finetuned-masakhaner-conll_2003_en | 0 | 2 | transformers | 2023-05-06T03:55:30 | ---
license: apache-2.0
tags:
- canine
- pretrained-on-english-language
---
### How to use
Here is how to use this model:
```python
from transformers import CanineModel
model = CanineModel.from_pretrained('mushfiqur11/<repo name>')
``` | 238 | [
[
-0.0042877197265625,
0.0004191398620605469,
-0.00183868408203125,
0.00743865966796875,
-0.0229339599609375,
-0.0040435791015625,
0.0148162841796875,
0.007778167724609375,
0.00664520263671875,
0.038421630859375,
-0.05633544921875,
-0.011505126953125,
-0.029418945... |
TokenfreeEMNLPSubmission/canine-base-finetuned-pos-ud-hindi-hdtb | 2023-05-06T04:01:52.000Z | [
"transformers",
"pytorch",
"canine",
"token-classification",
"pretrained-on-english-language",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | TokenfreeEMNLPSubmission | null | null | TokenfreeEMNLPSubmission/canine-base-finetuned-pos-ud-hindi-hdtb | 0 | 2 | transformers | 2023-05-06T04:01:39 | ---
license: apache-2.0
tags:
- canine
- pretrained-on-english-language
---
### How to use
Here is how to use this model:
```python
from transformers import CanineModel
model = CanineModel.from_pretrained('mushfiqur11/<repo name>')
``` | 238 | [
[
-0.004283905029296875,
0.00045490264892578125,
-0.0018529891967773438,
0.007434844970703125,
-0.0229644775390625,
-0.004047393798828125,
0.0148162841796875,
0.00774383544921875,
0.00662994384765625,
0.038421630859375,
-0.056304931640625,
-0.01148223876953125,
-0... |
TokenfreeEMNLPSubmission/bert-base-finetuned-pos-ud-arabic-padt | 2023-05-06T04:29:53.000Z | [
"transformers",
"pytorch",
"bert",
"token-classification",
"canine",
"pretrained-on-english-language",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | TokenfreeEMNLPSubmission | null | null | TokenfreeEMNLPSubmission/bert-base-finetuned-pos-ud-arabic-padt | 0 | 2 | transformers | 2023-05-06T04:29:42 | ---
license: apache-2.0
tags:
- canine
- pretrained-on-english-language
---
### How to use
Here is how to use this model:
```python
from transformers import CanineModel
model = CanineModel.from_pretrained('mushfiqur11/<repo name>')
``` | 238 | [
[
-0.004283905029296875,
0.00045490264892578125,
-0.0018529891967773438,
0.007434844970703125,
-0.0229644775390625,
-0.004047393798828125,
0.0148162841796875,
0.00774383544921875,
0.00662994384765625,
0.038421630859375,
-0.056304931640625,
-0.01148223876953125,
-0... |
TokenfreeEMNLPSubmission/bert-base-finetuned-pos-ud-japanese-gsd | 2023-05-06T04:30:53.000Z | [
"transformers",
"pytorch",
"bert",
"token-classification",
"canine",
"pretrained-on-english-language",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | TokenfreeEMNLPSubmission | null | null | TokenfreeEMNLPSubmission/bert-base-finetuned-pos-ud-japanese-gsd | 0 | 2 | transformers | 2023-05-06T04:30:41 | ---
license: apache-2.0
tags:
- canine
- pretrained-on-english-language
---
### How to use
Here is how to use this model:
```python
from transformers import CanineModel
model = CanineModel.from_pretrained('mushfiqur11/<repo name>')
``` | 238 | [
[
-0.004283905029296875,
0.00045490264892578125,
-0.0018529891967773438,
0.007434844970703125,
-0.0229644775390625,
-0.004047393798828125,
0.0148162841796875,
0.00774383544921875,
0.00662994384765625,
0.038421630859375,
-0.056304931640625,
-0.01148223876953125,
-0... |
gl198976/mpt-7b-instruct | 2023-05-06T05:52:17.000Z | [
"transformers",
"pytorch",
"mpt",
"text-generation",
"Composer",
"MosaicML",
"llm-foundry",
"custom_code",
"dataset:mosaicml/dolly_hhrlhf",
"arxiv:2205.14135",
"arxiv:2108.12409",
"arxiv:2010.04245",
"license:cc-by-sa-3.0",
"text-generation-inference",
"region:us"
] | text-generation | gl198976 | null | null | gl198976/mpt-7b-instruct | 1 | 2 | transformers | 2023-05-06T05:52:16 | ---
license: cc-by-sa-3.0
datasets:
- mosaicml/dolly_hhrlhf
tags:
- Composer
- MosaicML
- llm-foundry
inference: false
duplicated_from: mosaicml/mpt-7b-instruct
---
# MPT-7B-Instruct
MPT-7B-Instruct is a model for short-form instruction following.
It is built by finetuning [MPT-7B](https://huggingface.co/spaces/mosaicml/mpt-7b) on a [dataset](https://huggingface.co/datasets/sam-mosaic/dolly_hhrlhf) derived from the [Databricks Dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) and the [Anthropic Helpful and Harmless (HH-RLHF)](https://huggingface.co/datasets/Anthropic/hh-rlhf) datasets.
* License: _CC-By-SA-3.0_ (commercial use permitted)
* [Demo on Hugging Face Spaces](https://huggingface.co/spaces/mosaicml/mpt-7b-instruct)
This model was trained by [MosaicML](https://www.mosaicml.com) and follows a modified decoder-only transformer architecture.
## Model Date
May 5, 2023
## Model License
CC-By-SA-3.0 (commercial use permitted)
## Documentation
* [Blog post: Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs](https://www.mosaicml.com/blog/mpt-7b)
* [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
* Questions: Feel free to contact us via the [MosaicML Community Slack](https://join.slack.com/t/mosaicml-community/shared_invite/zt-1btms90mc-GipE2ufuPkKY0QBrmF3LSA)!
### Example Question/Instruction
**Longboi24**:
> What is a quoll?
**MPT-7B-Instruct**:
>A Quoll (pronounced “cool”) is one of Australia’s native carnivorous marsupial mammals, which are also known as macropods or wallabies in other parts around Asia and South America
## How to Use
Note: This model requires that `trust_remote_code=True` be passed to the `from_pretrained` method. This is because we use a custom model architecture that is not yet part of the `transformers` package.
It includes options for many training efficiency features such as [FlashAttention (Dao et al. 2022)](https://arxiv.org/pdf/2205.14135.pdf), [ALiBi](https://arxiv.org/abs/2108.12409), QK LayerNorm, and more.
```python
import transformers
model = transformers.AutoModelForCausalLM.from_pretrained(
'mosaicml/mpt-7b-instruct',
trust_remote_code=True
)
```
Note: This model requires that `trust_remote_code=True` be passed to the `from_pretrained` method.
This is because we use a custom `MPT` model architecture that is not yet part of the Hugging Face `transformers` package.
`MPT` includes options for many training efficiency features such as [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf), [ALiBi](https://arxiv.org/abs/2108.12409), [QK LayerNorm](https://arxiv.org/abs/2010.04245), and more.
To use the optimized [triton implementation](https://github.com/openai/triton) of FlashAttention, you can load the model with `attn_impl='triton'` and move the model to `bfloat16`:
```python
config = transformers.AutoConfig.from_pretrained(
'mosaicml/mpt-7b-instruct',
trust_remote_code=True
)
config.attn_config['attn_impl'] = 'triton'
model = transformers.AutoModelForCausalLM.from_pretrained(
'mosaicml/mpt-7b-instruct',
config=config,
torch_dtype=torch.bfloat16,
trust_remote_code=True
)
model.to(device='cuda:0')
```
Although the model was trained with a sequence length of 2048, ALiBi enables users to increase the maximum sequence length during finetuning and/or inference. For example:
```python
config = transformers.AutoConfig.from_pretrained(
'mosaicml/mpt-7b-instruct',
trust_remote_code=True
)
config.update({"max_seq_len": 4096})
model = transformers.AutoModelForCausalLM.from_pretrained(
'mosaicml/mpt-7b-instruct',
config=config,
trust_remote_code=True
)
```
This model was trained with the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer.
```python
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b")
```
## Model Description
The architecture is a modification of a standard decoder-only transformer.
The model has been modified from a standard transformer in the following ways:
* It uses [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf)
* It uses [ALiBi (Attention with Linear Biases)](https://arxiv.org/abs/2108.12409) and does not use positional embeddings
* It does not use biases
| Hyperparameter | Value |
|----------------|-------|
|n_parameters | 6.7B |
|n_layers | 32 |
| n_heads | 32 |
| d_model | 4096 |
| vocab size | 50432 |
| sequence length | 2048 |
## PreTraining Data
For more details on the pretraining process, see [MPT-7B](https://huggingface.co/mosaicml/mpt-7b).
The data was tokenized using the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer.
## Limitations and Biases
_The following language is modified from [EleutherAI's GPT-NeoX-20B](https://huggingface.co/EleutherAI/gpt-neox-20b)_
MPT-7B-Instruct can produce factually incorrect output, and should not be relied on to produce factually accurate information.
MPT-7B-Instruct was trained on various public datasets.
While great efforts have been taken to clean the pretraining data, it is possible that this model could generate lewd, biased or otherwise offensive outputs.
## Acknowledgements
This model was finetuned by Sam Havens and the MosaicML NLP team
## MosaicML Platform
If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://forms.mosaicml.com/demo?utm_source=huggingface&utm_medium=referral&utm_campaign=mpt-7b).
## Citation
Please cite this model using the following format:
```
@online{MosaicML2023Introducing,
author = {MosaicML NLP Team},
title = {Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs},
year = {2023},
url = {www.mosaicml.com/blog/mpt-7b},
note = {Accessed: 2023-03-28}, % change this date
urldate = {2023-03-28} % change this date
}
``` | 6,036 | [
[
-0.040435791015625,
-0.031890869140625,
0.0178985595703125,
0.02978515625,
-0.032623291015625,
-0.0033111572265625,
0.00356292724609375,
-0.0220794677734375,
0.003711700439453125,
0.029571533203125,
-0.052398681640625,
-0.04150390625,
-0.047027587890625,
0.0... |
SHENMU007/neunit_BASE_V5.2 | 2023-05-06T08:34:32.000Z | [
"transformers",
"pytorch",
"tensorboard",
"speecht5",
"text-to-audio",
"1.1.0",
"generated_from_trainer",
"zh",
"dataset:facebook/voxpopuli",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-to-audio | SHENMU007 | null | null | SHENMU007/neunit_BASE_V5.2 | 0 | 2 | transformers | 2023-05-06T06:05:45 | ---
language:
- zh
license: mit
tags:
- 1.1.0
- generated_from_trainer
datasets:
- facebook/voxpopuli
model-index:
- name: SpeechT5 TTS Dutch neunit
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# SpeechT5 TTS Dutch neunit
This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on the VoxPopuli dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 4000
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.29.0.dev0
- Pytorch 2.0.0+cu117
- Datasets 2.11.0
- Tokenizers 0.12.1
| 1,251 | [
[
-0.0350341796875,
-0.051727294921875,
-0.005931854248046875,
0.01265716552734375,
-0.025390625,
-0.0193939208984375,
-0.01763916015625,
-0.0265045166015625,
0.0114288330078125,
0.021270751953125,
-0.0411376953125,
-0.050048828125,
-0.04315185546875,
0.008583... |
RussellHaley/my_awesome_qa_model | 2023-05-06T09:56:53.000Z | [
"transformers",
"pytorch",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | question-answering | RussellHaley | null | null | RussellHaley/my_awesome_qa_model | 0 | 2 | transformers | 2023-05-06T06:51:56 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- squad
model-index:
- name: my_awesome_qa_model
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_qa_model
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the squad dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6098
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 250 | 2.3619 |
| 2.7558 | 2.0 | 500 | 1.6926 |
| 2.7558 | 3.0 | 750 | 1.6098 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,391 | [
[
-0.028289794921875,
-0.0478515625,
0.0165252685546875,
0.0170440673828125,
-0.02392578125,
-0.006496429443359375,
0.00865936279296875,
-0.0117645263671875,
0.004241943359375,
0.0188446044921875,
-0.06256103515625,
-0.045654296875,
-0.043731689453125,
-0.0060... |
scige/distilbert-base-uncased-finetuned-emotion | 2023-05-06T08:44:10.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:emotion",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | scige | null | null | scige/distilbert-base-uncased-finetuned-emotion | 0 | 2 | transformers | 2023-05-06T07:36:25 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
config: split
split: validation
args: split
metrics:
- name: Accuracy
type: accuracy
value: 0.925
- name: F1
type: f1
value: 0.9249177844653992
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2111
- Accuracy: 0.925
- F1: 0.9249
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.7959 | 1.0 | 250 | 0.2974 | 0.915 | 0.9123 |
| 0.2412 | 2.0 | 500 | 0.2111 | 0.925 | 0.9249 |
### Framework versions
- Transformers 4.27.4
- Pytorch 1.11.0+cu113
- Datasets 2.7.1
- Tokenizers 0.13.2
| 1,846 | [
[
-0.03778076171875,
-0.041900634765625,
0.01499176025390625,
0.0218963623046875,
-0.026123046875,
-0.0187835693359375,
-0.01282501220703125,
-0.00798797607421875,
0.010986328125,
0.00832366943359375,
-0.056488037109375,
-0.05169677734375,
-0.059814453125,
-0.... |
saikiranmaddukuri/stable-diffusion-sinop | 2023-05-06T07:47:24.000Z | [
"transformers",
"pytorch",
"tensorboard",
"git",
"text-generation",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-generation | saikiranmaddukuri | null | null | saikiranmaddukuri/stable-diffusion-sinop | 0 | 2 | transformers | 2023-05-06T07:38:33 | ---
license: mit
tags:
- generated_from_trainer
model-index:
- name: stable-diffusion-sinop
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# stable-diffusion-sinop
This model is a fine-tuned version of [microsoft/git-base](https://huggingface.co/microsoft/git-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 6.7572
- Wer Score: 72.7778
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer Score |
|:-------------:|:-----:|:----:|:---------------:|:---------:|
| 3.8191 | 50.0 | 50 | 6.7572 | 72.7778 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.13.3
| 1,373 | [
[
-0.0247344970703125,
-0.048431396484375,
0.0197906494140625,
0.00977325439453125,
-0.0252532958984375,
-0.0260162353515625,
-0.00984954833984375,
-0.0013151168823242188,
0.01226043701171875,
0.0138397216796875,
-0.031646728515625,
-0.052154541015625,
-0.05007934... |
bonurtek/bert-base-uncased-finetuned-cola | 2023-05-07T19:00:13.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | bonurtek | null | null | bonurtek/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-06T07:41:51 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5879880120258366
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4827
- Matthews Correlation: 0.5880
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.1328795996187915e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 268 | 0.4427 | 0.4984 |
| 0.398 | 2.0 | 536 | 0.4827 | 0.5880 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,813 | [
[
-0.024505615234375,
-0.05364990234375,
0.00926971435546875,
0.0197601318359375,
-0.026123046875,
-0.020782470703125,
-0.019378662109375,
-0.016204833984375,
0.026397705078125,
0.016693115234375,
-0.0501708984375,
-0.0295562744140625,
-0.05181884765625,
-0.02... |
Mike00vito/best-xxl-multiCLS | 2023-05-06T09:17:06.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"endpoints_compatible",
"region:us"
] | text-classification | Mike00vito | null | null | Mike00vito/best-xxl-multiCLS | 0 | 2 | transformers | 2023-05-06T09:16:12 | ---
tags:
- generated_from_trainer
model-index:
- name: prova-xxl-multi
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# prova-xxl-multi
This model was trained from scratch on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6447
- F1 Score: 0.8803
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 Score |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 81 | 1.9244 | 0.8787 |
| No log | 2.0 | 162 | 1.6447 | 0.8803 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,319 | [
[
-0.031097412109375,
-0.032135009765625,
0.0165252685546875,
0.006214141845703125,
-0.027496337890625,
-0.0260772705078125,
0.00738525390625,
-0.00763702392578125,
0.00937652587890625,
0.02789306640625,
-0.0631103515625,
-0.045745849609375,
-0.043670654296875,
... |
ilkekas/bert-base-uncased-finetuned2-cola | 2023-05-06T10:31:07.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | ilkekas | null | null | ilkekas/bert-base-uncased-finetuned2-cola | 0 | 2 | transformers | 2023-05-06T09:26:03 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned2-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5650459791482846
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned2-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5176
- Matthews Correlation: 0.5650
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.6781109393881056e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 8
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5726 | 1.0 | 535 | 0.5090 | 0.3912 |
| 0.4467 | 2.0 | 1070 | 0.4536 | 0.5024 |
| 0.3891 | 3.0 | 1605 | 0.5093 | 0.4943 |
| 0.3387 | 4.0 | 2140 | 0.4927 | 0.5365 |
| 0.3177 | 5.0 | 2675 | 0.4897 | 0.5624 |
| 0.2853 | 6.0 | 3210 | 0.5176 | 0.5650 |
| 0.2718 | 7.0 | 3745 | 0.5440 | 0.5524 |
| 0.2532 | 8.0 | 4280 | 0.5431 | 0.5602 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,259 | [
[
-0.0283050537109375,
-0.049530029296875,
0.007022857666015625,
0.01392364501953125,
-0.0204620361328125,
-0.0180511474609375,
-0.0130615234375,
-0.014739990234375,
0.027740478515625,
0.0160064697265625,
-0.051422119140625,
-0.0340576171875,
-0.05487060546875,
... |
sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr_saturday | 2023-05-06T09:49:19.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | sepehrbakhshi | null | null | sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr_saturday | 0 | 2 | transformers | 2023-05-06T09:47:14 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr_saturday
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.513134547199089
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr_saturday
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4518
- Matthews Correlation: 0.5131
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4887 | 1.0 | 535 | 0.4518 | 0.5131 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,781 | [
[
-0.02581787109375,
-0.0496826171875,
0.0147552490234375,
0.0203094482421875,
-0.0302581787109375,
-0.02691650390625,
-0.0193328857421875,
-0.015716552734375,
0.0257110595703125,
0.0159759521484375,
-0.051300048828125,
-0.0367431640625,
-0.051483154296875,
-0... |
sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr_from_server | 2023-05-06T10:05:21.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | sepehrbakhshi | null | null | sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr_from_server | 0 | 2 | transformers | 2023-05-06T10:03:13 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola_sepehr_from_server
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.47550066760653964
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola_sepehr_from_server
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4843
- Matthews Correlation: 0.4755
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 268 | 0.4843 | 0.4755 |
### Framework versions
- Transformers 4.27.1
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.13.2
| 1,761 | [
[
-0.0270538330078125,
-0.05438232421875,
0.011871337890625,
0.01861572265625,
-0.0308837890625,
-0.0241241455078125,
-0.0192718505859375,
-0.0184326171875,
0.0234222412109375,
0.0191650390625,
-0.051300048828125,
-0.033966064453125,
-0.05120849609375,
-0.0231... |
Mike00vito/best-multi-multiCLS | 2023-05-06T10:06:42.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"endpoints_compatible",
"region:us"
] | text-classification | Mike00vito | null | null | Mike00vito/best-multi-multiCLS | 0 | 2 | transformers | 2023-05-06T10:05:07 | ---
tags:
- generated_from_trainer
model-index:
- name: prova-multi-multi
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# prova-multi-multi
This model was trained from scratch on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3772
- F1 Score: 0.8551
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 Score |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 81 | 2.3118 | 0.8617 |
| No log | 2.0 | 162 | 2.3772 | 0.8551 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,323 | [
[
-0.0316162109375,
-0.033935546875,
0.01514434814453125,
0.009674072265625,
-0.027099609375,
-0.024139404296875,
0.00527191162109375,
-0.00678253173828125,
0.006992340087890625,
0.0270538330078125,
-0.06268310546875,
-0.047393798828125,
-0.043792724609375,
-0... |
sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr_saturday_new | 2023-05-06T10:38:41.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | sepehrbakhshi | null | null | sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr_saturday_new | 0 | 2 | transformers | 2023-05-06T10:36:00 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr_saturday_new
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5126228485857701
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr_saturday_new
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4760
- Matthews Correlation: 0.5126
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4957 | 1.0 | 535 | 0.4760 | 0.5126 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,790 | [
[
-0.0274658203125,
-0.04986572265625,
0.01302337646484375,
0.019561767578125,
-0.031768798828125,
-0.026641845703125,
-0.01953125,
-0.0157928466796875,
0.025909423828125,
0.0160675048828125,
-0.050048828125,
-0.037109375,
-0.053009033203125,
-0.02326965332031... |
yigg/bert-base-uncased-finetuned-cola | 2023-05-06T13:25:27.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | yigg | null | null | yigg/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-06T11:01:25 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.46698933079472565
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5629
- Matthews Correlation: 0.4670
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.866149341238024e-06
- train_batch_size: 4
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5043 | 1.0 | 2138 | 0.5637 | 0.3863 |
| 0.4399 | 2.0 | 4276 | 0.5629 | 0.4670 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,812 | [
[
-0.0258941650390625,
-0.05255126953125,
0.0105133056640625,
0.02032470703125,
-0.02545166015625,
-0.0216217041015625,
-0.0182952880859375,
-0.0152435302734375,
0.026397705078125,
0.0176239013671875,
-0.051239013671875,
-0.0302734375,
-0.051055908203125,
-0.0... |
orcan/bert-base-uncased-finetuned-cola | 2023-05-07T20:52:49.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | orcan | null | null | orcan/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-06T11:28:08 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5187251192358523
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4574
- Matthews Correlation: 0.5187
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4.449201083603278e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5498 | 1.0 | 535 | 0.5006 | 0.4467 |
| 0.4212 | 2.0 | 1070 | 0.4574 | 0.5187 |
| 0.3631 | 3.0 | 1605 | 0.4944 | 0.5153 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,886 | [
[
-0.0262603759765625,
-0.05078125,
0.0106048583984375,
0.019012451171875,
-0.02459716796875,
-0.02032470703125,
-0.0169219970703125,
-0.01424407958984375,
0.0273590087890625,
0.0174560546875,
-0.05108642578125,
-0.031341552734375,
-0.051361083984375,
-0.02128... |
Mike00vito/best-xxl-singleCLS | 2023-05-06T11:38:36.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"endpoints_compatible",
"region:us"
] | text-classification | Mike00vito | null | null | Mike00vito/best-xxl-singleCLS | 0 | 2 | transformers | 2023-05-06T11:33:13 | ---
tags:
- generated_from_trainer
model-index:
- name: prova-xxl-single
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# prova-xxl-single
This model was trained from scratch on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4960
- F1 Score: 0.9484
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 Score |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 369 | 0.8665 | 0.9145 |
| No log | 2.0 | 738 | 0.4302 | 0.9512 |
| No log | 3.0 | 1107 | 0.5309 | 0.9389 |
| No log | 4.0 | 1476 | 0.4960 | 0.9484 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,445 | [
[
-0.02880859375,
-0.035614013671875,
0.01229095458984375,
0.005626678466796875,
-0.02801513671875,
-0.032135009765625,
0.006511688232421875,
-0.004764556884765625,
0.01296234130859375,
0.0323486328125,
-0.0616455078125,
-0.0517578125,
-0.0408935546875,
-0.016... |
guoluo/Bert_class_PE_1e-09_followed_dropout_point2 | 2023-05-06T11:36:28.000Z | [
"transformers",
"tf",
"bert",
"text-classification",
"generated_from_keras_callback",
"endpoints_compatible",
"region:us"
] | text-classification | guoluo | null | null | guoluo/Bert_class_PE_1e-09_followed_dropout_point2 | 0 | 2 | transformers | 2023-05-06T11:35:41 | ---
tags:
- generated_from_keras_callback
model-index:
- name: Bert_class_PE_1e-09
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Bert_class_PE_1e-09
This model is a fine-tuned version of [guoluo/Bert_1.5e_07](https://huggingface.co/guoluo/Bert_1.5e_07) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.9438
- Train Accuracy: 0.6776
- Validation Loss: 0.9651
- Validation Accuracy: 0.6761
- Train Lr: 9.920339e-10
- Epoch: 3999
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 9.920339e-10, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Train Lr | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:------------:|:-----:|
| 1.4730 | 0.1647 | 1.5009 | 0.1338 | 1e-09 | 0 |
| 1.4744 | 0.1412 | 1.5003 | 0.1338 | 1e-09 | 1 |
| 1.4780 | 0.1388 | 1.4998 | 0.1338 | 1e-09 | 2 |
| 1.4773 | 0.1388 | 1.4993 | 0.1338 | 1e-09 | 3 |
| 1.4733 | 0.1482 | 1.4988 | 0.1338 | 1e-09 | 4 |
| 1.4676 | 0.1482 | 1.4983 | 0.1338 | 1e-09 | 5 |
| 1.4769 | 0.1388 | 1.4979 | 0.1338 | 1e-09 | 6 |
| 1.4704 | 0.1600 | 1.4974 | 0.1338 | 1e-09 | 7 |
| 1.4791 | 0.1435 | 1.4969 | 0.1338 | 1e-09 | 8 |
| 1.4696 | 0.1482 | 1.4963 | 0.1338 | 1e-09 | 9 |
| 1.4714 | 0.1506 | 1.4959 | 0.1338 | 1e-09 | 10 |
| 1.4701 | 0.1365 | 1.4954 | 0.1338 | 1e-09 | 11 |
| 1.4626 | 0.1482 | 1.4949 | 0.1338 | 1e-09 | 12 |
| 1.4725 | 0.1553 | 1.4945 | 0.1338 | 1e-09 | 13 |
| 1.4704 | 0.1435 | 1.4940 | 0.1338 | 1e-09 | 14 |
| 1.4720 | 0.1435 | 1.4935 | 0.1338 | 1e-09 | 15 |
| 1.4724 | 0.1388 | 1.4930 | 0.1338 | 1e-09 | 16 |
| 1.4749 | 0.1388 | 1.4925 | 0.1338 | 1e-09 | 17 |
| 1.4697 | 0.1388 | 1.4921 | 0.1338 | 1e-09 | 18 |
| 1.4736 | 0.1294 | 1.4916 | 0.1338 | 1e-09 | 19 |
| 1.4678 | 0.1412 | 1.4911 | 0.1338 | 1e-09 | 20 |
| 1.4649 | 0.1459 | 1.4906 | 0.1338 | 1e-09 | 21 |
| 1.4681 | 0.1576 | 1.4901 | 0.1338 | 1e-09 | 22 |
| 1.4672 | 0.1576 | 1.4895 | 0.1338 | 1e-09 | 23 |
| 1.4636 | 0.1412 | 1.4890 | 0.1338 | 1e-09 | 24 |
| 1.4660 | 0.1600 | 1.4885 | 0.1338 | 1e-09 | 25 |
| 1.4692 | 0.1576 | 1.4880 | 0.1338 | 1e-09 | 26 |
| 1.4693 | 0.1482 | 1.4876 | 0.1338 | 1e-09 | 27 |
| 1.4627 | 0.1506 | 1.4871 | 0.1338 | 1e-09 | 28 |
| 1.4676 | 0.1529 | 1.4867 | 0.1338 | 1e-09 | 29 |
| 1.4606 | 0.1529 | 1.4862 | 0.1338 | 1e-09 | 30 |
| 1.4697 | 0.1412 | 1.4857 | 0.1338 | 1e-09 | 31 |
| 1.4638 | 0.1435 | 1.4852 | 0.1338 | 1e-09 | 32 |
| 1.4613 | 0.1435 | 1.4847 | 0.1338 | 1e-09 | 33 |
| 1.4583 | 0.1435 | 1.4842 | 0.1338 | 1e-09 | 34 |
| 1.4584 | 0.1576 | 1.4837 | 0.1338 | 1e-09 | 35 |
| 1.4557 | 0.1553 | 1.4833 | 0.1338 | 1e-09 | 36 |
| 1.4531 | 0.1529 | 1.4828 | 0.1338 | 1e-09 | 37 |
| 1.4552 | 0.1506 | 1.4824 | 0.1338 | 1e-09 | 38 |
| 1.4584 | 0.1506 | 1.4820 | 0.1338 | 1e-09 | 39 |
| 1.4646 | 0.1694 | 1.4815 | 0.1268 | 1e-09 | 40 |
| 1.4597 | 0.1412 | 1.4810 | 0.1268 | 1e-09 | 41 |
| 1.4597 | 0.1365 | 1.4806 | 0.1268 | 1e-09 | 42 |
| 1.4515 | 0.1671 | 1.4801 | 0.1268 | 1e-09 | 43 |
| 1.4508 | 0.1341 | 1.4796 | 0.1338 | 1e-09 | 44 |
| 1.4511 | 0.1529 | 1.4792 | 0.1338 | 1e-09 | 45 |
| 1.4520 | 0.1459 | 1.4787 | 0.1338 | 1e-09 | 46 |
| 1.4547 | 0.1788 | 1.4782 | 0.1338 | 1e-09 | 47 |
| 1.4570 | 0.1624 | 1.4777 | 0.1338 | 1e-09 | 48 |
| 1.4486 | 0.1506 | 1.4773 | 0.1338 | 1e-09 | 49 |
| 1.4544 | 0.1671 | 1.4768 | 0.1338 | 1e-09 | 50 |
| 1.4519 | 0.1576 | 1.4764 | 0.1338 | 1e-09 | 51 |
| 1.4503 | 0.1553 | 1.4760 | 0.1338 | 1e-09 | 52 |
| 1.4527 | 0.1412 | 1.4755 | 0.1338 | 1e-09 | 53 |
| 1.4522 | 0.1482 | 1.4750 | 0.1338 | 1e-09 | 54 |
| 1.4562 | 0.1412 | 1.4745 | 0.1338 | 1e-09 | 55 |
| 1.4444 | 0.1412 | 1.4740 | 0.1338 | 9.999999e-10 | 56 |
| 1.4459 | 0.1341 | 1.4735 | 0.1338 | 9.999997e-10 | 57 |
| 1.4506 | 0.1435 | 1.4731 | 0.1338 | 9.999996e-10 | 58 |
| 1.4536 | 0.1412 | 1.4726 | 0.1338 | 9.999995e-10 | 59 |
| 1.4503 | 0.1506 | 1.4722 | 0.1338 | 9.999994e-10 | 60 |
| 1.4466 | 0.1553 | 1.4717 | 0.1338 | 9.999993e-10 | 61 |
| 1.4540 | 0.1506 | 1.4713 | 0.1338 | 9.999992e-10 | 62 |
| 1.4448 | 0.1553 | 1.4708 | 0.1338 | 9.999991e-10 | 63 |
| 1.4507 | 0.1294 | 1.4704 | 0.1338 | 9.99999e-10 | 64 |
| 1.4446 | 0.1412 | 1.4699 | 0.1338 | 9.999989e-10 | 65 |
| 1.4387 | 0.1482 | 1.4694 | 0.1338 | 9.999988e-10 | 66 |
| 1.4491 | 0.1318 | 1.4690 | 0.1338 | 9.999986e-10 | 67 |
| 1.4354 | 0.1741 | 1.4685 | 0.1338 | 9.999985e-10 | 68 |
| 1.4393 | 0.1741 | 1.4680 | 0.1338 | 9.999984e-10 | 69 |
| 1.4443 | 0.1506 | 1.4675 | 0.1338 | 9.999983e-10 | 70 |
| 1.4441 | 0.1624 | 1.4670 | 0.1338 | 9.999982e-10 | 71 |
| 1.4411 | 0.1553 | 1.4665 | 0.1338 | 9.999981e-10 | 72 |
| 1.4438 | 0.1365 | 1.4660 | 0.1338 | 9.99998e-10 | 73 |
| 1.4314 | 0.1647 | 1.4656 | 0.1338 | 9.999979e-10 | 74 |
| 1.4394 | 0.1600 | 1.4651 | 0.1338 | 9.999978e-10 | 75 |
| 1.4469 | 0.1765 | 1.4647 | 0.1338 | 9.999976e-10 | 76 |
| 1.4408 | 0.1600 | 1.4642 | 0.1338 | 9.999975e-10 | 77 |
| 1.4388 | 0.1624 | 1.4638 | 0.1338 | 9.999974e-10 | 78 |
| 1.4391 | 0.1529 | 1.4633 | 0.1338 | 9.999973e-10 | 79 |
| 1.4367 | 0.1600 | 1.4629 | 0.1338 | 9.999972e-10 | 80 |
| 1.4407 | 0.1576 | 1.4624 | 0.1338 | 9.999971e-10 | 81 |
| 1.4388 | 0.1529 | 1.4620 | 0.1338 | 9.99997e-10 | 82 |
| 1.4483 | 0.1694 | 1.4615 | 0.1338 | 9.999969e-10 | 83 |
| 1.4385 | 0.1765 | 1.4610 | 0.1338 | 9.999968e-10 | 84 |
| 1.4331 | 0.1929 | 1.4606 | 0.1338 | 9.999966e-10 | 85 |
| 1.4328 | 0.1694 | 1.4602 | 0.1338 | 9.999965e-10 | 86 |
| 1.4365 | 0.1694 | 1.4597 | 0.1338 | 9.999964e-10 | 87 |
| 1.4374 | 0.1694 | 1.4592 | 0.1338 | 9.999963e-10 | 88 |
| 1.4330 | 0.1765 | 1.4588 | 0.1338 | 9.999962e-10 | 89 |
| 1.4370 | 0.1529 | 1.4584 | 0.1268 | 9.999961e-10 | 90 |
| 1.4311 | 0.1765 | 1.4579 | 0.1268 | 9.99996e-10 | 91 |
| 1.4330 | 0.1788 | 1.4574 | 0.1268 | 9.999959e-10 | 92 |
| 1.4363 | 0.1435 | 1.4570 | 0.1268 | 9.999958e-10 | 93 |
| 1.4248 | 0.1694 | 1.4566 | 0.1268 | 9.999956e-10 | 94 |
| 1.4353 | 0.1812 | 1.4561 | 0.1268 | 9.999955e-10 | 95 |
| 1.4279 | 0.1600 | 1.4556 | 0.1268 | 9.999954e-10 | 96 |
| 1.4337 | 0.1718 | 1.4552 | 0.1268 | 9.999953e-10 | 97 |
| 1.4282 | 0.1694 | 1.4548 | 0.1268 | 9.999952e-10 | 98 |
| 1.4342 | 0.1718 | 1.4543 | 0.1268 | 9.999951e-10 | 99 |
| 1.4213 | 0.1694 | 1.4539 | 0.1268 | 9.99995e-10 | 100 |
| 1.4358 | 0.1647 | 1.4535 | 0.1268 | 9.999949e-10 | 101 |
| 1.4306 | 0.1859 | 1.4530 | 0.1338 | 9.999948e-10 | 102 |
| 1.4330 | 0.1718 | 1.4525 | 0.1338 | 9.999946e-10 | 103 |
| 1.4319 | 0.1694 | 1.4521 | 0.1338 | 9.999945e-10 | 104 |
| 1.4280 | 0.1576 | 1.4516 | 0.1338 | 9.999944e-10 | 105 |
| 1.4240 | 0.1671 | 1.4512 | 0.1338 | 9.999943e-10 | 106 |
| 1.4359 | 0.1647 | 1.4507 | 0.1338 | 9.999942e-10 | 107 |
| 1.4296 | 0.1318 | 1.4502 | 0.1338 | 9.999941e-10 | 108 |
| 1.4308 | 0.1835 | 1.4498 | 0.1338 | 9.99994e-10 | 109 |
| 1.4242 | 0.1835 | 1.4493 | 0.1338 | 9.999939e-10 | 110 |
| 1.4257 | 0.1741 | 1.4489 | 0.1338 | 9.999938e-10 | 111 |
| 1.4235 | 0.1694 | 1.4485 | 0.1338 | 9.999936e-10 | 112 |
| 1.4269 | 0.1576 | 1.4481 | 0.1338 | 9.999935e-10 | 113 |
| 1.4188 | 0.1624 | 1.4476 | 0.1338 | 9.999934e-10 | 114 |
| 1.4221 | 0.1624 | 1.4471 | 0.1408 | 9.999933e-10 | 115 |
| 1.4269 | 0.1929 | 1.4467 | 0.1408 | 9.999932e-10 | 116 |
| 1.4274 | 0.1765 | 1.4463 | 0.1408 | 9.999931e-10 | 117 |
| 1.4262 | 0.1459 | 1.4458 | 0.1408 | 9.99993e-10 | 118 |
| 1.4208 | 0.1718 | 1.4453 | 0.1408 | 9.999929e-10 | 119 |
| 1.4237 | 0.1718 | 1.4448 | 0.1408 | 9.999928e-10 | 120 |
| 1.4242 | 0.1718 | 1.4444 | 0.1408 | 9.999926e-10 | 121 |
| 1.4321 | 0.1435 | 1.4439 | 0.1408 | 9.999925e-10 | 122 |
| 1.4208 | 0.1671 | 1.4435 | 0.1408 | 9.999924e-10 | 123 |
| 1.4127 | 0.1929 | 1.4430 | 0.1408 | 9.999923e-10 | 124 |
| 1.4281 | 0.1671 | 1.4425 | 0.1408 | 9.999922e-10 | 125 |
| 1.4135 | 0.1953 | 1.4421 | 0.1408 | 9.999921e-10 | 126 |
| 1.4214 | 0.1718 | 1.4417 | 0.1408 | 9.99992e-10 | 127 |
| 1.4190 | 0.1953 | 1.4412 | 0.1408 | 9.999919e-10 | 128 |
| 1.4187 | 0.1929 | 1.4408 | 0.1408 | 9.999918e-10 | 129 |
| 1.4159 | 0.1671 | 1.4404 | 0.1408 | 9.999916e-10 | 130 |
| 1.4168 | 0.1506 | 1.4399 | 0.1408 | 9.999915e-10 | 131 |
| 1.4185 | 0.1765 | 1.4395 | 0.1408 | 9.999914e-10 | 132 |
| 1.4145 | 0.1765 | 1.4390 | 0.1408 | 9.999913e-10 | 133 |
| 1.4168 | 0.1882 | 1.4385 | 0.1408 | 9.999912e-10 | 134 |
| 1.4245 | 0.1812 | 1.4381 | 0.1408 | 9.999911e-10 | 135 |
| 1.4101 | 0.1671 | 1.4377 | 0.1408 | 9.99991e-10 | 136 |
| 1.4140 | 0.1835 | 1.4372 | 0.1479 | 9.999909e-10 | 137 |
| 1.4131 | 0.2024 | 1.4368 | 0.1479 | 9.999908e-10 | 138 |
| 1.4200 | 0.1694 | 1.4363 | 0.1479 | 9.999906e-10 | 139 |
| 1.4104 | 0.1765 | 1.4359 | 0.1479 | 9.999905e-10 | 140 |
| 1.4260 | 0.1788 | 1.4354 | 0.1479 | 9.999904e-10 | 141 |
| 1.4185 | 0.1859 | 1.4350 | 0.1479 | 9.999903e-10 | 142 |
| 1.4098 | 0.1929 | 1.4346 | 0.1479 | 9.999902e-10 | 143 |
| 1.4109 | 0.1812 | 1.4342 | 0.1479 | 9.999901e-10 | 144 |
| 1.4054 | 0.2118 | 1.4337 | 0.1479 | 9.9999e-10 | 145 |
| 1.4072 | 0.2000 | 1.4333 | 0.1479 | 9.999899e-10 | 146 |
| 1.4111 | 0.1906 | 1.4329 | 0.1479 | 9.999898e-10 | 147 |
| 1.4174 | 0.1718 | 1.4324 | 0.1479 | 9.999896e-10 | 148 |
| 1.4068 | 0.1671 | 1.4320 | 0.1479 | 9.999895e-10 | 149 |
| 1.4069 | 0.1694 | 1.4316 | 0.1479 | 9.999894e-10 | 150 |
| 1.4043 | 0.2047 | 1.4311 | 0.1479 | 9.999893e-10 | 151 |
| 1.4046 | 0.1929 | 1.4307 | 0.1479 | 9.999892e-10 | 152 |
| 1.4066 | 0.1953 | 1.4302 | 0.1479 | 9.999891e-10 | 153 |
| 1.4031 | 0.2000 | 1.4298 | 0.1479 | 9.99989e-10 | 154 |
| 1.4112 | 0.1788 | 1.4294 | 0.1479 | 9.999889e-10 | 155 |
| 1.4012 | 0.2118 | 1.4290 | 0.1479 | 9.999888e-10 | 156 |
| 1.4140 | 0.1812 | 1.4285 | 0.1479 | 9.999886e-10 | 157 |
| 1.4062 | 0.1741 | 1.4281 | 0.1479 | 9.999885e-10 | 158 |
| 1.4049 | 0.1929 | 1.4276 | 0.1479 | 9.999884e-10 | 159 |
| 1.4082 | 0.2047 | 1.4272 | 0.1479 | 9.999883e-10 | 160 |
| 1.4085 | 0.1882 | 1.4268 | 0.1479 | 9.999882e-10 | 161 |
| 1.4095 | 0.1835 | 1.4264 | 0.1479 | 9.999881e-10 | 162 |
| 1.4040 | 0.2047 | 1.4259 | 0.1479 | 9.99988e-10 | 163 |
| 1.4080 | 0.2071 | 1.4255 | 0.1479 | 9.999879e-10 | 164 |
| 1.3990 | 0.2047 | 1.4251 | 0.1479 | 9.999878e-10 | 165 |
| 1.4095 | 0.2094 | 1.4247 | 0.1479 | 9.999876e-10 | 166 |
| 1.4054 | 0.1906 | 1.4242 | 0.1479 | 9.999874e-10 | 167 |
| 1.4014 | 0.2188 | 1.4238 | 0.1479 | 9.999872e-10 | 168 |
| 1.3944 | 0.2259 | 1.4234 | 0.1479 | 9.99987e-10 | 169 |
| 1.3990 | 0.2047 | 1.4230 | 0.1479 | 9.999868e-10 | 170 |
| 1.4027 | 0.2094 | 1.4226 | 0.1479 | 9.999865e-10 | 171 |
| 1.4030 | 0.2024 | 1.4222 | 0.1479 | 9.999863e-10 | 172 |
| 1.4038 | 0.1929 | 1.4218 | 0.1479 | 9.999861e-10 | 173 |
| 1.4008 | 0.1859 | 1.4213 | 0.1479 | 9.999859e-10 | 174 |
| 1.4051 | 0.2141 | 1.4209 | 0.1479 | 9.999856e-10 | 175 |
| 1.3957 | 0.2024 | 1.4204 | 0.1479 | 9.999854e-10 | 176 |
| 1.4036 | 0.1788 | 1.4200 | 0.1479 | 9.999852e-10 | 177 |
| 1.3998 | 0.1953 | 1.4196 | 0.1479 | 9.99985e-10 | 178 |
| 1.3987 | 0.2047 | 1.4192 | 0.1479 | 9.999848e-10 | 179 |
| 1.4036 | 0.2000 | 1.4187 | 0.1479 | 9.999845e-10 | 180 |
| 1.4005 | 0.2047 | 1.4183 | 0.1479 | 9.999843e-10 | 181 |
| 1.4007 | 0.2118 | 1.4179 | 0.1479 | 9.999841e-10 | 182 |
| 1.3974 | 0.1882 | 1.4174 | 0.1479 | 9.999839e-10 | 183 |
| 1.3847 | 0.2118 | 1.4170 | 0.1479 | 9.999837e-10 | 184 |
| 1.3995 | 0.2094 | 1.4166 | 0.1479 | 9.999834e-10 | 185 |
| 1.3922 | 0.1835 | 1.4163 | 0.1549 | 9.999832e-10 | 186 |
| 1.4009 | 0.2071 | 1.4158 | 0.1549 | 9.99983e-10 | 187 |
| 1.3924 | 0.2188 | 1.4154 | 0.1549 | 9.999828e-10 | 188 |
| 1.3915 | 0.2259 | 1.4150 | 0.1549 | 9.999825e-10 | 189 |
| 1.3922 | 0.2353 | 1.4146 | 0.1549 | 9.999823e-10 | 190 |
| 1.3913 | 0.2424 | 1.4142 | 0.1549 | 9.999821e-10 | 191 |
| 1.3933 | 0.2188 | 1.4137 | 0.1549 | 9.999819e-10 | 192 |
| 1.3874 | 0.2400 | 1.4133 | 0.1549 | 9.999817e-10 | 193 |
| 1.3961 | 0.2071 | 1.4129 | 0.1549 | 9.999814e-10 | 194 |
| 1.4043 | 0.2000 | 1.4125 | 0.1549 | 9.999812e-10 | 195 |
| 1.3918 | 0.2071 | 1.4121 | 0.1620 | 9.99981e-10 | 196 |
| 1.3959 | 0.2094 | 1.4117 | 0.1620 | 9.999808e-10 | 197 |
| 1.3930 | 0.1812 | 1.4113 | 0.1620 | 9.999805e-10 | 198 |
| 1.3954 | 0.2071 | 1.4109 | 0.1620 | 9.999803e-10 | 199 |
| 1.3853 | 0.2259 | 1.4105 | 0.1620 | 9.999801e-10 | 200 |
| 1.3934 | 0.2212 | 1.4100 | 0.1620 | 9.999799e-10 | 201 |
| 1.3876 | 0.2212 | 1.4095 | 0.1620 | 9.999797e-10 | 202 |
| 1.3894 | 0.2235 | 1.4091 | 0.1620 | 9.999794e-10 | 203 |
| 1.3860 | 0.2447 | 1.4087 | 0.1690 | 9.999792e-10 | 204 |
| 1.3892 | 0.2000 | 1.4083 | 0.1690 | 9.99979e-10 | 205 |
| 1.3870 | 0.2259 | 1.4078 | 0.1761 | 9.999788e-10 | 206 |
| 1.3941 | 0.2094 | 1.4074 | 0.1761 | 9.999785e-10 | 207 |
| 1.3908 | 0.1953 | 1.4070 | 0.1761 | 9.999783e-10 | 208 |
| 1.3886 | 0.2306 | 1.4066 | 0.1761 | 9.999781e-10 | 209 |
| 1.3888 | 0.2376 | 1.4062 | 0.1761 | 9.999779e-10 | 210 |
| 1.3806 | 0.2329 | 1.4058 | 0.1761 | 9.999777e-10 | 211 |
| 1.3893 | 0.2424 | 1.4054 | 0.1761 | 9.999774e-10 | 212 |
| 1.3775 | 0.2282 | 1.4050 | 0.1761 | 9.999772e-10 | 213 |
| 1.3867 | 0.2047 | 1.4046 | 0.1761 | 9.99977e-10 | 214 |
| 1.3871 | 0.2353 | 1.4041 | 0.1761 | 9.999768e-10 | 215 |
| 1.3678 | 0.2612 | 1.4037 | 0.1761 | 9.999765e-10 | 216 |
| 1.3773 | 0.2376 | 1.4034 | 0.1761 | 9.999763e-10 | 217 |
| 1.3906 | 0.2141 | 1.4030 | 0.1761 | 9.999761e-10 | 218 |
| 1.3838 | 0.2235 | 1.4026 | 0.1761 | 9.999759e-10 | 219 |
| 1.3835 | 0.2612 | 1.4022 | 0.1761 | 9.999757e-10 | 220 |
| 1.3824 | 0.2329 | 1.4017 | 0.1761 | 9.999754e-10 | 221 |
| 1.3830 | 0.2376 | 1.4013 | 0.1761 | 9.999752e-10 | 222 |
| 1.3848 | 0.2235 | 1.4009 | 0.1831 | 9.99975e-10 | 223 |
| 1.3772 | 0.2565 | 1.4004 | 0.1831 | 9.999748e-10 | 224 |
| 1.3764 | 0.2447 | 1.4001 | 0.1831 | 9.999745e-10 | 225 |
| 1.3779 | 0.2541 | 1.3997 | 0.1831 | 9.999743e-10 | 226 |
| 1.3781 | 0.2588 | 1.3993 | 0.1831 | 9.999741e-10 | 227 |
| 1.3838 | 0.2047 | 1.3989 | 0.1831 | 9.999739e-10 | 228 |
| 1.3807 | 0.2259 | 1.3985 | 0.1831 | 9.999737e-10 | 229 |
| 1.3745 | 0.2635 | 1.3982 | 0.1831 | 9.999734e-10 | 230 |
| 1.3776 | 0.2447 | 1.3977 | 0.1831 | 9.999732e-10 | 231 |
| 1.3787 | 0.2282 | 1.3973 | 0.1831 | 9.99973e-10 | 232 |
| 1.3747 | 0.2706 | 1.3969 | 0.1831 | 9.999728e-10 | 233 |
| 1.3771 | 0.2447 | 1.3965 | 0.1901 | 9.999725e-10 | 234 |
| 1.3783 | 0.2259 | 1.3961 | 0.1901 | 9.999723e-10 | 235 |
| 1.3763 | 0.2141 | 1.3957 | 0.1901 | 9.999721e-10 | 236 |
| 1.3687 | 0.2565 | 1.3953 | 0.1901 | 9.999719e-10 | 237 |
| 1.3681 | 0.2565 | 1.3949 | 0.1901 | 9.999717e-10 | 238 |
| 1.3785 | 0.2400 | 1.3945 | 0.1901 | 9.999714e-10 | 239 |
| 1.3807 | 0.2259 | 1.3941 | 0.1972 | 9.999712e-10 | 240 |
| 1.3709 | 0.2353 | 1.3937 | 0.1972 | 9.99971e-10 | 241 |
| 1.3736 | 0.2753 | 1.3933 | 0.1972 | 9.999708e-10 | 242 |
| 1.3735 | 0.2376 | 1.3929 | 0.1972 | 9.999706e-10 | 243 |
| 1.3797 | 0.2235 | 1.3925 | 0.1972 | 9.999703e-10 | 244 |
| 1.3814 | 0.2541 | 1.3921 | 0.2042 | 9.999701e-10 | 245 |
| 1.3672 | 0.2565 | 1.3917 | 0.2042 | 9.999699e-10 | 246 |
| 1.3702 | 0.2518 | 1.3912 | 0.2042 | 9.999697e-10 | 247 |
| 1.3696 | 0.2682 | 1.3908 | 0.2042 | 9.999694e-10 | 248 |
| 1.3727 | 0.2424 | 1.3904 | 0.2042 | 9.999692e-10 | 249 |
| 1.3712 | 0.2635 | 1.3900 | 0.2042 | 9.99969e-10 | 250 |
| 1.3755 | 0.2235 | 1.3896 | 0.2042 | 9.999688e-10 | 251 |
| 1.3626 | 0.2612 | 1.3892 | 0.2042 | 9.999686e-10 | 252 |
| 1.3751 | 0.2376 | 1.3889 | 0.2042 | 9.999683e-10 | 253 |
| 1.3742 | 0.2353 | 1.3885 | 0.2042 | 9.999681e-10 | 254 |
| 1.3749 | 0.2329 | 1.3881 | 0.2042 | 9.999679e-10 | 255 |
| 1.3686 | 0.2541 | 1.3878 | 0.2042 | 9.999677e-10 | 256 |
| 1.3761 | 0.2353 | 1.3873 | 0.2042 | 9.999674e-10 | 257 |
| 1.3742 | 0.2565 | 1.3869 | 0.2042 | 9.999672e-10 | 258 |
| 1.3720 | 0.2682 | 1.3864 | 0.2042 | 9.99967e-10 | 259 |
| 1.3676 | 0.2471 | 1.3860 | 0.2042 | 9.999668e-10 | 260 |
| 1.3710 | 0.2541 | 1.3856 | 0.2042 | 9.999666e-10 | 261 |
| 1.3640 | 0.2918 | 1.3852 | 0.2042 | 9.999663e-10 | 262 |
| 1.3611 | 0.2588 | 1.3848 | 0.2042 | 9.999661e-10 | 263 |
| 1.3686 | 0.2635 | 1.3844 | 0.2042 | 9.999659e-10 | 264 |
| 1.3653 | 0.2776 | 1.3840 | 0.2042 | 9.999657e-10 | 265 |
| 1.3623 | 0.2729 | 1.3836 | 0.2042 | 9.999654e-10 | 266 |
| 1.3690 | 0.2518 | 1.3832 | 0.2042 | 9.999652e-10 | 267 |
| 1.3642 | 0.2635 | 1.3828 | 0.2042 | 9.99965e-10 | 268 |
| 1.3676 | 0.2518 | 1.3823 | 0.2042 | 9.999648e-10 | 269 |
| 1.3697 | 0.2612 | 1.3820 | 0.2042 | 9.999646e-10 | 270 |
| 1.3579 | 0.2894 | 1.3816 | 0.2042 | 9.999643e-10 | 271 |
| 1.3626 | 0.2588 | 1.3812 | 0.2042 | 9.999641e-10 | 272 |
| 1.3602 | 0.2753 | 1.3807 | 0.2042 | 9.999639e-10 | 273 |
| 1.3667 | 0.2612 | 1.3803 | 0.2042 | 9.999637e-10 | 274 |
| 1.3669 | 0.2847 | 1.3800 | 0.2042 | 9.999634e-10 | 275 |
| 1.3602 | 0.2988 | 1.3796 | 0.2042 | 9.999632e-10 | 276 |
| 1.3618 | 0.2941 | 1.3792 | 0.2042 | 9.99963e-10 | 277 |
| 1.3531 | 0.3129 | 1.3788 | 0.2183 | 9.999627e-10 | 278 |
| 1.3597 | 0.2894 | 1.3785 | 0.2183 | 9.999623e-10 | 279 |
| 1.3636 | 0.2729 | 1.3781 | 0.2183 | 9.99962e-10 | 280 |
| 1.3619 | 0.2706 | 1.3777 | 0.2183 | 9.999617e-10 | 281 |
| 1.3573 | 0.3059 | 1.3772 | 0.2183 | 9.999613e-10 | 282 |
| 1.3587 | 0.2635 | 1.3768 | 0.2183 | 9.99961e-10 | 283 |
| 1.3569 | 0.2776 | 1.3764 | 0.2183 | 9.999607e-10 | 284 |
| 1.3521 | 0.3200 | 1.3761 | 0.2183 | 9.999603e-10 | 285 |
| 1.3603 | 0.3176 | 1.3757 | 0.2183 | 9.9996e-10 | 286 |
| 1.3575 | 0.2894 | 1.3753 | 0.2183 | 9.999597e-10 | 287 |
| 1.3626 | 0.2565 | 1.3749 | 0.2183 | 9.999593e-10 | 288 |
| 1.3613 | 0.2565 | 1.3746 | 0.2183 | 9.99959e-10 | 289 |
| 1.3615 | 0.2706 | 1.3742 | 0.2183 | 9.999587e-10 | 290 |
| 1.3554 | 0.2706 | 1.3739 | 0.2183 | 9.999583e-10 | 291 |
| 1.3559 | 0.2988 | 1.3735 | 0.2183 | 9.99958e-10 | 292 |
| 1.3588 | 0.2682 | 1.3731 | 0.2254 | 9.999577e-10 | 293 |
| 1.3506 | 0.2824 | 1.3727 | 0.2254 | 9.999573e-10 | 294 |
| 1.3588 | 0.2706 | 1.3723 | 0.2324 | 9.99957e-10 | 295 |
| 1.3486 | 0.2824 | 1.3720 | 0.2254 | 9.999567e-10 | 296 |
| 1.3553 | 0.3012 | 1.3716 | 0.2254 | 9.999563e-10 | 297 |
| 1.3605 | 0.2447 | 1.3712 | 0.2254 | 9.99956e-10 | 298 |
| 1.3502 | 0.3176 | 1.3709 | 0.2254 | 9.999557e-10 | 299 |
| 1.3522 | 0.3012 | 1.3705 | 0.2254 | 9.999553e-10 | 300 |
| 1.3544 | 0.2824 | 1.3701 | 0.2183 | 9.99955e-10 | 301 |
| 1.3577 | 0.2494 | 1.3697 | 0.2183 | 9.999547e-10 | 302 |
| 1.3470 | 0.2918 | 1.3693 | 0.2183 | 9.999543e-10 | 303 |
| 1.3623 | 0.2871 | 1.3689 | 0.2183 | 9.99954e-10 | 304 |
| 1.3532 | 0.2776 | 1.3685 | 0.2183 | 9.999537e-10 | 305 |
| 1.3551 | 0.2753 | 1.3681 | 0.2183 | 9.999533e-10 | 306 |
| 1.3566 | 0.2659 | 1.3677 | 0.2183 | 9.99953e-10 | 307 |
| 1.3517 | 0.2965 | 1.3673 | 0.2113 | 9.999527e-10 | 308 |
| 1.3574 | 0.2988 | 1.3669 | 0.2113 | 9.999523e-10 | 309 |
| 1.3467 | 0.3200 | 1.3666 | 0.2113 | 9.99952e-10 | 310 |
| 1.3510 | 0.3082 | 1.3662 | 0.2113 | 9.999517e-10 | 311 |
| 1.3448 | 0.3129 | 1.3658 | 0.2113 | 9.999513e-10 | 312 |
| 1.3512 | 0.2800 | 1.3654 | 0.2113 | 9.99951e-10 | 313 |
| 1.3486 | 0.3082 | 1.3650 | 0.2113 | 9.999507e-10 | 314 |
| 1.3441 | 0.3106 | 1.3647 | 0.2113 | 9.999503e-10 | 315 |
| 1.3474 | 0.3176 | 1.3643 | 0.2113 | 9.9995e-10 | 316 |
| 1.3496 | 0.2965 | 1.3639 | 0.2113 | 9.999497e-10 | 317 |
| 1.3436 | 0.3200 | 1.3635 | 0.2183 | 9.999493e-10 | 318 |
| 1.3398 | 0.3318 | 1.3631 | 0.2183 | 9.99949e-10 | 319 |
| 1.3440 | 0.3318 | 1.3627 | 0.2183 | 9.999487e-10 | 320 |
| 1.3402 | 0.3294 | 1.3624 | 0.2254 | 9.999483e-10 | 321 |
| 1.3463 | 0.3247 | 1.3620 | 0.2254 | 9.99948e-10 | 322 |
| 1.3458 | 0.3012 | 1.3616 | 0.2254 | 9.999477e-10 | 323 |
| 1.3492 | 0.3153 | 1.3612 | 0.2254 | 9.999473e-10 | 324 |
| 1.3496 | 0.2941 | 1.3609 | 0.2324 | 9.99947e-10 | 325 |
| 1.3505 | 0.2776 | 1.3605 | 0.2394 | 9.999467e-10 | 326 |
| 1.3314 | 0.3200 | 1.3601 | 0.2394 | 9.999463e-10 | 327 |
| 1.3509 | 0.3082 | 1.3597 | 0.2394 | 9.99946e-10 | 328 |
| 1.3441 | 0.3318 | 1.3593 | 0.2465 | 9.999457e-10 | 329 |
| 1.3360 | 0.3365 | 1.3589 | 0.2535 | 9.999453e-10 | 330 |
| 1.3424 | 0.3271 | 1.3586 | 0.2606 | 9.99945e-10 | 331 |
| 1.3513 | 0.2824 | 1.3582 | 0.2606 | 9.999447e-10 | 332 |
| 1.3505 | 0.3106 | 1.3578 | 0.2606 | 9.999443e-10 | 333 |
| 1.3332 | 0.3176 | 1.3575 | 0.2606 | 9.99944e-10 | 334 |
| 1.3374 | 0.3341 | 1.3571 | 0.2606 | 9.999437e-10 | 335 |
| 1.3425 | 0.3106 | 1.3567 | 0.2606 | 9.999434e-10 | 336 |
| 1.3480 | 0.2988 | 1.3563 | 0.2606 | 9.99943e-10 | 337 |
| 1.3396 | 0.2894 | 1.3560 | 0.2606 | 9.999427e-10 | 338 |
| 1.3431 | 0.3271 | 1.3556 | 0.2676 | 9.999424e-10 | 339 |
| 1.3378 | 0.3271 | 1.3552 | 0.2676 | 9.99942e-10 | 340 |
| 1.3409 | 0.3318 | 1.3548 | 0.2676 | 9.999417e-10 | 341 |
| 1.3401 | 0.3506 | 1.3544 | 0.2676 | 9.999414e-10 | 342 |
| 1.3394 | 0.3153 | 1.3541 | 0.2746 | 9.99941e-10 | 343 |
| 1.3350 | 0.3412 | 1.3537 | 0.2746 | 9.999407e-10 | 344 |
| 1.3464 | 0.3200 | 1.3533 | 0.2817 | 9.999404e-10 | 345 |
| 1.3349 | 0.3412 | 1.3530 | 0.2817 | 9.9994e-10 | 346 |
| 1.3362 | 0.3318 | 1.3527 | 0.2817 | 9.999397e-10 | 347 |
| 1.3454 | 0.3153 | 1.3523 | 0.2817 | 9.999394e-10 | 348 |
| 1.3336 | 0.3459 | 1.3519 | 0.2817 | 9.99939e-10 | 349 |
| 1.3333 | 0.3812 | 1.3516 | 0.2817 | 9.999387e-10 | 350 |
| 1.3349 | 0.3459 | 1.3512 | 0.2817 | 9.999384e-10 | 351 |
| 1.3363 | 0.3388 | 1.3509 | 0.2817 | 9.99938e-10 | 352 |
| 1.3243 | 0.3553 | 1.3505 | 0.2887 | 9.999377e-10 | 353 |
| 1.3317 | 0.3529 | 1.3502 | 0.2817 | 9.999374e-10 | 354 |
| 1.3294 | 0.3388 | 1.3498 | 0.2887 | 9.99937e-10 | 355 |
| 1.3385 | 0.3459 | 1.3494 | 0.2887 | 9.999367e-10 | 356 |
| 1.3293 | 0.3624 | 1.3491 | 0.2887 | 9.999364e-10 | 357 |
| 1.3285 | 0.3694 | 1.3487 | 0.2887 | 9.99936e-10 | 358 |
| 1.3377 | 0.3271 | 1.3483 | 0.2887 | 9.999357e-10 | 359 |
| 1.3367 | 0.3271 | 1.3479 | 0.2887 | 9.999354e-10 | 360 |
| 1.3332 | 0.3341 | 1.3476 | 0.2887 | 9.99935e-10 | 361 |
| 1.3377 | 0.3600 | 1.3473 | 0.2887 | 9.999347e-10 | 362 |
| 1.3222 | 0.3953 | 1.3469 | 0.2887 | 9.999344e-10 | 363 |
| 1.3268 | 0.3553 | 1.3465 | 0.2887 | 9.99934e-10 | 364 |
| 1.3315 | 0.3412 | 1.3461 | 0.2887 | 9.999337e-10 | 365 |
| 1.3318 | 0.3365 | 1.3458 | 0.2887 | 9.999334e-10 | 366 |
| 1.3273 | 0.3671 | 1.3454 | 0.3028 | 9.99933e-10 | 367 |
| 1.3294 | 0.3576 | 1.3450 | 0.3028 | 9.999327e-10 | 368 |
| 1.3291 | 0.3694 | 1.3446 | 0.3028 | 9.999324e-10 | 369 |
| 1.3198 | 0.3600 | 1.3443 | 0.3028 | 9.99932e-10 | 370 |
| 1.3227 | 0.3741 | 1.3440 | 0.3028 | 9.999317e-10 | 371 |
| 1.3275 | 0.3553 | 1.3436 | 0.3028 | 9.999314e-10 | 372 |
| 1.3285 | 0.3388 | 1.3432 | 0.3028 | 9.99931e-10 | 373 |
| 1.3314 | 0.3671 | 1.3428 | 0.3028 | 9.999307e-10 | 374 |
| 1.3250 | 0.3812 | 1.3425 | 0.3028 | 9.999304e-10 | 375 |
| 1.3255 | 0.3553 | 1.3422 | 0.2958 | 9.9993e-10 | 376 |
| 1.3269 | 0.3906 | 1.3419 | 0.2958 | 9.999297e-10 | 377 |
| 1.3257 | 0.3694 | 1.3415 | 0.2958 | 9.999294e-10 | 378 |
| 1.3235 | 0.3624 | 1.3412 | 0.2958 | 9.99929e-10 | 379 |
| 1.3304 | 0.3224 | 1.3408 | 0.3028 | 9.999287e-10 | 380 |
| 1.3203 | 0.3694 | 1.3404 | 0.3028 | 9.999284e-10 | 381 |
| 1.3223 | 0.3694 | 1.3400 | 0.3169 | 9.99928e-10 | 382 |
| 1.3217 | 0.3953 | 1.3397 | 0.3169 | 9.999277e-10 | 383 |
| 1.3163 | 0.3882 | 1.3393 | 0.3169 | 9.999274e-10 | 384 |
| 1.3261 | 0.3718 | 1.3390 | 0.3169 | 9.99927e-10 | 385 |
| 1.3308 | 0.3624 | 1.3386 | 0.3169 | 9.999267e-10 | 386 |
| 1.3263 | 0.3482 | 1.3382 | 0.3239 | 9.999264e-10 | 387 |
| 1.3218 | 0.4094 | 1.3378 | 0.3239 | 9.99926e-10 | 388 |
| 1.3217 | 0.3788 | 1.3375 | 0.3239 | 9.999256e-10 | 389 |
| 1.3270 | 0.3482 | 1.3370 | 0.3239 | 9.999251e-10 | 390 |
| 1.3237 | 0.3600 | 1.3367 | 0.3239 | 9.999247e-10 | 391 |
| 1.3207 | 0.3741 | 1.3363 | 0.3239 | 9.999243e-10 | 392 |
| 1.3203 | 0.3835 | 1.3359 | 0.3239 | 9.999238e-10 | 393 |
| 1.3177 | 0.3671 | 1.3356 | 0.3169 | 9.999234e-10 | 394 |
| 1.3187 | 0.4000 | 1.3353 | 0.3169 | 9.999229e-10 | 395 |
| 1.3227 | 0.3529 | 1.3349 | 0.3169 | 9.999225e-10 | 396 |
| 1.3195 | 0.3624 | 1.3345 | 0.3239 | 9.99922e-10 | 397 |
| 1.3217 | 0.4141 | 1.3342 | 0.3239 | 9.999216e-10 | 398 |
| 1.3205 | 0.3906 | 1.3338 | 0.3239 | 9.999211e-10 | 399 |
| 1.3192 | 0.3812 | 1.3334 | 0.3239 | 9.999207e-10 | 400 |
| 1.3194 | 0.3812 | 1.3330 | 0.3239 | 9.999203e-10 | 401 |
| 1.3175 | 0.3741 | 1.3326 | 0.3239 | 9.999198e-10 | 402 |
| 1.3118 | 0.4306 | 1.3323 | 0.3239 | 9.999194e-10 | 403 |
| 1.3226 | 0.3788 | 1.3319 | 0.3239 | 9.999189e-10 | 404 |
| 1.3186 | 0.4047 | 1.3315 | 0.3239 | 9.999185e-10 | 405 |
| 1.3201 | 0.3671 | 1.3312 | 0.3239 | 9.99918e-10 | 406 |
| 1.3193 | 0.4000 | 1.3308 | 0.3310 | 9.999176e-10 | 407 |
| 1.3247 | 0.3718 | 1.3304 | 0.3310 | 9.999171e-10 | 408 |
| 1.3146 | 0.3906 | 1.3301 | 0.3310 | 9.999167e-10 | 409 |
| 1.3139 | 0.3812 | 1.3298 | 0.3380 | 9.999163e-10 | 410 |
| 1.3172 | 0.4165 | 1.3294 | 0.3451 | 9.999158e-10 | 411 |
| 1.3146 | 0.4071 | 1.3291 | 0.3451 | 9.999154e-10 | 412 |
| 1.3148 | 0.3859 | 1.3287 | 0.3451 | 9.999149e-10 | 413 |
| 1.3177 | 0.4024 | 1.3284 | 0.3521 | 9.999145e-10 | 414 |
| 1.3096 | 0.4329 | 1.3280 | 0.3662 | 9.99914e-10 | 415 |
| 1.3126 | 0.3929 | 1.3276 | 0.3662 | 9.999136e-10 | 416 |
| 1.3147 | 0.4235 | 1.3273 | 0.3662 | 9.999132e-10 | 417 |
| 1.3149 | 0.3600 | 1.3269 | 0.3732 | 9.999127e-10 | 418 |
| 1.3122 | 0.4259 | 1.3265 | 0.3732 | 9.999123e-10 | 419 |
| 1.3140 | 0.3929 | 1.3262 | 0.3732 | 9.999118e-10 | 420 |
| 1.3111 | 0.3835 | 1.3258 | 0.3873 | 9.999114e-10 | 421 |
| 1.3131 | 0.4094 | 1.3255 | 0.3944 | 9.999109e-10 | 422 |
| 1.3118 | 0.3859 | 1.3251 | 0.3944 | 9.999105e-10 | 423 |
| 1.3146 | 0.3671 | 1.3248 | 0.4014 | 9.9991e-10 | 424 |
| 1.3078 | 0.4188 | 1.3244 | 0.4085 | 9.999096e-10 | 425 |
| 1.3087 | 0.4188 | 1.3241 | 0.4085 | 9.999092e-10 | 426 |
| 1.3125 | 0.4188 | 1.3237 | 0.4155 | 9.999087e-10 | 427 |
| 1.3071 | 0.4024 | 1.3234 | 0.4225 | 9.999083e-10 | 428 |
| 1.3131 | 0.3929 | 1.3230 | 0.4296 | 9.999078e-10 | 429 |
| 1.3077 | 0.4424 | 1.3227 | 0.4296 | 9.999074e-10 | 430 |
| 1.3127 | 0.4024 | 1.3223 | 0.4296 | 9.999069e-10 | 431 |
| 1.3047 | 0.4518 | 1.3220 | 0.4296 | 9.999065e-10 | 432 |
| 1.2997 | 0.4329 | 1.3216 | 0.4296 | 9.99906e-10 | 433 |
| 1.3050 | 0.4329 | 1.3213 | 0.4296 | 9.999056e-10 | 434 |
| 1.3077 | 0.4329 | 1.3210 | 0.4296 | 9.999052e-10 | 435 |
| 1.3064 | 0.4329 | 1.3206 | 0.4296 | 9.999047e-10 | 436 |
| 1.3038 | 0.4424 | 1.3202 | 0.4296 | 9.999043e-10 | 437 |
| 1.3140 | 0.3976 | 1.3199 | 0.4366 | 9.999038e-10 | 438 |
| 1.3025 | 0.4235 | 1.3195 | 0.4366 | 9.999034e-10 | 439 |
| 1.3021 | 0.4282 | 1.3192 | 0.4296 | 9.999029e-10 | 440 |
| 1.3029 | 0.4235 | 1.3188 | 0.4366 | 9.999025e-10 | 441 |
| 1.2991 | 0.4682 | 1.3185 | 0.4366 | 9.99902e-10 | 442 |
| 1.3099 | 0.4165 | 1.3181 | 0.4366 | 9.999016e-10 | 443 |
| 1.3051 | 0.4376 | 1.3178 | 0.4366 | 9.999012e-10 | 444 |
| 1.2937 | 0.4353 | 1.3174 | 0.4437 | 9.999007e-10 | 445 |
| 1.3004 | 0.4235 | 1.3171 | 0.4507 | 9.999003e-10 | 446 |
| 1.2956 | 0.4682 | 1.3167 | 0.4507 | 9.998998e-10 | 447 |
| 1.3079 | 0.4329 | 1.3164 | 0.4577 | 9.998994e-10 | 448 |
| 1.3026 | 0.4376 | 1.3160 | 0.4577 | 9.998989e-10 | 449 |
| 1.3009 | 0.4400 | 1.3156 | 0.4648 | 9.998985e-10 | 450 |
| 1.3018 | 0.4353 | 1.3153 | 0.4648 | 9.99898e-10 | 451 |
| 1.3011 | 0.4329 | 1.3149 | 0.4648 | 9.998976e-10 | 452 |
| 1.3014 | 0.4259 | 1.3146 | 0.4648 | 9.998972e-10 | 453 |
| 1.3028 | 0.4659 | 1.3142 | 0.4648 | 9.998967e-10 | 454 |
| 1.2986 | 0.4329 | 1.3140 | 0.4648 | 9.998963e-10 | 455 |
| 1.2987 | 0.4376 | 1.3136 | 0.4718 | 9.998958e-10 | 456 |
| 1.3080 | 0.4188 | 1.3132 | 0.4718 | 9.998954e-10 | 457 |
| 1.2989 | 0.4282 | 1.3129 | 0.4718 | 9.99895e-10 | 458 |
| 1.3003 | 0.4447 | 1.3125 | 0.4718 | 9.998945e-10 | 459 |
| 1.2984 | 0.4494 | 1.3122 | 0.4718 | 9.998941e-10 | 460 |
| 1.2991 | 0.4306 | 1.3118 | 0.4859 | 9.998936e-10 | 461 |
| 1.3014 | 0.4588 | 1.3115 | 0.4930 | 9.998932e-10 | 462 |
| 1.3041 | 0.4118 | 1.3112 | 0.4930 | 9.998927e-10 | 463 |
| 1.3031 | 0.4306 | 1.3109 | 0.4930 | 9.998923e-10 | 464 |
| 1.2979 | 0.4329 | 1.3105 | 0.4930 | 9.998918e-10 | 465 |
| 1.3049 | 0.4424 | 1.3102 | 0.4930 | 9.998914e-10 | 466 |
| 1.3003 | 0.4541 | 1.3098 | 0.4930 | 9.99891e-10 | 467 |
| 1.2883 | 0.4518 | 1.3095 | 0.4930 | 9.998905e-10 | 468 |
| 1.2887 | 0.5012 | 1.3091 | 0.5 | 9.998901e-10 | 469 |
| 1.3032 | 0.4541 | 1.3088 | 0.5 | 9.998896e-10 | 470 |
| 1.2940 | 0.4518 | 1.3084 | 0.5 | 9.998892e-10 | 471 |
| 1.2887 | 0.4894 | 1.3081 | 0.5 | 9.998887e-10 | 472 |
| 1.2878 | 0.4753 | 1.3078 | 0.5 | 9.998883e-10 | 473 |
| 1.2885 | 0.4941 | 1.3074 | 0.5 | 9.998878e-10 | 474 |
| 1.2936 | 0.4612 | 1.3071 | 0.5 | 9.998874e-10 | 475 |
| 1.2915 | 0.4659 | 1.3067 | 0.5 | 9.99887e-10 | 476 |
| 1.2886 | 0.4518 | 1.3064 | 0.5 | 9.998865e-10 | 477 |
| 1.2975 | 0.4376 | 1.3061 | 0.5 | 9.998861e-10 | 478 |
| 1.2930 | 0.4635 | 1.3057 | 0.4930 | 9.998856e-10 | 479 |
| 1.2910 | 0.4894 | 1.3054 | 0.4930 | 9.998852e-10 | 480 |
| 1.2891 | 0.4682 | 1.3050 | 0.5 | 9.998847e-10 | 481 |
| 1.2900 | 0.4965 | 1.3047 | 0.5 | 9.998843e-10 | 482 |
| 1.2902 | 0.4682 | 1.3044 | 0.5 | 9.998838e-10 | 483 |
| 1.2912 | 0.4965 | 1.3041 | 0.5 | 9.998834e-10 | 484 |
| 1.2926 | 0.4541 | 1.3037 | 0.5 | 9.99883e-10 | 485 |
| 1.2893 | 0.4706 | 1.3034 | 0.5070 | 9.998825e-10 | 486 |
| 1.2823 | 0.4965 | 1.3030 | 0.5070 | 9.998821e-10 | 487 |
| 1.2865 | 0.4894 | 1.3026 | 0.5 | 9.998816e-10 | 488 |
| 1.2902 | 0.4682 | 1.3023 | 0.5 | 9.998812e-10 | 489 |
| 1.2818 | 0.5082 | 1.3020 | 0.5 | 9.998807e-10 | 490 |
| 1.2924 | 0.4424 | 1.3017 | 0.5 | 9.998803e-10 | 491 |
| 1.2839 | 0.4918 | 1.3013 | 0.5 | 9.998798e-10 | 492 |
| 1.2840 | 0.4635 | 1.3010 | 0.5 | 9.998794e-10 | 493 |
| 1.2860 | 0.4800 | 1.3007 | 0.5 | 9.99879e-10 | 494 |
| 1.2913 | 0.4424 | 1.3003 | 0.5 | 9.998785e-10 | 495 |
| 1.2914 | 0.4988 | 1.2999 | 0.5070 | 9.998781e-10 | 496 |
| 1.2898 | 0.4635 | 1.2996 | 0.5070 | 9.998776e-10 | 497 |
| 1.2885 | 0.4635 | 1.2992 | 0.5141 | 9.998772e-10 | 498 |
| 1.2825 | 0.4847 | 1.2989 | 0.5141 | 9.998767e-10 | 499 |
| 1.2835 | 0.4682 | 1.2986 | 0.5141 | 9.998762e-10 | 500 |
| 1.2855 | 0.4894 | 1.2982 | 0.5141 | 9.998756e-10 | 501 |
| 1.2873 | 0.4729 | 1.2978 | 0.5141 | 9.998751e-10 | 502 |
| 1.2834 | 0.5106 | 1.2975 | 0.5141 | 9.998745e-10 | 503 |
| 1.2837 | 0.5153 | 1.2972 | 0.5211 | 9.99874e-10 | 504 |
| 1.2818 | 0.4941 | 1.2969 | 0.5211 | 9.998734e-10 | 505 |
| 1.2815 | 0.5082 | 1.2966 | 0.5211 | 9.998729e-10 | 506 |
| 1.2845 | 0.4800 | 1.2962 | 0.5211 | 9.998723e-10 | 507 |
| 1.2966 | 0.4376 | 1.2959 | 0.5211 | 9.998717e-10 | 508 |
| 1.2863 | 0.4941 | 1.2955 | 0.5282 | 9.998712e-10 | 509 |
| 1.2814 | 0.4871 | 1.2952 | 0.5282 | 9.998706e-10 | 510 |
| 1.2809 | 0.5224 | 1.2948 | 0.5282 | 9.998701e-10 | 511 |
| 1.2850 | 0.4682 | 1.2945 | 0.5352 | 9.998695e-10 | 512 |
| 1.2787 | 0.5035 | 1.2942 | 0.5352 | 9.99869e-10 | 513 |
| 1.2819 | 0.5059 | 1.2939 | 0.5352 | 9.998684e-10 | 514 |
| 1.2825 | 0.4729 | 1.2936 | 0.5423 | 9.998679e-10 | 515 |
| 1.2720 | 0.5341 | 1.2932 | 0.5423 | 9.998673e-10 | 516 |
| 1.2779 | 0.5153 | 1.2929 | 0.5423 | 9.998667e-10 | 517 |
| 1.2803 | 0.5176 | 1.2925 | 0.5563 | 9.998662e-10 | 518 |
| 1.2803 | 0.4706 | 1.2922 | 0.5563 | 9.998656e-10 | 519 |
| 1.2752 | 0.5059 | 1.2919 | 0.5563 | 9.998651e-10 | 520 |
| 1.2816 | 0.4894 | 1.2915 | 0.5634 | 9.998645e-10 | 521 |
| 1.2723 | 0.5459 | 1.2912 | 0.5634 | 9.99864e-10 | 522 |
| 1.2828 | 0.5012 | 1.2909 | 0.5775 | 9.998634e-10 | 523 |
| 1.2901 | 0.4871 | 1.2906 | 0.5775 | 9.998629e-10 | 524 |
| 1.2856 | 0.4800 | 1.2902 | 0.5775 | 9.998623e-10 | 525 |
| 1.2812 | 0.5176 | 1.2899 | 0.5775 | 9.998617e-10 | 526 |
| 1.2731 | 0.5176 | 1.2896 | 0.5775 | 9.998612e-10 | 527 |
| 1.2819 | 0.5082 | 1.2892 | 0.5775 | 9.998606e-10 | 528 |
| 1.2775 | 0.5106 | 1.2889 | 0.5775 | 9.998601e-10 | 529 |
| 1.2774 | 0.5012 | 1.2886 | 0.5775 | 9.998595e-10 | 530 |
| 1.2765 | 0.5294 | 1.2883 | 0.5775 | 9.99859e-10 | 531 |
| 1.2782 | 0.5176 | 1.2880 | 0.5775 | 9.998584e-10 | 532 |
| 1.2763 | 0.5082 | 1.2877 | 0.5775 | 9.998579e-10 | 533 |
| 1.2716 | 0.5082 | 1.2873 | 0.5775 | 9.998573e-10 | 534 |
| 1.2827 | 0.5035 | 1.2870 | 0.5775 | 9.998568e-10 | 535 |
| 1.2741 | 0.5106 | 1.2867 | 0.5775 | 9.998562e-10 | 536 |
| 1.2719 | 0.5294 | 1.2864 | 0.5775 | 9.998556e-10 | 537 |
| 1.2698 | 0.5153 | 1.2861 | 0.5775 | 9.998551e-10 | 538 |
| 1.2801 | 0.5294 | 1.2857 | 0.5775 | 9.998545e-10 | 539 |
| 1.2698 | 0.5459 | 1.2854 | 0.5775 | 9.99854e-10 | 540 |
| 1.2722 | 0.5129 | 1.2851 | 0.5775 | 9.998534e-10 | 541 |
| 1.2690 | 0.5176 | 1.2848 | 0.5775 | 9.998529e-10 | 542 |
| 1.2807 | 0.5106 | 1.2845 | 0.5775 | 9.998523e-10 | 543 |
| 1.2762 | 0.5153 | 1.2841 | 0.5845 | 9.998518e-10 | 544 |
| 1.2734 | 0.5365 | 1.2838 | 0.5915 | 9.998512e-10 | 545 |
| 1.2607 | 0.5459 | 1.2835 | 0.5915 | 9.998506e-10 | 546 |
| 1.2778 | 0.5035 | 1.2831 | 0.5915 | 9.998501e-10 | 547 |
| 1.2625 | 0.5271 | 1.2828 | 0.5986 | 9.998495e-10 | 548 |
| 1.2641 | 0.5318 | 1.2825 | 0.5986 | 9.99849e-10 | 549 |
| 1.2695 | 0.5341 | 1.2822 | 0.6056 | 9.998484e-10 | 550 |
| 1.2721 | 0.5459 | 1.2819 | 0.6056 | 9.998479e-10 | 551 |
| 1.2707 | 0.5271 | 1.2816 | 0.6056 | 9.998473e-10 | 552 |
| 1.2695 | 0.5247 | 1.2812 | 0.6056 | 9.998468e-10 | 553 |
| 1.2766 | 0.5035 | 1.2809 | 0.6056 | 9.998462e-10 | 554 |
| 1.2678 | 0.5482 | 1.2806 | 0.6056 | 9.998457e-10 | 555 |
| 1.2677 | 0.5318 | 1.2803 | 0.6056 | 9.998451e-10 | 556 |
| 1.2711 | 0.5271 | 1.2799 | 0.6056 | 9.998445e-10 | 557 |
| 1.2639 | 0.5529 | 1.2796 | 0.6056 | 9.99844e-10 | 558 |
| 1.2619 | 0.5906 | 1.2794 | 0.6056 | 9.998434e-10 | 559 |
| 1.2710 | 0.5271 | 1.2791 | 0.6056 | 9.998429e-10 | 560 |
| 1.2666 | 0.5647 | 1.2787 | 0.6056 | 9.998423e-10 | 561 |
| 1.2639 | 0.5388 | 1.2784 | 0.6056 | 9.998418e-10 | 562 |
| 1.2736 | 0.5200 | 1.2781 | 0.6056 | 9.998412e-10 | 563 |
| 1.2722 | 0.5271 | 1.2777 | 0.6056 | 9.998407e-10 | 564 |
| 1.2638 | 0.5482 | 1.2774 | 0.6056 | 9.998401e-10 | 565 |
| 1.2654 | 0.5318 | 1.2771 | 0.6056 | 9.998395e-10 | 566 |
| 1.2649 | 0.5459 | 1.2767 | 0.6056 | 9.99839e-10 | 567 |
| 1.2638 | 0.5412 | 1.2764 | 0.6056 | 9.998384e-10 | 568 |
| 1.2626 | 0.5694 | 1.2761 | 0.6056 | 9.998379e-10 | 569 |
| 1.2579 | 0.5576 | 1.2758 | 0.6056 | 9.998373e-10 | 570 |
| 1.2673 | 0.5671 | 1.2755 | 0.6056 | 9.998368e-10 | 571 |
| 1.2628 | 0.5224 | 1.2751 | 0.6056 | 9.998362e-10 | 572 |
| 1.2664 | 0.5247 | 1.2748 | 0.6056 | 9.998357e-10 | 573 |
| 1.2653 | 0.5247 | 1.2745 | 0.6056 | 9.998351e-10 | 574 |
| 1.2662 | 0.5294 | 1.2742 | 0.6056 | 9.998345e-10 | 575 |
| 1.2553 | 0.5459 | 1.2738 | 0.6056 | 9.99834e-10 | 576 |
| 1.2572 | 0.5765 | 1.2735 | 0.6056 | 9.998334e-10 | 577 |
| 1.2645 | 0.5271 | 1.2732 | 0.6056 | 9.998329e-10 | 578 |
| 1.2659 | 0.5388 | 1.2728 | 0.5986 | 9.998323e-10 | 579 |
| 1.2604 | 0.5482 | 1.2725 | 0.5986 | 9.998318e-10 | 580 |
| 1.2665 | 0.5012 | 1.2722 | 0.5986 | 9.998312e-10 | 581 |
| 1.2617 | 0.5388 | 1.2718 | 0.6056 | 9.998307e-10 | 582 |
| 1.2657 | 0.5200 | 1.2715 | 0.6056 | 9.998301e-10 | 583 |
| 1.2616 | 0.5412 | 1.2712 | 0.6127 | 9.998296e-10 | 584 |
| 1.2571 | 0.5624 | 1.2709 | 0.6127 | 9.99829e-10 | 585 |
| 1.2589 | 0.5482 | 1.2707 | 0.6127 | 9.998284e-10 | 586 |
| 1.2522 | 0.5671 | 1.2704 | 0.6056 | 9.998279e-10 | 587 |
| 1.2607 | 0.5553 | 1.2701 | 0.6056 | 9.998273e-10 | 588 |
| 1.2534 | 0.5624 | 1.2698 | 0.6056 | 9.998268e-10 | 589 |
| 1.2607 | 0.5624 | 1.2695 | 0.6056 | 9.998262e-10 | 590 |
| 1.2507 | 0.5812 | 1.2692 | 0.6056 | 9.998257e-10 | 591 |
| 1.2587 | 0.5506 | 1.2688 | 0.6056 | 9.998251e-10 | 592 |
| 1.2608 | 0.5506 | 1.2685 | 0.6056 | 9.998246e-10 | 593 |
| 1.2531 | 0.5553 | 1.2682 | 0.6056 | 9.99824e-10 | 594 |
| 1.2529 | 0.5953 | 1.2679 | 0.6056 | 9.998234e-10 | 595 |
| 1.2587 | 0.5435 | 1.2676 | 0.6056 | 9.998229e-10 | 596 |
| 1.2547 | 0.5459 | 1.2673 | 0.6056 | 9.998223e-10 | 597 |
| 1.2549 | 0.5694 | 1.2669 | 0.6056 | 9.998218e-10 | 598 |
| 1.2550 | 0.5576 | 1.2667 | 0.6127 | 9.998212e-10 | 599 |
| 1.2594 | 0.5741 | 1.2663 | 0.6127 | 9.998207e-10 | 600 |
| 1.2558 | 0.5435 | 1.2660 | 0.6127 | 9.998201e-10 | 601 |
| 1.2565 | 0.5576 | 1.2657 | 0.6127 | 9.998196e-10 | 602 |
| 1.2509 | 0.5671 | 1.2654 | 0.6127 | 9.99819e-10 | 603 |
| 1.2568 | 0.5765 | 1.2650 | 0.6127 | 9.998185e-10 | 604 |
| 1.2573 | 0.5529 | 1.2647 | 0.6197 | 9.998179e-10 | 605 |
| 1.2585 | 0.5388 | 1.2644 | 0.6197 | 9.998173e-10 | 606 |
| 1.2561 | 0.5647 | 1.2641 | 0.6197 | 9.998168e-10 | 607 |
| 1.2506 | 0.5459 | 1.2638 | 0.6197 | 9.998162e-10 | 608 |
| 1.2531 | 0.5765 | 1.2635 | 0.6197 | 9.998157e-10 | 609 |
| 1.2610 | 0.5506 | 1.2632 | 0.6197 | 9.998151e-10 | 610 |
| 1.2600 | 0.5553 | 1.2630 | 0.6197 | 9.998145e-10 | 611 |
| 1.2570 | 0.5788 | 1.2627 | 0.6197 | 9.998138e-10 | 612 |
| 1.2604 | 0.5600 | 1.2624 | 0.6197 | 9.998131e-10 | 613 |
| 1.2517 | 0.6000 | 1.2621 | 0.6197 | 9.998125e-10 | 614 |
| 1.2429 | 0.6141 | 1.2618 | 0.6268 | 9.998118e-10 | 615 |
| 1.2512 | 0.5718 | 1.2615 | 0.6268 | 9.998111e-10 | 616 |
| 1.2457 | 0.6047 | 1.2612 | 0.6268 | 9.998105e-10 | 617 |
| 1.2537 | 0.5718 | 1.2609 | 0.6268 | 9.998098e-10 | 618 |
| 1.2472 | 0.6047 | 1.2606 | 0.6268 | 9.998091e-10 | 619 |
| 1.2471 | 0.5953 | 1.2603 | 0.6268 | 9.998085e-10 | 620 |
| 1.2561 | 0.5765 | 1.2600 | 0.6268 | 9.998078e-10 | 621 |
| 1.2440 | 0.6000 | 1.2596 | 0.6268 | 9.998071e-10 | 622 |
| 1.2524 | 0.5671 | 1.2593 | 0.6268 | 9.998065e-10 | 623 |
| 1.2532 | 0.5835 | 1.2590 | 0.6268 | 9.998058e-10 | 624 |
| 1.2488 | 0.5576 | 1.2587 | 0.6268 | 9.998051e-10 | 625 |
| 1.2444 | 0.5976 | 1.2584 | 0.6268 | 9.998045e-10 | 626 |
| 1.2502 | 0.6094 | 1.2581 | 0.6268 | 9.998038e-10 | 627 |
| 1.2469 | 0.6024 | 1.2578 | 0.6268 | 9.998031e-10 | 628 |
| 1.2458 | 0.5718 | 1.2575 | 0.6338 | 9.998025e-10 | 629 |
| 1.2477 | 0.5953 | 1.2572 | 0.6338 | 9.998018e-10 | 630 |
| 1.2435 | 0.6024 | 1.2569 | 0.6338 | 9.998011e-10 | 631 |
| 1.2480 | 0.5788 | 1.2566 | 0.6268 | 9.998005e-10 | 632 |
| 1.2532 | 0.5412 | 1.2563 | 0.6268 | 9.997998e-10 | 633 |
| 1.2395 | 0.6047 | 1.2560 | 0.6268 | 9.997991e-10 | 634 |
| 1.2395 | 0.6259 | 1.2557 | 0.6268 | 9.997985e-10 | 635 |
| 1.2486 | 0.5788 | 1.2555 | 0.6268 | 9.997978e-10 | 636 |
| 1.2469 | 0.5835 | 1.2551 | 0.6338 | 9.997971e-10 | 637 |
| 1.2482 | 0.5647 | 1.2549 | 0.6338 | 9.997965e-10 | 638 |
| 1.2402 | 0.5765 | 1.2545 | 0.6338 | 9.997958e-10 | 639 |
| 1.2389 | 0.6047 | 1.2543 | 0.6408 | 9.997951e-10 | 640 |
| 1.2414 | 0.5953 | 1.2539 | 0.6408 | 9.997945e-10 | 641 |
| 1.2449 | 0.6071 | 1.2536 | 0.6408 | 9.997938e-10 | 642 |
| 1.2436 | 0.5929 | 1.2533 | 0.6408 | 9.997931e-10 | 643 |
| 1.2437 | 0.5929 | 1.2530 | 0.6408 | 9.997925e-10 | 644 |
| 1.2383 | 0.6094 | 1.2527 | 0.6408 | 9.997918e-10 | 645 |
| 1.2492 | 0.5859 | 1.2524 | 0.6408 | 9.997911e-10 | 646 |
| 1.2437 | 0.6047 | 1.2521 | 0.6408 | 9.997905e-10 | 647 |
| 1.2383 | 0.5882 | 1.2518 | 0.6408 | 9.997898e-10 | 648 |
| 1.2484 | 0.5694 | 1.2516 | 0.6408 | 9.997891e-10 | 649 |
| 1.2385 | 0.6000 | 1.2512 | 0.6408 | 9.997885e-10 | 650 |
| 1.2402 | 0.6094 | 1.2510 | 0.6408 | 9.997878e-10 | 651 |
| 1.2392 | 0.5953 | 1.2506 | 0.6408 | 9.997871e-10 | 652 |
| 1.2480 | 0.5788 | 1.2503 | 0.6408 | 9.997865e-10 | 653 |
| 1.2373 | 0.5929 | 1.2500 | 0.6408 | 9.997858e-10 | 654 |
| 1.2406 | 0.5882 | 1.2497 | 0.6408 | 9.997851e-10 | 655 |
| 1.2478 | 0.5506 | 1.2495 | 0.6408 | 9.997845e-10 | 656 |
| 1.2418 | 0.5906 | 1.2492 | 0.6408 | 9.997838e-10 | 657 |
| 1.2421 | 0.6071 | 1.2489 | 0.6408 | 9.997831e-10 | 658 |
| 1.2368 | 0.5976 | 1.2486 | 0.6408 | 9.997825e-10 | 659 |
| 1.2435 | 0.5600 | 1.2483 | 0.6408 | 9.997818e-10 | 660 |
| 1.2422 | 0.6024 | 1.2480 | 0.6408 | 9.997811e-10 | 661 |
| 1.2397 | 0.6094 | 1.2477 | 0.6479 | 9.997805e-10 | 662 |
| 1.2419 | 0.6000 | 1.2474 | 0.6479 | 9.997798e-10 | 663 |
| 1.2365 | 0.5812 | 1.2471 | 0.6479 | 9.997791e-10 | 664 |
| 1.2399 | 0.6024 | 1.2469 | 0.6549 | 9.997785e-10 | 665 |
| 1.2446 | 0.6047 | 1.2466 | 0.6549 | 9.997778e-10 | 666 |
| 1.2391 | 0.6047 | 1.2463 | 0.6549 | 9.997771e-10 | 667 |
| 1.2460 | 0.6165 | 1.2460 | 0.6549 | 9.997765e-10 | 668 |
| 1.2348 | 0.6141 | 1.2457 | 0.6479 | 9.997758e-10 | 669 |
| 1.2348 | 0.6024 | 1.2454 | 0.6479 | 9.997752e-10 | 670 |
| 1.2347 | 0.6094 | 1.2451 | 0.6479 | 9.997745e-10 | 671 |
| 1.2319 | 0.5953 | 1.2448 | 0.6479 | 9.997738e-10 | 672 |
| 1.2381 | 0.6118 | 1.2445 | 0.6479 | 9.997732e-10 | 673 |
| 1.2299 | 0.6141 | 1.2442 | 0.6479 | 9.997725e-10 | 674 |
| 1.2329 | 0.6071 | 1.2439 | 0.6479 | 9.997718e-10 | 675 |
| 1.2309 | 0.6259 | 1.2436 | 0.6479 | 9.997712e-10 | 676 |
| 1.2260 | 0.6259 | 1.2433 | 0.6479 | 9.997705e-10 | 677 |
| 1.2328 | 0.6212 | 1.2430 | 0.6479 | 9.997698e-10 | 678 |
| 1.2348 | 0.6024 | 1.2427 | 0.6479 | 9.997692e-10 | 679 |
| 1.2315 | 0.6047 | 1.2424 | 0.6479 | 9.997685e-10 | 680 |
| 1.2375 | 0.6235 | 1.2421 | 0.6479 | 9.997678e-10 | 681 |
| 1.2276 | 0.6376 | 1.2418 | 0.6479 | 9.997672e-10 | 682 |
| 1.2278 | 0.6165 | 1.2416 | 0.6408 | 9.997665e-10 | 683 |
| 1.2383 | 0.6188 | 1.2413 | 0.6408 | 9.997658e-10 | 684 |
| 1.2323 | 0.6071 | 1.2410 | 0.6408 | 9.997652e-10 | 685 |
| 1.2242 | 0.6094 | 1.2407 | 0.6408 | 9.997645e-10 | 686 |
| 1.2382 | 0.5976 | 1.2404 | 0.6408 | 9.997638e-10 | 687 |
| 1.2333 | 0.6212 | 1.2401 | 0.6479 | 9.997632e-10 | 688 |
| 1.2327 | 0.6094 | 1.2398 | 0.6479 | 9.997625e-10 | 689 |
| 1.2319 | 0.6259 | 1.2395 | 0.6479 | 9.997618e-10 | 690 |
| 1.2244 | 0.6329 | 1.2392 | 0.6479 | 9.997612e-10 | 691 |
| 1.2279 | 0.6118 | 1.2390 | 0.6479 | 9.997605e-10 | 692 |
| 1.2330 | 0.6212 | 1.2387 | 0.6479 | 9.997598e-10 | 693 |
| 1.2285 | 0.6306 | 1.2384 | 0.6479 | 9.997592e-10 | 694 |
| 1.2234 | 0.6188 | 1.2381 | 0.6479 | 9.997585e-10 | 695 |
| 1.2296 | 0.6282 | 1.2379 | 0.6479 | 9.997578e-10 | 696 |
| 1.2289 | 0.6353 | 1.2375 | 0.6479 | 9.997572e-10 | 697 |
| 1.2305 | 0.6259 | 1.2373 | 0.6479 | 9.997565e-10 | 698 |
| 1.2264 | 0.6329 | 1.2370 | 0.6479 | 9.997558e-10 | 699 |
| 1.2254 | 0.6165 | 1.2367 | 0.6479 | 9.997552e-10 | 700 |
| 1.2318 | 0.6188 | 1.2364 | 0.6479 | 9.997545e-10 | 701 |
| 1.2261 | 0.6094 | 1.2361 | 0.6479 | 9.997538e-10 | 702 |
| 1.2320 | 0.6094 | 1.2359 | 0.6479 | 9.997532e-10 | 703 |
| 1.2271 | 0.6188 | 1.2356 | 0.6479 | 9.997525e-10 | 704 |
| 1.2189 | 0.6282 | 1.2353 | 0.6479 | 9.997518e-10 | 705 |
| 1.2196 | 0.6329 | 1.2350 | 0.6479 | 9.997512e-10 | 706 |
| 1.2207 | 0.6376 | 1.2348 | 0.6479 | 9.997505e-10 | 707 |
| 1.2265 | 0.5929 | 1.2345 | 0.6479 | 9.997498e-10 | 708 |
| 1.2226 | 0.6400 | 1.2342 | 0.6479 | 9.997492e-10 | 709 |
| 1.2294 | 0.6212 | 1.2338 | 0.6479 | 9.997485e-10 | 710 |
| 1.2220 | 0.6235 | 1.2335 | 0.6479 | 9.997478e-10 | 711 |
| 1.2288 | 0.6165 | 1.2332 | 0.6479 | 9.997472e-10 | 712 |
| 1.2299 | 0.6376 | 1.2330 | 0.6479 | 9.997465e-10 | 713 |
| 1.2196 | 0.6212 | 1.2327 | 0.6479 | 9.997458e-10 | 714 |
| 1.2180 | 0.6282 | 1.2324 | 0.6479 | 9.997452e-10 | 715 |
| 1.2271 | 0.6494 | 1.2322 | 0.6479 | 9.997445e-10 | 716 |
| 1.2231 | 0.6188 | 1.2319 | 0.6479 | 9.997438e-10 | 717 |
| 1.2253 | 0.6212 | 1.2317 | 0.6479 | 9.997432e-10 | 718 |
| 1.2265 | 0.5976 | 1.2314 | 0.6479 | 9.997425e-10 | 719 |
| 1.2221 | 0.6071 | 1.2311 | 0.6479 | 9.997418e-10 | 720 |
| 1.2174 | 0.6306 | 1.2308 | 0.6479 | 9.997412e-10 | 721 |
| 1.2241 | 0.6282 | 1.2306 | 0.6479 | 9.997404e-10 | 722 |
| 1.2241 | 0.6259 | 1.2303 | 0.6479 | 9.997396e-10 | 723 |
| 1.2211 | 0.6118 | 1.2300 | 0.6479 | 9.997388e-10 | 724 |
| 1.2126 | 0.6259 | 1.2298 | 0.6549 | 9.997381e-10 | 725 |
| 1.2193 | 0.6541 | 1.2295 | 0.6549 | 9.997373e-10 | 726 |
| 1.2128 | 0.6471 | 1.2292 | 0.6549 | 9.997365e-10 | 727 |
| 1.2246 | 0.6141 | 1.2289 | 0.6549 | 9.997357e-10 | 728 |
| 1.2164 | 0.6282 | 1.2286 | 0.6549 | 9.99735e-10 | 729 |
| 1.2171 | 0.6282 | 1.2284 | 0.6549 | 9.997342e-10 | 730 |
| 1.2173 | 0.6447 | 1.2281 | 0.6549 | 9.997334e-10 | 731 |
| 1.2135 | 0.6353 | 1.2278 | 0.6549 | 9.997326e-10 | 732 |
| 1.2139 | 0.6329 | 1.2275 | 0.6549 | 9.997319e-10 | 733 |
| 1.2202 | 0.6353 | 1.2273 | 0.6549 | 9.997311e-10 | 734 |
| 1.2140 | 0.6541 | 1.2270 | 0.6549 | 9.997303e-10 | 735 |
| 1.2116 | 0.6400 | 1.2267 | 0.6549 | 9.997295e-10 | 736 |
| 1.2206 | 0.6282 | 1.2264 | 0.6549 | 9.997287e-10 | 737 |
| 1.2170 | 0.6235 | 1.2262 | 0.6549 | 9.99728e-10 | 738 |
| 1.2202 | 0.6329 | 1.2259 | 0.6549 | 9.997272e-10 | 739 |
| 1.2149 | 0.6424 | 1.2256 | 0.6549 | 9.997264e-10 | 740 |
| 1.2109 | 0.6329 | 1.2253 | 0.6549 | 9.997256e-10 | 741 |
| 1.2127 | 0.6235 | 1.2250 | 0.6549 | 9.997249e-10 | 742 |
| 1.2132 | 0.6447 | 1.2248 | 0.6549 | 9.997241e-10 | 743 |
| 1.2129 | 0.6165 | 1.2245 | 0.6549 | 9.997233e-10 | 744 |
| 1.2094 | 0.6494 | 1.2242 | 0.6549 | 9.997225e-10 | 745 |
| 1.2206 | 0.6118 | 1.2240 | 0.6549 | 9.997217e-10 | 746 |
| 1.2174 | 0.6376 | 1.2237 | 0.6549 | 9.99721e-10 | 747 |
| 1.2220 | 0.6047 | 1.2234 | 0.6549 | 9.997202e-10 | 748 |
| 1.2130 | 0.6424 | 1.2232 | 0.6549 | 9.997194e-10 | 749 |
| 1.2201 | 0.6259 | 1.2229 | 0.6549 | 9.997186e-10 | 750 |
| 1.2147 | 0.6329 | 1.2226 | 0.6549 | 9.997179e-10 | 751 |
| 1.2148 | 0.6235 | 1.2223 | 0.6549 | 9.997171e-10 | 752 |
| 1.2149 | 0.6329 | 1.2221 | 0.6549 | 9.997163e-10 | 753 |
| 1.2139 | 0.6329 | 1.2218 | 0.6549 | 9.997155e-10 | 754 |
| 1.2167 | 0.6400 | 1.2215 | 0.6549 | 9.997148e-10 | 755 |
| 1.2103 | 0.6518 | 1.2212 | 0.6549 | 9.99714e-10 | 756 |
| 1.2095 | 0.6471 | 1.2209 | 0.6549 | 9.997132e-10 | 757 |
| 1.2157 | 0.6259 | 1.2207 | 0.6549 | 9.997124e-10 | 758 |
| 1.2153 | 0.6424 | 1.2204 | 0.6549 | 9.997116e-10 | 759 |
| 1.2136 | 0.6400 | 1.2202 | 0.6549 | 9.997109e-10 | 760 |
| 1.2068 | 0.6353 | 1.2199 | 0.6549 | 9.997101e-10 | 761 |
| 1.2131 | 0.6329 | 1.2197 | 0.6549 | 9.997093e-10 | 762 |
| 1.2018 | 0.6494 | 1.2194 | 0.6549 | 9.997085e-10 | 763 |
| 1.2136 | 0.6353 | 1.2191 | 0.6549 | 9.997078e-10 | 764 |
| 1.2101 | 0.6306 | 1.2188 | 0.6549 | 9.99707e-10 | 765 |
| 1.2122 | 0.6447 | 1.2186 | 0.6549 | 9.997062e-10 | 766 |
| 1.2098 | 0.6353 | 1.2183 | 0.6549 | 9.997054e-10 | 767 |
| 1.2114 | 0.6518 | 1.2181 | 0.6549 | 9.997047e-10 | 768 |
| 1.2122 | 0.6400 | 1.2178 | 0.6620 | 9.997039e-10 | 769 |
| 1.2138 | 0.6235 | 1.2175 | 0.6690 | 9.997031e-10 | 770 |
| 1.2082 | 0.6588 | 1.2172 | 0.6761 | 9.997023e-10 | 771 |
| 1.2133 | 0.6518 | 1.2169 | 0.6761 | 9.997015e-10 | 772 |
| 1.2063 | 0.6329 | 1.2167 | 0.6761 | 9.997008e-10 | 773 |
| 1.2104 | 0.6541 | 1.2164 | 0.6761 | 9.997e-10 | 774 |
| 1.2060 | 0.6376 | 1.2161 | 0.6761 | 9.996992e-10 | 775 |
| 1.2030 | 0.6471 | 1.2158 | 0.6761 | 9.996984e-10 | 776 |
| 1.2076 | 0.6329 | 1.2155 | 0.6761 | 9.996977e-10 | 777 |
| 1.2008 | 0.6565 | 1.2153 | 0.6761 | 9.996969e-10 | 778 |
| 1.2092 | 0.6447 | 1.2150 | 0.6761 | 9.996961e-10 | 779 |
| 1.2116 | 0.6471 | 1.2147 | 0.6761 | 9.996953e-10 | 780 |
| 1.2111 | 0.6306 | 1.2144 | 0.6761 | 9.996945e-10 | 781 |
| 1.2123 | 0.6565 | 1.2142 | 0.6761 | 9.996938e-10 | 782 |
| 1.1970 | 0.6635 | 1.2139 | 0.6761 | 9.99693e-10 | 783 |
| 1.2024 | 0.6635 | 1.2136 | 0.6761 | 9.996922e-10 | 784 |
| 1.2029 | 0.6329 | 1.2134 | 0.6761 | 9.996914e-10 | 785 |
| 1.2050 | 0.6447 | 1.2131 | 0.6761 | 9.996907e-10 | 786 |
| 1.2117 | 0.6541 | 1.2128 | 0.6761 | 9.996899e-10 | 787 |
| 1.2021 | 0.6588 | 1.2126 | 0.6761 | 9.996891e-10 | 788 |
| 1.2075 | 0.6565 | 1.2123 | 0.6761 | 9.996883e-10 | 789 |
| 1.2131 | 0.6518 | 1.2120 | 0.6761 | 9.996876e-10 | 790 |
| 1.2062 | 0.6541 | 1.2118 | 0.6761 | 9.996868e-10 | 791 |
| 1.2005 | 0.6471 | 1.2115 | 0.6761 | 9.99686e-10 | 792 |
| 1.2104 | 0.6541 | 1.2112 | 0.6761 | 9.996852e-10 | 793 |
| 1.1939 | 0.6424 | 1.2110 | 0.6761 | 9.996844e-10 | 794 |
| 1.2017 | 0.6588 | 1.2107 | 0.6761 | 9.996837e-10 | 795 |
| 1.2061 | 0.6588 | 1.2105 | 0.6761 | 9.996829e-10 | 796 |
| 1.2084 | 0.6565 | 1.2102 | 0.6761 | 9.996821e-10 | 797 |
| 1.2063 | 0.6635 | 1.2099 | 0.6761 | 9.996813e-10 | 798 |
| 1.2001 | 0.6588 | 1.2096 | 0.6761 | 9.996806e-10 | 799 |
| 1.2047 | 0.6447 | 1.2094 | 0.6761 | 9.996798e-10 | 800 |
| 1.2034 | 0.6471 | 1.2092 | 0.6761 | 9.99679e-10 | 801 |
| 1.1968 | 0.6541 | 1.2089 | 0.6761 | 9.996782e-10 | 802 |
| 1.2095 | 0.6376 | 1.2086 | 0.6761 | 9.996775e-10 | 803 |
| 1.1969 | 0.6565 | 1.2083 | 0.6761 | 9.996767e-10 | 804 |
| 1.2043 | 0.6447 | 1.2080 | 0.6761 | 9.996759e-10 | 805 |
| 1.2058 | 0.6376 | 1.2078 | 0.6761 | 9.996751e-10 | 806 |
| 1.1986 | 0.6565 | 1.2075 | 0.6761 | 9.996743e-10 | 807 |
| 1.1983 | 0.6588 | 1.2073 | 0.6761 | 9.996736e-10 | 808 |
| 1.2041 | 0.6353 | 1.2070 | 0.6761 | 9.996728e-10 | 809 |
| 1.2055 | 0.6494 | 1.2068 | 0.6761 | 9.99672e-10 | 810 |
| 1.1934 | 0.6565 | 1.2065 | 0.6761 | 9.996712e-10 | 811 |
| 1.1971 | 0.6635 | 1.2063 | 0.6761 | 9.996705e-10 | 812 |
| 1.2028 | 0.6494 | 1.2060 | 0.6761 | 9.996697e-10 | 813 |
| 1.2042 | 0.6565 | 1.2058 | 0.6761 | 9.996689e-10 | 814 |
| 1.1954 | 0.6565 | 1.2055 | 0.6761 | 9.996681e-10 | 815 |
| 1.2005 | 0.6541 | 1.2052 | 0.6761 | 9.996673e-10 | 816 |
| 1.1996 | 0.6518 | 1.2050 | 0.6761 | 9.996666e-10 | 817 |
| 1.1968 | 0.6424 | 1.2047 | 0.6761 | 9.996658e-10 | 818 |
| 1.1947 | 0.6471 | 1.2045 | 0.6761 | 9.99665e-10 | 819 |
| 1.1982 | 0.6518 | 1.2042 | 0.6761 | 9.996642e-10 | 820 |
| 1.1967 | 0.6447 | 1.2039 | 0.6761 | 9.996635e-10 | 821 |
| 1.1976 | 0.6565 | 1.2037 | 0.6761 | 9.996627e-10 | 822 |
| 1.1990 | 0.6424 | 1.2034 | 0.6761 | 9.996619e-10 | 823 |
| 1.2013 | 0.6400 | 1.2032 | 0.6761 | 9.996611e-10 | 824 |
| 1.2046 | 0.6518 | 1.2029 | 0.6761 | 9.996604e-10 | 825 |
| 1.1975 | 0.6659 | 1.2027 | 0.6761 | 9.996596e-10 | 826 |
| 1.1907 | 0.6612 | 1.2025 | 0.6761 | 9.996588e-10 | 827 |
| 1.1963 | 0.6659 | 1.2022 | 0.6761 | 9.99658e-10 | 828 |
| 1.1901 | 0.6588 | 1.2019 | 0.6761 | 9.996572e-10 | 829 |
| 1.1920 | 0.6635 | 1.2017 | 0.6761 | 9.996565e-10 | 830 |
| 1.1900 | 0.6588 | 1.2014 | 0.6761 | 9.996557e-10 | 831 |
| 1.1954 | 0.6612 | 1.2012 | 0.6761 | 9.996549e-10 | 832 |
| 1.1956 | 0.6471 | 1.2010 | 0.6761 | 9.99654e-10 | 833 |
| 1.1882 | 0.6612 | 1.2007 | 0.6761 | 9.996531e-10 | 834 |
| 1.1963 | 0.6494 | 1.2004 | 0.6761 | 9.996522e-10 | 835 |
| 1.1932 | 0.6471 | 1.2002 | 0.6761 | 9.996514e-10 | 836 |
| 1.1955 | 0.6565 | 1.1999 | 0.6761 | 9.996505e-10 | 837 |
| 1.1932 | 0.6565 | 1.1997 | 0.6761 | 9.996496e-10 | 838 |
| 1.1943 | 0.6565 | 1.1994 | 0.6761 | 9.996487e-10 | 839 |
| 1.1885 | 0.6518 | 1.1991 | 0.6761 | 9.996478e-10 | 840 |
| 1.1975 | 0.6565 | 1.1989 | 0.6761 | 9.996469e-10 | 841 |
| 1.1930 | 0.6518 | 1.1986 | 0.6761 | 9.99646e-10 | 842 |
| 1.1836 | 0.6729 | 1.1984 | 0.6761 | 9.996451e-10 | 843 |
| 1.1839 | 0.6706 | 1.1982 | 0.6761 | 9.996443e-10 | 844 |
| 1.1870 | 0.6565 | 1.1979 | 0.6761 | 9.996434e-10 | 845 |
| 1.1919 | 0.6541 | 1.1976 | 0.6761 | 9.996425e-10 | 846 |
| 1.1877 | 0.6588 | 1.1974 | 0.6761 | 9.996416e-10 | 847 |
| 1.1914 | 0.6635 | 1.1971 | 0.6761 | 9.996407e-10 | 848 |
| 1.1953 | 0.6588 | 1.1969 | 0.6761 | 9.996398e-10 | 849 |
| 1.1865 | 0.6635 | 1.1966 | 0.6761 | 9.996389e-10 | 850 |
| 1.1927 | 0.6612 | 1.1964 | 0.6761 | 9.99638e-10 | 851 |
| 1.1831 | 0.6588 | 1.1961 | 0.6761 | 9.996372e-10 | 852 |
| 1.1877 | 0.6729 | 1.1959 | 0.6761 | 9.996363e-10 | 853 |
| 1.1787 | 0.6588 | 1.1956 | 0.6761 | 9.996354e-10 | 854 |
| 1.1773 | 0.6612 | 1.1954 | 0.6761 | 9.996345e-10 | 855 |
| 1.1871 | 0.6706 | 1.1951 | 0.6761 | 9.996336e-10 | 856 |
| 1.1812 | 0.6612 | 1.1949 | 0.6761 | 9.996327e-10 | 857 |
| 1.1870 | 0.6612 | 1.1946 | 0.6761 | 9.996318e-10 | 858 |
| 1.1824 | 0.6612 | 1.1944 | 0.6761 | 9.996309e-10 | 859 |
| 1.1842 | 0.6494 | 1.1942 | 0.6761 | 9.9963e-10 | 860 |
| 1.1800 | 0.6776 | 1.1939 | 0.6761 | 9.996292e-10 | 861 |
| 1.1848 | 0.6800 | 1.1937 | 0.6761 | 9.996283e-10 | 862 |
| 1.1904 | 0.6682 | 1.1934 | 0.6761 | 9.996274e-10 | 863 |
| 1.1798 | 0.6682 | 1.1932 | 0.6761 | 9.996265e-10 | 864 |
| 1.1813 | 0.6635 | 1.1930 | 0.6761 | 9.996256e-10 | 865 |
| 1.1847 | 0.6706 | 1.1927 | 0.6761 | 9.996247e-10 | 866 |
| 1.1915 | 0.6612 | 1.1925 | 0.6761 | 9.996238e-10 | 867 |
| 1.1793 | 0.6800 | 1.1923 | 0.6761 | 9.996229e-10 | 868 |
| 1.1836 | 0.6776 | 1.1920 | 0.6761 | 9.99622e-10 | 869 |
| 1.1884 | 0.6753 | 1.1918 | 0.6761 | 9.996212e-10 | 870 |
| 1.1780 | 0.6847 | 1.1916 | 0.6761 | 9.996203e-10 | 871 |
| 1.1850 | 0.6729 | 1.1913 | 0.6761 | 9.996194e-10 | 872 |
| 1.1930 | 0.6588 | 1.1911 | 0.6761 | 9.996185e-10 | 873 |
| 1.1882 | 0.6518 | 1.1908 | 0.6761 | 9.996176e-10 | 874 |
| 1.1870 | 0.6729 | 1.1906 | 0.6761 | 9.996167e-10 | 875 |
| 1.1886 | 0.6541 | 1.1903 | 0.6761 | 9.996158e-10 | 876 |
| 1.1785 | 0.6659 | 1.1901 | 0.6761 | 9.99615e-10 | 877 |
| 1.1861 | 0.6588 | 1.1898 | 0.6761 | 9.996141e-10 | 878 |
| 1.1864 | 0.6753 | 1.1896 | 0.6761 | 9.996132e-10 | 879 |
| 1.1904 | 0.6706 | 1.1893 | 0.6761 | 9.996123e-10 | 880 |
| 1.1829 | 0.6659 | 1.1890 | 0.6761 | 9.996114e-10 | 881 |
| 1.1840 | 0.6706 | 1.1888 | 0.6761 | 9.996105e-10 | 882 |
| 1.1742 | 0.6753 | 1.1886 | 0.6761 | 9.996096e-10 | 883 |
| 1.1818 | 0.6635 | 1.1883 | 0.6761 | 9.996087e-10 | 884 |
| 1.1794 | 0.6729 | 1.1881 | 0.6761 | 9.996078e-10 | 885 |
| 1.1860 | 0.6612 | 1.1879 | 0.6761 | 9.99607e-10 | 886 |
| 1.1812 | 0.6635 | 1.1876 | 0.6761 | 9.996061e-10 | 887 |
| 1.1820 | 0.6682 | 1.1874 | 0.6761 | 9.996052e-10 | 888 |
| 1.1819 | 0.6776 | 1.1871 | 0.6761 | 9.996043e-10 | 889 |
| 1.1871 | 0.6635 | 1.1869 | 0.6761 | 9.996034e-10 | 890 |
| 1.1799 | 0.6635 | 1.1867 | 0.6761 | 9.996025e-10 | 891 |
| 1.1803 | 0.6729 | 1.1864 | 0.6761 | 9.996016e-10 | 892 |
| 1.1827 | 0.6612 | 1.1861 | 0.6761 | 9.996007e-10 | 893 |
| 1.1818 | 0.6635 | 1.1859 | 0.6761 | 9.995998e-10 | 894 |
| 1.1818 | 0.6753 | 1.1857 | 0.6761 | 9.99599e-10 | 895 |
| 1.1763 | 0.6776 | 1.1854 | 0.6761 | 9.995981e-10 | 896 |
| 1.1753 | 0.6706 | 1.1852 | 0.6761 | 9.995972e-10 | 897 |
| 1.1783 | 0.6706 | 1.1849 | 0.6761 | 9.995963e-10 | 898 |
| 1.1787 | 0.6753 | 1.1847 | 0.6761 | 9.995954e-10 | 899 |
| 1.1771 | 0.6541 | 1.1845 | 0.6761 | 9.995945e-10 | 900 |
| 1.1735 | 0.6659 | 1.1842 | 0.6761 | 9.995936e-10 | 901 |
| 1.1812 | 0.6565 | 1.1840 | 0.6761 | 9.995927e-10 | 902 |
| 1.1791 | 0.6659 | 1.1837 | 0.6761 | 9.995919e-10 | 903 |
| 1.1768 | 0.6682 | 1.1835 | 0.6761 | 9.99591e-10 | 904 |
| 1.1781 | 0.6682 | 1.1833 | 0.6761 | 9.995901e-10 | 905 |
| 1.1747 | 0.6612 | 1.1830 | 0.6761 | 9.995892e-10 | 906 |
| 1.1791 | 0.6753 | 1.1828 | 0.6761 | 9.995883e-10 | 907 |
| 1.1805 | 0.6706 | 1.1825 | 0.6761 | 9.995874e-10 | 908 |
| 1.1753 | 0.6612 | 1.1823 | 0.6761 | 9.995865e-10 | 909 |
| 1.1684 | 0.6776 | 1.1820 | 0.6761 | 9.995856e-10 | 910 |
| 1.1760 | 0.6588 | 1.1818 | 0.6761 | 9.995847e-10 | 911 |
| 1.1827 | 0.6682 | 1.1815 | 0.6761 | 9.995839e-10 | 912 |
| 1.1749 | 0.6776 | 1.1813 | 0.6761 | 9.99583e-10 | 913 |
| 1.1826 | 0.6706 | 1.1810 | 0.6761 | 9.995821e-10 | 914 |
| 1.1789 | 0.6706 | 1.1808 | 0.6761 | 9.995812e-10 | 915 |
| 1.1759 | 0.6659 | 1.1806 | 0.6761 | 9.995803e-10 | 916 |
| 1.1679 | 0.6682 | 1.1804 | 0.6761 | 9.995794e-10 | 917 |
| 1.1653 | 0.6659 | 1.1801 | 0.6761 | 9.995785e-10 | 918 |
| 1.1746 | 0.6729 | 1.1799 | 0.6761 | 9.995776e-10 | 919 |
| 1.1765 | 0.6659 | 1.1796 | 0.6761 | 9.995768e-10 | 920 |
| 1.1719 | 0.6682 | 1.1794 | 0.6761 | 9.995759e-10 | 921 |
| 1.1728 | 0.6753 | 1.1791 | 0.6761 | 9.99575e-10 | 922 |
| 1.1680 | 0.6706 | 1.1789 | 0.6761 | 9.995741e-10 | 923 |
| 1.1740 | 0.6541 | 1.1786 | 0.6761 | 9.995732e-10 | 924 |
| 1.1794 | 0.6635 | 1.1784 | 0.6761 | 9.995723e-10 | 925 |
| 1.1689 | 0.6753 | 1.1782 | 0.6761 | 9.995714e-10 | 926 |
| 1.1742 | 0.6729 | 1.1780 | 0.6761 | 9.995705e-10 | 927 |
| 1.1682 | 0.6706 | 1.1777 | 0.6761 | 9.995696e-10 | 928 |
| 1.1695 | 0.6706 | 1.1775 | 0.6761 | 9.995688e-10 | 929 |
| 1.1724 | 0.6682 | 1.1773 | 0.6761 | 9.995679e-10 | 930 |
| 1.1782 | 0.6729 | 1.1770 | 0.6761 | 9.99567e-10 | 931 |
| 1.1631 | 0.6776 | 1.1768 | 0.6761 | 9.995661e-10 | 932 |
| 1.1734 | 0.6659 | 1.1766 | 0.6761 | 9.995652e-10 | 933 |
| 1.1639 | 0.6706 | 1.1763 | 0.6761 | 9.995643e-10 | 934 |
| 1.1755 | 0.6729 | 1.1761 | 0.6761 | 9.995634e-10 | 935 |
| 1.1706 | 0.6706 | 1.1759 | 0.6761 | 9.995625e-10 | 936 |
| 1.1671 | 0.6682 | 1.1757 | 0.6761 | 9.995617e-10 | 937 |
| 1.1684 | 0.6753 | 1.1754 | 0.6761 | 9.995608e-10 | 938 |
| 1.1744 | 0.6753 | 1.1752 | 0.6761 | 9.995599e-10 | 939 |
| 1.1667 | 0.6682 | 1.1750 | 0.6761 | 9.99559e-10 | 940 |
| 1.1703 | 0.6682 | 1.1748 | 0.6761 | 9.995581e-10 | 941 |
| 1.1656 | 0.6682 | 1.1746 | 0.6761 | 9.995572e-10 | 942 |
| 1.1696 | 0.6682 | 1.1744 | 0.6761 | 9.995563e-10 | 943 |
| 1.1650 | 0.6706 | 1.1741 | 0.6761 | 9.995554e-10 | 944 |
| 1.1644 | 0.6706 | 1.1739 | 0.6761 | 9.995544e-10 | 945 |
| 1.1701 | 0.6776 | 1.1737 | 0.6761 | 9.995534e-10 | 946 |
| 1.1635 | 0.6753 | 1.1734 | 0.6761 | 9.995524e-10 | 947 |
| 1.1717 | 0.6729 | 1.1732 | 0.6761 | 9.995514e-10 | 948 |
| 1.1740 | 0.6635 | 1.1730 | 0.6761 | 9.995504e-10 | 949 |
| 1.1675 | 0.6635 | 1.1727 | 0.6761 | 9.995494e-10 | 950 |
| 1.1670 | 0.6659 | 1.1725 | 0.6761 | 9.995484e-10 | 951 |
| 1.1695 | 0.6776 | 1.1723 | 0.6761 | 9.995474e-10 | 952 |
| 1.1651 | 0.6729 | 1.1720 | 0.6761 | 9.995464e-10 | 953 |
| 1.1642 | 0.6588 | 1.1718 | 0.6761 | 9.995454e-10 | 954 |
| 1.1652 | 0.6729 | 1.1716 | 0.6761 | 9.995444e-10 | 955 |
| 1.1673 | 0.6682 | 1.1714 | 0.6761 | 9.995434e-10 | 956 |
| 1.1649 | 0.6729 | 1.1712 | 0.6761 | 9.995424e-10 | 957 |
| 1.1665 | 0.6753 | 1.1710 | 0.6761 | 9.995414e-10 | 958 |
| 1.1633 | 0.6776 | 1.1707 | 0.6761 | 9.995405e-10 | 959 |
| 1.1625 | 0.6635 | 1.1705 | 0.6761 | 9.995395e-10 | 960 |
| 1.1668 | 0.6635 | 1.1703 | 0.6761 | 9.995385e-10 | 961 |
| 1.1607 | 0.6729 | 1.1701 | 0.6761 | 9.995375e-10 | 962 |
| 1.1697 | 0.6706 | 1.1699 | 0.6761 | 9.995365e-10 | 963 |
| 1.1637 | 0.6753 | 1.1696 | 0.6761 | 9.995355e-10 | 964 |
| 1.1644 | 0.6729 | 1.1694 | 0.6761 | 9.995345e-10 | 965 |
| 1.1613 | 0.6729 | 1.1692 | 0.6761 | 9.995335e-10 | 966 |
| 1.1685 | 0.6612 | 1.1690 | 0.6761 | 9.995325e-10 | 967 |
| 1.1595 | 0.6706 | 1.1688 | 0.6761 | 9.995315e-10 | 968 |
| 1.1650 | 0.6706 | 1.1686 | 0.6761 | 9.995305e-10 | 969 |
| 1.1582 | 0.6682 | 1.1684 | 0.6761 | 9.995295e-10 | 970 |
| 1.1609 | 0.6729 | 1.1682 | 0.6761 | 9.995285e-10 | 971 |
| 1.1619 | 0.6706 | 1.1679 | 0.6761 | 9.995275e-10 | 972 |
| 1.1618 | 0.6776 | 1.1677 | 0.6761 | 9.995265e-10 | 973 |
| 1.1594 | 0.6682 | 1.1675 | 0.6761 | 9.995255e-10 | 974 |
| 1.1572 | 0.6753 | 1.1673 | 0.6761 | 9.995245e-10 | 975 |
| 1.1591 | 0.6776 | 1.1670 | 0.6761 | 9.995235e-10 | 976 |
| 1.1600 | 0.6729 | 1.1668 | 0.6761 | 9.995225e-10 | 977 |
| 1.1590 | 0.6635 | 1.1666 | 0.6761 | 9.995215e-10 | 978 |
| 1.1570 | 0.6753 | 1.1664 | 0.6761 | 9.995205e-10 | 979 |
| 1.1615 | 0.6729 | 1.1662 | 0.6761 | 9.995195e-10 | 980 |
| 1.1601 | 0.6776 | 1.1660 | 0.6761 | 9.995185e-10 | 981 |
| 1.1605 | 0.6682 | 1.1658 | 0.6761 | 9.995175e-10 | 982 |
| 1.1557 | 0.6800 | 1.1656 | 0.6761 | 9.995165e-10 | 983 |
| 1.1575 | 0.6729 | 1.1653 | 0.6761 | 9.995155e-10 | 984 |
| 1.1531 | 0.6659 | 1.1651 | 0.6761 | 9.995145e-10 | 985 |
| 1.1654 | 0.6753 | 1.1649 | 0.6761 | 9.995135e-10 | 986 |
| 1.1555 | 0.6776 | 1.1647 | 0.6761 | 9.995125e-10 | 987 |
| 1.1603 | 0.6753 | 1.1645 | 0.6761 | 9.995115e-10 | 988 |
| 1.1605 | 0.6729 | 1.1643 | 0.6761 | 9.995105e-10 | 989 |
| 1.1575 | 0.6682 | 1.1640 | 0.6761 | 9.995095e-10 | 990 |
| 1.1633 | 0.6776 | 1.1638 | 0.6761 | 9.995085e-10 | 991 |
| 1.1637 | 0.6776 | 1.1636 | 0.6761 | 9.995075e-10 | 992 |
| 1.1583 | 0.6753 | 1.1634 | 0.6761 | 9.995065e-10 | 993 |
| 1.1557 | 0.6824 | 1.1632 | 0.6761 | 9.995055e-10 | 994 |
| 1.1611 | 0.6682 | 1.1629 | 0.6761 | 9.995045e-10 | 995 |
| 1.1580 | 0.6659 | 1.1627 | 0.6761 | 9.995035e-10 | 996 |
| 1.1599 | 0.6682 | 1.1625 | 0.6761 | 9.995025e-10 | 997 |
| 1.1575 | 0.6824 | 1.1623 | 0.6761 | 9.995015e-10 | 998 |
| 1.1645 | 0.6635 | 1.1621 | 0.6761 | 9.995005e-10 | 999 |
| 1.1536 | 0.6776 | 1.1619 | 0.6761 | 9.994995e-10 | 1000 |
| 1.1546 | 0.6729 | 1.1616 | 0.6761 | 9.994985e-10 | 1001 |
| 1.1577 | 0.6706 | 1.1614 | 0.6761 | 9.994975e-10 | 1002 |
| 1.1537 | 0.6753 | 1.1612 | 0.6761 | 9.994965e-10 | 1003 |
| 1.1464 | 0.6753 | 1.1610 | 0.6761 | 9.994955e-10 | 1004 |
| 1.1584 | 0.6753 | 1.1607 | 0.6761 | 9.994945e-10 | 1005 |
| 1.1504 | 0.6706 | 1.1605 | 0.6761 | 9.994935e-10 | 1006 |
| 1.1536 | 0.6753 | 1.1603 | 0.6761 | 9.994925e-10 | 1007 |
| 1.1583 | 0.6776 | 1.1601 | 0.6761 | 9.994915e-10 | 1008 |
| 1.1560 | 0.6753 | 1.1598 | 0.6761 | 9.994905e-10 | 1009 |
| 1.1489 | 0.6706 | 1.1596 | 0.6761 | 9.994895e-10 | 1010 |
| 1.1522 | 0.6729 | 1.1594 | 0.6761 | 9.994885e-10 | 1011 |
| 1.1557 | 0.6776 | 1.1592 | 0.6761 | 9.994875e-10 | 1012 |
| 1.1555 | 0.6729 | 1.1589 | 0.6761 | 9.994865e-10 | 1013 |
| 1.1496 | 0.6753 | 1.1587 | 0.6761 | 9.994855e-10 | 1014 |
| 1.1449 | 0.6800 | 1.1585 | 0.6761 | 9.994845e-10 | 1015 |
| 1.1449 | 0.6800 | 1.1583 | 0.6761 | 9.994835e-10 | 1016 |
| 1.1574 | 0.6753 | 1.1581 | 0.6761 | 9.994825e-10 | 1017 |
| 1.1486 | 0.6753 | 1.1579 | 0.6761 | 9.994815e-10 | 1018 |
| 1.1557 | 0.6776 | 1.1576 | 0.6761 | 9.994805e-10 | 1019 |
| 1.1534 | 0.6706 | 1.1574 | 0.6761 | 9.994795e-10 | 1020 |
| 1.1488 | 0.6753 | 1.1572 | 0.6761 | 9.994785e-10 | 1021 |
| 1.1551 | 0.6776 | 1.1570 | 0.6761 | 9.994775e-10 | 1022 |
| 1.1507 | 0.6753 | 1.1568 | 0.6761 | 9.994765e-10 | 1023 |
| 1.1526 | 0.6776 | 1.1566 | 0.6761 | 9.994755e-10 | 1024 |
| 1.1476 | 0.6753 | 1.1564 | 0.6761 | 9.994745e-10 | 1025 |
| 1.1520 | 0.6706 | 1.1562 | 0.6761 | 9.994735e-10 | 1026 |
| 1.1449 | 0.6729 | 1.1560 | 0.6761 | 9.994725e-10 | 1027 |
| 1.1529 | 0.6729 | 1.1558 | 0.6761 | 9.994715e-10 | 1028 |
| 1.1515 | 0.6753 | 1.1555 | 0.6761 | 9.994705e-10 | 1029 |
| 1.1511 | 0.6706 | 1.1553 | 0.6761 | 9.994695e-10 | 1030 |
| 1.1476 | 0.6706 | 1.1551 | 0.6761 | 9.994685e-10 | 1031 |
| 1.1532 | 0.6776 | 1.1549 | 0.6761 | 9.994675e-10 | 1032 |
| 1.1511 | 0.6776 | 1.1546 | 0.6761 | 9.994665e-10 | 1033 |
| 1.1515 | 0.6753 | 1.1545 | 0.6761 | 9.994655e-10 | 1034 |
| 1.1506 | 0.6753 | 1.1543 | 0.6761 | 9.994645e-10 | 1035 |
| 1.1508 | 0.6706 | 1.1541 | 0.6761 | 9.994635e-10 | 1036 |
| 1.1492 | 0.6729 | 1.1539 | 0.6761 | 9.994625e-10 | 1037 |
| 1.1504 | 0.6800 | 1.1536 | 0.6761 | 9.994615e-10 | 1038 |
| 1.1429 | 0.6753 | 1.1534 | 0.6761 | 9.994605e-10 | 1039 |
| 1.1528 | 0.6729 | 1.1532 | 0.6761 | 9.994595e-10 | 1040 |
| 1.1508 | 0.6753 | 1.1530 | 0.6761 | 9.994585e-10 | 1041 |
| 1.1535 | 0.6729 | 1.1528 | 0.6761 | 9.994575e-10 | 1042 |
| 1.1535 | 0.6706 | 1.1526 | 0.6761 | 9.994565e-10 | 1043 |
| 1.1453 | 0.6776 | 1.1524 | 0.6761 | 9.994555e-10 | 1044 |
| 1.1455 | 0.6706 | 1.1521 | 0.6761 | 9.994545e-10 | 1045 |
| 1.1488 | 0.6729 | 1.1519 | 0.6761 | 9.994535e-10 | 1046 |
| 1.1425 | 0.6729 | 1.1517 | 0.6761 | 9.994525e-10 | 1047 |
| 1.1435 | 0.6824 | 1.1515 | 0.6761 | 9.994515e-10 | 1048 |
| 1.1341 | 0.6824 | 1.1513 | 0.6761 | 9.994505e-10 | 1049 |
| 1.1458 | 0.6776 | 1.1511 | 0.6761 | 9.994495e-10 | 1050 |
| 1.1453 | 0.6776 | 1.1508 | 0.6761 | 9.994485e-10 | 1051 |
| 1.1430 | 0.6753 | 1.1506 | 0.6761 | 9.994475e-10 | 1052 |
| 1.1469 | 0.6753 | 1.1504 | 0.6761 | 9.994465e-10 | 1053 |
| 1.1414 | 0.6729 | 1.1502 | 0.6761 | 9.994455e-10 | 1054 |
| 1.1519 | 0.6800 | 1.1500 | 0.6761 | 9.994445e-10 | 1055 |
| 1.1509 | 0.6753 | 1.1498 | 0.6761 | 9.994434e-10 | 1056 |
| 1.1462 | 0.6776 | 1.1496 | 0.6761 | 9.994423e-10 | 1057 |
| 1.1448 | 0.6753 | 1.1494 | 0.6761 | 9.994412e-10 | 1058 |
| 1.1470 | 0.6706 | 1.1492 | 0.6761 | 9.994401e-10 | 1059 |
| 1.1411 | 0.6753 | 1.1490 | 0.6761 | 9.99439e-10 | 1060 |
| 1.1453 | 0.6753 | 1.1488 | 0.6761 | 9.994379e-10 | 1061 |
| 1.1431 | 0.6753 | 1.1486 | 0.6761 | 9.994368e-10 | 1062 |
| 1.1361 | 0.6753 | 1.1484 | 0.6761 | 9.994356e-10 | 1063 |
| 1.1469 | 0.6729 | 1.1481 | 0.6761 | 9.994345e-10 | 1064 |
| 1.1376 | 0.6753 | 1.1479 | 0.6761 | 9.994334e-10 | 1065 |
| 1.1399 | 0.6706 | 1.1477 | 0.6761 | 9.994323e-10 | 1066 |
| 1.1400 | 0.6776 | 1.1475 | 0.6761 | 9.994312e-10 | 1067 |
| 1.1438 | 0.6776 | 1.1473 | 0.6761 | 9.994301e-10 | 1068 |
| 1.1453 | 0.6729 | 1.1471 | 0.6761 | 9.99429e-10 | 1069 |
| 1.1422 | 0.6776 | 1.1469 | 0.6761 | 9.994279e-10 | 1070 |
| 1.1372 | 0.6753 | 1.1467 | 0.6761 | 9.994268e-10 | 1071 |
| 1.1368 | 0.6776 | 1.1465 | 0.6761 | 9.994257e-10 | 1072 |
| 1.1366 | 0.6753 | 1.1463 | 0.6761 | 9.994245e-10 | 1073 |
| 1.1398 | 0.6729 | 1.1461 | 0.6761 | 9.994234e-10 | 1074 |
| 1.1408 | 0.6824 | 1.1459 | 0.6761 | 9.994223e-10 | 1075 |
| 1.1345 | 0.6753 | 1.1457 | 0.6761 | 9.994212e-10 | 1076 |
| 1.1387 | 0.6776 | 1.1455 | 0.6761 | 9.994201e-10 | 1077 |
| 1.1359 | 0.6776 | 1.1453 | 0.6761 | 9.99419e-10 | 1078 |
| 1.1434 | 0.6776 | 1.1451 | 0.6761 | 9.994179e-10 | 1079 |
| 1.1286 | 0.6729 | 1.1449 | 0.6761 | 9.994168e-10 | 1080 |
| 1.1426 | 0.6800 | 1.1447 | 0.6761 | 9.994157e-10 | 1081 |
| 1.1433 | 0.6729 | 1.1445 | 0.6761 | 9.994146e-10 | 1082 |
| 1.1413 | 0.6776 | 1.1443 | 0.6761 | 9.994134e-10 | 1083 |
| 1.1435 | 0.6729 | 1.1441 | 0.6761 | 9.994123e-10 | 1084 |
| 1.1394 | 0.6776 | 1.1438 | 0.6761 | 9.994112e-10 | 1085 |
| 1.1420 | 0.6776 | 1.1436 | 0.6761 | 9.994101e-10 | 1086 |
| 1.1452 | 0.6753 | 1.1434 | 0.6761 | 9.99409e-10 | 1087 |
| 1.1370 | 0.6824 | 1.1432 | 0.6761 | 9.994079e-10 | 1088 |
| 1.1393 | 0.6776 | 1.1430 | 0.6761 | 9.994068e-10 | 1089 |
| 1.1353 | 0.6800 | 1.1428 | 0.6761 | 9.994057e-10 | 1090 |
| 1.1376 | 0.6753 | 1.1426 | 0.6761 | 9.994046e-10 | 1091 |
| 1.1362 | 0.6729 | 1.1424 | 0.6761 | 9.994034e-10 | 1092 |
| 1.1357 | 0.6800 | 1.1422 | 0.6761 | 9.994023e-10 | 1093 |
| 1.1313 | 0.6776 | 1.1419 | 0.6761 | 9.994012e-10 | 1094 |
| 1.1440 | 0.6753 | 1.1417 | 0.6761 | 9.994001e-10 | 1095 |
| 1.1427 | 0.6776 | 1.1415 | 0.6761 | 9.99399e-10 | 1096 |
| 1.1327 | 0.6800 | 1.1413 | 0.6761 | 9.993979e-10 | 1097 |
| 1.1346 | 0.6800 | 1.1411 | 0.6761 | 9.993968e-10 | 1098 |
| 1.1366 | 0.6729 | 1.1409 | 0.6761 | 9.993957e-10 | 1099 |
| 1.1365 | 0.6776 | 1.1408 | 0.6761 | 9.993946e-10 | 1100 |
| 1.1367 | 0.6800 | 1.1406 | 0.6761 | 9.993935e-10 | 1101 |
| 1.1240 | 0.6776 | 1.1404 | 0.6761 | 9.993923e-10 | 1102 |
| 1.1399 | 0.6776 | 1.1402 | 0.6761 | 9.993912e-10 | 1103 |
| 1.1375 | 0.6776 | 1.1400 | 0.6761 | 9.993901e-10 | 1104 |
| 1.1318 | 0.6776 | 1.1398 | 0.6761 | 9.99389e-10 | 1105 |
| 1.1355 | 0.6776 | 1.1396 | 0.6761 | 9.993879e-10 | 1106 |
| 1.1292 | 0.6729 | 1.1394 | 0.6761 | 9.993868e-10 | 1107 |
| 1.1354 | 0.6753 | 1.1392 | 0.6761 | 9.993857e-10 | 1108 |
| 1.1331 | 0.6800 | 1.1390 | 0.6761 | 9.993846e-10 | 1109 |
| 1.1378 | 0.6800 | 1.1388 | 0.6761 | 9.993835e-10 | 1110 |
| 1.1340 | 0.6800 | 1.1387 | 0.6761 | 9.993824e-10 | 1111 |
| 1.1348 | 0.6776 | 1.1385 | 0.6761 | 9.993812e-10 | 1112 |
| 1.1296 | 0.6824 | 1.1383 | 0.6761 | 9.993801e-10 | 1113 |
| 1.1321 | 0.6753 | 1.1381 | 0.6761 | 9.99379e-10 | 1114 |
| 1.1338 | 0.6776 | 1.1379 | 0.6761 | 9.993779e-10 | 1115 |
| 1.1406 | 0.6776 | 1.1376 | 0.6761 | 9.993768e-10 | 1116 |
| 1.1275 | 0.6776 | 1.1374 | 0.6761 | 9.993757e-10 | 1117 |
| 1.1299 | 0.6776 | 1.1372 | 0.6761 | 9.993746e-10 | 1118 |
| 1.1266 | 0.6753 | 1.1370 | 0.6761 | 9.993735e-10 | 1119 |
| 1.1338 | 0.6800 | 1.1368 | 0.6761 | 9.993724e-10 | 1120 |
| 1.1347 | 0.6753 | 1.1366 | 0.6761 | 9.993713e-10 | 1121 |
| 1.1209 | 0.6753 | 1.1364 | 0.6761 | 9.993701e-10 | 1122 |
| 1.1284 | 0.6776 | 1.1363 | 0.6761 | 9.99369e-10 | 1123 |
| 1.1299 | 0.6729 | 1.1360 | 0.6761 | 9.993679e-10 | 1124 |
| 1.1347 | 0.6776 | 1.1358 | 0.6761 | 9.993668e-10 | 1125 |
| 1.1312 | 0.6776 | 1.1356 | 0.6761 | 9.993657e-10 | 1126 |
| 1.1386 | 0.6753 | 1.1354 | 0.6761 | 9.993646e-10 | 1127 |
| 1.1308 | 0.6706 | 1.1352 | 0.6761 | 9.993635e-10 | 1128 |
| 1.1279 | 0.6776 | 1.1350 | 0.6761 | 9.993624e-10 | 1129 |
| 1.1326 | 0.6776 | 1.1348 | 0.6761 | 9.993613e-10 | 1130 |
| 1.1305 | 0.6776 | 1.1347 | 0.6761 | 9.993602e-10 | 1131 |
| 1.1316 | 0.6776 | 1.1344 | 0.6761 | 9.99359e-10 | 1132 |
| 1.1307 | 0.6800 | 1.1343 | 0.6761 | 9.993579e-10 | 1133 |
| 1.1344 | 0.6753 | 1.1341 | 0.6761 | 9.993568e-10 | 1134 |
| 1.1321 | 0.6776 | 1.1339 | 0.6761 | 9.993557e-10 | 1135 |
| 1.1265 | 0.6729 | 1.1337 | 0.6761 | 9.993546e-10 | 1136 |
| 1.1336 | 0.6753 | 1.1335 | 0.6761 | 9.993535e-10 | 1137 |
| 1.1257 | 0.6776 | 1.1333 | 0.6761 | 9.993524e-10 | 1138 |
| 1.1267 | 0.6824 | 1.1331 | 0.6761 | 9.993513e-10 | 1139 |
| 1.1225 | 0.6753 | 1.1329 | 0.6761 | 9.993502e-10 | 1140 |
| 1.1255 | 0.6753 | 1.1328 | 0.6761 | 9.99349e-10 | 1141 |
| 1.1233 | 0.6776 | 1.1325 | 0.6761 | 9.993479e-10 | 1142 |
| 1.1372 | 0.6753 | 1.1323 | 0.6761 | 9.993468e-10 | 1143 |
| 1.1197 | 0.6776 | 1.1321 | 0.6761 | 9.993457e-10 | 1144 |
| 1.1294 | 0.6776 | 1.1319 | 0.6761 | 9.993446e-10 | 1145 |
| 1.1205 | 0.6824 | 1.1317 | 0.6761 | 9.993435e-10 | 1146 |
| 1.1289 | 0.6729 | 1.1316 | 0.6761 | 9.993424e-10 | 1147 |
| 1.1295 | 0.6776 | 1.1314 | 0.6761 | 9.993413e-10 | 1148 |
| 1.1281 | 0.6776 | 1.1312 | 0.6761 | 9.993402e-10 | 1149 |
| 1.1301 | 0.6776 | 1.1310 | 0.6761 | 9.993391e-10 | 1150 |
| 1.1238 | 0.6776 | 1.1308 | 0.6761 | 9.99338e-10 | 1151 |
| 1.1318 | 0.6753 | 1.1306 | 0.6761 | 9.993368e-10 | 1152 |
| 1.1268 | 0.6776 | 1.1304 | 0.6761 | 9.993357e-10 | 1153 |
| 1.1250 | 0.6776 | 1.1302 | 0.6761 | 9.993346e-10 | 1154 |
| 1.1253 | 0.6776 | 1.1300 | 0.6761 | 9.993335e-10 | 1155 |
| 1.1315 | 0.6800 | 1.1298 | 0.6761 | 9.993324e-10 | 1156 |
| 1.1254 | 0.6776 | 1.1296 | 0.6761 | 9.993313e-10 | 1157 |
| 1.1263 | 0.6776 | 1.1294 | 0.6761 | 9.993302e-10 | 1158 |
| 1.1212 | 0.6753 | 1.1292 | 0.6761 | 9.993291e-10 | 1159 |
| 1.1247 | 0.6753 | 1.1290 | 0.6761 | 9.99328e-10 | 1160 |
| 1.1258 | 0.6800 | 1.1289 | 0.6761 | 9.993268e-10 | 1161 |
| 1.1262 | 0.6753 | 1.1287 | 0.6761 | 9.993257e-10 | 1162 |
| 1.1172 | 0.6776 | 1.1285 | 0.6761 | 9.993246e-10 | 1163 |
| 1.1232 | 0.6776 | 1.1283 | 0.6761 | 9.993235e-10 | 1164 |
| 1.1285 | 0.6776 | 1.1281 | 0.6761 | 9.993224e-10 | 1165 |
| 1.1163 | 0.6776 | 1.1279 | 0.6761 | 9.993213e-10 | 1166 |
| 1.1250 | 0.6776 | 1.1277 | 0.6761 | 9.993201e-10 | 1167 |
| 1.1218 | 0.6800 | 1.1275 | 0.6761 | 9.993188e-10 | 1168 |
| 1.1209 | 0.6753 | 1.1274 | 0.6761 | 9.993176e-10 | 1169 |
| 1.1265 | 0.6776 | 1.1272 | 0.6761 | 9.993164e-10 | 1170 |
| 1.1207 | 0.6753 | 1.1270 | 0.6761 | 9.993152e-10 | 1171 |
| 1.1299 | 0.6776 | 1.1268 | 0.6761 | 9.99314e-10 | 1172 |
| 1.1200 | 0.6776 | 1.1266 | 0.6761 | 9.993127e-10 | 1173 |
| 1.1281 | 0.6776 | 1.1264 | 0.6761 | 9.993115e-10 | 1174 |
| 1.1192 | 0.6776 | 1.1262 | 0.6761 | 9.993103e-10 | 1175 |
| 1.1209 | 0.6776 | 1.1261 | 0.6761 | 9.993091e-10 | 1176 |
| 1.1201 | 0.6776 | 1.1259 | 0.6761 | 9.993079e-10 | 1177 |
| 1.1158 | 0.6776 | 1.1257 | 0.6761 | 9.993066e-10 | 1178 |
| 1.1224 | 0.6776 | 1.1255 | 0.6761 | 9.993054e-10 | 1179 |
| 1.1221 | 0.6776 | 1.1254 | 0.6761 | 9.993042e-10 | 1180 |
| 1.1297 | 0.6776 | 1.1252 | 0.6761 | 9.99303e-10 | 1181 |
| 1.1234 | 0.6776 | 1.1250 | 0.6761 | 9.993018e-10 | 1182 |
| 1.1153 | 0.6753 | 1.1248 | 0.6761 | 9.993005e-10 | 1183 |
| 1.1264 | 0.6753 | 1.1246 | 0.6761 | 9.992993e-10 | 1184 |
| 1.1142 | 0.6776 | 1.1244 | 0.6761 | 9.992981e-10 | 1185 |
| 1.1175 | 0.6776 | 1.1242 | 0.6761 | 9.992969e-10 | 1186 |
| 1.1161 | 0.6776 | 1.1241 | 0.6761 | 9.992956e-10 | 1187 |
| 1.1172 | 0.6776 | 1.1239 | 0.6761 | 9.992944e-10 | 1188 |
| 1.1227 | 0.6776 | 1.1237 | 0.6761 | 9.992932e-10 | 1189 |
| 1.1178 | 0.6776 | 1.1236 | 0.6761 | 9.99292e-10 | 1190 |
| 1.1206 | 0.6753 | 1.1234 | 0.6761 | 9.992908e-10 | 1191 |
| 1.1169 | 0.6776 | 1.1232 | 0.6761 | 9.992895e-10 | 1192 |
| 1.1213 | 0.6800 | 1.1230 | 0.6761 | 9.992883e-10 | 1193 |
| 1.1254 | 0.6753 | 1.1228 | 0.6761 | 9.992871e-10 | 1194 |
| 1.1202 | 0.6753 | 1.1226 | 0.6761 | 9.992859e-10 | 1195 |
| 1.1176 | 0.6753 | 1.1225 | 0.6761 | 9.992847e-10 | 1196 |
| 1.1144 | 0.6776 | 1.1223 | 0.6761 | 9.992834e-10 | 1197 |
| 1.1186 | 0.6800 | 1.1221 | 0.6761 | 9.992822e-10 | 1198 |
| 1.1177 | 0.6776 | 1.1219 | 0.6761 | 9.99281e-10 | 1199 |
| 1.1174 | 0.6776 | 1.1218 | 0.6761 | 9.992798e-10 | 1200 |
| 1.1145 | 0.6776 | 1.1216 | 0.6761 | 9.992785e-10 | 1201 |
| 1.1176 | 0.6753 | 1.1214 | 0.6761 | 9.992773e-10 | 1202 |
| 1.1167 | 0.6776 | 1.1213 | 0.6761 | 9.992761e-10 | 1203 |
| 1.1224 | 0.6776 | 1.1211 | 0.6761 | 9.992749e-10 | 1204 |
| 1.1158 | 0.6800 | 1.1209 | 0.6761 | 9.992737e-10 | 1205 |
| 1.1184 | 0.6753 | 1.1207 | 0.6761 | 9.992724e-10 | 1206 |
| 1.1172 | 0.6776 | 1.1205 | 0.6761 | 9.992712e-10 | 1207 |
| 1.1127 | 0.6776 | 1.1204 | 0.6761 | 9.9927e-10 | 1208 |
| 1.1165 | 0.6776 | 1.1202 | 0.6761 | 9.992688e-10 | 1209 |
| 1.1136 | 0.6776 | 1.1200 | 0.6761 | 9.992676e-10 | 1210 |
| 1.1197 | 0.6729 | 1.1199 | 0.6761 | 9.992663e-10 | 1211 |
| 1.1157 | 0.6776 | 1.1197 | 0.6761 | 9.992651e-10 | 1212 |
| 1.1148 | 0.6800 | 1.1195 | 0.6761 | 9.992639e-10 | 1213 |
| 1.1177 | 0.6776 | 1.1193 | 0.6761 | 9.992627e-10 | 1214 |
| 1.1194 | 0.6753 | 1.1191 | 0.6761 | 9.992615e-10 | 1215 |
| 1.1105 | 0.6776 | 1.1190 | 0.6761 | 9.992602e-10 | 1216 |
| 1.1157 | 0.6776 | 1.1188 | 0.6761 | 9.99259e-10 | 1217 |
| 1.1129 | 0.6776 | 1.1186 | 0.6761 | 9.992578e-10 | 1218 |
| 1.1174 | 0.6776 | 1.1184 | 0.6761 | 9.992566e-10 | 1219 |
| 1.1133 | 0.6776 | 1.1182 | 0.6761 | 9.992553e-10 | 1220 |
| 1.1172 | 0.6776 | 1.1181 | 0.6761 | 9.992541e-10 | 1221 |
| 1.1153 | 0.6776 | 1.1179 | 0.6761 | 9.992529e-10 | 1222 |
| 1.1050 | 0.6776 | 1.1177 | 0.6761 | 9.992517e-10 | 1223 |
| 1.1142 | 0.6776 | 1.1175 | 0.6761 | 9.992505e-10 | 1224 |
| 1.1176 | 0.6776 | 1.1173 | 0.6761 | 9.992492e-10 | 1225 |
| 1.1128 | 0.6776 | 1.1172 | 0.6761 | 9.99248e-10 | 1226 |
| 1.1214 | 0.6776 | 1.1170 | 0.6761 | 9.992468e-10 | 1227 |
| 1.1194 | 0.6776 | 1.1168 | 0.6761 | 9.992456e-10 | 1228 |
| 1.1132 | 0.6776 | 1.1166 | 0.6761 | 9.992444e-10 | 1229 |
| 1.1130 | 0.6800 | 1.1165 | 0.6761 | 9.992431e-10 | 1230 |
| 1.1127 | 0.6776 | 1.1163 | 0.6761 | 9.992419e-10 | 1231 |
| 1.1121 | 0.6776 | 1.1161 | 0.6761 | 9.992407e-10 | 1232 |
| 1.1131 | 0.6776 | 1.1160 | 0.6761 | 9.992395e-10 | 1233 |
| 1.1124 | 0.6776 | 1.1158 | 0.6761 | 9.992382e-10 | 1234 |
| 1.1120 | 0.6800 | 1.1156 | 0.6761 | 9.99237e-10 | 1235 |
| 1.1054 | 0.6776 | 1.1155 | 0.6761 | 9.992358e-10 | 1236 |
| 1.1082 | 0.6776 | 1.1153 | 0.6761 | 9.992346e-10 | 1237 |
| 1.1159 | 0.6753 | 1.1151 | 0.6761 | 9.992334e-10 | 1238 |
| 1.1160 | 0.6776 | 1.1149 | 0.6761 | 9.992321e-10 | 1239 |
| 1.1107 | 0.6800 | 1.1148 | 0.6761 | 9.992309e-10 | 1240 |
| 1.1110 | 0.6776 | 1.1146 | 0.6761 | 9.992297e-10 | 1241 |
| 1.1148 | 0.6776 | 1.1144 | 0.6761 | 9.992285e-10 | 1242 |
| 1.1094 | 0.6776 | 1.1143 | 0.6761 | 9.992273e-10 | 1243 |
| 1.1062 | 0.6776 | 1.1141 | 0.6761 | 9.99226e-10 | 1244 |
| 1.1077 | 0.6776 | 1.1139 | 0.6761 | 9.992248e-10 | 1245 |
| 1.1069 | 0.6800 | 1.1137 | 0.6761 | 9.992236e-10 | 1246 |
| 1.1053 | 0.6776 | 1.1136 | 0.6761 | 9.992224e-10 | 1247 |
| 1.1068 | 0.6776 | 1.1134 | 0.6761 | 9.992212e-10 | 1248 |
| 1.1113 | 0.6800 | 1.1132 | 0.6761 | 9.992199e-10 | 1249 |
| 1.1025 | 0.6776 | 1.1131 | 0.6761 | 9.992187e-10 | 1250 |
| 1.1160 | 0.6776 | 1.1129 | 0.6761 | 9.992175e-10 | 1251 |
| 1.1088 | 0.6753 | 1.1127 | 0.6761 | 9.992163e-10 | 1252 |
| 1.1072 | 0.6776 | 1.1126 | 0.6761 | 9.99215e-10 | 1253 |
| 1.1026 | 0.6776 | 1.1124 | 0.6761 | 9.992138e-10 | 1254 |
| 1.1147 | 0.6776 | 1.1122 | 0.6761 | 9.992126e-10 | 1255 |
| 1.1075 | 0.6776 | 1.1121 | 0.6761 | 9.992114e-10 | 1256 |
| 1.1015 | 0.6776 | 1.1119 | 0.6761 | 9.992102e-10 | 1257 |
| 1.1071 | 0.6776 | 1.1117 | 0.6761 | 9.992089e-10 | 1258 |
| 1.1020 | 0.6776 | 1.1116 | 0.6761 | 9.992077e-10 | 1259 |
| 1.1129 | 0.6753 | 1.1114 | 0.6761 | 9.992065e-10 | 1260 |
| 1.1070 | 0.6776 | 1.1112 | 0.6761 | 9.992053e-10 | 1261 |
| 1.1001 | 0.6776 | 1.1111 | 0.6761 | 9.99204e-10 | 1262 |
| 1.0972 | 0.6776 | 1.1108 | 0.6761 | 9.992028e-10 | 1263 |
| 1.1102 | 0.6753 | 1.1107 | 0.6761 | 9.992016e-10 | 1264 |
| 1.1079 | 0.6776 | 1.1105 | 0.6761 | 9.992004e-10 | 1265 |
| 1.1092 | 0.6776 | 1.1104 | 0.6761 | 9.991992e-10 | 1266 |
| 1.1120 | 0.6776 | 1.1102 | 0.6761 | 9.99198e-10 | 1267 |
| 1.1117 | 0.6776 | 1.1100 | 0.6761 | 9.991967e-10 | 1268 |
| 1.1066 | 0.6776 | 1.1098 | 0.6761 | 9.991955e-10 | 1269 |
| 1.1101 | 0.6776 | 1.1097 | 0.6761 | 9.991943e-10 | 1270 |
| 1.1001 | 0.6776 | 1.1095 | 0.6761 | 9.991931e-10 | 1271 |
| 1.1073 | 0.6776 | 1.1093 | 0.6761 | 9.991918e-10 | 1272 |
| 1.1066 | 0.6776 | 1.1092 | 0.6761 | 9.991906e-10 | 1273 |
| 1.1089 | 0.6776 | 1.1090 | 0.6761 | 9.991894e-10 | 1274 |
| 1.1057 | 0.6776 | 1.1088 | 0.6761 | 9.991882e-10 | 1275 |
| 1.1085 | 0.6776 | 1.1087 | 0.6761 | 9.99187e-10 | 1276 |
| 1.1045 | 0.6776 | 1.1085 | 0.6761 | 9.991857e-10 | 1277 |
| 1.1029 | 0.6776 | 1.1083 | 0.6761 | 9.991844e-10 | 1278 |
| 1.1024 | 0.6776 | 1.1082 | 0.6761 | 9.991831e-10 | 1279 |
| 1.1022 | 0.6753 | 1.1080 | 0.6761 | 9.991817e-10 | 1280 |
| 1.0999 | 0.6776 | 1.1078 | 0.6761 | 9.991804e-10 | 1281 |
| 1.1084 | 0.6776 | 1.1077 | 0.6761 | 9.991791e-10 | 1282 |
| 1.1051 | 0.6776 | 1.1075 | 0.6761 | 9.991777e-10 | 1283 |
| 1.1010 | 0.6776 | 1.1073 | 0.6761 | 9.991764e-10 | 1284 |
| 1.1027 | 0.6776 | 1.1072 | 0.6761 | 9.991751e-10 | 1285 |
| 1.1140 | 0.6776 | 1.1070 | 0.6761 | 9.991737e-10 | 1286 |
| 1.1009 | 0.6776 | 1.1068 | 0.6761 | 9.991724e-10 | 1287 |
| 1.1096 | 0.6753 | 1.1067 | 0.6761 | 9.991711e-10 | 1288 |
| 1.1101 | 0.6776 | 1.1065 | 0.6761 | 9.991697e-10 | 1289 |
| 1.1054 | 0.6800 | 1.1063 | 0.6761 | 9.991684e-10 | 1290 |
| 1.1021 | 0.6776 | 1.1061 | 0.6761 | 9.991671e-10 | 1291 |
| 1.0990 | 0.6776 | 1.1060 | 0.6761 | 9.991658e-10 | 1292 |
| 1.1013 | 0.6776 | 1.1058 | 0.6761 | 9.991644e-10 | 1293 |
| 1.1073 | 0.6776 | 1.1056 | 0.6761 | 9.991631e-10 | 1294 |
| 1.1009 | 0.6776 | 1.1054 | 0.6761 | 9.991618e-10 | 1295 |
| 1.0973 | 0.6776 | 1.1053 | 0.6761 | 9.991604e-10 | 1296 |
| 1.1071 | 0.6776 | 1.1051 | 0.6761 | 9.991591e-10 | 1297 |
| 1.1029 | 0.6776 | 1.1049 | 0.6761 | 9.991578e-10 | 1298 |
| 1.1012 | 0.6776 | 1.1048 | 0.6761 | 9.991564e-10 | 1299 |
| 1.0993 | 0.6776 | 1.1046 | 0.6761 | 9.991551e-10 | 1300 |
| 1.0962 | 0.6776 | 1.1045 | 0.6761 | 9.991538e-10 | 1301 |
| 1.1020 | 0.6753 | 1.1043 | 0.6761 | 9.991524e-10 | 1302 |
| 1.0981 | 0.6776 | 1.1041 | 0.6761 | 9.991511e-10 | 1303 |
| 1.0974 | 0.6776 | 1.1040 | 0.6761 | 9.991498e-10 | 1304 |
| 1.0945 | 0.6776 | 1.1038 | 0.6761 | 9.991484e-10 | 1305 |
| 1.1022 | 0.6776 | 1.1037 | 0.6761 | 9.991471e-10 | 1306 |
| 1.1001 | 0.6776 | 1.1035 | 0.6761 | 9.991458e-10 | 1307 |
| 1.1029 | 0.6753 | 1.1034 | 0.6761 | 9.991444e-10 | 1308 |
| 1.0966 | 0.6776 | 1.1032 | 0.6761 | 9.991431e-10 | 1309 |
| 1.0932 | 0.6776 | 1.1031 | 0.6761 | 9.991418e-10 | 1310 |
| 1.0951 | 0.6776 | 1.1029 | 0.6761 | 9.991404e-10 | 1311 |
| 1.1039 | 0.6729 | 1.1027 | 0.6761 | 9.991391e-10 | 1312 |
| 1.0993 | 0.6776 | 1.1026 | 0.6761 | 9.991378e-10 | 1313 |
| 1.0978 | 0.6776 | 1.1024 | 0.6761 | 9.991364e-10 | 1314 |
| 1.1025 | 0.6776 | 1.1022 | 0.6761 | 9.991351e-10 | 1315 |
| 1.1008 | 0.6776 | 1.1021 | 0.6761 | 9.991338e-10 | 1316 |
| 1.1003 | 0.6776 | 1.1019 | 0.6761 | 9.991324e-10 | 1317 |
| 1.0956 | 0.6776 | 1.1018 | 0.6761 | 9.991311e-10 | 1318 |
| 1.0903 | 0.6776 | 1.1016 | 0.6761 | 9.991298e-10 | 1319 |
| 1.1005 | 0.6776 | 1.1015 | 0.6761 | 9.991284e-10 | 1320 |
| 1.0937 | 0.6776 | 1.1013 | 0.6761 | 9.991271e-10 | 1321 |
| 1.0979 | 0.6776 | 1.1012 | 0.6761 | 9.991258e-10 | 1322 |
| 1.0996 | 0.6776 | 1.1010 | 0.6761 | 9.991244e-10 | 1323 |
| 1.0903 | 0.6776 | 1.1008 | 0.6761 | 9.991231e-10 | 1324 |
| 1.0978 | 0.6776 | 1.1007 | 0.6761 | 9.991218e-10 | 1325 |
| 1.0988 | 0.6776 | 1.1005 | 0.6761 | 9.991205e-10 | 1326 |
| 1.0980 | 0.6776 | 1.1004 | 0.6761 | 9.991191e-10 | 1327 |
| 1.0951 | 0.6776 | 1.1002 | 0.6761 | 9.991178e-10 | 1328 |
| 1.0989 | 0.6776 | 1.1000 | 0.6761 | 9.991165e-10 | 1329 |
| 1.0927 | 0.6776 | 1.0999 | 0.6761 | 9.991151e-10 | 1330 |
| 1.0897 | 0.6776 | 1.0997 | 0.6761 | 9.991138e-10 | 1331 |
| 1.0971 | 0.6776 | 1.0996 | 0.6761 | 9.991125e-10 | 1332 |
| 1.0936 | 0.6776 | 1.0994 | 0.6761 | 9.991111e-10 | 1333 |
| 1.0961 | 0.6776 | 1.0993 | 0.6761 | 9.991098e-10 | 1334 |
| 1.0967 | 0.6776 | 1.0991 | 0.6761 | 9.991085e-10 | 1335 |
| 1.0967 | 0.6776 | 1.0989 | 0.6761 | 9.991071e-10 | 1336 |
| 1.0969 | 0.6776 | 1.0988 | 0.6761 | 9.991058e-10 | 1337 |
| 1.0955 | 0.6776 | 1.0986 | 0.6761 | 9.991045e-10 | 1338 |
| 1.0953 | 0.6776 | 1.0985 | 0.6761 | 9.991031e-10 | 1339 |
| 1.0908 | 0.6776 | 1.0983 | 0.6761 | 9.991018e-10 | 1340 |
| 1.0892 | 0.6776 | 1.0982 | 0.6761 | 9.991005e-10 | 1341 |
| 1.0952 | 0.6776 | 1.0980 | 0.6761 | 9.990991e-10 | 1342 |
| 1.0936 | 0.6776 | 1.0979 | 0.6761 | 9.990978e-10 | 1343 |
| 1.0861 | 0.6776 | 1.0977 | 0.6761 | 9.990965e-10 | 1344 |
| 1.0982 | 0.6776 | 1.0976 | 0.6761 | 9.990951e-10 | 1345 |
| 1.0917 | 0.6776 | 1.0974 | 0.6761 | 9.990938e-10 | 1346 |
| 1.0999 | 0.6776 | 1.0972 | 0.6761 | 9.990925e-10 | 1347 |
| 1.1003 | 0.6776 | 1.0971 | 0.6761 | 9.990911e-10 | 1348 |
| 1.0971 | 0.6776 | 1.0969 | 0.6761 | 9.990898e-10 | 1349 |
| 1.0964 | 0.6800 | 1.0968 | 0.6761 | 9.990885e-10 | 1350 |
| 1.0960 | 0.6753 | 1.0966 | 0.6761 | 9.990871e-10 | 1351 |
| 1.0934 | 0.6776 | 1.0965 | 0.6761 | 9.990858e-10 | 1352 |
| 1.0920 | 0.6776 | 1.0963 | 0.6761 | 9.990845e-10 | 1353 |
| 1.0894 | 0.6776 | 1.0961 | 0.6761 | 9.990831e-10 | 1354 |
| 1.0977 | 0.6776 | 1.0960 | 0.6761 | 9.990818e-10 | 1355 |
| 1.0929 | 0.6776 | 1.0958 | 0.6761 | 9.990805e-10 | 1356 |
| 1.0921 | 0.6776 | 1.0956 | 0.6761 | 9.990792e-10 | 1357 |
| 1.0914 | 0.6776 | 1.0955 | 0.6761 | 9.990778e-10 | 1358 |
| 1.0893 | 0.6776 | 1.0953 | 0.6761 | 9.990765e-10 | 1359 |
| 1.0921 | 0.6776 | 1.0951 | 0.6761 | 9.990752e-10 | 1360 |
| 1.0900 | 0.6776 | 1.0950 | 0.6761 | 9.990738e-10 | 1361 |
| 1.0957 | 0.6776 | 1.0948 | 0.6761 | 9.990725e-10 | 1362 |
| 1.0860 | 0.6776 | 1.0946 | 0.6761 | 9.990712e-10 | 1363 |
| 1.0897 | 0.6776 | 1.0945 | 0.6761 | 9.990698e-10 | 1364 |
| 1.0904 | 0.6776 | 1.0944 | 0.6761 | 9.990685e-10 | 1365 |
| 1.0813 | 0.6800 | 1.0942 | 0.6761 | 9.990672e-10 | 1366 |
| 1.0883 | 0.6776 | 1.0941 | 0.6761 | 9.990658e-10 | 1367 |
| 1.0852 | 0.6776 | 1.0939 | 0.6761 | 9.990645e-10 | 1368 |
| 1.0886 | 0.6776 | 1.0937 | 0.6761 | 9.990632e-10 | 1369 |
| 1.0886 | 0.6776 | 1.0936 | 0.6761 | 9.990618e-10 | 1370 |
| 1.0910 | 0.6776 | 1.0935 | 0.6761 | 9.990605e-10 | 1371 |
| 1.0850 | 0.6776 | 1.0933 | 0.6761 | 9.990592e-10 | 1372 |
| 1.0883 | 0.6776 | 1.0931 | 0.6761 | 9.990578e-10 | 1373 |
| 1.0869 | 0.6776 | 1.0930 | 0.6761 | 9.990565e-10 | 1374 |
| 1.0963 | 0.6776 | 1.0928 | 0.6761 | 9.990552e-10 | 1375 |
| 1.0957 | 0.6776 | 1.0927 | 0.6761 | 9.990538e-10 | 1376 |
| 1.0958 | 0.6776 | 1.0925 | 0.6761 | 9.990525e-10 | 1377 |
| 1.0871 | 0.6776 | 1.0924 | 0.6761 | 9.990512e-10 | 1378 |
| 1.0893 | 0.6776 | 1.0922 | 0.6761 | 9.990498e-10 | 1379 |
| 1.0895 | 0.6776 | 1.0921 | 0.6761 | 9.990485e-10 | 1380 |
| 1.0850 | 0.6776 | 1.0919 | 0.6761 | 9.990472e-10 | 1381 |
| 1.0873 | 0.6776 | 1.0918 | 0.6761 | 9.990458e-10 | 1382 |
| 1.0825 | 0.6776 | 1.0916 | 0.6761 | 9.990445e-10 | 1383 |
| 1.0843 | 0.6776 | 1.0915 | 0.6761 | 9.990432e-10 | 1384 |
| 1.0910 | 0.6776 | 1.0913 | 0.6761 | 9.990418e-10 | 1385 |
| 1.0804 | 0.6776 | 1.0911 | 0.6761 | 9.990405e-10 | 1386 |
| 1.0891 | 0.6776 | 1.0910 | 0.6761 | 9.990392e-10 | 1387 |
| 1.0881 | 0.6776 | 1.0909 | 0.6761 | 9.990379e-10 | 1388 |
| 1.0815 | 0.6776 | 1.0907 | 0.6761 | 9.990365e-10 | 1389 |
| 1.0883 | 0.6776 | 1.0906 | 0.6761 | 9.990351e-10 | 1390 |
| 1.0828 | 0.6776 | 1.0904 | 0.6761 | 9.990336e-10 | 1391 |
| 1.0886 | 0.6776 | 1.0903 | 0.6761 | 9.990322e-10 | 1392 |
| 1.0794 | 0.6776 | 1.0901 | 0.6761 | 9.990307e-10 | 1393 |
| 1.0863 | 0.6776 | 1.0900 | 0.6761 | 9.990293e-10 | 1394 |
| 1.0881 | 0.6776 | 1.0898 | 0.6761 | 9.990279e-10 | 1395 |
| 1.0957 | 0.6776 | 1.0897 | 0.6761 | 9.990264e-10 | 1396 |
| 1.0829 | 0.6776 | 1.0895 | 0.6761 | 9.99025e-10 | 1397 |
| 1.0841 | 0.6776 | 1.0894 | 0.6761 | 9.990235e-10 | 1398 |
| 1.0898 | 0.6776 | 1.0893 | 0.6761 | 9.990221e-10 | 1399 |
| 1.0893 | 0.6776 | 1.0891 | 0.6761 | 9.990206e-10 | 1400 |
| 1.0829 | 0.6776 | 1.0890 | 0.6761 | 9.990192e-10 | 1401 |
| 1.0873 | 0.6776 | 1.0888 | 0.6761 | 9.990178e-10 | 1402 |
| 1.0798 | 0.6776 | 1.0887 | 0.6761 | 9.990163e-10 | 1403 |
| 1.0830 | 0.6776 | 1.0885 | 0.6761 | 9.990149e-10 | 1404 |
| 1.0862 | 0.6776 | 1.0884 | 0.6761 | 9.990134e-10 | 1405 |
| 1.0864 | 0.6800 | 1.0882 | 0.6761 | 9.99012e-10 | 1406 |
| 1.0871 | 0.6776 | 1.0881 | 0.6761 | 9.990105e-10 | 1407 |
| 1.0865 | 0.6776 | 1.0880 | 0.6761 | 9.990091e-10 | 1408 |
| 1.0880 | 0.6776 | 1.0878 | 0.6761 | 9.990077e-10 | 1409 |
| 1.0814 | 0.6776 | 1.0877 | 0.6761 | 9.990062e-10 | 1410 |
| 1.0829 | 0.6776 | 1.0875 | 0.6761 | 9.990048e-10 | 1411 |
| 1.0859 | 0.6776 | 1.0874 | 0.6761 | 9.990033e-10 | 1412 |
| 1.0792 | 0.6776 | 1.0872 | 0.6761 | 9.990019e-10 | 1413 |
| 1.0849 | 0.6776 | 1.0871 | 0.6761 | 9.990004e-10 | 1414 |
| 1.0806 | 0.6776 | 1.0869 | 0.6761 | 9.98999e-10 | 1415 |
| 1.0845 | 0.6776 | 1.0868 | 0.6761 | 9.989976e-10 | 1416 |
| 1.0839 | 0.6776 | 1.0867 | 0.6761 | 9.989961e-10 | 1417 |
| 1.0806 | 0.6776 | 1.0865 | 0.6761 | 9.989947e-10 | 1418 |
| 1.0877 | 0.6776 | 1.0864 | 0.6761 | 9.989932e-10 | 1419 |
| 1.0852 | 0.6776 | 1.0862 | 0.6761 | 9.989918e-10 | 1420 |
| 1.0835 | 0.6776 | 1.0860 | 0.6761 | 9.989903e-10 | 1421 |
| 1.0792 | 0.6776 | 1.0859 | 0.6761 | 9.989889e-10 | 1422 |
| 1.0747 | 0.6776 | 1.0857 | 0.6761 | 9.989874e-10 | 1423 |
| 1.0839 | 0.6776 | 1.0856 | 0.6761 | 9.98986e-10 | 1424 |
| 1.0873 | 0.6776 | 1.0854 | 0.6761 | 9.989846e-10 | 1425 |
| 1.0782 | 0.6776 | 1.0853 | 0.6761 | 9.989831e-10 | 1426 |
| 1.0831 | 0.6776 | 1.0851 | 0.6761 | 9.989817e-10 | 1427 |
| 1.0837 | 0.6776 | 1.0850 | 0.6761 | 9.989802e-10 | 1428 |
| 1.0816 | 0.6776 | 1.0849 | 0.6761 | 9.989788e-10 | 1429 |
| 1.0799 | 0.6776 | 1.0847 | 0.6761 | 9.989773e-10 | 1430 |
| 1.0828 | 0.6776 | 1.0846 | 0.6761 | 9.989759e-10 | 1431 |
| 1.0823 | 0.6776 | 1.0845 | 0.6761 | 9.989745e-10 | 1432 |
| 1.0850 | 0.6776 | 1.0843 | 0.6761 | 9.98973e-10 | 1433 |
| 1.0770 | 0.6776 | 1.0842 | 0.6761 | 9.989716e-10 | 1434 |
| 1.0794 | 0.6776 | 1.0840 | 0.6761 | 9.989701e-10 | 1435 |
| 1.0746 | 0.6776 | 1.0839 | 0.6761 | 9.989687e-10 | 1436 |
| 1.0856 | 0.6776 | 1.0838 | 0.6761 | 9.989672e-10 | 1437 |
| 1.0883 | 0.6776 | 1.0836 | 0.6761 | 9.989658e-10 | 1438 |
| 1.0844 | 0.6776 | 1.0835 | 0.6761 | 9.989644e-10 | 1439 |
| 1.0852 | 0.6776 | 1.0833 | 0.6761 | 9.989629e-10 | 1440 |
| 1.0800 | 0.6776 | 1.0832 | 0.6761 | 9.989615e-10 | 1441 |
| 1.0741 | 0.6776 | 1.0830 | 0.6761 | 9.9896e-10 | 1442 |
| 1.0807 | 0.6776 | 1.0829 | 0.6761 | 9.989586e-10 | 1443 |
| 1.0822 | 0.6776 | 1.0828 | 0.6761 | 9.989571e-10 | 1444 |
| 1.0746 | 0.6776 | 1.0826 | 0.6761 | 9.989557e-10 | 1445 |
| 1.0850 | 0.6776 | 1.0825 | 0.6761 | 9.989543e-10 | 1446 |
| 1.0741 | 0.6776 | 1.0823 | 0.6761 | 9.989528e-10 | 1447 |
| 1.0780 | 0.6776 | 1.0822 | 0.6761 | 9.989514e-10 | 1448 |
| 1.0883 | 0.6776 | 1.0821 | 0.6761 | 9.989499e-10 | 1449 |
| 1.0746 | 0.6776 | 1.0819 | 0.6761 | 9.989485e-10 | 1450 |
| 1.0787 | 0.6776 | 1.0818 | 0.6761 | 9.98947e-10 | 1451 |
| 1.0776 | 0.6776 | 1.0816 | 0.6761 | 9.989456e-10 | 1452 |
| 1.0799 | 0.6776 | 1.0815 | 0.6761 | 9.989441e-10 | 1453 |
| 1.0829 | 0.6776 | 1.0813 | 0.6761 | 9.989427e-10 | 1454 |
| 1.0738 | 0.6776 | 1.0812 | 0.6761 | 9.989413e-10 | 1455 |
| 1.0773 | 0.6776 | 1.0811 | 0.6761 | 9.989398e-10 | 1456 |
| 1.0717 | 0.6776 | 1.0809 | 0.6761 | 9.989384e-10 | 1457 |
| 1.0703 | 0.6776 | 1.0808 | 0.6761 | 9.989369e-10 | 1458 |
| 1.0763 | 0.6776 | 1.0806 | 0.6761 | 9.989355e-10 | 1459 |
| 1.0753 | 0.6776 | 1.0805 | 0.6761 | 9.98934e-10 | 1460 |
| 1.0755 | 0.6776 | 1.0804 | 0.6761 | 9.989326e-10 | 1461 |
| 1.0786 | 0.6776 | 1.0802 | 0.6761 | 9.989312e-10 | 1462 |
| 1.0673 | 0.6776 | 1.0801 | 0.6761 | 9.989297e-10 | 1463 |
| 1.0755 | 0.6776 | 1.0799 | 0.6761 | 9.989283e-10 | 1464 |
| 1.0762 | 0.6776 | 1.0798 | 0.6761 | 9.989268e-10 | 1465 |
| 1.0810 | 0.6776 | 1.0797 | 0.6761 | 9.989254e-10 | 1466 |
| 1.0683 | 0.6776 | 1.0795 | 0.6761 | 9.989239e-10 | 1467 |
| 1.0749 | 0.6776 | 1.0794 | 0.6761 | 9.989225e-10 | 1468 |
| 1.0807 | 0.6776 | 1.0792 | 0.6761 | 9.989211e-10 | 1469 |
| 1.0775 | 0.6776 | 1.0791 | 0.6761 | 9.989196e-10 | 1470 |
| 1.0732 | 0.6776 | 1.0790 | 0.6761 | 9.989182e-10 | 1471 |
| 1.0827 | 0.6776 | 1.0788 | 0.6761 | 9.989167e-10 | 1472 |
| 1.0756 | 0.6776 | 1.0787 | 0.6761 | 9.989153e-10 | 1473 |
| 1.0787 | 0.6776 | 1.0785 | 0.6761 | 9.989138e-10 | 1474 |
| 1.0709 | 0.6776 | 1.0784 | 0.6761 | 9.989124e-10 | 1475 |
| 1.0780 | 0.6776 | 1.0783 | 0.6761 | 9.98911e-10 | 1476 |
| 1.0693 | 0.6776 | 1.0781 | 0.6761 | 9.989095e-10 | 1477 |
| 1.0748 | 0.6776 | 1.0780 | 0.6761 | 9.989081e-10 | 1478 |
| 1.0803 | 0.6776 | 1.0779 | 0.6761 | 9.989066e-10 | 1479 |
| 1.0708 | 0.6776 | 1.0777 | 0.6761 | 9.989052e-10 | 1480 |
| 1.0707 | 0.6776 | 1.0776 | 0.6761 | 9.989037e-10 | 1481 |
| 1.0772 | 0.6776 | 1.0774 | 0.6761 | 9.989023e-10 | 1482 |
| 1.0706 | 0.6776 | 1.0773 | 0.6761 | 9.989009e-10 | 1483 |
| 1.0739 | 0.6776 | 1.0772 | 0.6761 | 9.988994e-10 | 1484 |
| 1.0749 | 0.6776 | 1.0770 | 0.6761 | 9.98898e-10 | 1485 |
| 1.0661 | 0.6776 | 1.0769 | 0.6761 | 9.988965e-10 | 1486 |
| 1.0749 | 0.6776 | 1.0767 | 0.6761 | 9.988951e-10 | 1487 |
| 1.0769 | 0.6776 | 1.0766 | 0.6761 | 9.988936e-10 | 1488 |
| 1.0761 | 0.6776 | 1.0765 | 0.6761 | 9.988922e-10 | 1489 |
| 1.0666 | 0.6776 | 1.0763 | 0.6761 | 9.988907e-10 | 1490 |
| 1.0730 | 0.6776 | 1.0762 | 0.6761 | 9.988893e-10 | 1491 |
| 1.0783 | 0.6776 | 1.0761 | 0.6761 | 9.988879e-10 | 1492 |
| 1.0793 | 0.6776 | 1.0759 | 0.6761 | 9.988864e-10 | 1493 |
| 1.0773 | 0.6776 | 1.0758 | 0.6761 | 9.98885e-10 | 1494 |
| 1.0714 | 0.6776 | 1.0757 | 0.6761 | 9.988835e-10 | 1495 |
| 1.0797 | 0.6776 | 1.0755 | 0.6761 | 9.988821e-10 | 1496 |
| 1.0650 | 0.6776 | 1.0754 | 0.6761 | 9.988806e-10 | 1497 |
| 1.0620 | 0.6776 | 1.0753 | 0.6761 | 9.988792e-10 | 1498 |
| 1.0776 | 0.6776 | 1.0751 | 0.6761 | 9.988778e-10 | 1499 |
| 1.0742 | 0.6776 | 1.0750 | 0.6761 | 9.988763e-10 | 1500 |
| 1.0659 | 0.6776 | 1.0748 | 0.6761 | 9.988748e-10 | 1501 |
| 1.0591 | 0.6776 | 1.0747 | 0.6761 | 9.988732e-10 | 1502 |
| 1.0664 | 0.6776 | 1.0746 | 0.6761 | 9.988717e-10 | 1503 |
| 1.0702 | 0.6776 | 1.0744 | 0.6761 | 9.988701e-10 | 1504 |
| 1.0650 | 0.6776 | 1.0743 | 0.6761 | 9.988685e-10 | 1505 |
| 1.0730 | 0.6776 | 1.0742 | 0.6761 | 9.98867e-10 | 1506 |
| 1.0745 | 0.6776 | 1.0741 | 0.6761 | 9.988654e-10 | 1507 |
| 1.0697 | 0.6776 | 1.0739 | 0.6761 | 9.988639e-10 | 1508 |
| 1.0746 | 0.6776 | 1.0738 | 0.6761 | 9.988623e-10 | 1509 |
| 1.0678 | 0.6776 | 1.0737 | 0.6761 | 9.988608e-10 | 1510 |
| 1.0742 | 0.6776 | 1.0735 | 0.6761 | 9.988592e-10 | 1511 |
| 1.0745 | 0.6776 | 1.0734 | 0.6761 | 9.988577e-10 | 1512 |
| 1.0698 | 0.6776 | 1.0732 | 0.6761 | 9.988561e-10 | 1513 |
| 1.0669 | 0.6776 | 1.0731 | 0.6761 | 9.988546e-10 | 1514 |
| 1.0704 | 0.6776 | 1.0730 | 0.6761 | 9.98853e-10 | 1515 |
| 1.0751 | 0.6776 | 1.0728 | 0.6761 | 9.988514e-10 | 1516 |
| 1.0652 | 0.6776 | 1.0727 | 0.6761 | 9.988499e-10 | 1517 |
| 1.0696 | 0.6776 | 1.0726 | 0.6761 | 9.988483e-10 | 1518 |
| 1.0690 | 0.6776 | 1.0725 | 0.6761 | 9.988468e-10 | 1519 |
| 1.0723 | 0.6776 | 1.0723 | 0.6761 | 9.988452e-10 | 1520 |
| 1.0653 | 0.6776 | 1.0722 | 0.6761 | 9.988437e-10 | 1521 |
| 1.0638 | 0.6776 | 1.0721 | 0.6761 | 9.988421e-10 | 1522 |
| 1.0734 | 0.6776 | 1.0719 | 0.6761 | 9.988406e-10 | 1523 |
| 1.0714 | 0.6776 | 1.0718 | 0.6761 | 9.98839e-10 | 1524 |
| 1.0757 | 0.6776 | 1.0717 | 0.6761 | 9.988375e-10 | 1525 |
| 1.0666 | 0.6776 | 1.0715 | 0.6761 | 9.988359e-10 | 1526 |
| 1.0631 | 0.6776 | 1.0714 | 0.6761 | 9.988343e-10 | 1527 |
| 1.0668 | 0.6776 | 1.0713 | 0.6761 | 9.988328e-10 | 1528 |
| 1.0602 | 0.6776 | 1.0712 | 0.6761 | 9.988312e-10 | 1529 |
| 1.0670 | 0.6776 | 1.0710 | 0.6761 | 9.988297e-10 | 1530 |
| 1.0698 | 0.6776 | 1.0709 | 0.6761 | 9.988281e-10 | 1531 |
| 1.0684 | 0.6776 | 1.0708 | 0.6761 | 9.988266e-10 | 1532 |
| 1.0642 | 0.6776 | 1.0707 | 0.6761 | 9.98825e-10 | 1533 |
| 1.0659 | 0.6776 | 1.0705 | 0.6761 | 9.988235e-10 | 1534 |
| 1.0724 | 0.6776 | 1.0704 | 0.6761 | 9.988219e-10 | 1535 |
| 1.0723 | 0.6776 | 1.0703 | 0.6761 | 9.988204e-10 | 1536 |
| 1.0665 | 0.6776 | 1.0701 | 0.6761 | 9.988188e-10 | 1537 |
| 1.0752 | 0.6776 | 1.0700 | 0.6761 | 9.988173e-10 | 1538 |
| 1.0636 | 0.6776 | 1.0699 | 0.6761 | 9.988157e-10 | 1539 |
| 1.0631 | 0.6776 | 1.0698 | 0.6761 | 9.988141e-10 | 1540 |
| 1.0652 | 0.6776 | 1.0696 | 0.6761 | 9.988126e-10 | 1541 |
| 1.0651 | 0.6776 | 1.0695 | 0.6761 | 9.98811e-10 | 1542 |
| 1.0697 | 0.6776 | 1.0694 | 0.6761 | 9.988095e-10 | 1543 |
| 1.0676 | 0.6776 | 1.0692 | 0.6761 | 9.988079e-10 | 1544 |
| 1.0636 | 0.6776 | 1.0691 | 0.6761 | 9.988064e-10 | 1545 |
| 1.0557 | 0.6776 | 1.0690 | 0.6761 | 9.988048e-10 | 1546 |
| 1.0598 | 0.6776 | 1.0689 | 0.6761 | 9.988033e-10 | 1547 |
| 1.0648 | 0.6776 | 1.0687 | 0.6761 | 9.988017e-10 | 1548 |
| 1.0655 | 0.6776 | 1.0686 | 0.6761 | 9.988002e-10 | 1549 |
| 1.0632 | 0.6776 | 1.0685 | 0.6761 | 9.987986e-10 | 1550 |
| 1.0656 | 0.6776 | 1.0683 | 0.6761 | 9.98797e-10 | 1551 |
| 1.0694 | 0.6776 | 1.0682 | 0.6761 | 9.987955e-10 | 1552 |
| 1.0576 | 0.6776 | 1.0681 | 0.6761 | 9.987939e-10 | 1553 |
| 1.0724 | 0.6776 | 1.0679 | 0.6761 | 9.987924e-10 | 1554 |
| 1.0685 | 0.6776 | 1.0678 | 0.6761 | 9.987908e-10 | 1555 |
| 1.0603 | 0.6776 | 1.0676 | 0.6761 | 9.987893e-10 | 1556 |
| 1.0560 | 0.6776 | 1.0675 | 0.6761 | 9.987877e-10 | 1557 |
| 1.0724 | 0.6776 | 1.0674 | 0.6761 | 9.987862e-10 | 1558 |
| 1.0657 | 0.6776 | 1.0673 | 0.6761 | 9.987846e-10 | 1559 |
| 1.0633 | 0.6776 | 1.0671 | 0.6761 | 9.987831e-10 | 1560 |
| 1.0629 | 0.6776 | 1.0670 | 0.6761 | 9.987815e-10 | 1561 |
| 1.0608 | 0.6776 | 1.0669 | 0.6761 | 9.9878e-10 | 1562 |
| 1.0693 | 0.6776 | 1.0668 | 0.6761 | 9.987784e-10 | 1563 |
| 1.0568 | 0.6776 | 1.0667 | 0.6761 | 9.987768e-10 | 1564 |
| 1.0606 | 0.6776 | 1.0665 | 0.6761 | 9.987753e-10 | 1565 |
| 1.0658 | 0.6776 | 1.0664 | 0.6761 | 9.987737e-10 | 1566 |
| 1.0591 | 0.6776 | 1.0663 | 0.6761 | 9.987722e-10 | 1567 |
| 1.0644 | 0.6776 | 1.0662 | 0.6761 | 9.987706e-10 | 1568 |
| 1.0561 | 0.6776 | 1.0660 | 0.6761 | 9.987691e-10 | 1569 |
| 1.0650 | 0.6776 | 1.0659 | 0.6761 | 9.987675e-10 | 1570 |
| 1.0640 | 0.6776 | 1.0658 | 0.6761 | 9.98766e-10 | 1571 |
| 1.0596 | 0.6776 | 1.0657 | 0.6761 | 9.987644e-10 | 1572 |
| 1.0599 | 0.6776 | 1.0656 | 0.6761 | 9.987629e-10 | 1573 |
| 1.0636 | 0.6776 | 1.0654 | 0.6761 | 9.987613e-10 | 1574 |
| 1.0643 | 0.6776 | 1.0653 | 0.6761 | 9.987597e-10 | 1575 |
| 1.0606 | 0.6776 | 1.0652 | 0.6761 | 9.987582e-10 | 1576 |
| 1.0664 | 0.6776 | 1.0651 | 0.6761 | 9.987566e-10 | 1577 |
| 1.0673 | 0.6776 | 1.0649 | 0.6761 | 9.987551e-10 | 1578 |
| 1.0585 | 0.6776 | 1.0648 | 0.6761 | 9.987535e-10 | 1579 |
| 1.0593 | 0.6776 | 1.0647 | 0.6761 | 9.98752e-10 | 1580 |
| 1.0624 | 0.6776 | 1.0646 | 0.6761 | 9.987504e-10 | 1581 |
| 1.0590 | 0.6776 | 1.0644 | 0.6761 | 9.987489e-10 | 1582 |
| 1.0607 | 0.6776 | 1.0643 | 0.6761 | 9.987473e-10 | 1583 |
| 1.0612 | 0.6776 | 1.0642 | 0.6761 | 9.987458e-10 | 1584 |
| 1.0587 | 0.6776 | 1.0641 | 0.6761 | 9.987442e-10 | 1585 |
| 1.0631 | 0.6776 | 1.0640 | 0.6761 | 9.987426e-10 | 1586 |
| 1.0626 | 0.6776 | 1.0639 | 0.6761 | 9.987411e-10 | 1587 |
| 1.0675 | 0.6776 | 1.0637 | 0.6761 | 9.987395e-10 | 1588 |
| 1.0618 | 0.6776 | 1.0636 | 0.6761 | 9.98738e-10 | 1589 |
| 1.0542 | 0.6776 | 1.0635 | 0.6761 | 9.987364e-10 | 1590 |
| 1.0560 | 0.6776 | 1.0634 | 0.6761 | 9.987349e-10 | 1591 |
| 1.0617 | 0.6776 | 1.0633 | 0.6761 | 9.987333e-10 | 1592 |
| 1.0553 | 0.6776 | 1.0631 | 0.6761 | 9.987318e-10 | 1593 |
| 1.0613 | 0.6776 | 1.0630 | 0.6761 | 9.987302e-10 | 1594 |
| 1.0562 | 0.6776 | 1.0629 | 0.6761 | 9.987287e-10 | 1595 |
| 1.0539 | 0.6776 | 1.0628 | 0.6761 | 9.987271e-10 | 1596 |
| 1.0569 | 0.6776 | 1.0627 | 0.6761 | 9.987255e-10 | 1597 |
| 1.0600 | 0.6776 | 1.0626 | 0.6761 | 9.98724e-10 | 1598 |
| 1.0601 | 0.6776 | 1.0624 | 0.6761 | 9.987224e-10 | 1599 |
| 1.0600 | 0.6776 | 1.0623 | 0.6761 | 9.987209e-10 | 1600 |
| 1.0604 | 0.6776 | 1.0622 | 0.6761 | 9.987193e-10 | 1601 |
| 1.0583 | 0.6776 | 1.0621 | 0.6761 | 9.987178e-10 | 1602 |
| 1.0590 | 0.6776 | 1.0620 | 0.6761 | 9.987162e-10 | 1603 |
| 1.0651 | 0.6776 | 1.0618 | 0.6761 | 9.987147e-10 | 1604 |
| 1.0613 | 0.6776 | 1.0617 | 0.6761 | 9.987131e-10 | 1605 |
| 1.0558 | 0.6776 | 1.0616 | 0.6761 | 9.987116e-10 | 1606 |
| 1.0591 | 0.6776 | 1.0615 | 0.6761 | 9.9871e-10 | 1607 |
| 1.0558 | 0.6776 | 1.0614 | 0.6761 | 9.987084e-10 | 1608 |
| 1.0588 | 0.6776 | 1.0613 | 0.6761 | 9.987069e-10 | 1609 |
| 1.0537 | 0.6776 | 1.0612 | 0.6761 | 9.987053e-10 | 1610 |
| 1.0614 | 0.6776 | 1.0610 | 0.6761 | 9.987038e-10 | 1611 |
| 1.0573 | 0.6776 | 1.0609 | 0.6761 | 9.987021e-10 | 1612 |
| 1.0593 | 0.6776 | 1.0608 | 0.6761 | 9.987005e-10 | 1613 |
| 1.0532 | 0.6776 | 1.0607 | 0.6761 | 9.986988e-10 | 1614 |
| 1.0576 | 0.6776 | 1.0605 | 0.6761 | 9.986971e-10 | 1615 |
| 1.0578 | 0.6776 | 1.0604 | 0.6761 | 9.986955e-10 | 1616 |
| 1.0539 | 0.6776 | 1.0603 | 0.6761 | 9.986938e-10 | 1617 |
| 1.0490 | 0.6776 | 1.0602 | 0.6761 | 9.986921e-10 | 1618 |
| 1.0582 | 0.6776 | 1.0601 | 0.6761 | 9.986905e-10 | 1619 |
| 1.0604 | 0.6776 | 1.0600 | 0.6761 | 9.986888e-10 | 1620 |
| 1.0565 | 0.6776 | 1.0598 | 0.6761 | 9.986871e-10 | 1621 |
| 1.0541 | 0.6776 | 1.0597 | 0.6761 | 9.986855e-10 | 1622 |
| 1.0560 | 0.6776 | 1.0596 | 0.6761 | 9.986838e-10 | 1623 |
| 1.0582 | 0.6776 | 1.0595 | 0.6761 | 9.986821e-10 | 1624 |
| 1.0549 | 0.6776 | 1.0594 | 0.6761 | 9.986805e-10 | 1625 |
| 1.0571 | 0.6776 | 1.0593 | 0.6761 | 9.986788e-10 | 1626 |
| 1.0468 | 0.6776 | 1.0591 | 0.6761 | 9.986771e-10 | 1627 |
| 1.0489 | 0.6776 | 1.0590 | 0.6761 | 9.986755e-10 | 1628 |
| 1.0514 | 0.6776 | 1.0589 | 0.6761 | 9.986738e-10 | 1629 |
| 1.0626 | 0.6776 | 1.0588 | 0.6761 | 9.986721e-10 | 1630 |
| 1.0538 | 0.6776 | 1.0587 | 0.6761 | 9.986705e-10 | 1631 |
| 1.0548 | 0.6776 | 1.0586 | 0.6761 | 9.986688e-10 | 1632 |
| 1.0507 | 0.6776 | 1.0585 | 0.6761 | 9.986671e-10 | 1633 |
| 1.0601 | 0.6776 | 1.0583 | 0.6761 | 9.986655e-10 | 1634 |
| 1.0547 | 0.6776 | 1.0582 | 0.6761 | 9.986638e-10 | 1635 |
| 1.0492 | 0.6776 | 1.0581 | 0.6761 | 9.986622e-10 | 1636 |
| 1.0556 | 0.6776 | 1.0580 | 0.6761 | 9.986605e-10 | 1637 |
| 1.0500 | 0.6776 | 1.0578 | 0.6761 | 9.986588e-10 | 1638 |
| 1.0509 | 0.6776 | 1.0577 | 0.6761 | 9.986572e-10 | 1639 |
| 1.0614 | 0.6776 | 1.0576 | 0.6761 | 9.986555e-10 | 1640 |
| 1.0519 | 0.6776 | 1.0575 | 0.6761 | 9.986538e-10 | 1641 |
| 1.0521 | 0.6776 | 1.0574 | 0.6761 | 9.986522e-10 | 1642 |
| 1.0534 | 0.6776 | 1.0573 | 0.6761 | 9.986505e-10 | 1643 |
| 1.0604 | 0.6776 | 1.0572 | 0.6761 | 9.986488e-10 | 1644 |
| 1.0557 | 0.6776 | 1.0571 | 0.6761 | 9.986472e-10 | 1645 |
| 1.0628 | 0.6776 | 1.0569 | 0.6761 | 9.986455e-10 | 1646 |
| 1.0532 | 0.6776 | 1.0568 | 0.6761 | 9.986438e-10 | 1647 |
| 1.0564 | 0.6776 | 1.0567 | 0.6761 | 9.986422e-10 | 1648 |
| 1.0535 | 0.6776 | 1.0566 | 0.6761 | 9.986405e-10 | 1649 |
| 1.0499 | 0.6776 | 1.0565 | 0.6761 | 9.986388e-10 | 1650 |
| 1.0531 | 0.6776 | 1.0564 | 0.6761 | 9.986372e-10 | 1651 |
| 1.0532 | 0.6776 | 1.0562 | 0.6761 | 9.986355e-10 | 1652 |
| 1.0506 | 0.6776 | 1.0561 | 0.6761 | 9.986338e-10 | 1653 |
| 1.0492 | 0.6776 | 1.0560 | 0.6761 | 9.986322e-10 | 1654 |
| 1.0570 | 0.6776 | 1.0559 | 0.6761 | 9.986305e-10 | 1655 |
| 1.0514 | 0.6776 | 1.0558 | 0.6761 | 9.986288e-10 | 1656 |
| 1.0576 | 0.6776 | 1.0557 | 0.6761 | 9.986272e-10 | 1657 |
| 1.0493 | 0.6776 | 1.0556 | 0.6761 | 9.986255e-10 | 1658 |
| 1.0555 | 0.6776 | 1.0555 | 0.6761 | 9.986239e-10 | 1659 |
| 1.0452 | 0.6776 | 1.0553 | 0.6761 | 9.986222e-10 | 1660 |
| 1.0467 | 0.6776 | 1.0552 | 0.6761 | 9.986205e-10 | 1661 |
| 1.0566 | 0.6776 | 1.0551 | 0.6761 | 9.986189e-10 | 1662 |
| 1.0542 | 0.6776 | 1.0550 | 0.6761 | 9.986172e-10 | 1663 |
| 1.0523 | 0.6776 | 1.0549 | 0.6761 | 9.986155e-10 | 1664 |
| 1.0521 | 0.6776 | 1.0548 | 0.6761 | 9.986139e-10 | 1665 |
| 1.0470 | 0.6776 | 1.0547 | 0.6761 | 9.986122e-10 | 1666 |
| 1.0556 | 0.6776 | 1.0546 | 0.6761 | 9.986105e-10 | 1667 |
| 1.0511 | 0.6776 | 1.0544 | 0.6761 | 9.986089e-10 | 1668 |
| 1.0493 | 0.6776 | 1.0543 | 0.6761 | 9.986072e-10 | 1669 |
| 1.0565 | 0.6776 | 1.0542 | 0.6761 | 9.986055e-10 | 1670 |
| 1.0489 | 0.6776 | 1.0541 | 0.6761 | 9.986039e-10 | 1671 |
| 1.0481 | 0.6776 | 1.0540 | 0.6761 | 9.986022e-10 | 1672 |
| 1.0468 | 0.6776 | 1.0539 | 0.6761 | 9.986005e-10 | 1673 |
| 1.0496 | 0.6776 | 1.0538 | 0.6761 | 9.985989e-10 | 1674 |
| 1.0479 | 0.6776 | 1.0537 | 0.6761 | 9.985972e-10 | 1675 |
| 1.0415 | 0.6776 | 1.0536 | 0.6761 | 9.985955e-10 | 1676 |
| 1.0570 | 0.6776 | 1.0535 | 0.6761 | 9.985939e-10 | 1677 |
| 1.0503 | 0.6776 | 1.0534 | 0.6761 | 9.985922e-10 | 1678 |
| 1.0469 | 0.6776 | 1.0533 | 0.6761 | 9.985905e-10 | 1679 |
| 1.0553 | 0.6776 | 1.0532 | 0.6761 | 9.985889e-10 | 1680 |
| 1.0563 | 0.6776 | 1.0530 | 0.6761 | 9.985872e-10 | 1681 |
| 1.0430 | 0.6776 | 1.0529 | 0.6761 | 9.985855e-10 | 1682 |
| 1.0454 | 0.6776 | 1.0528 | 0.6761 | 9.985839e-10 | 1683 |
| 1.0465 | 0.6776 | 1.0527 | 0.6761 | 9.985822e-10 | 1684 |
| 1.0531 | 0.6776 | 1.0526 | 0.6761 | 9.985806e-10 | 1685 |
| 1.0488 | 0.6776 | 1.0525 | 0.6761 | 9.985789e-10 | 1686 |
| 1.0532 | 0.6776 | 1.0524 | 0.6761 | 9.985772e-10 | 1687 |
| 1.0489 | 0.6776 | 1.0523 | 0.6761 | 9.985756e-10 | 1688 |
| 1.0511 | 0.6776 | 1.0522 | 0.6761 | 9.985739e-10 | 1689 |
| 1.0520 | 0.6776 | 1.0521 | 0.6761 | 9.985722e-10 | 1690 |
| 1.0455 | 0.6776 | 1.0519 | 0.6761 | 9.985706e-10 | 1691 |
| 1.0410 | 0.6776 | 1.0518 | 0.6761 | 9.985689e-10 | 1692 |
| 1.0508 | 0.6776 | 1.0518 | 0.6761 | 9.985672e-10 | 1693 |
| 1.0484 | 0.6776 | 1.0516 | 0.6761 | 9.985656e-10 | 1694 |
| 1.0496 | 0.6776 | 1.0515 | 0.6761 | 9.985639e-10 | 1695 |
| 1.0462 | 0.6776 | 1.0514 | 0.6761 | 9.985622e-10 | 1696 |
| 1.0458 | 0.6776 | 1.0513 | 0.6761 | 9.985606e-10 | 1697 |
| 1.0484 | 0.6776 | 1.0512 | 0.6761 | 9.985589e-10 | 1698 |
| 1.0519 | 0.6776 | 1.0511 | 0.6761 | 9.985572e-10 | 1699 |
| 1.0415 | 0.6776 | 1.0510 | 0.6761 | 9.985556e-10 | 1700 |
| 1.0514 | 0.6776 | 1.0509 | 0.6761 | 9.985539e-10 | 1701 |
| 1.0433 | 0.6776 | 1.0508 | 0.6761 | 9.985522e-10 | 1702 |
| 1.0457 | 0.6776 | 1.0507 | 0.6761 | 9.985506e-10 | 1703 |
| 1.0499 | 0.6776 | 1.0506 | 0.6761 | 9.985489e-10 | 1704 |
| 1.0430 | 0.6776 | 1.0505 | 0.6761 | 9.985472e-10 | 1705 |
| 1.0438 | 0.6776 | 1.0504 | 0.6761 | 9.985456e-10 | 1706 |
| 1.0497 | 0.6776 | 1.0502 | 0.6761 | 9.985439e-10 | 1707 |
| 1.0465 | 0.6776 | 1.0501 | 0.6761 | 9.985422e-10 | 1708 |
| 1.0392 | 0.6776 | 1.0500 | 0.6761 | 9.985406e-10 | 1709 |
| 1.0446 | 0.6776 | 1.0499 | 0.6761 | 9.985389e-10 | 1710 |
| 1.0516 | 0.6776 | 1.0498 | 0.6761 | 9.985373e-10 | 1711 |
| 1.0495 | 0.6776 | 1.0497 | 0.6761 | 9.985356e-10 | 1712 |
| 1.0457 | 0.6776 | 1.0496 | 0.6761 | 9.985339e-10 | 1713 |
| 1.0496 | 0.6776 | 1.0495 | 0.6761 | 9.985323e-10 | 1714 |
| 1.0413 | 0.6776 | 1.0494 | 0.6761 | 9.985306e-10 | 1715 |
| 1.0501 | 0.6776 | 1.0493 | 0.6761 | 9.985289e-10 | 1716 |
| 1.0429 | 0.6776 | 1.0492 | 0.6761 | 9.985273e-10 | 1717 |
| 1.0417 | 0.6776 | 1.0491 | 0.6761 | 9.985256e-10 | 1718 |
| 1.0401 | 0.6776 | 1.0490 | 0.6761 | 9.985239e-10 | 1719 |
| 1.0422 | 0.6776 | 1.0489 | 0.6761 | 9.985223e-10 | 1720 |
| 1.0485 | 0.6776 | 1.0488 | 0.6761 | 9.985206e-10 | 1721 |
| 1.0480 | 0.6776 | 1.0487 | 0.6761 | 9.985189e-10 | 1722 |
| 1.0449 | 0.6776 | 1.0486 | 0.6761 | 9.985173e-10 | 1723 |
| 1.0470 | 0.6776 | 1.0484 | 0.6761 | 9.985155e-10 | 1724 |
| 1.0484 | 0.6776 | 1.0483 | 0.6761 | 9.985137e-10 | 1725 |
| 1.0477 | 0.6776 | 1.0482 | 0.6761 | 9.985119e-10 | 1726 |
| 1.0396 | 0.6776 | 1.0481 | 0.6761 | 9.985102e-10 | 1727 |
| 1.0486 | 0.6776 | 1.0480 | 0.6761 | 9.985084e-10 | 1728 |
| 1.0422 | 0.6776 | 1.0479 | 0.6761 | 9.985066e-10 | 1729 |
| 1.0380 | 0.6776 | 1.0478 | 0.6761 | 9.985048e-10 | 1730 |
| 1.0407 | 0.6776 | 1.0477 | 0.6761 | 9.985031e-10 | 1731 |
| 1.0408 | 0.6776 | 1.0476 | 0.6761 | 9.985013e-10 | 1732 |
| 1.0478 | 0.6776 | 1.0475 | 0.6761 | 9.984995e-10 | 1733 |
| 1.0402 | 0.6776 | 1.0474 | 0.6761 | 9.984977e-10 | 1734 |
| 1.0453 | 0.6776 | 1.0473 | 0.6761 | 9.98496e-10 | 1735 |
| 1.0468 | 0.6776 | 1.0472 | 0.6761 | 9.984942e-10 | 1736 |
| 1.0491 | 0.6776 | 1.0471 | 0.6761 | 9.984924e-10 | 1737 |
| 1.0377 | 0.6776 | 1.0470 | 0.6761 | 9.984906e-10 | 1738 |
| 1.0390 | 0.6776 | 1.0469 | 0.6761 | 9.984888e-10 | 1739 |
| 1.0458 | 0.6776 | 1.0468 | 0.6761 | 9.984871e-10 | 1740 |
| 1.0426 | 0.6776 | 1.0467 | 0.6761 | 9.984853e-10 | 1741 |
| 1.0418 | 0.6776 | 1.0466 | 0.6761 | 9.984835e-10 | 1742 |
| 1.0389 | 0.6776 | 1.0465 | 0.6761 | 9.984817e-10 | 1743 |
| 1.0421 | 0.6776 | 1.0464 | 0.6761 | 9.9848e-10 | 1744 |
| 1.0422 | 0.6776 | 1.0463 | 0.6761 | 9.984782e-10 | 1745 |
| 1.0421 | 0.6776 | 1.0462 | 0.6761 | 9.984764e-10 | 1746 |
| 1.0542 | 0.6776 | 1.0461 | 0.6761 | 9.984746e-10 | 1747 |
| 1.0395 | 0.6776 | 1.0460 | 0.6761 | 9.984729e-10 | 1748 |
| 1.0462 | 0.6776 | 1.0459 | 0.6761 | 9.984711e-10 | 1749 |
| 1.0360 | 0.6776 | 1.0457 | 0.6761 | 9.984693e-10 | 1750 |
| 1.0456 | 0.6776 | 1.0457 | 0.6761 | 9.984675e-10 | 1751 |
| 1.0490 | 0.6776 | 1.0455 | 0.6761 | 9.984658e-10 | 1752 |
| 1.0464 | 0.6776 | 1.0454 | 0.6761 | 9.98464e-10 | 1753 |
| 1.0449 | 0.6776 | 1.0453 | 0.6761 | 9.984622e-10 | 1754 |
| 1.0412 | 0.6776 | 1.0452 | 0.6761 | 9.984604e-10 | 1755 |
| 1.0328 | 0.6776 | 1.0451 | 0.6761 | 9.984586e-10 | 1756 |
| 1.0436 | 0.6776 | 1.0450 | 0.6761 | 9.984569e-10 | 1757 |
| 1.0437 | 0.6776 | 1.0449 | 0.6761 | 9.984551e-10 | 1758 |
| 1.0407 | 0.6776 | 1.0448 | 0.6761 | 9.984533e-10 | 1759 |
| 1.0371 | 0.6776 | 1.0447 | 0.6761 | 9.984515e-10 | 1760 |
| 1.0408 | 0.6776 | 1.0446 | 0.6761 | 9.984498e-10 | 1761 |
| 1.0392 | 0.6776 | 1.0445 | 0.6761 | 9.98448e-10 | 1762 |
| 1.0402 | 0.6776 | 1.0444 | 0.6761 | 9.984462e-10 | 1763 |
| 1.0456 | 0.6776 | 1.0443 | 0.6761 | 9.984444e-10 | 1764 |
| 1.0318 | 0.6776 | 1.0442 | 0.6761 | 9.984427e-10 | 1765 |
| 1.0345 | 0.6776 | 1.0441 | 0.6761 | 9.984409e-10 | 1766 |
| 1.0440 | 0.6776 | 1.0440 | 0.6761 | 9.984391e-10 | 1767 |
| 1.0371 | 0.6776 | 1.0439 | 0.6761 | 9.984373e-10 | 1768 |
| 1.0391 | 0.6776 | 1.0438 | 0.6761 | 9.984356e-10 | 1769 |
| 1.0431 | 0.6776 | 1.0437 | 0.6761 | 9.984338e-10 | 1770 |
| 1.0364 | 0.6776 | 1.0436 | 0.6761 | 9.98432e-10 | 1771 |
| 1.0388 | 0.6776 | 1.0435 | 0.6761 | 9.984302e-10 | 1772 |
| 1.0450 | 0.6776 | 1.0434 | 0.6761 | 9.984285e-10 | 1773 |
| 1.0418 | 0.6776 | 1.0433 | 0.6761 | 9.984267e-10 | 1774 |
| 1.0410 | 0.6776 | 1.0432 | 0.6761 | 9.984249e-10 | 1775 |
| 1.0421 | 0.6776 | 1.0431 | 0.6761 | 9.984231e-10 | 1776 |
| 1.0386 | 0.6776 | 1.0430 | 0.6761 | 9.984213e-10 | 1777 |
| 1.0388 | 0.6776 | 1.0429 | 0.6761 | 9.984196e-10 | 1778 |
| 1.0348 | 0.6776 | 1.0428 | 0.6761 | 9.984178e-10 | 1779 |
| 1.0361 | 0.6776 | 1.0427 | 0.6761 | 9.98416e-10 | 1780 |
| 1.0315 | 0.6776 | 1.0426 | 0.6761 | 9.984142e-10 | 1781 |
| 1.0418 | 0.6776 | 1.0425 | 0.6761 | 9.984125e-10 | 1782 |
| 1.0449 | 0.6776 | 1.0424 | 0.6761 | 9.984107e-10 | 1783 |
| 1.0396 | 0.6776 | 1.0423 | 0.6761 | 9.984089e-10 | 1784 |
| 1.0344 | 0.6776 | 1.0422 | 0.6761 | 9.984071e-10 | 1785 |
| 1.0364 | 0.6776 | 1.0421 | 0.6761 | 9.984054e-10 | 1786 |
| 1.0408 | 0.6776 | 1.0420 | 0.6761 | 9.984036e-10 | 1787 |
| 1.0333 | 0.6776 | 1.0419 | 0.6761 | 9.984018e-10 | 1788 |
| 1.0354 | 0.6776 | 1.0418 | 0.6761 | 9.984e-10 | 1789 |
| 1.0375 | 0.6776 | 1.0417 | 0.6761 | 9.983983e-10 | 1790 |
| 1.0402 | 0.6776 | 1.0416 | 0.6761 | 9.983965e-10 | 1791 |
| 1.0416 | 0.6776 | 1.0415 | 0.6761 | 9.983947e-10 | 1792 |
| 1.0337 | 0.6776 | 1.0414 | 0.6761 | 9.983929e-10 | 1793 |
| 1.0346 | 0.6776 | 1.0413 | 0.6761 | 9.983911e-10 | 1794 |
| 1.0343 | 0.6776 | 1.0412 | 0.6761 | 9.983894e-10 | 1795 |
| 1.0399 | 0.6776 | 1.0411 | 0.6761 | 9.983876e-10 | 1796 |
| 1.0364 | 0.6776 | 1.0410 | 0.6761 | 9.983858e-10 | 1797 |
| 1.0392 | 0.6776 | 1.0409 | 0.6761 | 9.98384e-10 | 1798 |
| 1.0379 | 0.6776 | 1.0409 | 0.6761 | 9.983823e-10 | 1799 |
| 1.0317 | 0.6776 | 1.0407 | 0.6761 | 9.983805e-10 | 1800 |
| 1.0305 | 0.6776 | 1.0407 | 0.6761 | 9.983787e-10 | 1801 |
| 1.0377 | 0.6776 | 1.0406 | 0.6761 | 9.983769e-10 | 1802 |
| 1.0397 | 0.6776 | 1.0405 | 0.6761 | 9.983752e-10 | 1803 |
| 1.0335 | 0.6776 | 1.0404 | 0.6761 | 9.983734e-10 | 1804 |
| 1.0353 | 0.6776 | 1.0403 | 0.6761 | 9.983716e-10 | 1805 |
| 1.0387 | 0.6776 | 1.0402 | 0.6761 | 9.983698e-10 | 1806 |
| 1.0323 | 0.6776 | 1.0401 | 0.6761 | 9.98368e-10 | 1807 |
| 1.0354 | 0.6776 | 1.0400 | 0.6761 | 9.983663e-10 | 1808 |
| 1.0327 | 0.6776 | 1.0399 | 0.6761 | 9.983645e-10 | 1809 |
| 1.0339 | 0.6776 | 1.0398 | 0.6761 | 9.983627e-10 | 1810 |
| 1.0344 | 0.6776 | 1.0397 | 0.6761 | 9.98361e-10 | 1811 |
| 1.0322 | 0.6776 | 1.0396 | 0.6761 | 9.983592e-10 | 1812 |
| 1.0320 | 0.6776 | 1.0395 | 0.6761 | 9.983574e-10 | 1813 |
| 1.0299 | 0.6776 | 1.0394 | 0.6761 | 9.983556e-10 | 1814 |
| 1.0396 | 0.6776 | 1.0393 | 0.6761 | 9.983538e-10 | 1815 |
| 1.0327 | 0.6776 | 1.0392 | 0.6761 | 9.983521e-10 | 1816 |
| 1.0327 | 0.6776 | 1.0391 | 0.6761 | 9.983503e-10 | 1817 |
| 1.0406 | 0.6776 | 1.0390 | 0.6761 | 9.983485e-10 | 1818 |
| 1.0292 | 0.6776 | 1.0389 | 0.6761 | 9.983467e-10 | 1819 |
| 1.0416 | 0.6776 | 1.0388 | 0.6761 | 9.98345e-10 | 1820 |
| 1.0326 | 0.6776 | 1.0388 | 0.6761 | 9.983432e-10 | 1821 |
| 1.0315 | 0.6776 | 1.0387 | 0.6761 | 9.983414e-10 | 1822 |
| 1.0299 | 0.6776 | 1.0386 | 0.6761 | 9.983396e-10 | 1823 |
| 1.0439 | 0.6776 | 1.0385 | 0.6761 | 9.983379e-10 | 1824 |
| 1.0347 | 0.6776 | 1.0384 | 0.6761 | 9.983361e-10 | 1825 |
| 1.0314 | 0.6776 | 1.0383 | 0.6761 | 9.983343e-10 | 1826 |
| 1.0363 | 0.6776 | 1.0382 | 0.6761 | 9.983325e-10 | 1827 |
| 1.0379 | 0.6776 | 1.0381 | 0.6761 | 9.983308e-10 | 1828 |
| 1.0226 | 0.6776 | 1.0380 | 0.6761 | 9.98329e-10 | 1829 |
| 1.0359 | 0.6776 | 1.0379 | 0.6761 | 9.983272e-10 | 1830 |
| 1.0352 | 0.6776 | 1.0378 | 0.6761 | 9.983254e-10 | 1831 |
| 1.0300 | 0.6776 | 1.0377 | 0.6761 | 9.983236e-10 | 1832 |
| 1.0363 | 0.6776 | 1.0376 | 0.6761 | 9.983219e-10 | 1833 |
| 1.0303 | 0.6776 | 1.0375 | 0.6761 | 9.983201e-10 | 1834 |
| 1.0336 | 0.6776 | 1.0374 | 0.6761 | 9.983182e-10 | 1835 |
| 1.0311 | 0.6776 | 1.0373 | 0.6761 | 9.983163e-10 | 1836 |
| 1.0299 | 0.6776 | 1.0373 | 0.6761 | 9.983144e-10 | 1837 |
| 1.0364 | 0.6776 | 1.0372 | 0.6761 | 9.983125e-10 | 1838 |
| 1.0275 | 0.6776 | 1.0371 | 0.6761 | 9.983107e-10 | 1839 |
| 1.0320 | 0.6776 | 1.0370 | 0.6761 | 9.983088e-10 | 1840 |
| 1.0343 | 0.6776 | 1.0369 | 0.6761 | 9.983069e-10 | 1841 |
| 1.0337 | 0.6776 | 1.0368 | 0.6761 | 9.98305e-10 | 1842 |
| 1.0346 | 0.6776 | 1.0367 | 0.6761 | 9.983031e-10 | 1843 |
| 1.0283 | 0.6776 | 1.0366 | 0.6761 | 9.983012e-10 | 1844 |
| 1.0301 | 0.6776 | 1.0365 | 0.6761 | 9.982993e-10 | 1845 |
| 1.0339 | 0.6776 | 1.0364 | 0.6761 | 9.982974e-10 | 1846 |
| 1.0316 | 0.6776 | 1.0363 | 0.6761 | 9.982956e-10 | 1847 |
| 1.0359 | 0.6776 | 1.0362 | 0.6761 | 9.982937e-10 | 1848 |
| 1.0308 | 0.6776 | 1.0361 | 0.6761 | 9.982918e-10 | 1849 |
| 1.0309 | 0.6776 | 1.0360 | 0.6761 | 9.982899e-10 | 1850 |
| 1.0335 | 0.6776 | 1.0360 | 0.6761 | 9.98288e-10 | 1851 |
| 1.0348 | 0.6776 | 1.0359 | 0.6761 | 9.982861e-10 | 1852 |
| 1.0365 | 0.6776 | 1.0358 | 0.6761 | 9.982842e-10 | 1853 |
| 1.0327 | 0.6776 | 1.0357 | 0.6761 | 9.982823e-10 | 1854 |
| 1.0298 | 0.6776 | 1.0356 | 0.6761 | 9.982805e-10 | 1855 |
| 1.0331 | 0.6776 | 1.0355 | 0.6761 | 9.982786e-10 | 1856 |
| 1.0328 | 0.6776 | 1.0354 | 0.6761 | 9.982767e-10 | 1857 |
| 1.0357 | 0.6776 | 1.0353 | 0.6761 | 9.982748e-10 | 1858 |
| 1.0336 | 0.6776 | 1.0352 | 0.6761 | 9.982729e-10 | 1859 |
| 1.0290 | 0.6776 | 1.0351 | 0.6761 | 9.98271e-10 | 1860 |
| 1.0313 | 0.6776 | 1.0350 | 0.6761 | 9.982691e-10 | 1861 |
| 1.0332 | 0.6776 | 1.0350 | 0.6761 | 9.982672e-10 | 1862 |
| 1.0327 | 0.6776 | 1.0349 | 0.6761 | 9.982654e-10 | 1863 |
| 1.0264 | 0.6776 | 1.0348 | 0.6761 | 9.982635e-10 | 1864 |
| 1.0270 | 0.6776 | 1.0347 | 0.6761 | 9.982616e-10 | 1865 |
| 1.0338 | 0.6776 | 1.0346 | 0.6761 | 9.982597e-10 | 1866 |
| 1.0244 | 0.6776 | 1.0345 | 0.6761 | 9.982578e-10 | 1867 |
| 1.0226 | 0.6776 | 1.0344 | 0.6761 | 9.982559e-10 | 1868 |
| 1.0293 | 0.6776 | 1.0343 | 0.6761 | 9.98254e-10 | 1869 |
| 1.0323 | 0.6776 | 1.0342 | 0.6761 | 9.982521e-10 | 1870 |
| 1.0278 | 0.6776 | 1.0342 | 0.6761 | 9.982503e-10 | 1871 |
| 1.0303 | 0.6776 | 1.0341 | 0.6761 | 9.982484e-10 | 1872 |
| 1.0277 | 0.6776 | 1.0340 | 0.6761 | 9.982465e-10 | 1873 |
| 1.0298 | 0.6776 | 1.0339 | 0.6761 | 9.982446e-10 | 1874 |
| 1.0285 | 0.6776 | 1.0338 | 0.6761 | 9.982427e-10 | 1875 |
| 1.0389 | 0.6776 | 1.0337 | 0.6761 | 9.982408e-10 | 1876 |
| 1.0222 | 0.6776 | 1.0336 | 0.6761 | 9.982389e-10 | 1877 |
| 1.0318 | 0.6776 | 1.0335 | 0.6761 | 9.98237e-10 | 1878 |
| 1.0306 | 0.6776 | 1.0334 | 0.6761 | 9.982352e-10 | 1879 |
| 1.0309 | 0.6776 | 1.0334 | 0.6761 | 9.982333e-10 | 1880 |
| 1.0326 | 0.6776 | 1.0333 | 0.6761 | 9.982314e-10 | 1881 |
| 1.0306 | 0.6776 | 1.0332 | 0.6761 | 9.982295e-10 | 1882 |
| 1.0346 | 0.6776 | 1.0331 | 0.6761 | 9.982276e-10 | 1883 |
| 1.0270 | 0.6776 | 1.0330 | 0.6761 | 9.982257e-10 | 1884 |
| 1.0334 | 0.6776 | 1.0329 | 0.6761 | 9.982238e-10 | 1885 |
| 1.0258 | 0.6776 | 1.0328 | 0.6761 | 9.98222e-10 | 1886 |
| 1.0238 | 0.6776 | 1.0327 | 0.6761 | 9.982201e-10 | 1887 |
| 1.0339 | 0.6776 | 1.0326 | 0.6761 | 9.982182e-10 | 1888 |
| 1.0246 | 0.6776 | 1.0326 | 0.6761 | 9.982163e-10 | 1889 |
| 1.0305 | 0.6776 | 1.0325 | 0.6761 | 9.982144e-10 | 1890 |
| 1.0241 | 0.6776 | 1.0324 | 0.6761 | 9.982125e-10 | 1891 |
| 1.0318 | 0.6776 | 1.0323 | 0.6761 | 9.982106e-10 | 1892 |
| 1.0270 | 0.6776 | 1.0322 | 0.6761 | 9.982087e-10 | 1893 |
| 1.0234 | 0.6776 | 1.0321 | 0.6761 | 9.982069e-10 | 1894 |
| 1.0250 | 0.6776 | 1.0320 | 0.6761 | 9.98205e-10 | 1895 |
| 1.0353 | 0.6776 | 1.0319 | 0.6761 | 9.982031e-10 | 1896 |
| 1.0191 | 0.6776 | 1.0319 | 0.6761 | 9.982012e-10 | 1897 |
| 1.0299 | 0.6776 | 1.0318 | 0.6761 | 9.981993e-10 | 1898 |
| 1.0271 | 0.6776 | 1.0317 | 0.6761 | 9.981974e-10 | 1899 |
| 1.0270 | 0.6776 | 1.0316 | 0.6761 | 9.981955e-10 | 1900 |
| 1.0240 | 0.6776 | 1.0315 | 0.6761 | 9.981936e-10 | 1901 |
| 1.0209 | 0.6776 | 1.0314 | 0.6761 | 9.981918e-10 | 1902 |
| 1.0339 | 0.6776 | 1.0313 | 0.6761 | 9.981899e-10 | 1903 |
| 1.0242 | 0.6776 | 1.0313 | 0.6761 | 9.98188e-10 | 1904 |
| 1.0244 | 0.6776 | 1.0312 | 0.6761 | 9.981861e-10 | 1905 |
| 1.0257 | 0.6776 | 1.0311 | 0.6761 | 9.981842e-10 | 1906 |
| 1.0328 | 0.6776 | 1.0310 | 0.6761 | 9.981823e-10 | 1907 |
| 1.0244 | 0.6776 | 1.0309 | 0.6761 | 9.981804e-10 | 1908 |
| 1.0271 | 0.6776 | 1.0308 | 0.6761 | 9.981785e-10 | 1909 |
| 1.0313 | 0.6776 | 1.0308 | 0.6761 | 9.981767e-10 | 1910 |
| 1.0282 | 0.6776 | 1.0307 | 0.6761 | 9.981748e-10 | 1911 |
| 1.0255 | 0.6776 | 1.0306 | 0.6761 | 9.981729e-10 | 1912 |
| 1.0159 | 0.6776 | 1.0305 | 0.6761 | 9.98171e-10 | 1913 |
| 1.0255 | 0.6776 | 1.0304 | 0.6761 | 9.981691e-10 | 1914 |
| 1.0289 | 0.6776 | 1.0303 | 0.6761 | 9.981672e-10 | 1915 |
| 1.0187 | 0.6776 | 1.0302 | 0.6761 | 9.981653e-10 | 1916 |
| 1.0241 | 0.6776 | 1.0302 | 0.6761 | 9.981634e-10 | 1917 |
| 1.0247 | 0.6776 | 1.0301 | 0.6761 | 9.981616e-10 | 1918 |
| 1.0228 | 0.6776 | 1.0300 | 0.6761 | 9.981597e-10 | 1919 |
| 1.0269 | 0.6776 | 1.0299 | 0.6761 | 9.981578e-10 | 1920 |
| 1.0251 | 0.6776 | 1.0298 | 0.6761 | 9.981559e-10 | 1921 |
| 1.0205 | 0.6776 | 1.0297 | 0.6761 | 9.98154e-10 | 1922 |
| 1.0234 | 0.6776 | 1.0296 | 0.6761 | 9.981521e-10 | 1923 |
| 1.0246 | 0.6776 | 1.0296 | 0.6761 | 9.981502e-10 | 1924 |
| 1.0274 | 0.6776 | 1.0295 | 0.6761 | 9.981483e-10 | 1925 |
| 1.0324 | 0.6776 | 1.0294 | 0.6761 | 9.981465e-10 | 1926 |
| 1.0269 | 0.6776 | 1.0293 | 0.6761 | 9.981446e-10 | 1927 |
| 1.0265 | 0.6776 | 1.0292 | 0.6761 | 9.981427e-10 | 1928 |
| 1.0298 | 0.6776 | 1.0292 | 0.6761 | 9.981408e-10 | 1929 |
| 1.0139 | 0.6776 | 1.0291 | 0.6761 | 9.981389e-10 | 1930 |
| 1.0174 | 0.6776 | 1.0290 | 0.6761 | 9.98137e-10 | 1931 |
| 1.0222 | 0.6776 | 1.0289 | 0.6761 | 9.981351e-10 | 1932 |
| 1.0263 | 0.6776 | 1.0288 | 0.6761 | 9.981332e-10 | 1933 |
| 1.0243 | 0.6776 | 1.0288 | 0.6761 | 9.981314e-10 | 1934 |
| 1.0251 | 0.6776 | 1.0287 | 0.6761 | 9.981295e-10 | 1935 |
| 1.0288 | 0.6776 | 1.0286 | 0.6761 | 9.981276e-10 | 1936 |
| 1.0250 | 0.6776 | 1.0285 | 0.6761 | 9.981257e-10 | 1937 |
| 1.0262 | 0.6776 | 1.0284 | 0.6761 | 9.981238e-10 | 1938 |
| 1.0222 | 0.6776 | 1.0284 | 0.6761 | 9.981219e-10 | 1939 |
| 1.0224 | 0.6776 | 1.0283 | 0.6761 | 9.9812e-10 | 1940 |
| 1.0173 | 0.6776 | 1.0282 | 0.6761 | 9.981181e-10 | 1941 |
| 1.0243 | 0.6776 | 1.0281 | 0.6761 | 9.981163e-10 | 1942 |
| 1.0217 | 0.6776 | 1.0280 | 0.6761 | 9.981144e-10 | 1943 |
| 1.0235 | 0.6776 | 1.0280 | 0.6761 | 9.981125e-10 | 1944 |
| 1.0231 | 0.6776 | 1.0279 | 0.6761 | 9.981106e-10 | 1945 |
| 1.0291 | 0.6776 | 1.0278 | 0.6761 | 9.981087e-10 | 1946 |
| 1.0256 | 0.6776 | 1.0277 | 0.6761 | 9.981067e-10 | 1947 |
| 1.0214 | 0.6776 | 1.0277 | 0.6761 | 9.981047e-10 | 1948 |
| 1.0248 | 0.6776 | 1.0276 | 0.6761 | 9.981027e-10 | 1949 |
| 1.0266 | 0.6776 | 1.0275 | 0.6761 | 9.981007e-10 | 1950 |
| 1.0214 | 0.6776 | 1.0274 | 0.6761 | 9.980987e-10 | 1951 |
| 1.0265 | 0.6776 | 1.0273 | 0.6761 | 9.980967e-10 | 1952 |
| 1.0235 | 0.6776 | 1.0272 | 0.6761 | 9.980947e-10 | 1953 |
| 1.0238 | 0.6776 | 1.0272 | 0.6761 | 9.980927e-10 | 1954 |
| 1.0266 | 0.6776 | 1.0271 | 0.6761 | 9.980907e-10 | 1955 |
| 1.0210 | 0.6776 | 1.0270 | 0.6761 | 9.980887e-10 | 1956 |
| 1.0294 | 0.6776 | 1.0269 | 0.6761 | 9.980867e-10 | 1957 |
| 1.0203 | 0.6776 | 1.0268 | 0.6761 | 9.980847e-10 | 1958 |
| 1.0262 | 0.6776 | 1.0268 | 0.6761 | 9.980827e-10 | 1959 |
| 1.0259 | 0.6776 | 1.0267 | 0.6761 | 9.980807e-10 | 1960 |
| 1.0248 | 0.6776 | 1.0266 | 0.6761 | 9.980787e-10 | 1961 |
| 1.0186 | 0.6776 | 1.0265 | 0.6761 | 9.980767e-10 | 1962 |
| 1.0275 | 0.6776 | 1.0264 | 0.6761 | 9.980747e-10 | 1963 |
| 1.0208 | 0.6776 | 1.0264 | 0.6761 | 9.980727e-10 | 1964 |
| 1.0226 | 0.6776 | 1.0263 | 0.6761 | 9.980707e-10 | 1965 |
| 1.0234 | 0.6776 | 1.0262 | 0.6761 | 9.980687e-10 | 1966 |
| 1.0237 | 0.6776 | 1.0261 | 0.6761 | 9.980667e-10 | 1967 |
| 1.0246 | 0.6776 | 1.0261 | 0.6761 | 9.980647e-10 | 1968 |
| 1.0205 | 0.6776 | 1.0260 | 0.6761 | 9.980627e-10 | 1969 |
| 1.0164 | 0.6776 | 1.0259 | 0.6761 | 9.980607e-10 | 1970 |
| 1.0292 | 0.6776 | 1.0258 | 0.6761 | 9.980587e-10 | 1971 |
| 1.0239 | 0.6776 | 1.0257 | 0.6761 | 9.980567e-10 | 1972 |
| 1.0164 | 0.6776 | 1.0257 | 0.6761 | 9.980547e-10 | 1973 |
| 1.0223 | 0.6776 | 1.0256 | 0.6761 | 9.980528e-10 | 1974 |
| 1.0229 | 0.6776 | 1.0255 | 0.6761 | 9.980508e-10 | 1975 |
| 1.0215 | 0.6776 | 1.0254 | 0.6761 | 9.980488e-10 | 1976 |
| 1.0219 | 0.6776 | 1.0254 | 0.6761 | 9.980468e-10 | 1977 |
| 1.0309 | 0.6776 | 1.0253 | 0.6761 | 9.980448e-10 | 1978 |
| 1.0250 | 0.6776 | 1.0252 | 0.6761 | 9.980428e-10 | 1979 |
| 1.0191 | 0.6776 | 1.0251 | 0.6761 | 9.980408e-10 | 1980 |
| 1.0210 | 0.6776 | 1.0251 | 0.6761 | 9.980388e-10 | 1981 |
| 1.0189 | 0.6776 | 1.0250 | 0.6761 | 9.980368e-10 | 1982 |
| 1.0212 | 0.6776 | 1.0249 | 0.6761 | 9.980348e-10 | 1983 |
| 1.0226 | 0.6776 | 1.0248 | 0.6761 | 9.980328e-10 | 1984 |
| 1.0226 | 0.6776 | 1.0247 | 0.6761 | 9.980308e-10 | 1985 |
| 1.0164 | 0.6776 | 1.0247 | 0.6761 | 9.980288e-10 | 1986 |
| 1.0182 | 0.6776 | 1.0246 | 0.6761 | 9.980268e-10 | 1987 |
| 1.0163 | 0.6776 | 1.0245 | 0.6761 | 9.980248e-10 | 1988 |
| 1.0262 | 0.6776 | 1.0244 | 0.6761 | 9.980228e-10 | 1989 |
| 1.0168 | 0.6776 | 1.0243 | 0.6761 | 9.980208e-10 | 1990 |
| 1.0246 | 0.6776 | 1.0243 | 0.6761 | 9.980188e-10 | 1991 |
| 1.0243 | 0.6776 | 1.0242 | 0.6761 | 9.980168e-10 | 1992 |
| 1.0211 | 0.6776 | 1.0241 | 0.6761 | 9.980148e-10 | 1993 |
| 1.0162 | 0.6776 | 1.0240 | 0.6761 | 9.980128e-10 | 1994 |
| 1.0250 | 0.6776 | 1.0240 | 0.6761 | 9.980108e-10 | 1995 |
| 1.0233 | 0.6776 | 1.0239 | 0.6761 | 9.980088e-10 | 1996 |
| 1.0217 | 0.6776 | 1.0238 | 0.6761 | 9.980068e-10 | 1997 |
| 1.0228 | 0.6776 | 1.0237 | 0.6761 | 9.980048e-10 | 1998 |
| 1.0194 | 0.6776 | 1.0236 | 0.6761 | 9.980028e-10 | 1999 |
| 1.0264 | 0.6776 | 1.0236 | 0.6761 | 9.980008e-10 | 2000 |
| 1.0155 | 0.6776 | 1.0235 | 0.6761 | 9.979988e-10 | 2001 |
| 1.0161 | 0.6776 | 1.0234 | 0.6761 | 9.979968e-10 | 2002 |
| 1.0184 | 0.6776 | 1.0233 | 0.6761 | 9.979948e-10 | 2003 |
| 1.0151 | 0.6776 | 1.0232 | 0.6761 | 9.979928e-10 | 2004 |
| 1.0137 | 0.6776 | 1.0232 | 0.6761 | 9.979908e-10 | 2005 |
| 1.0135 | 0.6776 | 1.0231 | 0.6761 | 9.979888e-10 | 2006 |
| 1.0203 | 0.6776 | 1.0230 | 0.6761 | 9.979868e-10 | 2007 |
| 1.0192 | 0.6776 | 1.0229 | 0.6761 | 9.979848e-10 | 2008 |
| 1.0151 | 0.6776 | 1.0228 | 0.6761 | 9.979828e-10 | 2009 |
| 1.0177 | 0.6776 | 1.0228 | 0.6761 | 9.979808e-10 | 2010 |
| 1.0122 | 0.6776 | 1.0227 | 0.6761 | 9.979788e-10 | 2011 |
| 1.0186 | 0.6776 | 1.0226 | 0.6761 | 9.979768e-10 | 2012 |
| 1.0223 | 0.6776 | 1.0225 | 0.6761 | 9.979748e-10 | 2013 |
| 1.0187 | 0.6776 | 1.0224 | 0.6761 | 9.979728e-10 | 2014 |
| 1.0211 | 0.6776 | 1.0224 | 0.6761 | 9.979708e-10 | 2015 |
| 1.0171 | 0.6776 | 1.0223 | 0.6761 | 9.979688e-10 | 2016 |
| 1.0140 | 0.6776 | 1.0222 | 0.6761 | 9.979668e-10 | 2017 |
| 1.0123 | 0.6776 | 1.0221 | 0.6761 | 9.979648e-10 | 2018 |
| 1.0172 | 0.6800 | 1.0220 | 0.6761 | 9.979628e-10 | 2019 |
| 1.0153 | 0.6776 | 1.0220 | 0.6761 | 9.979608e-10 | 2020 |
| 1.0185 | 0.6776 | 1.0219 | 0.6761 | 9.979588e-10 | 2021 |
| 1.0250 | 0.6776 | 1.0218 | 0.6761 | 9.979568e-10 | 2022 |
| 1.0214 | 0.6776 | 1.0217 | 0.6761 | 9.979548e-10 | 2023 |
| 1.0212 | 0.6776 | 1.0216 | 0.6761 | 9.979528e-10 | 2024 |
| 1.0177 | 0.6776 | 1.0216 | 0.6761 | 9.979508e-10 | 2025 |
| 1.0202 | 0.6776 | 1.0215 | 0.6761 | 9.979488e-10 | 2026 |
| 1.0105 | 0.6776 | 1.0214 | 0.6761 | 9.979468e-10 | 2027 |
| 1.0178 | 0.6776 | 1.0214 | 0.6761 | 9.979448e-10 | 2028 |
| 1.0126 | 0.6776 | 1.0213 | 0.6761 | 9.979428e-10 | 2029 |
| 1.0151 | 0.6776 | 1.0212 | 0.6761 | 9.979408e-10 | 2030 |
| 1.0174 | 0.6776 | 1.0211 | 0.6761 | 9.979388e-10 | 2031 |
| 1.0217 | 0.6776 | 1.0211 | 0.6761 | 9.979368e-10 | 2032 |
| 1.0174 | 0.6776 | 1.0210 | 0.6761 | 9.979348e-10 | 2033 |
| 1.0189 | 0.6776 | 1.0209 | 0.6761 | 9.979328e-10 | 2034 |
| 1.0125 | 0.6776 | 1.0208 | 0.6761 | 9.979308e-10 | 2035 |
| 1.0192 | 0.6776 | 1.0208 | 0.6761 | 9.979289e-10 | 2036 |
| 1.0169 | 0.6776 | 1.0207 | 0.6761 | 9.979269e-10 | 2037 |
| 1.0101 | 0.6776 | 1.0206 | 0.6761 | 9.979249e-10 | 2038 |
| 1.0161 | 0.6776 | 1.0206 | 0.6761 | 9.979229e-10 | 2039 |
| 1.0144 | 0.6776 | 1.0205 | 0.6761 | 9.979209e-10 | 2040 |
| 1.0189 | 0.6776 | 1.0204 | 0.6761 | 9.979189e-10 | 2041 |
| 1.0149 | 0.6776 | 1.0203 | 0.6761 | 9.979169e-10 | 2042 |
| 1.0161 | 0.6776 | 1.0203 | 0.6761 | 9.979149e-10 | 2043 |
| 1.0144 | 0.6776 | 1.0202 | 0.6761 | 9.979129e-10 | 2044 |
| 1.0162 | 0.6776 | 1.0201 | 0.6761 | 9.979109e-10 | 2045 |
| 1.0166 | 0.6776 | 1.0201 | 0.6761 | 9.979089e-10 | 2046 |
| 1.0147 | 0.6776 | 1.0200 | 0.6761 | 9.979069e-10 | 2047 |
| 1.0093 | 0.6776 | 1.0199 | 0.6761 | 9.979049e-10 | 2048 |
| 1.0179 | 0.6776 | 1.0198 | 0.6761 | 9.979029e-10 | 2049 |
| 1.0098 | 0.6776 | 1.0198 | 0.6761 | 9.979009e-10 | 2050 |
| 1.0123 | 0.6776 | 1.0197 | 0.6761 | 9.978989e-10 | 2051 |
| 1.0144 | 0.6776 | 1.0196 | 0.6761 | 9.978969e-10 | 2052 |
| 1.0106 | 0.6776 | 1.0195 | 0.6761 | 9.978949e-10 | 2053 |
| 1.0194 | 0.6776 | 1.0195 | 0.6761 | 9.978929e-10 | 2054 |
| 1.0160 | 0.6776 | 1.0194 | 0.6761 | 9.978909e-10 | 2055 |
| 1.0129 | 0.6776 | 1.0193 | 0.6761 | 9.978889e-10 | 2056 |
| 1.0139 | 0.6776 | 1.0192 | 0.6761 | 9.978869e-10 | 2057 |
| 1.0138 | 0.6776 | 1.0192 | 0.6761 | 9.978849e-10 | 2058 |
| 1.0124 | 0.6776 | 1.0191 | 0.6761 | 9.978828e-10 | 2059 |
| 1.0185 | 0.6776 | 1.0190 | 0.6761 | 9.978807e-10 | 2060 |
| 1.0123 | 0.6776 | 1.0190 | 0.6761 | 9.978786e-10 | 2061 |
| 1.0124 | 0.6776 | 1.0189 | 0.6761 | 9.978764e-10 | 2062 |
| 1.0147 | 0.6776 | 1.0188 | 0.6761 | 9.978743e-10 | 2063 |
| 1.0112 | 0.6776 | 1.0188 | 0.6761 | 9.978722e-10 | 2064 |
| 1.0138 | 0.6776 | 1.0187 | 0.6761 | 9.978701e-10 | 2065 |
| 1.0100 | 0.6776 | 1.0186 | 0.6761 | 9.97868e-10 | 2066 |
| 1.0118 | 0.6776 | 1.0185 | 0.6761 | 9.978659e-10 | 2067 |
| 1.0121 | 0.6776 | 1.0185 | 0.6761 | 9.978638e-10 | 2068 |
| 1.0190 | 0.6776 | 1.0184 | 0.6761 | 9.978617e-10 | 2069 |
| 1.0111 | 0.6776 | 1.0183 | 0.6761 | 9.978596e-10 | 2070 |
| 1.0150 | 0.6776 | 1.0183 | 0.6761 | 9.978575e-10 | 2071 |
| 1.0140 | 0.6776 | 1.0182 | 0.6761 | 9.978554e-10 | 2072 |
| 1.0100 | 0.6776 | 1.0181 | 0.6761 | 9.978532e-10 | 2073 |
| 1.0202 | 0.6776 | 1.0181 | 0.6761 | 9.978511e-10 | 2074 |
| 1.0102 | 0.6776 | 1.0180 | 0.6761 | 9.97849e-10 | 2075 |
| 1.0196 | 0.6776 | 1.0179 | 0.6761 | 9.978469e-10 | 2076 |
| 1.0130 | 0.6776 | 1.0179 | 0.6761 | 9.978448e-10 | 2077 |
| 1.0128 | 0.6776 | 1.0178 | 0.6761 | 9.978427e-10 | 2078 |
| 1.0147 | 0.6776 | 1.0177 | 0.6761 | 9.978406e-10 | 2079 |
| 1.0161 | 0.6776 | 1.0176 | 0.6761 | 9.978385e-10 | 2080 |
| 1.0114 | 0.6776 | 1.0176 | 0.6761 | 9.978364e-10 | 2081 |
| 1.0123 | 0.6776 | 1.0175 | 0.6761 | 9.978343e-10 | 2082 |
| 1.0107 | 0.6776 | 1.0174 | 0.6761 | 9.978322e-10 | 2083 |
| 1.0174 | 0.6776 | 1.0173 | 0.6761 | 9.9783e-10 | 2084 |
| 1.0131 | 0.6776 | 1.0173 | 0.6761 | 9.978279e-10 | 2085 |
| 1.0121 | 0.6776 | 1.0172 | 0.6761 | 9.978258e-10 | 2086 |
| 1.0174 | 0.6776 | 1.0171 | 0.6761 | 9.978237e-10 | 2087 |
| 1.0111 | 0.6776 | 1.0171 | 0.6761 | 9.978216e-10 | 2088 |
| 1.0122 | 0.6776 | 1.0170 | 0.6761 | 9.978195e-10 | 2089 |
| 1.0073 | 0.6776 | 1.0169 | 0.6761 | 9.978174e-10 | 2090 |
| 1.0142 | 0.6776 | 1.0169 | 0.6761 | 9.978153e-10 | 2091 |
| 1.0190 | 0.6776 | 1.0168 | 0.6761 | 9.978132e-10 | 2092 |
| 1.0172 | 0.6776 | 1.0167 | 0.6761 | 9.978111e-10 | 2093 |
| 1.0123 | 0.6776 | 1.0166 | 0.6761 | 9.97809e-10 | 2094 |
| 1.0136 | 0.6776 | 1.0166 | 0.6761 | 9.978068e-10 | 2095 |
| 1.0136 | 0.6776 | 1.0165 | 0.6761 | 9.978047e-10 | 2096 |
| 1.0116 | 0.6776 | 1.0164 | 0.6761 | 9.978026e-10 | 2097 |
| 1.0158 | 0.6776 | 1.0164 | 0.6761 | 9.978005e-10 | 2098 |
| 1.0038 | 0.6776 | 1.0163 | 0.6761 | 9.977984e-10 | 2099 |
| 1.0087 | 0.6776 | 1.0162 | 0.6761 | 9.977963e-10 | 2100 |
| 1.0121 | 0.6776 | 1.0162 | 0.6761 | 9.977942e-10 | 2101 |
| 1.0133 | 0.6776 | 1.0161 | 0.6761 | 9.977921e-10 | 2102 |
| 1.0115 | 0.6776 | 1.0160 | 0.6761 | 9.9779e-10 | 2103 |
| 1.0101 | 0.6776 | 1.0160 | 0.6761 | 9.977879e-10 | 2104 |
| 1.0117 | 0.6776 | 1.0159 | 0.6761 | 9.977857e-10 | 2105 |
| 1.0092 | 0.6776 | 1.0158 | 0.6761 | 9.977836e-10 | 2106 |
| 1.0116 | 0.6776 | 1.0158 | 0.6761 | 9.977815e-10 | 2107 |
| 1.0140 | 0.6776 | 1.0157 | 0.6761 | 9.977794e-10 | 2108 |
| 1.0154 | 0.6776 | 1.0156 | 0.6761 | 9.977773e-10 | 2109 |
| 1.0130 | 0.6776 | 1.0156 | 0.6761 | 9.977752e-10 | 2110 |
| 1.0109 | 0.6776 | 1.0155 | 0.6761 | 9.977731e-10 | 2111 |
| 1.0098 | 0.6776 | 1.0154 | 0.6761 | 9.97771e-10 | 2112 |
| 1.0142 | 0.6776 | 1.0154 | 0.6761 | 9.977689e-10 | 2113 |
| 1.0104 | 0.6776 | 1.0153 | 0.6761 | 9.977668e-10 | 2114 |
| 1.0085 | 0.6776 | 1.0152 | 0.6761 | 9.977646e-10 | 2115 |
| 1.0083 | 0.6776 | 1.0152 | 0.6761 | 9.977625e-10 | 2116 |
| 1.0060 | 0.6776 | 1.0151 | 0.6761 | 9.977604e-10 | 2117 |
| 1.0119 | 0.6776 | 1.0150 | 0.6761 | 9.977583e-10 | 2118 |
| 1.0063 | 0.6776 | 1.0149 | 0.6761 | 9.977562e-10 | 2119 |
| 1.0089 | 0.6776 | 1.0149 | 0.6761 | 9.977541e-10 | 2120 |
| 1.0130 | 0.6776 | 1.0148 | 0.6761 | 9.97752e-10 | 2121 |
| 1.0146 | 0.6776 | 1.0147 | 0.6761 | 9.977499e-10 | 2122 |
| 1.0159 | 0.6776 | 1.0147 | 0.6761 | 9.977478e-10 | 2123 |
| 1.0105 | 0.6776 | 1.0146 | 0.6761 | 9.977457e-10 | 2124 |
| 1.0132 | 0.6776 | 1.0146 | 0.6761 | 9.977436e-10 | 2125 |
| 1.0049 | 0.6776 | 1.0145 | 0.6761 | 9.977414e-10 | 2126 |
| 1.0131 | 0.6776 | 1.0144 | 0.6761 | 9.977393e-10 | 2127 |
| 1.0070 | 0.6776 | 1.0144 | 0.6761 | 9.977372e-10 | 2128 |
| 1.0109 | 0.6776 | 1.0143 | 0.6761 | 9.977351e-10 | 2129 |
| 1.0072 | 0.6776 | 1.0142 | 0.6761 | 9.97733e-10 | 2130 |
| 1.0140 | 0.6776 | 1.0142 | 0.6761 | 9.977309e-10 | 2131 |
| 1.0071 | 0.6776 | 1.0141 | 0.6761 | 9.977288e-10 | 2132 |
| 1.0123 | 0.6776 | 1.0140 | 0.6761 | 9.977267e-10 | 2133 |
| 1.0079 | 0.6776 | 1.0140 | 0.6761 | 9.977246e-10 | 2134 |
| 1.0133 | 0.6776 | 1.0139 | 0.6761 | 9.977225e-10 | 2135 |
| 1.0049 | 0.6776 | 1.0138 | 0.6761 | 9.977204e-10 | 2136 |
| 1.0123 | 0.6776 | 1.0138 | 0.6761 | 9.977182e-10 | 2137 |
| 1.0121 | 0.6776 | 1.0137 | 0.6761 | 9.977161e-10 | 2138 |
| 1.0087 | 0.6776 | 1.0137 | 0.6761 | 9.97714e-10 | 2139 |
| 1.0095 | 0.6776 | 1.0136 | 0.6761 | 9.977119e-10 | 2140 |
| 1.0112 | 0.6776 | 1.0135 | 0.6761 | 9.977098e-10 | 2141 |
| 1.0107 | 0.6776 | 1.0135 | 0.6761 | 9.977077e-10 | 2142 |
| 1.0149 | 0.6776 | 1.0134 | 0.6761 | 9.977056e-10 | 2143 |
| 1.0033 | 0.6776 | 1.0133 | 0.6761 | 9.977035e-10 | 2144 |
| 1.0118 | 0.6776 | 1.0133 | 0.6761 | 9.977014e-10 | 2145 |
| 1.0043 | 0.6776 | 1.0132 | 0.6761 | 9.976993e-10 | 2146 |
| 1.0111 | 0.6776 | 1.0131 | 0.6761 | 9.976971e-10 | 2147 |
| 1.0046 | 0.6776 | 1.0131 | 0.6761 | 9.97695e-10 | 2148 |
| 1.0089 | 0.6776 | 1.0130 | 0.6761 | 9.976929e-10 | 2149 |
| 1.0049 | 0.6776 | 1.0129 | 0.6761 | 9.976908e-10 | 2150 |
| 1.0083 | 0.6776 | 1.0129 | 0.6761 | 9.976887e-10 | 2151 |
| 1.0055 | 0.6776 | 1.0128 | 0.6761 | 9.976866e-10 | 2152 |
| 1.0071 | 0.6776 | 1.0127 | 0.6761 | 9.976845e-10 | 2153 |
| 1.0050 | 0.6776 | 1.0127 | 0.6761 | 9.976824e-10 | 2154 |
| 1.0067 | 0.6776 | 1.0126 | 0.6761 | 9.976803e-10 | 2155 |
| 1.0020 | 0.6776 | 1.0126 | 0.6761 | 9.976782e-10 | 2156 |
| 1.0101 | 0.6776 | 1.0125 | 0.6761 | 9.97676e-10 | 2157 |
| 1.0042 | 0.6776 | 1.0124 | 0.6761 | 9.976739e-10 | 2158 |
| 1.0055 | 0.6776 | 1.0124 | 0.6761 | 9.976718e-10 | 2159 |
| 1.0098 | 0.6776 | 1.0123 | 0.6761 | 9.976697e-10 | 2160 |
| 1.0136 | 0.6776 | 1.0122 | 0.6761 | 9.976676e-10 | 2161 |
| 0.9997 | 0.6776 | 1.0122 | 0.6761 | 9.976655e-10 | 2162 |
| 1.0097 | 0.6776 | 1.0121 | 0.6761 | 9.976634e-10 | 2163 |
| 1.0075 | 0.6776 | 1.0120 | 0.6761 | 9.976613e-10 | 2164 |
| 1.0094 | 0.6776 | 1.0120 | 0.6761 | 9.976592e-10 | 2165 |
| 1.0091 | 0.6776 | 1.0119 | 0.6761 | 9.976571e-10 | 2166 |
| 1.0055 | 0.6776 | 1.0119 | 0.6761 | 9.97655e-10 | 2167 |
| 1.0037 | 0.6776 | 1.0118 | 0.6761 | 9.976528e-10 | 2168 |
| 1.0038 | 0.6776 | 1.0117 | 0.6761 | 9.976507e-10 | 2169 |
| 1.0072 | 0.6776 | 1.0117 | 0.6761 | 9.976486e-10 | 2170 |
| 1.0075 | 0.6776 | 1.0116 | 0.6761 | 9.976464e-10 | 2171 |
| 1.0029 | 0.6776 | 1.0115 | 0.6761 | 9.976442e-10 | 2172 |
| 1.0082 | 0.6776 | 1.0115 | 0.6761 | 9.97642e-10 | 2173 |
| 1.0066 | 0.6776 | 1.0114 | 0.6761 | 9.976397e-10 | 2174 |
| 1.0119 | 0.6776 | 1.0114 | 0.6761 | 9.976375e-10 | 2175 |
| 1.0123 | 0.6776 | 1.0113 | 0.6761 | 9.976353e-10 | 2176 |
| 1.0089 | 0.6776 | 1.0112 | 0.6761 | 9.976331e-10 | 2177 |
| 1.0105 | 0.6776 | 1.0112 | 0.6761 | 9.976309e-10 | 2178 |
| 1.0026 | 0.6776 | 1.0111 | 0.6761 | 9.976286e-10 | 2179 |
| 1.0016 | 0.6776 | 1.0111 | 0.6761 | 9.976264e-10 | 2180 |
| 0.9997 | 0.6776 | 1.0110 | 0.6761 | 9.976242e-10 | 2181 |
| 1.0086 | 0.6776 | 1.0109 | 0.6761 | 9.97622e-10 | 2182 |
| 0.9995 | 0.6776 | 1.0109 | 0.6761 | 9.976198e-10 | 2183 |
| 1.0135 | 0.6776 | 1.0108 | 0.6761 | 9.976175e-10 | 2184 |
| 1.0070 | 0.6776 | 1.0107 | 0.6761 | 9.976153e-10 | 2185 |
| 0.9989 | 0.6776 | 1.0107 | 0.6761 | 9.976131e-10 | 2186 |
| 1.0019 | 0.6776 | 1.0106 | 0.6761 | 9.976109e-10 | 2187 |
| 1.0052 | 0.6776 | 1.0105 | 0.6761 | 9.976087e-10 | 2188 |
| 1.0058 | 0.6776 | 1.0105 | 0.6761 | 9.976064e-10 | 2189 |
| 1.0013 | 0.6776 | 1.0104 | 0.6761 | 9.976042e-10 | 2190 |
| 1.0087 | 0.6776 | 1.0104 | 0.6761 | 9.97602e-10 | 2191 |
| 1.0049 | 0.6776 | 1.0103 | 0.6761 | 9.975998e-10 | 2192 |
| 1.0089 | 0.6776 | 1.0102 | 0.6761 | 9.975976e-10 | 2193 |
| 1.0079 | 0.6776 | 1.0102 | 0.6761 | 9.975953e-10 | 2194 |
| 1.0009 | 0.6776 | 1.0101 | 0.6761 | 9.975931e-10 | 2195 |
| 0.9985 | 0.6776 | 1.0101 | 0.6761 | 9.975909e-10 | 2196 |
| 1.0059 | 0.6776 | 1.0100 | 0.6761 | 9.975887e-10 | 2197 |
| 1.0013 | 0.6776 | 1.0099 | 0.6761 | 9.975865e-10 | 2198 |
| 1.0028 | 0.6776 | 1.0099 | 0.6761 | 9.975842e-10 | 2199 |
| 0.9948 | 0.6776 | 1.0098 | 0.6761 | 9.97582e-10 | 2200 |
| 1.0003 | 0.6776 | 1.0098 | 0.6761 | 9.975798e-10 | 2201 |
| 1.0052 | 0.6776 | 1.0097 | 0.6761 | 9.975776e-10 | 2202 |
| 1.0028 | 0.6776 | 1.0096 | 0.6761 | 9.975754e-10 | 2203 |
| 1.0074 | 0.6776 | 1.0096 | 0.6761 | 9.975731e-10 | 2204 |
| 1.0032 | 0.6776 | 1.0095 | 0.6761 | 9.975709e-10 | 2205 |
| 1.0037 | 0.6776 | 1.0094 | 0.6761 | 9.975687e-10 | 2206 |
| 1.0116 | 0.6776 | 1.0094 | 0.6761 | 9.975665e-10 | 2207 |
| 1.0039 | 0.6776 | 1.0093 | 0.6761 | 9.975643e-10 | 2208 |
| 0.9952 | 0.6776 | 1.0092 | 0.6761 | 9.97562e-10 | 2209 |
| 1.0025 | 0.6776 | 1.0092 | 0.6761 | 9.975598e-10 | 2210 |
| 1.0042 | 0.6776 | 1.0091 | 0.6761 | 9.975576e-10 | 2211 |
| 0.9969 | 0.6776 | 1.0091 | 0.6761 | 9.975554e-10 | 2212 |
| 1.0018 | 0.6776 | 1.0090 | 0.6761 | 9.975532e-10 | 2213 |
| 0.9998 | 0.6776 | 1.0090 | 0.6761 | 9.975509e-10 | 2214 |
| 1.0006 | 0.6776 | 1.0089 | 0.6761 | 9.975487e-10 | 2215 |
| 1.0073 | 0.6776 | 1.0088 | 0.6761 | 9.975465e-10 | 2216 |
| 1.0034 | 0.6776 | 1.0088 | 0.6761 | 9.975443e-10 | 2217 |
| 0.9997 | 0.6776 | 1.0087 | 0.6761 | 9.97542e-10 | 2218 |
| 1.0005 | 0.6776 | 1.0087 | 0.6761 | 9.975398e-10 | 2219 |
| 1.0028 | 0.6776 | 1.0086 | 0.6761 | 9.975376e-10 | 2220 |
| 1.0079 | 0.6776 | 1.0085 | 0.6761 | 9.975354e-10 | 2221 |
| 1.0010 | 0.6776 | 1.0085 | 0.6761 | 9.975332e-10 | 2222 |
| 0.9997 | 0.6776 | 1.0084 | 0.6761 | 9.97531e-10 | 2223 |
| 1.0041 | 0.6776 | 1.0084 | 0.6761 | 9.975287e-10 | 2224 |
| 1.0010 | 0.6776 | 1.0083 | 0.6761 | 9.975265e-10 | 2225 |
| 1.0008 | 0.6776 | 1.0082 | 0.6761 | 9.975243e-10 | 2226 |
| 1.0048 | 0.6776 | 1.0082 | 0.6761 | 9.975221e-10 | 2227 |
| 1.0015 | 0.6776 | 1.0081 | 0.6761 | 9.975198e-10 | 2228 |
| 0.9948 | 0.6776 | 1.0081 | 0.6761 | 9.975176e-10 | 2229 |
| 1.0029 | 0.6776 | 1.0080 | 0.6761 | 9.975154e-10 | 2230 |
| 1.0018 | 0.6776 | 1.0080 | 0.6761 | 9.975132e-10 | 2231 |
| 1.0028 | 0.6776 | 1.0079 | 0.6761 | 9.97511e-10 | 2232 |
| 0.9981 | 0.6776 | 1.0078 | 0.6761 | 9.975087e-10 | 2233 |
| 1.0012 | 0.6776 | 1.0078 | 0.6761 | 9.975065e-10 | 2234 |
| 1.0047 | 0.6776 | 1.0077 | 0.6761 | 9.975043e-10 | 2235 |
| 1.0054 | 0.6776 | 1.0077 | 0.6761 | 9.975021e-10 | 2236 |
| 1.0012 | 0.6776 | 1.0076 | 0.6761 | 9.974999e-10 | 2237 |
| 1.0040 | 0.6776 | 1.0075 | 0.6761 | 9.974976e-10 | 2238 |
| 1.0034 | 0.6776 | 1.0075 | 0.6761 | 9.974954e-10 | 2239 |
| 1.0092 | 0.6776 | 1.0074 | 0.6761 | 9.974932e-10 | 2240 |
| 1.0079 | 0.6776 | 1.0074 | 0.6761 | 9.97491e-10 | 2241 |
| 1.0005 | 0.6776 | 1.0073 | 0.6761 | 9.974888e-10 | 2242 |
| 1.0011 | 0.6776 | 1.0073 | 0.6761 | 9.974865e-10 | 2243 |
| 1.0014 | 0.6776 | 1.0072 | 0.6761 | 9.974843e-10 | 2244 |
| 1.0006 | 0.6776 | 1.0071 | 0.6761 | 9.974821e-10 | 2245 |
| 0.9951 | 0.6776 | 1.0071 | 0.6761 | 9.974799e-10 | 2246 |
| 0.9989 | 0.6776 | 1.0070 | 0.6761 | 9.974777e-10 | 2247 |
| 0.9991 | 0.6776 | 1.0070 | 0.6761 | 9.974754e-10 | 2248 |
| 0.9974 | 0.6776 | 1.0069 | 0.6761 | 9.974732e-10 | 2249 |
| 1.0060 | 0.6776 | 1.0069 | 0.6761 | 9.97471e-10 | 2250 |
| 1.0053 | 0.6776 | 1.0068 | 0.6761 | 9.974688e-10 | 2251 |
| 1.0004 | 0.6776 | 1.0067 | 0.6761 | 9.974666e-10 | 2252 |
| 0.9997 | 0.6776 | 1.0067 | 0.6761 | 9.974643e-10 | 2253 |
| 0.9950 | 0.6776 | 1.0066 | 0.6761 | 9.974621e-10 | 2254 |
| 1.0064 | 0.6776 | 1.0066 | 0.6761 | 9.974599e-10 | 2255 |
| 1.0019 | 0.6776 | 1.0065 | 0.6761 | 9.974577e-10 | 2256 |
| 1.0090 | 0.6776 | 1.0065 | 0.6761 | 9.974555e-10 | 2257 |
| 1.0009 | 0.6776 | 1.0064 | 0.6761 | 9.974532e-10 | 2258 |
| 0.9976 | 0.6776 | 1.0063 | 0.6761 | 9.97451e-10 | 2259 |
| 1.0029 | 0.6776 | 1.0063 | 0.6761 | 9.974488e-10 | 2260 |
| 0.9938 | 0.6776 | 1.0062 | 0.6761 | 9.974466e-10 | 2261 |
| 0.9986 | 0.6776 | 1.0062 | 0.6761 | 9.974443e-10 | 2262 |
| 1.0055 | 0.6776 | 1.0061 | 0.6761 | 9.974421e-10 | 2263 |
| 1.0037 | 0.6776 | 1.0061 | 0.6761 | 9.974399e-10 | 2264 |
| 1.0078 | 0.6776 | 1.0060 | 0.6761 | 9.974377e-10 | 2265 |
| 0.9965 | 0.6776 | 1.0060 | 0.6761 | 9.974355e-10 | 2266 |
| 1.0037 | 0.6776 | 1.0059 | 0.6761 | 9.974332e-10 | 2267 |
| 1.0081 | 0.6776 | 1.0059 | 0.6761 | 9.97431e-10 | 2268 |
| 0.9967 | 0.6776 | 1.0058 | 0.6761 | 9.974288e-10 | 2269 |
| 1.0092 | 0.6776 | 1.0058 | 0.6761 | 9.974266e-10 | 2270 |
| 1.0011 | 0.6776 | 1.0057 | 0.6761 | 9.974244e-10 | 2271 |
| 1.0051 | 0.6776 | 1.0056 | 0.6761 | 9.974221e-10 | 2272 |
| 1.0005 | 0.6776 | 1.0056 | 0.6761 | 9.974199e-10 | 2273 |
| 0.9930 | 0.6776 | 1.0055 | 0.6761 | 9.974177e-10 | 2274 |
| 0.9996 | 0.6776 | 1.0055 | 0.6761 | 9.974155e-10 | 2275 |
| 0.9984 | 0.6776 | 1.0054 | 0.6761 | 9.974133e-10 | 2276 |
| 1.0024 | 0.6776 | 1.0054 | 0.6761 | 9.97411e-10 | 2277 |
| 0.9949 | 0.6776 | 1.0053 | 0.6761 | 9.974088e-10 | 2278 |
| 0.9994 | 0.6776 | 1.0052 | 0.6761 | 9.974066e-10 | 2279 |
| 0.9998 | 0.6776 | 1.0052 | 0.6761 | 9.974044e-10 | 2280 |
| 1.0032 | 0.6776 | 1.0051 | 0.6761 | 9.974022e-10 | 2281 |
| 1.0050 | 0.6776 | 1.0051 | 0.6761 | 9.973998e-10 | 2282 |
| 0.9969 | 0.6776 | 1.0050 | 0.6761 | 9.973975e-10 | 2283 |
| 1.0039 | 0.6776 | 1.0050 | 0.6761 | 9.973952e-10 | 2284 |
| 1.0016 | 0.6776 | 1.0049 | 0.6761 | 9.973928e-10 | 2285 |
| 0.9998 | 0.6776 | 1.0049 | 0.6761 | 9.973905e-10 | 2286 |
| 1.0016 | 0.6776 | 1.0048 | 0.6761 | 9.973882e-10 | 2287 |
| 0.9957 | 0.6776 | 1.0048 | 0.6761 | 9.973858e-10 | 2288 |
| 0.9936 | 0.6776 | 1.0047 | 0.6761 | 9.973835e-10 | 2289 |
| 0.9987 | 0.6776 | 1.0047 | 0.6761 | 9.973812e-10 | 2290 |
| 0.9972 | 0.6776 | 1.0046 | 0.6761 | 9.973788e-10 | 2291 |
| 0.9978 | 0.6776 | 1.0046 | 0.6761 | 9.973765e-10 | 2292 |
| 1.0054 | 0.6776 | 1.0045 | 0.6761 | 9.973742e-10 | 2293 |
| 1.0038 | 0.6776 | 1.0044 | 0.6761 | 9.973719e-10 | 2294 |
| 0.9963 | 0.6776 | 1.0044 | 0.6761 | 9.973695e-10 | 2295 |
| 1.0030 | 0.6776 | 1.0043 | 0.6761 | 9.973672e-10 | 2296 |
| 1.0049 | 0.6776 | 1.0043 | 0.6761 | 9.973649e-10 | 2297 |
| 1.0039 | 0.6776 | 1.0042 | 0.6761 | 9.973625e-10 | 2298 |
| 1.0001 | 0.6776 | 1.0042 | 0.6761 | 9.973602e-10 | 2299 |
| 1.0046 | 0.6776 | 1.0041 | 0.6761 | 9.973579e-10 | 2300 |
| 1.0027 | 0.6776 | 1.0041 | 0.6761 | 9.973555e-10 | 2301 |
| 0.9993 | 0.6776 | 1.0040 | 0.6761 | 9.973532e-10 | 2302 |
| 1.0016 | 0.6776 | 1.0040 | 0.6761 | 9.973509e-10 | 2303 |
| 0.9969 | 0.6776 | 1.0039 | 0.6761 | 9.973485e-10 | 2304 |
| 1.0023 | 0.6776 | 1.0038 | 0.6761 | 9.973462e-10 | 2305 |
| 1.0015 | 0.6776 | 1.0038 | 0.6761 | 9.973439e-10 | 2306 |
| 0.9924 | 0.6776 | 1.0037 | 0.6761 | 9.973415e-10 | 2307 |
| 1.0025 | 0.6776 | 1.0037 | 0.6761 | 9.973392e-10 | 2308 |
| 0.9972 | 0.6776 | 1.0036 | 0.6761 | 9.973369e-10 | 2309 |
| 0.9933 | 0.6776 | 1.0036 | 0.6761 | 9.973345e-10 | 2310 |
| 0.9949 | 0.6776 | 1.0035 | 0.6761 | 9.973322e-10 | 2311 |
| 1.0023 | 0.6776 | 1.0035 | 0.6761 | 9.973299e-10 | 2312 |
| 0.9961 | 0.6776 | 1.0034 | 0.6761 | 9.973276e-10 | 2313 |
| 0.9957 | 0.6776 | 1.0034 | 0.6761 | 9.973252e-10 | 2314 |
| 1.0023 | 0.6776 | 1.0033 | 0.6761 | 9.973229e-10 | 2315 |
| 0.9957 | 0.6776 | 1.0033 | 0.6761 | 9.973206e-10 | 2316 |
| 1.0004 | 0.6776 | 1.0032 | 0.6761 | 9.973182e-10 | 2317 |
| 0.9928 | 0.6776 | 1.0031 | 0.6761 | 9.973159e-10 | 2318 |
| 0.9987 | 0.6776 | 1.0031 | 0.6761 | 9.973136e-10 | 2319 |
| 1.0032 | 0.6776 | 1.0031 | 0.6761 | 9.973112e-10 | 2320 |
| 0.9993 | 0.6776 | 1.0030 | 0.6761 | 9.973089e-10 | 2321 |
| 1.0041 | 0.6776 | 1.0029 | 0.6761 | 9.973066e-10 | 2322 |
| 0.9930 | 0.6776 | 1.0029 | 0.6761 | 9.973042e-10 | 2323 |
| 0.9968 | 0.6776 | 1.0028 | 0.6761 | 9.973019e-10 | 2324 |
| 1.0037 | 0.6776 | 1.0028 | 0.6761 | 9.972996e-10 | 2325 |
| 1.0009 | 0.6776 | 1.0027 | 0.6761 | 9.972972e-10 | 2326 |
| 1.0020 | 0.6776 | 1.0027 | 0.6761 | 9.972949e-10 | 2327 |
| 0.9954 | 0.6776 | 1.0026 | 0.6761 | 9.972926e-10 | 2328 |
| 0.9942 | 0.6776 | 1.0026 | 0.6761 | 9.972903e-10 | 2329 |
| 0.9967 | 0.6776 | 1.0025 | 0.6761 | 9.972879e-10 | 2330 |
| 1.0002 | 0.6776 | 1.0025 | 0.6761 | 9.972856e-10 | 2331 |
| 0.9962 | 0.6776 | 1.0024 | 0.6761 | 9.972833e-10 | 2332 |
| 1.0007 | 0.6776 | 1.0024 | 0.6761 | 9.972809e-10 | 2333 |
| 1.0004 | 0.6776 | 1.0023 | 0.6761 | 9.972786e-10 | 2334 |
| 0.9976 | 0.6776 | 1.0023 | 0.6761 | 9.972763e-10 | 2335 |
| 0.9934 | 0.6776 | 1.0022 | 0.6761 | 9.972739e-10 | 2336 |
| 1.0000 | 0.6776 | 1.0022 | 0.6761 | 9.972716e-10 | 2337 |
| 0.9985 | 0.6776 | 1.0021 | 0.6761 | 9.972693e-10 | 2338 |
| 0.9958 | 0.6776 | 1.0021 | 0.6761 | 9.972669e-10 | 2339 |
| 0.9944 | 0.6776 | 1.0020 | 0.6761 | 9.972646e-10 | 2340 |
| 1.0025 | 0.6776 | 1.0020 | 0.6761 | 9.972623e-10 | 2341 |
| 0.9952 | 0.6776 | 1.0019 | 0.6761 | 9.972599e-10 | 2342 |
| 0.9980 | 0.6776 | 1.0019 | 0.6761 | 9.972576e-10 | 2343 |
| 1.0006 | 0.6776 | 1.0018 | 0.6761 | 9.972553e-10 | 2344 |
| 0.9976 | 0.6776 | 1.0017 | 0.6761 | 9.97253e-10 | 2345 |
| 0.9981 | 0.6776 | 1.0017 | 0.6761 | 9.972506e-10 | 2346 |
| 0.9984 | 0.6776 | 1.0016 | 0.6761 | 9.972483e-10 | 2347 |
| 1.0000 | 0.6776 | 1.0016 | 0.6761 | 9.97246e-10 | 2348 |
| 0.9927 | 0.6776 | 1.0015 | 0.6761 | 9.972436e-10 | 2349 |
| 0.9968 | 0.6776 | 1.0015 | 0.6761 | 9.972413e-10 | 2350 |
| 1.0024 | 0.6776 | 1.0014 | 0.6761 | 9.97239e-10 | 2351 |
| 1.0000 | 0.6776 | 1.0014 | 0.6761 | 9.972366e-10 | 2352 |
| 0.9983 | 0.6776 | 1.0013 | 0.6761 | 9.972343e-10 | 2353 |
| 0.9918 | 0.6776 | 1.0013 | 0.6761 | 9.97232e-10 | 2354 |
| 1.0018 | 0.6776 | 1.0013 | 0.6761 | 9.972296e-10 | 2355 |
| 0.9936 | 0.6776 | 1.0012 | 0.6761 | 9.972273e-10 | 2356 |
| 0.9929 | 0.6776 | 1.0012 | 0.6761 | 9.97225e-10 | 2357 |
| 0.9844 | 0.6776 | 1.0011 | 0.6761 | 9.972226e-10 | 2358 |
| 0.9949 | 0.6776 | 1.0011 | 0.6761 | 9.972203e-10 | 2359 |
| 0.9946 | 0.6776 | 1.0010 | 0.6761 | 9.97218e-10 | 2360 |
| 0.9923 | 0.6776 | 1.0010 | 0.6761 | 9.972156e-10 | 2361 |
| 0.9977 | 0.6776 | 1.0009 | 0.6761 | 9.972133e-10 | 2362 |
| 0.9940 | 0.6776 | 1.0008 | 0.6761 | 9.97211e-10 | 2363 |
| 0.9944 | 0.6776 | 1.0008 | 0.6761 | 9.972086e-10 | 2364 |
| 0.9896 | 0.6776 | 1.0008 | 0.6761 | 9.972063e-10 | 2365 |
| 1.0001 | 0.6776 | 1.0007 | 0.6761 | 9.97204e-10 | 2366 |
| 0.9971 | 0.6776 | 1.0007 | 0.6761 | 9.972017e-10 | 2367 |
| 0.9951 | 0.6776 | 1.0006 | 0.6761 | 9.971993e-10 | 2368 |
| 0.9981 | 0.6776 | 1.0006 | 0.6761 | 9.97197e-10 | 2369 |
| 0.9957 | 0.6776 | 1.0005 | 0.6761 | 9.971947e-10 | 2370 |
| 0.9941 | 0.6776 | 1.0005 | 0.6761 | 9.971923e-10 | 2371 |
| 0.9914 | 0.6776 | 1.0004 | 0.6761 | 9.9719e-10 | 2372 |
| 0.9886 | 0.6776 | 1.0004 | 0.6761 | 9.971877e-10 | 2373 |
| 0.9932 | 0.6776 | 1.0003 | 0.6761 | 9.971853e-10 | 2374 |
| 0.9985 | 0.6776 | 1.0003 | 0.6761 | 9.97183e-10 | 2375 |
| 0.9981 | 0.6776 | 1.0002 | 0.6761 | 9.971807e-10 | 2376 |
| 0.9952 | 0.6776 | 1.0002 | 0.6761 | 9.971783e-10 | 2377 |
| 0.9912 | 0.6776 | 1.0001 | 0.6761 | 9.97176e-10 | 2378 |
| 0.9949 | 0.6776 | 1.0001 | 0.6761 | 9.971737e-10 | 2379 |
| 0.9977 | 0.6776 | 1.0000 | 0.6761 | 9.971713e-10 | 2380 |
| 0.9991 | 0.6776 | 1.0000 | 0.6761 | 9.97169e-10 | 2381 |
| 0.9926 | 0.6776 | 0.9999 | 0.6761 | 9.971667e-10 | 2382 |
| 1.0038 | 0.6776 | 0.9999 | 0.6761 | 9.971644e-10 | 2383 |
| 0.9890 | 0.6776 | 0.9998 | 0.6761 | 9.97162e-10 | 2384 |
| 0.9982 | 0.6776 | 0.9998 | 0.6761 | 9.971597e-10 | 2385 |
| 0.9939 | 0.6776 | 0.9997 | 0.6761 | 9.971574e-10 | 2386 |
| 0.9953 | 0.6776 | 0.9997 | 0.6761 | 9.97155e-10 | 2387 |
| 1.0034 | 0.6776 | 0.9996 | 0.6761 | 9.971527e-10 | 2388 |
| 0.9985 | 0.6776 | 0.9996 | 0.6761 | 9.971504e-10 | 2389 |
| 0.9948 | 0.6776 | 0.9995 | 0.6761 | 9.97148e-10 | 2390 |
| 0.9911 | 0.6776 | 0.9995 | 0.6761 | 9.971457e-10 | 2391 |
| 0.9910 | 0.6776 | 0.9994 | 0.6761 | 9.971434e-10 | 2392 |
| 0.9845 | 0.6776 | 0.9994 | 0.6761 | 9.97141e-10 | 2393 |
| 0.9979 | 0.6776 | 0.9994 | 0.6761 | 9.971386e-10 | 2394 |
| 0.9927 | 0.6776 | 0.9993 | 0.6761 | 9.971362e-10 | 2395 |
| 0.9972 | 0.6776 | 0.9993 | 0.6761 | 9.971337e-10 | 2396 |
| 0.9945 | 0.6776 | 0.9992 | 0.6761 | 9.971313e-10 | 2397 |
| 0.9961 | 0.6776 | 0.9992 | 0.6761 | 9.971288e-10 | 2398 |
| 0.9938 | 0.6776 | 0.9991 | 0.6761 | 9.971264e-10 | 2399 |
| 0.9990 | 0.6776 | 0.9991 | 0.6761 | 9.971239e-10 | 2400 |
| 0.9935 | 0.6776 | 0.9990 | 0.6761 | 9.971215e-10 | 2401 |
| 0.9903 | 0.6776 | 0.9990 | 0.6761 | 9.97119e-10 | 2402 |
| 0.9894 | 0.6776 | 0.9989 | 0.6761 | 9.971166e-10 | 2403 |
| 0.9901 | 0.6776 | 0.9989 | 0.6761 | 9.971142e-10 | 2404 |
| 0.9945 | 0.6776 | 0.9988 | 0.6761 | 9.971117e-10 | 2405 |
| 0.9818 | 0.6776 | 0.9988 | 0.6761 | 9.971093e-10 | 2406 |
| 0.9883 | 0.6776 | 0.9987 | 0.6761 | 9.971068e-10 | 2407 |
| 0.9922 | 0.6776 | 0.9987 | 0.6761 | 9.971044e-10 | 2408 |
| 0.9896 | 0.6776 | 0.9986 | 0.6761 | 9.97102e-10 | 2409 |
| 0.9962 | 0.6776 | 0.9986 | 0.6761 | 9.970995e-10 | 2410 |
| 0.9865 | 0.6776 | 0.9985 | 0.6761 | 9.970971e-10 | 2411 |
| 0.9937 | 0.6776 | 0.9985 | 0.6761 | 9.970946e-10 | 2412 |
| 0.9911 | 0.6776 | 0.9984 | 0.6761 | 9.970922e-10 | 2413 |
| 0.9911 | 0.6776 | 0.9984 | 0.6761 | 9.970897e-10 | 2414 |
| 0.9944 | 0.6776 | 0.9983 | 0.6761 | 9.970873e-10 | 2415 |
| 0.9876 | 0.6776 | 0.9983 | 0.6761 | 9.970849e-10 | 2416 |
| 0.9932 | 0.6776 | 0.9982 | 0.6761 | 9.970824e-10 | 2417 |
| 0.9952 | 0.6776 | 0.9982 | 0.6761 | 9.9708e-10 | 2418 |
| 0.9900 | 0.6776 | 0.9982 | 0.6761 | 9.970775e-10 | 2419 |
| 0.9934 | 0.6776 | 0.9981 | 0.6761 | 9.970751e-10 | 2420 |
| 0.9943 | 0.6776 | 0.9981 | 0.6761 | 9.970726e-10 | 2421 |
| 0.9878 | 0.6776 | 0.9980 | 0.6761 | 9.970702e-10 | 2422 |
| 0.9949 | 0.6776 | 0.9980 | 0.6761 | 9.970678e-10 | 2423 |
| 0.9895 | 0.6776 | 0.9979 | 0.6761 | 9.970653e-10 | 2424 |
| 0.9931 | 0.6776 | 0.9979 | 0.6761 | 9.970629e-10 | 2425 |
| 0.9870 | 0.6776 | 0.9978 | 0.6761 | 9.970604e-10 | 2426 |
| 0.9921 | 0.6776 | 0.9978 | 0.6761 | 9.97058e-10 | 2427 |
| 0.9857 | 0.6776 | 0.9978 | 0.6761 | 9.970555e-10 | 2428 |
| 0.9924 | 0.6776 | 0.9977 | 0.6761 | 9.970531e-10 | 2429 |
| 0.9920 | 0.6776 | 0.9977 | 0.6761 | 9.970507e-10 | 2430 |
| 0.9936 | 0.6776 | 0.9976 | 0.6761 | 9.970482e-10 | 2431 |
| 0.9936 | 0.6776 | 0.9976 | 0.6761 | 9.970458e-10 | 2432 |
| 0.9898 | 0.6776 | 0.9975 | 0.6761 | 9.970433e-10 | 2433 |
| 0.9902 | 0.6776 | 0.9975 | 0.6761 | 9.970409e-10 | 2434 |
| 0.9905 | 0.6776 | 0.9974 | 0.6761 | 9.970385e-10 | 2435 |
| 0.9903 | 0.6776 | 0.9974 | 0.6761 | 9.97036e-10 | 2436 |
| 0.9945 | 0.6776 | 0.9973 | 0.6761 | 9.970336e-10 | 2437 |
| 0.9844 | 0.6776 | 0.9973 | 0.6761 | 9.970311e-10 | 2438 |
| 0.9929 | 0.6776 | 0.9972 | 0.6761 | 9.970287e-10 | 2439 |
| 0.9900 | 0.6776 | 0.9972 | 0.6761 | 9.970262e-10 | 2440 |
| 0.9870 | 0.6776 | 0.9972 | 0.6761 | 9.970238e-10 | 2441 |
| 0.9875 | 0.6776 | 0.9971 | 0.6761 | 9.970214e-10 | 2442 |
| 0.9903 | 0.6776 | 0.9971 | 0.6761 | 9.970189e-10 | 2443 |
| 0.9942 | 0.6776 | 0.9970 | 0.6761 | 9.970165e-10 | 2444 |
| 0.9963 | 0.6776 | 0.9970 | 0.6761 | 9.97014e-10 | 2445 |
| 0.9859 | 0.6776 | 0.9969 | 0.6761 | 9.970116e-10 | 2446 |
| 0.9920 | 0.6776 | 0.9969 | 0.6761 | 9.970091e-10 | 2447 |
| 0.9934 | 0.6776 | 0.9969 | 0.6761 | 9.970067e-10 | 2448 |
| 0.9901 | 0.6776 | 0.9968 | 0.6761 | 9.970043e-10 | 2449 |
| 1.0000 | 0.6776 | 0.9968 | 0.6761 | 9.970018e-10 | 2450 |
| 0.9920 | 0.6776 | 0.9967 | 0.6761 | 9.969994e-10 | 2451 |
| 0.9996 | 0.6776 | 0.9967 | 0.6761 | 9.969969e-10 | 2452 |
| 0.9921 | 0.6776 | 0.9966 | 0.6761 | 9.969945e-10 | 2453 |
| 0.9848 | 0.6776 | 0.9966 | 0.6761 | 9.96992e-10 | 2454 |
| 0.9893 | 0.6776 | 0.9965 | 0.6761 | 9.969896e-10 | 2455 |
| 0.9897 | 0.6776 | 0.9965 | 0.6761 | 9.969872e-10 | 2456 |
| 0.9864 | 0.6776 | 0.9965 | 0.6761 | 9.969847e-10 | 2457 |
| 0.9904 | 0.6776 | 0.9964 | 0.6761 | 9.969823e-10 | 2458 |
| 0.9886 | 0.6776 | 0.9964 | 0.6761 | 9.969798e-10 | 2459 |
| 0.9861 | 0.6776 | 0.9963 | 0.6761 | 9.969774e-10 | 2460 |
| 0.9882 | 0.6776 | 0.9963 | 0.6761 | 9.96975e-10 | 2461 |
| 0.9869 | 0.6776 | 0.9962 | 0.6761 | 9.969725e-10 | 2462 |
| 0.9831 | 0.6776 | 0.9962 | 0.6761 | 9.969701e-10 | 2463 |
| 0.9948 | 0.6776 | 0.9962 | 0.6761 | 9.969676e-10 | 2464 |
| 0.9870 | 0.6776 | 0.9961 | 0.6761 | 9.969652e-10 | 2465 |
| 0.9945 | 0.6776 | 0.9961 | 0.6761 | 9.969627e-10 | 2466 |
| 0.9927 | 0.6776 | 0.9960 | 0.6761 | 9.969603e-10 | 2467 |
| 0.9907 | 0.6776 | 0.9960 | 0.6761 | 9.969578e-10 | 2468 |
| 0.9950 | 0.6776 | 0.9959 | 0.6761 | 9.969554e-10 | 2469 |
| 0.9804 | 0.6776 | 0.9959 | 0.6761 | 9.96953e-10 | 2470 |
| 0.9965 | 0.6776 | 0.9958 | 0.6761 | 9.969505e-10 | 2471 |
| 0.9935 | 0.6776 | 0.9958 | 0.6761 | 9.969481e-10 | 2472 |
| 0.9918 | 0.6776 | 0.9958 | 0.6761 | 9.969456e-10 | 2473 |
| 0.9897 | 0.6776 | 0.9957 | 0.6761 | 9.969432e-10 | 2474 |
| 0.9877 | 0.6776 | 0.9957 | 0.6761 | 9.969408e-10 | 2475 |
| 0.9856 | 0.6776 | 0.9956 | 0.6761 | 9.969383e-10 | 2476 |
| 0.9875 | 0.6776 | 0.9956 | 0.6761 | 9.969359e-10 | 2477 |
| 0.9855 | 0.6776 | 0.9955 | 0.6761 | 9.969334e-10 | 2478 |
| 0.9848 | 0.6776 | 0.9955 | 0.6761 | 9.96931e-10 | 2479 |
| 0.9864 | 0.6776 | 0.9955 | 0.6761 | 9.969285e-10 | 2480 |
| 0.9901 | 0.6776 | 0.9954 | 0.6761 | 9.969261e-10 | 2481 |
| 0.9880 | 0.6776 | 0.9954 | 0.6761 | 9.969237e-10 | 2482 |
| 0.9890 | 0.6776 | 0.9953 | 0.6761 | 9.969212e-10 | 2483 |
| 0.9878 | 0.6776 | 0.9953 | 0.6761 | 9.969188e-10 | 2484 |
| 0.9990 | 0.6776 | 0.9952 | 0.6761 | 9.969163e-10 | 2485 |
| 0.9858 | 0.6776 | 0.9952 | 0.6761 | 9.969139e-10 | 2486 |
| 0.9834 | 0.6776 | 0.9952 | 0.6761 | 9.969114e-10 | 2487 |
| 0.9870 | 0.6776 | 0.9951 | 0.6761 | 9.96909e-10 | 2488 |
| 0.9923 | 0.6776 | 0.9951 | 0.6761 | 9.969066e-10 | 2489 |
| 0.9830 | 0.6776 | 0.9950 | 0.6761 | 9.969041e-10 | 2490 |
| 0.9903 | 0.6776 | 0.9950 | 0.6761 | 9.969017e-10 | 2491 |
| 0.9922 | 0.6776 | 0.9949 | 0.6761 | 9.968992e-10 | 2492 |
| 0.9858 | 0.6776 | 0.9949 | 0.6761 | 9.968968e-10 | 2493 |
| 0.9820 | 0.6776 | 0.9949 | 0.6761 | 9.968943e-10 | 2494 |
| 0.9921 | 0.6776 | 0.9948 | 0.6761 | 9.968919e-10 | 2495 |
| 0.9939 | 0.6776 | 0.9948 | 0.6761 | 9.968895e-10 | 2496 |
| 0.9864 | 0.6776 | 0.9947 | 0.6761 | 9.96887e-10 | 2497 |
| 0.9904 | 0.6776 | 0.9947 | 0.6761 | 9.968846e-10 | 2498 |
| 0.9948 | 0.6776 | 0.9947 | 0.6761 | 9.968821e-10 | 2499 |
| 0.9923 | 0.6776 | 0.9946 | 0.6761 | 9.968797e-10 | 2500 |
| 0.9845 | 0.6776 | 0.9946 | 0.6761 | 9.968772e-10 | 2501 |
| 0.9906 | 0.6776 | 0.9945 | 0.6761 | 9.968748e-10 | 2502 |
| 0.9835 | 0.6776 | 0.9945 | 0.6761 | 9.968724e-10 | 2503 |
| 0.9908 | 0.6776 | 0.9945 | 0.6761 | 9.968699e-10 | 2504 |
| 0.9862 | 0.6776 | 0.9944 | 0.6761 | 9.968675e-10 | 2505 |
| 0.9852 | 0.6776 | 0.9944 | 0.6761 | 9.968649e-10 | 2506 |
| 0.9889 | 0.6776 | 0.9943 | 0.6761 | 9.968624e-10 | 2507 |
| 0.9902 | 0.6776 | 0.9943 | 0.6761 | 9.968598e-10 | 2508 |
| 0.9948 | 0.6776 | 0.9942 | 0.6761 | 9.968573e-10 | 2509 |
| 0.9916 | 0.6776 | 0.9942 | 0.6761 | 9.968547e-10 | 2510 |
| 0.9904 | 0.6776 | 0.9942 | 0.6761 | 9.968522e-10 | 2511 |
| 0.9879 | 0.6776 | 0.9941 | 0.6761 | 9.968496e-10 | 2512 |
| 0.9857 | 0.6776 | 0.9941 | 0.6761 | 9.96847e-10 | 2513 |
| 0.9893 | 0.6776 | 0.9940 | 0.6761 | 9.968445e-10 | 2514 |
| 0.9796 | 0.6776 | 0.9940 | 0.6761 | 9.968419e-10 | 2515 |
| 0.9883 | 0.6776 | 0.9939 | 0.6761 | 9.968394e-10 | 2516 |
| 0.9886 | 0.6776 | 0.9939 | 0.6761 | 9.968368e-10 | 2517 |
| 0.9895 | 0.6776 | 0.9939 | 0.6761 | 9.968343e-10 | 2518 |
| 0.9895 | 0.6776 | 0.9938 | 0.6761 | 9.968317e-10 | 2519 |
| 0.9859 | 0.6776 | 0.9938 | 0.6761 | 9.968292e-10 | 2520 |
| 0.9881 | 0.6776 | 0.9937 | 0.6761 | 9.968266e-10 | 2521 |
| 0.9845 | 0.6776 | 0.9937 | 0.6761 | 9.968241e-10 | 2522 |
| 0.9816 | 0.6776 | 0.9937 | 0.6761 | 9.968215e-10 | 2523 |
| 0.9914 | 0.6776 | 0.9936 | 0.6761 | 9.96819e-10 | 2524 |
| 0.9892 | 0.6776 | 0.9936 | 0.6761 | 9.968164e-10 | 2525 |
| 0.9866 | 0.6776 | 0.9935 | 0.6761 | 9.968139e-10 | 2526 |
| 0.9852 | 0.6776 | 0.9935 | 0.6761 | 9.968113e-10 | 2527 |
| 0.9897 | 0.6776 | 0.9935 | 0.6761 | 9.968087e-10 | 2528 |
| 0.9900 | 0.6776 | 0.9934 | 0.6761 | 9.968062e-10 | 2529 |
| 0.9784 | 0.6776 | 0.9934 | 0.6761 | 9.968036e-10 | 2530 |
| 0.9916 | 0.6776 | 0.9933 | 0.6761 | 9.968011e-10 | 2531 |
| 0.9834 | 0.6776 | 0.9933 | 0.6761 | 9.967985e-10 | 2532 |
| 0.9922 | 0.6776 | 0.9933 | 0.6761 | 9.96796e-10 | 2533 |
| 0.9876 | 0.6776 | 0.9932 | 0.6761 | 9.967934e-10 | 2534 |
| 0.9738 | 0.6776 | 0.9932 | 0.6761 | 9.967909e-10 | 2535 |
| 0.9876 | 0.6776 | 0.9931 | 0.6761 | 9.967883e-10 | 2536 |
| 0.9851 | 0.6776 | 0.9931 | 0.6761 | 9.967858e-10 | 2537 |
| 0.9835 | 0.6776 | 0.9931 | 0.6761 | 9.967832e-10 | 2538 |
| 0.9830 | 0.6776 | 0.9930 | 0.6761 | 9.967807e-10 | 2539 |
| 0.9924 | 0.6776 | 0.9930 | 0.6761 | 9.967781e-10 | 2540 |
| 0.9869 | 0.6776 | 0.9929 | 0.6761 | 9.967756e-10 | 2541 |
| 0.9833 | 0.6776 | 0.9929 | 0.6761 | 9.96773e-10 | 2542 |
| 0.9892 | 0.6776 | 0.9929 | 0.6761 | 9.967704e-10 | 2543 |
| 0.9840 | 0.6776 | 0.9928 | 0.6761 | 9.967679e-10 | 2544 |
| 0.9811 | 0.6776 | 0.9928 | 0.6761 | 9.967653e-10 | 2545 |
| 0.9834 | 0.6776 | 0.9927 | 0.6761 | 9.967628e-10 | 2546 |
| 0.9887 | 0.6776 | 0.9927 | 0.6761 | 9.967602e-10 | 2547 |
| 0.9835 | 0.6776 | 0.9927 | 0.6761 | 9.967577e-10 | 2548 |
| 0.9890 | 0.6776 | 0.9926 | 0.6761 | 9.967551e-10 | 2549 |
| 0.9876 | 0.6776 | 0.9926 | 0.6761 | 9.967526e-10 | 2550 |
| 0.9887 | 0.6776 | 0.9926 | 0.6761 | 9.9675e-10 | 2551 |
| 0.9840 | 0.6776 | 0.9925 | 0.6761 | 9.967475e-10 | 2552 |
| 0.9866 | 0.6776 | 0.9925 | 0.6761 | 9.967449e-10 | 2553 |
| 0.9825 | 0.6776 | 0.9924 | 0.6761 | 9.967424e-10 | 2554 |
| 0.9843 | 0.6776 | 0.9924 | 0.6761 | 9.967398e-10 | 2555 |
| 0.9868 | 0.6776 | 0.9924 | 0.6761 | 9.967372e-10 | 2556 |
| 0.9806 | 0.6776 | 0.9923 | 0.6761 | 9.967347e-10 | 2557 |
| 0.9826 | 0.6776 | 0.9923 | 0.6761 | 9.967321e-10 | 2558 |
| 0.9844 | 0.6776 | 0.9922 | 0.6761 | 9.967296e-10 | 2559 |
| 0.9875 | 0.6776 | 0.9922 | 0.6761 | 9.96727e-10 | 2560 |
| 0.9919 | 0.6776 | 0.9922 | 0.6761 | 9.967245e-10 | 2561 |
| 0.9807 | 0.6776 | 0.9921 | 0.6761 | 9.967219e-10 | 2562 |
| 0.9825 | 0.6776 | 0.9921 | 0.6761 | 9.967194e-10 | 2563 |
| 0.9834 | 0.6776 | 0.9920 | 0.6761 | 9.967168e-10 | 2564 |
| 0.9847 | 0.6776 | 0.9920 | 0.6761 | 9.967143e-10 | 2565 |
| 0.9813 | 0.6776 | 0.9920 | 0.6761 | 9.967117e-10 | 2566 |
| 0.9857 | 0.6776 | 0.9919 | 0.6761 | 9.967092e-10 | 2567 |
| 0.9935 | 0.6776 | 0.9919 | 0.6761 | 9.967066e-10 | 2568 |
| 0.9852 | 0.6776 | 0.9919 | 0.6761 | 9.96704e-10 | 2569 |
| 0.9858 | 0.6776 | 0.9918 | 0.6761 | 9.967015e-10 | 2570 |
| 0.9805 | 0.6776 | 0.9918 | 0.6761 | 9.96699e-10 | 2571 |
| 0.9915 | 0.6776 | 0.9917 | 0.6761 | 9.966964e-10 | 2572 |
| 0.9866 | 0.6776 | 0.9917 | 0.6761 | 9.966938e-10 | 2573 |
| 0.9779 | 0.6776 | 0.9917 | 0.6761 | 9.966913e-10 | 2574 |
| 0.9810 | 0.6776 | 0.9916 | 0.6761 | 9.966887e-10 | 2575 |
| 0.9808 | 0.6776 | 0.9916 | 0.6761 | 9.966862e-10 | 2576 |
| 0.9861 | 0.6776 | 0.9916 | 0.6761 | 9.966836e-10 | 2577 |
| 0.9824 | 0.6776 | 0.9915 | 0.6761 | 9.966811e-10 | 2578 |
| 0.9888 | 0.6776 | 0.9915 | 0.6761 | 9.966785e-10 | 2579 |
| 0.9859 | 0.6776 | 0.9915 | 0.6761 | 9.96676e-10 | 2580 |
| 0.9879 | 0.6776 | 0.9914 | 0.6761 | 9.966734e-10 | 2581 |
| 0.9863 | 0.6776 | 0.9914 | 0.6761 | 9.966709e-10 | 2582 |
| 0.9879 | 0.6776 | 0.9913 | 0.6761 | 9.966683e-10 | 2583 |
| 0.9879 | 0.6776 | 0.9913 | 0.6761 | 9.966657e-10 | 2584 |
| 0.9851 | 0.6776 | 0.9913 | 0.6761 | 9.966632e-10 | 2585 |
| 0.9812 | 0.6776 | 0.9912 | 0.6761 | 9.966606e-10 | 2586 |
| 0.9886 | 0.6776 | 0.9912 | 0.6761 | 9.966581e-10 | 2587 |
| 0.9802 | 0.6776 | 0.9911 | 0.6761 | 9.966555e-10 | 2588 |
| 0.9863 | 0.6776 | 0.9911 | 0.6761 | 9.96653e-10 | 2589 |
| 0.9885 | 0.6776 | 0.9911 | 0.6761 | 9.966504e-10 | 2590 |
| 0.9833 | 0.6776 | 0.9910 | 0.6761 | 9.966479e-10 | 2591 |
| 0.9868 | 0.6776 | 0.9910 | 0.6761 | 9.966453e-10 | 2592 |
| 0.9825 | 0.6776 | 0.9910 | 0.6761 | 9.966428e-10 | 2593 |
| 0.9831 | 0.6776 | 0.9909 | 0.6761 | 9.966402e-10 | 2594 |
| 0.9852 | 0.6776 | 0.9909 | 0.6761 | 9.966377e-10 | 2595 |
| 0.9810 | 0.6776 | 0.9908 | 0.6761 | 9.966351e-10 | 2596 |
| 0.9866 | 0.6776 | 0.9908 | 0.6761 | 9.966326e-10 | 2597 |
| 0.9863 | 0.6776 | 0.9908 | 0.6761 | 9.9663e-10 | 2598 |
| 0.9868 | 0.6776 | 0.9907 | 0.6761 | 9.966274e-10 | 2599 |
| 0.9877 | 0.6776 | 0.9907 | 0.6761 | 9.966249e-10 | 2600 |
| 0.9844 | 0.6776 | 0.9906 | 0.6761 | 9.966223e-10 | 2601 |
| 0.9883 | 0.6776 | 0.9906 | 0.6761 | 9.966198e-10 | 2602 |
| 0.9854 | 0.6776 | 0.9906 | 0.6761 | 9.966172e-10 | 2603 |
| 0.9846 | 0.6776 | 0.9906 | 0.6761 | 9.966147e-10 | 2604 |
| 0.9850 | 0.6776 | 0.9905 | 0.6761 | 9.966121e-10 | 2605 |
| 0.9834 | 0.6776 | 0.9905 | 0.6761 | 9.966096e-10 | 2606 |
| 0.9881 | 0.6776 | 0.9904 | 0.6761 | 9.96607e-10 | 2607 |
| 0.9827 | 0.6776 | 0.9904 | 0.6761 | 9.966045e-10 | 2608 |
| 0.9812 | 0.6776 | 0.9904 | 0.6761 | 9.966019e-10 | 2609 |
| 0.9840 | 0.6776 | 0.9903 | 0.6761 | 9.965994e-10 | 2610 |
| 0.9763 | 0.6776 | 0.9903 | 0.6761 | 9.965968e-10 | 2611 |
| 0.9883 | 0.6776 | 0.9903 | 0.6761 | 9.965943e-10 | 2612 |
| 0.9896 | 0.6776 | 0.9902 | 0.6761 | 9.965917e-10 | 2613 |
| 0.9819 | 0.6776 | 0.9902 | 0.6761 | 9.965891e-10 | 2614 |
| 0.9874 | 0.6776 | 0.9902 | 0.6761 | 9.965866e-10 | 2615 |
| 0.9821 | 0.6776 | 0.9901 | 0.6761 | 9.96584e-10 | 2616 |
| 0.9803 | 0.6776 | 0.9901 | 0.6761 | 9.965815e-10 | 2617 |
| 0.9765 | 0.6776 | 0.9901 | 0.6761 | 9.965788e-10 | 2618 |
| 0.9872 | 0.6776 | 0.9900 | 0.6761 | 9.965762e-10 | 2619 |
| 0.9789 | 0.6776 | 0.9900 | 0.6761 | 9.965735e-10 | 2620 |
| 0.9831 | 0.6776 | 0.9899 | 0.6761 | 9.965708e-10 | 2621 |
| 0.9839 | 0.6776 | 0.9899 | 0.6761 | 9.965682e-10 | 2622 |
| 0.9849 | 0.6776 | 0.9899 | 0.6761 | 9.965655e-10 | 2623 |
| 0.9807 | 0.6776 | 0.9898 | 0.6761 | 9.965628e-10 | 2624 |
| 0.9798 | 0.6776 | 0.9898 | 0.6761 | 9.965602e-10 | 2625 |
| 0.9809 | 0.6776 | 0.9898 | 0.6761 | 9.965575e-10 | 2626 |
| 0.9816 | 0.6776 | 0.9897 | 0.6761 | 9.965548e-10 | 2627 |
| 0.9790 | 0.6776 | 0.9897 | 0.6761 | 9.965522e-10 | 2628 |
| 0.9855 | 0.6776 | 0.9897 | 0.6761 | 9.965495e-10 | 2629 |
| 0.9852 | 0.6776 | 0.9896 | 0.6761 | 9.965468e-10 | 2630 |
| 0.9775 | 0.6776 | 0.9896 | 0.6761 | 9.965442e-10 | 2631 |
| 0.9817 | 0.6776 | 0.9896 | 0.6761 | 9.965415e-10 | 2632 |
| 0.9833 | 0.6776 | 0.9895 | 0.6761 | 9.965389e-10 | 2633 |
| 0.9806 | 0.6776 | 0.9895 | 0.6761 | 9.965362e-10 | 2634 |
| 0.9819 | 0.6776 | 0.9895 | 0.6761 | 9.965335e-10 | 2635 |
| 0.9792 | 0.6776 | 0.9894 | 0.6761 | 9.965309e-10 | 2636 |
| 0.9808 | 0.6776 | 0.9894 | 0.6761 | 9.965282e-10 | 2637 |
| 0.9802 | 0.6776 | 0.9893 | 0.6761 | 9.965255e-10 | 2638 |
| 0.9866 | 0.6776 | 0.9893 | 0.6761 | 9.965229e-10 | 2639 |
| 0.9809 | 0.6776 | 0.9893 | 0.6761 | 9.965202e-10 | 2640 |
| 0.9781 | 0.6776 | 0.9892 | 0.6761 | 9.965175e-10 | 2641 |
| 0.9835 | 0.6776 | 0.9892 | 0.6761 | 9.965149e-10 | 2642 |
| 0.9805 | 0.6776 | 0.9892 | 0.6761 | 9.965122e-10 | 2643 |
| 0.9819 | 0.6776 | 0.9891 | 0.6761 | 9.965095e-10 | 2644 |
| 0.9864 | 0.6776 | 0.9891 | 0.6761 | 9.965069e-10 | 2645 |
| 0.9848 | 0.6776 | 0.9891 | 0.6761 | 9.965042e-10 | 2646 |
| 0.9878 | 0.6776 | 0.9890 | 0.6761 | 9.965015e-10 | 2647 |
| 0.9788 | 0.6776 | 0.9890 | 0.6761 | 9.964989e-10 | 2648 |
| 0.9839 | 0.6776 | 0.9890 | 0.6761 | 9.964962e-10 | 2649 |
| 0.9851 | 0.6776 | 0.9889 | 0.6761 | 9.964936e-10 | 2650 |
| 0.9832 | 0.6776 | 0.9889 | 0.6761 | 9.964909e-10 | 2651 |
| 0.9841 | 0.6776 | 0.9889 | 0.6761 | 9.964882e-10 | 2652 |
| 0.9858 | 0.6776 | 0.9888 | 0.6761 | 9.964856e-10 | 2653 |
| 0.9829 | 0.6776 | 0.9888 | 0.6761 | 9.964829e-10 | 2654 |
| 0.9861 | 0.6776 | 0.9888 | 0.6761 | 9.964802e-10 | 2655 |
| 0.9829 | 0.6776 | 0.9887 | 0.6761 | 9.964776e-10 | 2656 |
| 0.9798 | 0.6776 | 0.9887 | 0.6761 | 9.964749e-10 | 2657 |
| 0.9819 | 0.6776 | 0.9887 | 0.6761 | 9.964722e-10 | 2658 |
| 0.9828 | 0.6776 | 0.9886 | 0.6761 | 9.964696e-10 | 2659 |
| 0.9924 | 0.6776 | 0.9886 | 0.6761 | 9.964669e-10 | 2660 |
| 0.9799 | 0.6776 | 0.9886 | 0.6761 | 9.964642e-10 | 2661 |
| 0.9823 | 0.6776 | 0.9885 | 0.6761 | 9.964616e-10 | 2662 |
| 0.9820 | 0.6776 | 0.9885 | 0.6761 | 9.964589e-10 | 2663 |
| 0.9891 | 0.6776 | 0.9885 | 0.6761 | 9.964563e-10 | 2664 |
| 0.9851 | 0.6776 | 0.9884 | 0.6761 | 9.964536e-10 | 2665 |
| 0.9746 | 0.6776 | 0.9884 | 0.6761 | 9.964509e-10 | 2666 |
| 0.9725 | 0.6776 | 0.9884 | 0.6761 | 9.964483e-10 | 2667 |
| 0.9788 | 0.6776 | 0.9883 | 0.6761 | 9.964456e-10 | 2668 |
| 0.9786 | 0.6776 | 0.9883 | 0.6761 | 9.964429e-10 | 2669 |
| 0.9837 | 0.6776 | 0.9883 | 0.6761 | 9.964403e-10 | 2670 |
| 0.9723 | 0.6776 | 0.9882 | 0.6761 | 9.964376e-10 | 2671 |
| 0.9786 | 0.6776 | 0.9882 | 0.6761 | 9.964349e-10 | 2672 |
| 0.9782 | 0.6776 | 0.9882 | 0.6761 | 9.964323e-10 | 2673 |
| 0.9830 | 0.6776 | 0.9881 | 0.6761 | 9.964296e-10 | 2674 |
| 0.9839 | 0.6776 | 0.9881 | 0.6761 | 9.964269e-10 | 2675 |
| 0.9797 | 0.6776 | 0.9881 | 0.6761 | 9.964243e-10 | 2676 |
| 0.9783 | 0.6776 | 0.9880 | 0.6761 | 9.964216e-10 | 2677 |
| 0.9766 | 0.6776 | 0.9880 | 0.6761 | 9.96419e-10 | 2678 |
| 0.9820 | 0.6776 | 0.9880 | 0.6761 | 9.964163e-10 | 2679 |
| 0.9830 | 0.6776 | 0.9879 | 0.6761 | 9.964136e-10 | 2680 |
| 0.9812 | 0.6776 | 0.9879 | 0.6761 | 9.96411e-10 | 2681 |
| 0.9781 | 0.6776 | 0.9878 | 0.6761 | 9.964083e-10 | 2682 |
| 0.9815 | 0.6776 | 0.9878 | 0.6761 | 9.964056e-10 | 2683 |
| 0.9779 | 0.6776 | 0.9878 | 0.6761 | 9.96403e-10 | 2684 |
| 0.9857 | 0.6776 | 0.9878 | 0.6761 | 9.964003e-10 | 2685 |
| 0.9810 | 0.6776 | 0.9877 | 0.6761 | 9.963976e-10 | 2686 |
| 0.9835 | 0.6776 | 0.9877 | 0.6761 | 9.96395e-10 | 2687 |
| 0.9854 | 0.6776 | 0.9877 | 0.6761 | 9.963923e-10 | 2688 |
| 0.9809 | 0.6776 | 0.9876 | 0.6761 | 9.963896e-10 | 2689 |
| 0.9797 | 0.6776 | 0.9876 | 0.6761 | 9.96387e-10 | 2690 |
| 0.9815 | 0.6776 | 0.9876 | 0.6761 | 9.963843e-10 | 2691 |
| 0.9815 | 0.6776 | 0.9875 | 0.6761 | 9.963816e-10 | 2692 |
| 0.9780 | 0.6776 | 0.9875 | 0.6761 | 9.96379e-10 | 2693 |
| 0.9847 | 0.6776 | 0.9875 | 0.6761 | 9.963763e-10 | 2694 |
| 0.9771 | 0.6776 | 0.9874 | 0.6761 | 9.963737e-10 | 2695 |
| 0.9792 | 0.6776 | 0.9874 | 0.6761 | 9.96371e-10 | 2696 |
| 0.9782 | 0.6776 | 0.9874 | 0.6761 | 9.963683e-10 | 2697 |
| 0.9782 | 0.6776 | 0.9873 | 0.6761 | 9.963657e-10 | 2698 |
| 0.9792 | 0.6776 | 0.9873 | 0.6761 | 9.96363e-10 | 2699 |
| 0.9845 | 0.6776 | 0.9873 | 0.6761 | 9.963603e-10 | 2700 |
| 0.9761 | 0.6776 | 0.9872 | 0.6761 | 9.963577e-10 | 2701 |
| 0.9761 | 0.6776 | 0.9872 | 0.6761 | 9.96355e-10 | 2702 |
| 0.9834 | 0.6776 | 0.9872 | 0.6761 | 9.963523e-10 | 2703 |
| 0.9855 | 0.6776 | 0.9871 | 0.6761 | 9.963497e-10 | 2704 |
| 0.9796 | 0.6776 | 0.9871 | 0.6761 | 9.96347e-10 | 2705 |
| 0.9784 | 0.6776 | 0.9871 | 0.6761 | 9.963443e-10 | 2706 |
| 0.9759 | 0.6776 | 0.9870 | 0.6761 | 9.963417e-10 | 2707 |
| 0.9829 | 0.6776 | 0.9870 | 0.6761 | 9.96339e-10 | 2708 |
| 0.9791 | 0.6776 | 0.9870 | 0.6761 | 9.963363e-10 | 2709 |
| 0.9814 | 0.6776 | 0.9869 | 0.6761 | 9.963337e-10 | 2710 |
| 0.9857 | 0.6776 | 0.9869 | 0.6761 | 9.96331e-10 | 2711 |
| 0.9845 | 0.6776 | 0.9869 | 0.6761 | 9.963284e-10 | 2712 |
| 0.9769 | 0.6776 | 0.9868 | 0.6761 | 9.963257e-10 | 2713 |
| 0.9796 | 0.6776 | 0.9868 | 0.6761 | 9.96323e-10 | 2714 |
| 0.9810 | 0.6776 | 0.9868 | 0.6761 | 9.963204e-10 | 2715 |
| 0.9802 | 0.6776 | 0.9867 | 0.6761 | 9.963177e-10 | 2716 |
| 0.9841 | 0.6776 | 0.9867 | 0.6761 | 9.96315e-10 | 2717 |
| 0.9792 | 0.6776 | 0.9867 | 0.6761 | 9.963124e-10 | 2718 |
| 0.9800 | 0.6776 | 0.9866 | 0.6761 | 9.963097e-10 | 2719 |
| 0.9777 | 0.6776 | 0.9866 | 0.6761 | 9.96307e-10 | 2720 |
| 0.9767 | 0.6776 | 0.9866 | 0.6761 | 9.963044e-10 | 2721 |
| 0.9776 | 0.6776 | 0.9866 | 0.6761 | 9.963017e-10 | 2722 |
| 0.9766 | 0.6776 | 0.9865 | 0.6761 | 9.96299e-10 | 2723 |
| 0.9804 | 0.6776 | 0.9865 | 0.6761 | 9.962964e-10 | 2724 |
| 0.9825 | 0.6776 | 0.9865 | 0.6761 | 9.962937e-10 | 2725 |
| 0.9768 | 0.6776 | 0.9864 | 0.6761 | 9.96291e-10 | 2726 |
| 0.9730 | 0.6776 | 0.9864 | 0.6761 | 9.962884e-10 | 2727 |
| 0.9728 | 0.6776 | 0.9864 | 0.6761 | 9.962857e-10 | 2728 |
| 0.9742 | 0.6776 | 0.9863 | 0.6761 | 9.962831e-10 | 2729 |
| 0.9813 | 0.6776 | 0.9863 | 0.6761 | 9.962804e-10 | 2730 |
| 0.9759 | 0.6776 | 0.9863 | 0.6761 | 9.962776e-10 | 2731 |
| 0.9790 | 0.6776 | 0.9862 | 0.6761 | 9.962748e-10 | 2732 |
| 0.9810 | 0.6776 | 0.9862 | 0.6761 | 9.962721e-10 | 2733 |
| 0.9791 | 0.6776 | 0.9862 | 0.6761 | 9.962693e-10 | 2734 |
| 0.9806 | 0.6776 | 0.9861 | 0.6761 | 9.962665e-10 | 2735 |
| 0.9761 | 0.6776 | 0.9861 | 0.6761 | 9.962637e-10 | 2736 |
| 0.9783 | 0.6776 | 0.9861 | 0.6761 | 9.96261e-10 | 2737 |
| 0.9823 | 0.6776 | 0.9861 | 0.6761 | 9.962582e-10 | 2738 |
| 0.9770 | 0.6776 | 0.9860 | 0.6761 | 9.962554e-10 | 2739 |
| 0.9775 | 0.6776 | 0.9860 | 0.6761 | 9.962526e-10 | 2740 |
| 0.9772 | 0.6776 | 0.9860 | 0.6761 | 9.962499e-10 | 2741 |
| 0.9840 | 0.6776 | 0.9859 | 0.6761 | 9.962471e-10 | 2742 |
| 0.9798 | 0.6776 | 0.9859 | 0.6761 | 9.962443e-10 | 2743 |
| 0.9715 | 0.6776 | 0.9859 | 0.6761 | 9.962415e-10 | 2744 |
| 0.9738 | 0.6776 | 0.9858 | 0.6761 | 9.962388e-10 | 2745 |
| 0.9802 | 0.6776 | 0.9858 | 0.6761 | 9.96236e-10 | 2746 |
| 0.9786 | 0.6776 | 0.9858 | 0.6761 | 9.962332e-10 | 2747 |
| 0.9784 | 0.6776 | 0.9858 | 0.6761 | 9.962304e-10 | 2748 |
| 0.9823 | 0.6776 | 0.9857 | 0.6761 | 9.962277e-10 | 2749 |
| 0.9774 | 0.6776 | 0.9857 | 0.6761 | 9.962249e-10 | 2750 |
| 0.9792 | 0.6776 | 0.9857 | 0.6761 | 9.962221e-10 | 2751 |
| 0.9757 | 0.6776 | 0.9856 | 0.6761 | 9.962193e-10 | 2752 |
| 0.9773 | 0.6776 | 0.9856 | 0.6761 | 9.962166e-10 | 2753 |
| 0.9722 | 0.6776 | 0.9856 | 0.6761 | 9.962138e-10 | 2754 |
| 0.9806 | 0.6776 | 0.9855 | 0.6761 | 9.96211e-10 | 2755 |
| 0.9752 | 0.6776 | 0.9855 | 0.6761 | 9.962082e-10 | 2756 |
| 0.9727 | 0.6776 | 0.9855 | 0.6761 | 9.962055e-10 | 2757 |
| 0.9751 | 0.6776 | 0.9855 | 0.6761 | 9.962027e-10 | 2758 |
| 0.9800 | 0.6776 | 0.9854 | 0.6761 | 9.961999e-10 | 2759 |
| 0.9766 | 0.6776 | 0.9854 | 0.6761 | 9.961971e-10 | 2760 |
| 0.9721 | 0.6776 | 0.9854 | 0.6761 | 9.961943e-10 | 2761 |
| 0.9805 | 0.6776 | 0.9853 | 0.6761 | 9.961916e-10 | 2762 |
| 0.9847 | 0.6776 | 0.9853 | 0.6761 | 9.961888e-10 | 2763 |
| 0.9727 | 0.6776 | 0.9853 | 0.6761 | 9.96186e-10 | 2764 |
| 0.9811 | 0.6776 | 0.9852 | 0.6761 | 9.961832e-10 | 2765 |
| 0.9747 | 0.6776 | 0.9852 | 0.6761 | 9.961805e-10 | 2766 |
| 0.9793 | 0.6776 | 0.9852 | 0.6761 | 9.961777e-10 | 2767 |
| 0.9742 | 0.6776 | 0.9852 | 0.6761 | 9.961749e-10 | 2768 |
| 0.9795 | 0.6776 | 0.9851 | 0.6761 | 9.961721e-10 | 2769 |
| 0.9814 | 0.6776 | 0.9851 | 0.6761 | 9.961694e-10 | 2770 |
| 0.9738 | 0.6776 | 0.9851 | 0.6761 | 9.961666e-10 | 2771 |
| 0.9755 | 0.6776 | 0.9850 | 0.6761 | 9.961638e-10 | 2772 |
| 0.9806 | 0.6776 | 0.9850 | 0.6761 | 9.96161e-10 | 2773 |
| 0.9776 | 0.6776 | 0.9850 | 0.6761 | 9.961583e-10 | 2774 |
| 0.9718 | 0.6776 | 0.9850 | 0.6761 | 9.961555e-10 | 2775 |
| 0.9794 | 0.6776 | 0.9849 | 0.6761 | 9.961527e-10 | 2776 |
| 0.9708 | 0.6776 | 0.9849 | 0.6761 | 9.961499e-10 | 2777 |
| 0.9750 | 0.6776 | 0.9849 | 0.6761 | 9.961472e-10 | 2778 |
| 0.9703 | 0.6776 | 0.9848 | 0.6761 | 9.961444e-10 | 2779 |
| 0.9743 | 0.6776 | 0.9848 | 0.6761 | 9.961416e-10 | 2780 |
| 0.9788 | 0.6776 | 0.9848 | 0.6761 | 9.961388e-10 | 2781 |
| 0.9763 | 0.6776 | 0.9848 | 0.6761 | 9.961361e-10 | 2782 |
| 0.9715 | 0.6776 | 0.9847 | 0.6761 | 9.961333e-10 | 2783 |
| 0.9783 | 0.6776 | 0.9847 | 0.6761 | 9.961305e-10 | 2784 |
| 0.9815 | 0.6776 | 0.9847 | 0.6761 | 9.961277e-10 | 2785 |
| 0.9740 | 0.6776 | 0.9846 | 0.6761 | 9.96125e-10 | 2786 |
| 0.9818 | 0.6776 | 0.9846 | 0.6761 | 9.961222e-10 | 2787 |
| 0.9804 | 0.6776 | 0.9846 | 0.6761 | 9.961194e-10 | 2788 |
| 0.9816 | 0.6776 | 0.9845 | 0.6761 | 9.961166e-10 | 2789 |
| 0.9776 | 0.6776 | 0.9845 | 0.6761 | 9.961139e-10 | 2790 |
| 0.9801 | 0.6776 | 0.9845 | 0.6761 | 9.961111e-10 | 2791 |
| 0.9771 | 0.6776 | 0.9845 | 0.6761 | 9.961083e-10 | 2792 |
| 0.9771 | 0.6776 | 0.9844 | 0.6761 | 9.961055e-10 | 2793 |
| 0.9712 | 0.6776 | 0.9844 | 0.6761 | 9.961028e-10 | 2794 |
| 0.9781 | 0.6776 | 0.9844 | 0.6761 | 9.961e-10 | 2795 |
| 0.9726 | 0.6776 | 0.9843 | 0.6761 | 9.960972e-10 | 2796 |
| 0.9701 | 0.6776 | 0.9843 | 0.6761 | 9.960944e-10 | 2797 |
| 0.9739 | 0.6776 | 0.9843 | 0.6761 | 9.960917e-10 | 2798 |
| 0.9774 | 0.6776 | 0.9842 | 0.6761 | 9.960889e-10 | 2799 |
| 0.9781 | 0.6776 | 0.9842 | 0.6761 | 9.960861e-10 | 2800 |
| 0.9746 | 0.6776 | 0.9842 | 0.6761 | 9.960833e-10 | 2801 |
| 0.9756 | 0.6776 | 0.9842 | 0.6761 | 9.960806e-10 | 2802 |
| 0.9731 | 0.6776 | 0.9841 | 0.6761 | 9.960778e-10 | 2803 |
| 0.9723 | 0.6776 | 0.9841 | 0.6761 | 9.96075e-10 | 2804 |
| 0.9762 | 0.6776 | 0.9841 | 0.6761 | 9.960722e-10 | 2805 |
| 0.9753 | 0.6776 | 0.9841 | 0.6761 | 9.960694e-10 | 2806 |
| 0.9760 | 0.6776 | 0.9840 | 0.6761 | 9.960667e-10 | 2807 |
| 0.9794 | 0.6776 | 0.9840 | 0.6761 | 9.960639e-10 | 2808 |
| 0.9758 | 0.6776 | 0.9840 | 0.6761 | 9.960611e-10 | 2809 |
| 0.9738 | 0.6776 | 0.9839 | 0.6761 | 9.960583e-10 | 2810 |
| 0.9714 | 0.6776 | 0.9839 | 0.6761 | 9.960556e-10 | 2811 |
| 0.9752 | 0.6776 | 0.9839 | 0.6761 | 9.960528e-10 | 2812 |
| 0.9765 | 0.6776 | 0.9839 | 0.6761 | 9.9605e-10 | 2813 |
| 0.9746 | 0.6776 | 0.9838 | 0.6761 | 9.960472e-10 | 2814 |
| 0.9764 | 0.6776 | 0.9838 | 0.6761 | 9.960445e-10 | 2815 |
| 0.9770 | 0.6776 | 0.9838 | 0.6761 | 9.960417e-10 | 2816 |
| 0.9748 | 0.6776 | 0.9837 | 0.6761 | 9.960389e-10 | 2817 |
| 0.9771 | 0.6776 | 0.9837 | 0.6761 | 9.960361e-10 | 2818 |
| 0.9759 | 0.6776 | 0.9837 | 0.6761 | 9.960334e-10 | 2819 |
| 0.9729 | 0.6776 | 0.9837 | 0.6761 | 9.960306e-10 | 2820 |
| 0.9795 | 0.6776 | 0.9836 | 0.6761 | 9.960278e-10 | 2821 |
| 0.9739 | 0.6776 | 0.9836 | 0.6761 | 9.96025e-10 | 2822 |
| 0.9757 | 0.6776 | 0.9836 | 0.6761 | 9.960223e-10 | 2823 |
| 0.9773 | 0.6776 | 0.9836 | 0.6761 | 9.960195e-10 | 2824 |
| 0.9744 | 0.6776 | 0.9835 | 0.6761 | 9.960167e-10 | 2825 |
| 0.9753 | 0.6776 | 0.9835 | 0.6761 | 9.960139e-10 | 2826 |
| 0.9747 | 0.6776 | 0.9835 | 0.6761 | 9.960112e-10 | 2827 |
| 0.9697 | 0.6776 | 0.9834 | 0.6761 | 9.960084e-10 | 2828 |
| 0.9779 | 0.6776 | 0.9834 | 0.6761 | 9.960056e-10 | 2829 |
| 0.9720 | 0.6776 | 0.9834 | 0.6761 | 9.960028e-10 | 2830 |
| 0.9742 | 0.6776 | 0.9834 | 0.6761 | 9.960001e-10 | 2831 |
| 0.9784 | 0.6776 | 0.9833 | 0.6761 | 9.959973e-10 | 2832 |
| 0.9731 | 0.6776 | 0.9833 | 0.6761 | 9.959945e-10 | 2833 |
| 0.9785 | 0.6776 | 0.9833 | 0.6761 | 9.959917e-10 | 2834 |
| 0.9737 | 0.6776 | 0.9833 | 0.6761 | 9.95989e-10 | 2835 |
| 0.9770 | 0.6776 | 0.9832 | 0.6761 | 9.959862e-10 | 2836 |
| 0.9729 | 0.6776 | 0.9832 | 0.6761 | 9.959834e-10 | 2837 |
| 0.9774 | 0.6776 | 0.9832 | 0.6761 | 9.959806e-10 | 2838 |
| 0.9684 | 0.6776 | 0.9831 | 0.6761 | 9.959779e-10 | 2839 |
| 0.9777 | 0.6776 | 0.9831 | 0.6761 | 9.959751e-10 | 2840 |
| 0.9743 | 0.6776 | 0.9831 | 0.6761 | 9.959723e-10 | 2841 |
| 0.9705 | 0.6776 | 0.9831 | 0.6761 | 9.959695e-10 | 2842 |
| 0.9713 | 0.6776 | 0.9830 | 0.6761 | 9.959666e-10 | 2843 |
| 0.9754 | 0.6776 | 0.9830 | 0.6761 | 9.959638e-10 | 2844 |
| 0.9740 | 0.6776 | 0.9830 | 0.6761 | 9.959609e-10 | 2845 |
| 0.9754 | 0.6776 | 0.9829 | 0.6761 | 9.95958e-10 | 2846 |
| 0.9696 | 0.6776 | 0.9829 | 0.6761 | 9.959551e-10 | 2847 |
| 0.9836 | 0.6776 | 0.9829 | 0.6761 | 9.959522e-10 | 2848 |
| 0.9750 | 0.6776 | 0.9829 | 0.6761 | 9.959493e-10 | 2849 |
| 0.9744 | 0.6776 | 0.9828 | 0.6761 | 9.959464e-10 | 2850 |
| 0.9753 | 0.6776 | 0.9828 | 0.6761 | 9.959435e-10 | 2851 |
| 0.9763 | 0.6776 | 0.9828 | 0.6761 | 9.959407e-10 | 2852 |
| 0.9742 | 0.6776 | 0.9828 | 0.6761 | 9.959378e-10 | 2853 |
| 0.9723 | 0.6776 | 0.9827 | 0.6761 | 9.959349e-10 | 2854 |
| 0.9703 | 0.6776 | 0.9827 | 0.6761 | 9.95932e-10 | 2855 |
| 0.9742 | 0.6776 | 0.9827 | 0.6761 | 9.959291e-10 | 2856 |
| 0.9804 | 0.6776 | 0.9826 | 0.6761 | 9.959262e-10 | 2857 |
| 0.9799 | 0.6776 | 0.9826 | 0.6761 | 9.959233e-10 | 2858 |
| 0.9779 | 0.6776 | 0.9826 | 0.6761 | 9.959205e-10 | 2859 |
| 0.9754 | 0.6776 | 0.9826 | 0.6761 | 9.959176e-10 | 2860 |
| 0.9689 | 0.6776 | 0.9825 | 0.6761 | 9.959147e-10 | 2861 |
| 0.9715 | 0.6776 | 0.9825 | 0.6761 | 9.959118e-10 | 2862 |
| 0.9694 | 0.6776 | 0.9825 | 0.6761 | 9.959089e-10 | 2863 |
| 0.9720 | 0.6776 | 0.9824 | 0.6761 | 9.95906e-10 | 2864 |
| 0.9754 | 0.6776 | 0.9824 | 0.6761 | 9.959031e-10 | 2865 |
| 0.9758 | 0.6776 | 0.9824 | 0.6761 | 9.959003e-10 | 2866 |
| 0.9698 | 0.6776 | 0.9824 | 0.6761 | 9.958974e-10 | 2867 |
| 0.9777 | 0.6776 | 0.9823 | 0.6761 | 9.958945e-10 | 2868 |
| 0.9702 | 0.6776 | 0.9823 | 0.6761 | 9.958916e-10 | 2869 |
| 0.9784 | 0.6776 | 0.9823 | 0.6761 | 9.958887e-10 | 2870 |
| 0.9779 | 0.6776 | 0.9823 | 0.6761 | 9.958858e-10 | 2871 |
| 0.9746 | 0.6776 | 0.9822 | 0.6761 | 9.958829e-10 | 2872 |
| 0.9693 | 0.6776 | 0.9822 | 0.6761 | 9.9588e-10 | 2873 |
| 0.9721 | 0.6776 | 0.9822 | 0.6761 | 9.958772e-10 | 2874 |
| 0.9711 | 0.6776 | 0.9822 | 0.6761 | 9.958743e-10 | 2875 |
| 0.9735 | 0.6776 | 0.9821 | 0.6761 | 9.958714e-10 | 2876 |
| 0.9676 | 0.6776 | 0.9821 | 0.6761 | 9.958685e-10 | 2877 |
| 0.9750 | 0.6776 | 0.9821 | 0.6761 | 9.958656e-10 | 2878 |
| 0.9786 | 0.6776 | 0.9820 | 0.6761 | 9.958627e-10 | 2879 |
| 0.9687 | 0.6776 | 0.9820 | 0.6761 | 9.958598e-10 | 2880 |
| 0.9776 | 0.6776 | 0.9820 | 0.6761 | 9.95857e-10 | 2881 |
| 0.9794 | 0.6776 | 0.9820 | 0.6761 | 9.958541e-10 | 2882 |
| 0.9705 | 0.6776 | 0.9819 | 0.6761 | 9.958512e-10 | 2883 |
| 0.9743 | 0.6776 | 0.9819 | 0.6761 | 9.958483e-10 | 2884 |
| 0.9721 | 0.6776 | 0.9819 | 0.6761 | 9.958454e-10 | 2885 |
| 0.9717 | 0.6776 | 0.9819 | 0.6761 | 9.958425e-10 | 2886 |
| 0.9695 | 0.6776 | 0.9818 | 0.6761 | 9.958396e-10 | 2887 |
| 0.9729 | 0.6776 | 0.9818 | 0.6761 | 9.958367e-10 | 2888 |
| 0.9759 | 0.6776 | 0.9818 | 0.6761 | 9.958339e-10 | 2889 |
| 0.9799 | 0.6776 | 0.9818 | 0.6761 | 9.95831e-10 | 2890 |
| 0.9797 | 0.6776 | 0.9817 | 0.6761 | 9.958281e-10 | 2891 |
| 0.9709 | 0.6776 | 0.9817 | 0.6761 | 9.958252e-10 | 2892 |
| 0.9767 | 0.6776 | 0.9817 | 0.6761 | 9.958223e-10 | 2893 |
| 0.9733 | 0.6776 | 0.9817 | 0.6761 | 9.958194e-10 | 2894 |
| 0.9768 | 0.6776 | 0.9816 | 0.6761 | 9.958165e-10 | 2895 |
| 0.9763 | 0.6776 | 0.9816 | 0.6761 | 9.958137e-10 | 2896 |
| 0.9696 | 0.6776 | 0.9816 | 0.6761 | 9.958108e-10 | 2897 |
| 0.9765 | 0.6776 | 0.9816 | 0.6761 | 9.958079e-10 | 2898 |
| 0.9768 | 0.6776 | 0.9815 | 0.6761 | 9.95805e-10 | 2899 |
| 0.9754 | 0.6776 | 0.9815 | 0.6761 | 9.958021e-10 | 2900 |
| 0.9732 | 0.6776 | 0.9815 | 0.6761 | 9.957992e-10 | 2901 |
| 0.9735 | 0.6776 | 0.9815 | 0.6761 | 9.957963e-10 | 2902 |
| 0.9772 | 0.6776 | 0.9814 | 0.6761 | 9.957934e-10 | 2903 |
| 0.9729 | 0.6776 | 0.9814 | 0.6761 | 9.957906e-10 | 2904 |
| 0.9731 | 0.6776 | 0.9814 | 0.6761 | 9.957877e-10 | 2905 |
| 0.9724 | 0.6776 | 0.9814 | 0.6761 | 9.957848e-10 | 2906 |
| 0.9756 | 0.6776 | 0.9813 | 0.6761 | 9.957819e-10 | 2907 |
| 0.9676 | 0.6776 | 0.9813 | 0.6761 | 9.95779e-10 | 2908 |
| 0.9691 | 0.6776 | 0.9813 | 0.6761 | 9.957761e-10 | 2909 |
| 0.9737 | 0.6776 | 0.9812 | 0.6761 | 9.957732e-10 | 2910 |
| 0.9677 | 0.6776 | 0.9812 | 0.6761 | 9.957704e-10 | 2911 |
| 0.9717 | 0.6776 | 0.9812 | 0.6761 | 9.957675e-10 | 2912 |
| 0.9709 | 0.6776 | 0.9812 | 0.6761 | 9.957646e-10 | 2913 |
| 0.9707 | 0.6776 | 0.9811 | 0.6761 | 9.957617e-10 | 2914 |
| 0.9718 | 0.6776 | 0.9811 | 0.6761 | 9.957588e-10 | 2915 |
| 0.9753 | 0.6776 | 0.9811 | 0.6761 | 9.957559e-10 | 2916 |
| 0.9767 | 0.6776 | 0.9811 | 0.6761 | 9.95753e-10 | 2917 |
| 0.9707 | 0.6776 | 0.9810 | 0.6761 | 9.957501e-10 | 2918 |
| 0.9729 | 0.6776 | 0.9810 | 0.6761 | 9.957473e-10 | 2919 |
| 0.9740 | 0.6776 | 0.9810 | 0.6761 | 9.957444e-10 | 2920 |
| 0.9677 | 0.6776 | 0.9810 | 0.6761 | 9.957415e-10 | 2921 |
| 0.9758 | 0.6776 | 0.9809 | 0.6761 | 9.957386e-10 | 2922 |
| 0.9707 | 0.6776 | 0.9809 | 0.6761 | 9.957357e-10 | 2923 |
| 0.9715 | 0.6776 | 0.9809 | 0.6761 | 9.957328e-10 | 2924 |
| 0.9741 | 0.6776 | 0.9809 | 0.6761 | 9.957299e-10 | 2925 |
| 0.9759 | 0.6776 | 0.9808 | 0.6761 | 9.957271e-10 | 2926 |
| 0.9731 | 0.6776 | 0.9808 | 0.6761 | 9.957242e-10 | 2927 |
| 0.9724 | 0.6776 | 0.9808 | 0.6761 | 9.957213e-10 | 2928 |
| 0.9713 | 0.6776 | 0.9808 | 0.6761 | 9.957184e-10 | 2929 |
| 0.9792 | 0.6776 | 0.9807 | 0.6761 | 9.957155e-10 | 2930 |
| 0.9721 | 0.6776 | 0.9807 | 0.6761 | 9.957126e-10 | 2931 |
| 0.9750 | 0.6776 | 0.9807 | 0.6761 | 9.957097e-10 | 2932 |
| 0.9788 | 0.6776 | 0.9807 | 0.6761 | 9.957069e-10 | 2933 |
| 0.9718 | 0.6776 | 0.9806 | 0.6761 | 9.95704e-10 | 2934 |
| 0.9688 | 0.6776 | 0.9806 | 0.6761 | 9.957011e-10 | 2935 |
| 0.9747 | 0.6776 | 0.9806 | 0.6761 | 9.956982e-10 | 2936 |
| 0.9708 | 0.6776 | 0.9806 | 0.6761 | 9.956953e-10 | 2937 |
| 0.9642 | 0.6776 | 0.9805 | 0.6761 | 9.956924e-10 | 2938 |
| 0.9693 | 0.6776 | 0.9805 | 0.6761 | 9.956895e-10 | 2939 |
| 0.9644 | 0.6776 | 0.9805 | 0.6761 | 9.956866e-10 | 2940 |
| 0.9662 | 0.6776 | 0.9805 | 0.6761 | 9.956838e-10 | 2941 |
| 0.9684 | 0.6776 | 0.9804 | 0.6761 | 9.956809e-10 | 2942 |
| 0.9759 | 0.6776 | 0.9804 | 0.6761 | 9.95678e-10 | 2943 |
| 0.9737 | 0.6776 | 0.9804 | 0.6761 | 9.956751e-10 | 2944 |
| 0.9760 | 0.6776 | 0.9804 | 0.6761 | 9.956722e-10 | 2945 |
| 0.9682 | 0.6776 | 0.9803 | 0.6761 | 9.956693e-10 | 2946 |
| 0.9743 | 0.6776 | 0.9803 | 0.6761 | 9.956664e-10 | 2947 |
| 0.9765 | 0.6776 | 0.9803 | 0.6761 | 9.956636e-10 | 2948 |
| 0.9711 | 0.6776 | 0.9803 | 0.6761 | 9.956607e-10 | 2949 |
| 0.9746 | 0.6776 | 0.9802 | 0.6761 | 9.956578e-10 | 2950 |
| 0.9795 | 0.6776 | 0.9802 | 0.6761 | 9.956549e-10 | 2951 |
| 0.9634 | 0.6776 | 0.9802 | 0.6761 | 9.95652e-10 | 2952 |
| 0.9733 | 0.6776 | 0.9802 | 0.6761 | 9.956491e-10 | 2953 |
| 0.9748 | 0.6776 | 0.9801 | 0.6761 | 9.956462e-10 | 2954 |
| 0.9712 | 0.6776 | 0.9801 | 0.6761 | 9.956432e-10 | 2955 |
| 0.9740 | 0.6776 | 0.9801 | 0.6761 | 9.956402e-10 | 2956 |
| 0.9772 | 0.6776 | 0.9801 | 0.6761 | 9.956372e-10 | 2957 |
| 0.9707 | 0.6776 | 0.9801 | 0.6761 | 9.956342e-10 | 2958 |
| 0.9730 | 0.6776 | 0.9800 | 0.6761 | 9.956312e-10 | 2959 |
| 0.9669 | 0.6776 | 0.9800 | 0.6761 | 9.956282e-10 | 2960 |
| 0.9719 | 0.6776 | 0.9800 | 0.6761 | 9.956252e-10 | 2961 |
| 0.9726 | 0.6776 | 0.9800 | 0.6761 | 9.956223e-10 | 2962 |
| 0.9643 | 0.6776 | 0.9799 | 0.6761 | 9.956193e-10 | 2963 |
| 0.9724 | 0.6776 | 0.9799 | 0.6761 | 9.956163e-10 | 2964 |
| 0.9714 | 0.6776 | 0.9799 | 0.6761 | 9.956133e-10 | 2965 |
| 0.9706 | 0.6776 | 0.9799 | 0.6761 | 9.956103e-10 | 2966 |
| 0.9751 | 0.6776 | 0.9798 | 0.6761 | 9.956073e-10 | 2967 |
| 0.9724 | 0.6776 | 0.9798 | 0.6761 | 9.956043e-10 | 2968 |
| 0.9707 | 0.6776 | 0.9798 | 0.6761 | 9.956013e-10 | 2969 |
| 0.9693 | 0.6776 | 0.9798 | 0.6761 | 9.955983e-10 | 2970 |
| 0.9706 | 0.6776 | 0.9797 | 0.6761 | 9.955953e-10 | 2971 |
| 0.9689 | 0.6776 | 0.9797 | 0.6761 | 9.955923e-10 | 2972 |
| 0.9659 | 0.6776 | 0.9797 | 0.6761 | 9.955893e-10 | 2973 |
| 0.9719 | 0.6776 | 0.9797 | 0.6761 | 9.955863e-10 | 2974 |
| 0.9689 | 0.6776 | 0.9796 | 0.6761 | 9.955833e-10 | 2975 |
| 0.9719 | 0.6776 | 0.9796 | 0.6761 | 9.955803e-10 | 2976 |
| 0.9671 | 0.6776 | 0.9796 | 0.6761 | 9.955773e-10 | 2977 |
| 0.9711 | 0.6776 | 0.9796 | 0.6761 | 9.955743e-10 | 2978 |
| 0.9716 | 0.6776 | 0.9795 | 0.6761 | 9.955713e-10 | 2979 |
| 0.9703 | 0.6776 | 0.9795 | 0.6761 | 9.955683e-10 | 2980 |
| 0.9686 | 0.6776 | 0.9795 | 0.6761 | 9.955653e-10 | 2981 |
| 0.9729 | 0.6776 | 0.9795 | 0.6761 | 9.955623e-10 | 2982 |
| 0.9649 | 0.6776 | 0.9795 | 0.6761 | 9.955593e-10 | 2983 |
| 0.9675 | 0.6776 | 0.9794 | 0.6761 | 9.955563e-10 | 2984 |
| 0.9686 | 0.6776 | 0.9794 | 0.6761 | 9.955533e-10 | 2985 |
| 0.9680 | 0.6776 | 0.9794 | 0.6761 | 9.955503e-10 | 2986 |
| 0.9750 | 0.6776 | 0.9794 | 0.6761 | 9.955473e-10 | 2987 |
| 0.9697 | 0.6776 | 0.9793 | 0.6761 | 9.955443e-10 | 2988 |
| 0.9673 | 0.6776 | 0.9793 | 0.6761 | 9.955413e-10 | 2989 |
| 0.9692 | 0.6776 | 0.9793 | 0.6761 | 9.955383e-10 | 2990 |
| 0.9745 | 0.6776 | 0.9793 | 0.6761 | 9.955353e-10 | 2991 |
| 0.9735 | 0.6776 | 0.9792 | 0.6761 | 9.955323e-10 | 2992 |
| 0.9694 | 0.6776 | 0.9792 | 0.6761 | 9.955293e-10 | 2993 |
| 0.9694 | 0.6776 | 0.9792 | 0.6761 | 9.955263e-10 | 2994 |
| 0.9700 | 0.6776 | 0.9792 | 0.6761 | 9.955233e-10 | 2995 |
| 0.9702 | 0.6776 | 0.9791 | 0.6761 | 9.955203e-10 | 2996 |
| 0.9763 | 0.6776 | 0.9791 | 0.6761 | 9.955173e-10 | 2997 |
| 0.9598 | 0.6776 | 0.9791 | 0.6761 | 9.955143e-10 | 2998 |
| 0.9772 | 0.6776 | 0.9791 | 0.6761 | 9.955113e-10 | 2999 |
| 0.9726 | 0.6776 | 0.9790 | 0.6761 | 9.955083e-10 | 3000 |
| 0.9708 | 0.6776 | 0.9790 | 0.6761 | 9.955053e-10 | 3001 |
| 0.9744 | 0.6776 | 0.9790 | 0.6761 | 9.955023e-10 | 3002 |
| 0.9718 | 0.6776 | 0.9790 | 0.6761 | 9.954993e-10 | 3003 |
| 0.9648 | 0.6776 | 0.9790 | 0.6761 | 9.954964e-10 | 3004 |
| 0.9726 | 0.6776 | 0.9789 | 0.6761 | 9.954934e-10 | 3005 |
| 0.9684 | 0.6776 | 0.9789 | 0.6761 | 9.954904e-10 | 3006 |
| 0.9647 | 0.6776 | 0.9789 | 0.6761 | 9.954874e-10 | 3007 |
| 0.9733 | 0.6776 | 0.9789 | 0.6761 | 9.954844e-10 | 3008 |
| 0.9686 | 0.6776 | 0.9789 | 0.6761 | 9.954814e-10 | 3009 |
| 0.9647 | 0.6776 | 0.9788 | 0.6761 | 9.954784e-10 | 3010 |
| 0.9659 | 0.6776 | 0.9788 | 0.6761 | 9.954754e-10 | 3011 |
| 0.9735 | 0.6776 | 0.9788 | 0.6761 | 9.954724e-10 | 3012 |
| 0.9735 | 0.6776 | 0.9788 | 0.6761 | 9.954694e-10 | 3013 |
| 0.9747 | 0.6776 | 0.9787 | 0.6761 | 9.954664e-10 | 3014 |
| 0.9680 | 0.6776 | 0.9787 | 0.6761 | 9.954634e-10 | 3015 |
| 0.9701 | 0.6776 | 0.9787 | 0.6761 | 9.954604e-10 | 3016 |
| 0.9667 | 0.6776 | 0.9787 | 0.6761 | 9.954574e-10 | 3017 |
| 0.9734 | 0.6776 | 0.9786 | 0.6761 | 9.954544e-10 | 3018 |
| 0.9714 | 0.6776 | 0.9786 | 0.6761 | 9.954514e-10 | 3019 |
| 0.9725 | 0.6776 | 0.9786 | 0.6761 | 9.954484e-10 | 3020 |
| 0.9716 | 0.6776 | 0.9786 | 0.6761 | 9.954454e-10 | 3021 |
| 0.9626 | 0.6776 | 0.9785 | 0.6761 | 9.954424e-10 | 3022 |
| 0.9678 | 0.6776 | 0.9785 | 0.6761 | 9.954394e-10 | 3023 |
| 0.9692 | 0.6776 | 0.9785 | 0.6761 | 9.954364e-10 | 3024 |
| 0.9628 | 0.6776 | 0.9785 | 0.6761 | 9.954334e-10 | 3025 |
| 0.9650 | 0.6776 | 0.9785 | 0.6761 | 9.954304e-10 | 3026 |
| 0.9670 | 0.6776 | 0.9784 | 0.6761 | 9.954274e-10 | 3027 |
| 0.9662 | 0.6776 | 0.9784 | 0.6761 | 9.954244e-10 | 3028 |
| 0.9725 | 0.6776 | 0.9784 | 0.6761 | 9.954214e-10 | 3029 |
| 0.9703 | 0.6776 | 0.9784 | 0.6761 | 9.954184e-10 | 3030 |
| 0.9669 | 0.6776 | 0.9783 | 0.6761 | 9.954154e-10 | 3031 |
| 0.9707 | 0.6776 | 0.9783 | 0.6761 | 9.954124e-10 | 3032 |
| 0.9703 | 0.6776 | 0.9783 | 0.6761 | 9.954094e-10 | 3033 |
| 0.9671 | 0.6776 | 0.9783 | 0.6761 | 9.954064e-10 | 3034 |
| 0.9663 | 0.6776 | 0.9783 | 0.6761 | 9.954034e-10 | 3035 |
| 0.9723 | 0.6776 | 0.9782 | 0.6761 | 9.954004e-10 | 3036 |
| 0.9727 | 0.6776 | 0.9782 | 0.6761 | 9.953974e-10 | 3037 |
| 0.9730 | 0.6776 | 0.9782 | 0.6761 | 9.953944e-10 | 3038 |
| 0.9677 | 0.6776 | 0.9782 | 0.6761 | 9.953914e-10 | 3039 |
| 0.9710 | 0.6776 | 0.9782 | 0.6761 | 9.953884e-10 | 3040 |
| 0.9666 | 0.6776 | 0.9781 | 0.6761 | 9.953854e-10 | 3041 |
| 0.9705 | 0.6776 | 0.9781 | 0.6761 | 9.953824e-10 | 3042 |
| 0.9715 | 0.6776 | 0.9781 | 0.6761 | 9.953794e-10 | 3043 |
| 0.9731 | 0.6776 | 0.9781 | 0.6761 | 9.953764e-10 | 3044 |
| 0.9722 | 0.6776 | 0.9780 | 0.6761 | 9.953735e-10 | 3045 |
| 0.9662 | 0.6776 | 0.9780 | 0.6761 | 9.953705e-10 | 3046 |
| 0.9687 | 0.6776 | 0.9780 | 0.6761 | 9.953675e-10 | 3047 |
| 0.9717 | 0.6776 | 0.9780 | 0.6761 | 9.953645e-10 | 3048 |
| 0.9590 | 0.6776 | 0.9780 | 0.6761 | 9.953615e-10 | 3049 |
| 0.9667 | 0.6776 | 0.9780 | 0.6761 | 9.953585e-10 | 3050 |
| 0.9708 | 0.6776 | 0.9779 | 0.6761 | 9.953555e-10 | 3051 |
| 0.9573 | 0.6776 | 0.9779 | 0.6761 | 9.953525e-10 | 3052 |
| 0.9638 | 0.6776 | 0.9779 | 0.6761 | 9.953495e-10 | 3053 |
| 0.9660 | 0.6776 | 0.9779 | 0.6761 | 9.953465e-10 | 3054 |
| 0.9686 | 0.6776 | 0.9779 | 0.6761 | 9.953435e-10 | 3055 |
| 0.9705 | 0.6776 | 0.9778 | 0.6761 | 9.953405e-10 | 3056 |
| 0.9687 | 0.6776 | 0.9778 | 0.6761 | 9.953375e-10 | 3057 |
| 0.9740 | 0.6776 | 0.9778 | 0.6761 | 9.953345e-10 | 3058 |
| 0.9758 | 0.6776 | 0.9778 | 0.6761 | 9.953315e-10 | 3059 |
| 0.9684 | 0.6776 | 0.9777 | 0.6761 | 9.953285e-10 | 3060 |
| 0.9681 | 0.6776 | 0.9777 | 0.6761 | 9.953255e-10 | 3061 |
| 0.9653 | 0.6776 | 0.9777 | 0.6761 | 9.953225e-10 | 3062 |
| 0.9688 | 0.6776 | 0.9777 | 0.6761 | 9.953195e-10 | 3063 |
| 0.9702 | 0.6776 | 0.9777 | 0.6761 | 9.953165e-10 | 3064 |
| 0.9727 | 0.6776 | 0.9776 | 0.6761 | 9.953135e-10 | 3065 |
| 0.9675 | 0.6776 | 0.9776 | 0.6761 | 9.953105e-10 | 3066 |
| 0.9635 | 0.6776 | 0.9776 | 0.6761 | 9.953075e-10 | 3067 |
| 0.9725 | 0.6776 | 0.9776 | 0.6761 | 9.953044e-10 | 3068 |
| 0.9634 | 0.6776 | 0.9776 | 0.6761 | 9.953013e-10 | 3069 |
| 0.9686 | 0.6776 | 0.9775 | 0.6761 | 9.952982e-10 | 3070 |
| 0.9692 | 0.6776 | 0.9775 | 0.6761 | 9.952951e-10 | 3071 |
| 0.9656 | 0.6776 | 0.9775 | 0.6761 | 9.95292e-10 | 3072 |
| 0.9672 | 0.6776 | 0.9775 | 0.6761 | 9.952889e-10 | 3073 |
| 0.9691 | 0.6776 | 0.9774 | 0.6761 | 9.952857e-10 | 3074 |
| 0.9673 | 0.6776 | 0.9774 | 0.6761 | 9.952826e-10 | 3075 |
| 0.9621 | 0.6776 | 0.9774 | 0.6761 | 9.952795e-10 | 3076 |
| 0.9706 | 0.6776 | 0.9774 | 0.6761 | 9.952764e-10 | 3077 |
| 0.9652 | 0.6776 | 0.9774 | 0.6761 | 9.952733e-10 | 3078 |
| 0.9673 | 0.6776 | 0.9773 | 0.6761 | 9.952702e-10 | 3079 |
| 0.9658 | 0.6776 | 0.9773 | 0.6761 | 9.952671e-10 | 3080 |
| 0.9751 | 0.6776 | 0.9773 | 0.6761 | 9.95264e-10 | 3081 |
| 0.9629 | 0.6776 | 0.9773 | 0.6761 | 9.952609e-10 | 3082 |
| 0.9702 | 0.6776 | 0.9773 | 0.6761 | 9.952578e-10 | 3083 |
| 0.9703 | 0.6776 | 0.9772 | 0.6761 | 9.952547e-10 | 3084 |
| 0.9694 | 0.6776 | 0.9772 | 0.6761 | 9.952515e-10 | 3085 |
| 0.9703 | 0.6776 | 0.9772 | 0.6761 | 9.952484e-10 | 3086 |
| 0.9621 | 0.6776 | 0.9772 | 0.6761 | 9.952453e-10 | 3087 |
| 0.9646 | 0.6776 | 0.9772 | 0.6761 | 9.952422e-10 | 3088 |
| 0.9632 | 0.6776 | 0.9771 | 0.6761 | 9.952391e-10 | 3089 |
| 0.9651 | 0.6776 | 0.9771 | 0.6761 | 9.95236e-10 | 3090 |
| 0.9679 | 0.6776 | 0.9771 | 0.6761 | 9.952329e-10 | 3091 |
| 0.9695 | 0.6776 | 0.9771 | 0.6761 | 9.952298e-10 | 3092 |
| 0.9665 | 0.6776 | 0.9771 | 0.6761 | 9.952267e-10 | 3093 |
| 0.9672 | 0.6776 | 0.9770 | 0.6761 | 9.952236e-10 | 3094 |
| 0.9679 | 0.6776 | 0.9770 | 0.6761 | 9.952205e-10 | 3095 |
| 0.9636 | 0.6776 | 0.9770 | 0.6761 | 9.952174e-10 | 3096 |
| 0.9689 | 0.6776 | 0.9770 | 0.6761 | 9.952142e-10 | 3097 |
| 0.9737 | 0.6776 | 0.9770 | 0.6761 | 9.952111e-10 | 3098 |
| 0.9684 | 0.6776 | 0.9769 | 0.6761 | 9.95208e-10 | 3099 |
| 0.9690 | 0.6776 | 0.9769 | 0.6761 | 9.952049e-10 | 3100 |
| 0.9719 | 0.6776 | 0.9769 | 0.6761 | 9.952018e-10 | 3101 |
| 0.9644 | 0.6776 | 0.9769 | 0.6761 | 9.951987e-10 | 3102 |
| 0.9679 | 0.6776 | 0.9769 | 0.6761 | 9.951956e-10 | 3103 |
| 0.9583 | 0.6776 | 0.9768 | 0.6761 | 9.951925e-10 | 3104 |
| 0.9630 | 0.6776 | 0.9768 | 0.6761 | 9.951894e-10 | 3105 |
| 0.9644 | 0.6776 | 0.9768 | 0.6761 | 9.951863e-10 | 3106 |
| 0.9671 | 0.6776 | 0.9768 | 0.6761 | 9.951832e-10 | 3107 |
| 0.9722 | 0.6776 | 0.9768 | 0.6761 | 9.9518e-10 | 3108 |
| 0.9725 | 0.6776 | 0.9767 | 0.6761 | 9.951769e-10 | 3109 |
| 0.9680 | 0.6776 | 0.9767 | 0.6761 | 9.951738e-10 | 3110 |
| 0.9725 | 0.6776 | 0.9767 | 0.6761 | 9.951707e-10 | 3111 |
| 0.9652 | 0.6776 | 0.9767 | 0.6761 | 9.951676e-10 | 3112 |
| 0.9648 | 0.6776 | 0.9767 | 0.6761 | 9.951645e-10 | 3113 |
| 0.9672 | 0.6776 | 0.9766 | 0.6761 | 9.951614e-10 | 3114 |
| 0.9708 | 0.6776 | 0.9766 | 0.6761 | 9.951583e-10 | 3115 |
| 0.9643 | 0.6776 | 0.9766 | 0.6761 | 9.951552e-10 | 3116 |
| 0.9672 | 0.6776 | 0.9766 | 0.6761 | 9.951521e-10 | 3117 |
| 0.9693 | 0.6776 | 0.9766 | 0.6761 | 9.95149e-10 | 3118 |
| 0.9635 | 0.6776 | 0.9765 | 0.6761 | 9.951459e-10 | 3119 |
| 0.9657 | 0.6776 | 0.9765 | 0.6761 | 9.951427e-10 | 3120 |
| 0.9665 | 0.6776 | 0.9765 | 0.6761 | 9.951396e-10 | 3121 |
| 0.9689 | 0.6776 | 0.9765 | 0.6761 | 9.951365e-10 | 3122 |
| 0.9692 | 0.6776 | 0.9765 | 0.6761 | 9.951334e-10 | 3123 |
| 0.9672 | 0.6776 | 0.9764 | 0.6761 | 9.951303e-10 | 3124 |
| 0.9708 | 0.6776 | 0.9764 | 0.6761 | 9.951272e-10 | 3125 |
| 0.9675 | 0.6776 | 0.9764 | 0.6761 | 9.951241e-10 | 3126 |
| 0.9663 | 0.6776 | 0.9764 | 0.6761 | 9.95121e-10 | 3127 |
| 0.9627 | 0.6776 | 0.9764 | 0.6761 | 9.951179e-10 | 3128 |
| 0.9636 | 0.6776 | 0.9763 | 0.6761 | 9.951148e-10 | 3129 |
| 0.9654 | 0.6776 | 0.9763 | 0.6761 | 9.951117e-10 | 3130 |
| 0.9711 | 0.6776 | 0.9763 | 0.6761 | 9.951086e-10 | 3131 |
| 0.9691 | 0.6776 | 0.9763 | 0.6761 | 9.951054e-10 | 3132 |
| 0.9668 | 0.6776 | 0.9763 | 0.6761 | 9.951023e-10 | 3133 |
| 0.9693 | 0.6776 | 0.9762 | 0.6761 | 9.950992e-10 | 3134 |
| 0.9603 | 0.6776 | 0.9762 | 0.6761 | 9.950961e-10 | 3135 |
| 0.9754 | 0.6776 | 0.9762 | 0.6761 | 9.95093e-10 | 3136 |
| 0.9646 | 0.6776 | 0.9762 | 0.6761 | 9.950899e-10 | 3137 |
| 0.9697 | 0.6776 | 0.9762 | 0.6761 | 9.950868e-10 | 3138 |
| 0.9749 | 0.6776 | 0.9761 | 0.6761 | 9.950837e-10 | 3139 |
| 0.9689 | 0.6776 | 0.9761 | 0.6761 | 9.950806e-10 | 3140 |
| 0.9659 | 0.6776 | 0.9761 | 0.6761 | 9.950775e-10 | 3141 |
| 0.9681 | 0.6776 | 0.9761 | 0.6761 | 9.950744e-10 | 3142 |
| 0.9660 | 0.6776 | 0.9761 | 0.6761 | 9.950712e-10 | 3143 |
| 0.9703 | 0.6776 | 0.9760 | 0.6761 | 9.950681e-10 | 3144 |
| 0.9730 | 0.6776 | 0.9760 | 0.6761 | 9.95065e-10 | 3145 |
| 0.9643 | 0.6776 | 0.9760 | 0.6761 | 9.950619e-10 | 3146 |
| 0.9603 | 0.6776 | 0.9760 | 0.6761 | 9.950588e-10 | 3147 |
| 0.9702 | 0.6776 | 0.9760 | 0.6761 | 9.950557e-10 | 3148 |
| 0.9702 | 0.6776 | 0.9760 | 0.6761 | 9.950526e-10 | 3149 |
| 0.9644 | 0.6776 | 0.9759 | 0.6761 | 9.950495e-10 | 3150 |
| 0.9642 | 0.6776 | 0.9759 | 0.6761 | 9.950464e-10 | 3151 |
| 0.9715 | 0.6776 | 0.9759 | 0.6761 | 9.950433e-10 | 3152 |
| 0.9679 | 0.6776 | 0.9759 | 0.6761 | 9.950402e-10 | 3153 |
| 0.9679 | 0.6776 | 0.9759 | 0.6761 | 9.95037e-10 | 3154 |
| 0.9749 | 0.6776 | 0.9758 | 0.6761 | 9.950339e-10 | 3155 |
| 0.9664 | 0.6776 | 0.9758 | 0.6761 | 9.950308e-10 | 3156 |
| 0.9622 | 0.6776 | 0.9758 | 0.6761 | 9.950277e-10 | 3157 |
| 0.9556 | 0.6776 | 0.9758 | 0.6761 | 9.950246e-10 | 3158 |
| 0.9627 | 0.6776 | 0.9758 | 0.6761 | 9.950215e-10 | 3159 |
| 0.9664 | 0.6776 | 0.9757 | 0.6761 | 9.950184e-10 | 3160 |
| 0.9675 | 0.6776 | 0.9757 | 0.6761 | 9.950153e-10 | 3161 |
| 0.9693 | 0.6776 | 0.9757 | 0.6761 | 9.950122e-10 | 3162 |
| 0.9683 | 0.6776 | 0.9757 | 0.6761 | 9.950091e-10 | 3163 |
| 0.9698 | 0.6776 | 0.9757 | 0.6761 | 9.95006e-10 | 3164 |
| 0.9712 | 0.6776 | 0.9757 | 0.6761 | 9.950029e-10 | 3165 |
| 0.9693 | 0.6776 | 0.9756 | 0.6761 | 9.949997e-10 | 3166 |
| 0.9722 | 0.6776 | 0.9756 | 0.6761 | 9.949966e-10 | 3167 |
| 0.9643 | 0.6776 | 0.9756 | 0.6761 | 9.949935e-10 | 3168 |
| 0.9587 | 0.6776 | 0.9756 | 0.6761 | 9.949904e-10 | 3169 |
| 0.9680 | 0.6776 | 0.9756 | 0.6761 | 9.949873e-10 | 3170 |
| 0.9638 | 0.6776 | 0.9755 | 0.6761 | 9.949842e-10 | 3171 |
| 0.9671 | 0.6776 | 0.9755 | 0.6761 | 9.949811e-10 | 3172 |
| 0.9687 | 0.6776 | 0.9755 | 0.6761 | 9.94978e-10 | 3173 |
| 0.9667 | 0.6776 | 0.9755 | 0.6761 | 9.949749e-10 | 3174 |
| 0.9640 | 0.6776 | 0.9755 | 0.6761 | 9.949718e-10 | 3175 |
| 0.9674 | 0.6776 | 0.9754 | 0.6761 | 9.949687e-10 | 3176 |
| 0.9726 | 0.6776 | 0.9754 | 0.6761 | 9.949656e-10 | 3177 |
| 0.9681 | 0.6776 | 0.9754 | 0.6761 | 9.949624e-10 | 3178 |
| 0.9653 | 0.6776 | 0.9754 | 0.6761 | 9.949593e-10 | 3179 |
| 0.9633 | 0.6776 | 0.9754 | 0.6761 | 9.949562e-10 | 3180 |
| 0.9678 | 0.6776 | 0.9754 | 0.6761 | 9.94953e-10 | 3181 |
| 0.9618 | 0.6776 | 0.9753 | 0.6761 | 9.949498e-10 | 3182 |
| 0.9594 | 0.6776 | 0.9753 | 0.6761 | 9.949466e-10 | 3183 |
| 0.9562 | 0.6776 | 0.9753 | 0.6761 | 9.949433e-10 | 3184 |
| 0.9692 | 0.6776 | 0.9753 | 0.6761 | 9.949401e-10 | 3185 |
| 0.9627 | 0.6776 | 0.9753 | 0.6761 | 9.949369e-10 | 3186 |
| 0.9659 | 0.6776 | 0.9752 | 0.6761 | 9.949337e-10 | 3187 |
| 0.9659 | 0.6776 | 0.9752 | 0.6761 | 9.949305e-10 | 3188 |
| 0.9691 | 0.6776 | 0.9752 | 0.6761 | 9.949273e-10 | 3189 |
| 0.9620 | 0.6776 | 0.9752 | 0.6761 | 9.94924e-10 | 3190 |
| 0.9642 | 0.6776 | 0.9752 | 0.6761 | 9.949208e-10 | 3191 |
| 0.9683 | 0.6776 | 0.9752 | 0.6761 | 9.949176e-10 | 3192 |
| 0.9663 | 0.6776 | 0.9751 | 0.6761 | 9.949144e-10 | 3193 |
| 0.9605 | 0.6776 | 0.9751 | 0.6761 | 9.949112e-10 | 3194 |
| 0.9646 | 0.6776 | 0.9751 | 0.6761 | 9.949079e-10 | 3195 |
| 0.9678 | 0.6776 | 0.9751 | 0.6761 | 9.949047e-10 | 3196 |
| 0.9637 | 0.6776 | 0.9751 | 0.6761 | 9.949015e-10 | 3197 |
| 0.9596 | 0.6776 | 0.9750 | 0.6761 | 9.948983e-10 | 3198 |
| 0.9706 | 0.6776 | 0.9750 | 0.6761 | 9.94895e-10 | 3199 |
| 0.9654 | 0.6776 | 0.9750 | 0.6761 | 9.948918e-10 | 3200 |
| 0.9644 | 0.6776 | 0.9750 | 0.6761 | 9.948886e-10 | 3201 |
| 0.9677 | 0.6776 | 0.9750 | 0.6761 | 9.948854e-10 | 3202 |
| 0.9588 | 0.6776 | 0.9750 | 0.6761 | 9.948822e-10 | 3203 |
| 0.9640 | 0.6776 | 0.9749 | 0.6761 | 9.94879e-10 | 3204 |
| 0.9678 | 0.6776 | 0.9749 | 0.6761 | 9.948757e-10 | 3205 |
| 0.9603 | 0.6776 | 0.9749 | 0.6761 | 9.948725e-10 | 3206 |
| 0.9643 | 0.6776 | 0.9749 | 0.6761 | 9.948693e-10 | 3207 |
| 0.9628 | 0.6776 | 0.9749 | 0.6761 | 9.948661e-10 | 3208 |
| 0.9660 | 0.6776 | 0.9748 | 0.6761 | 9.948629e-10 | 3209 |
| 0.9678 | 0.6776 | 0.9748 | 0.6761 | 9.948596e-10 | 3210 |
| 0.9658 | 0.6776 | 0.9748 | 0.6761 | 9.948564e-10 | 3211 |
| 0.9674 | 0.6776 | 0.9748 | 0.6761 | 9.948532e-10 | 3212 |
| 0.9635 | 0.6776 | 0.9748 | 0.6761 | 9.9485e-10 | 3213 |
| 0.9674 | 0.6776 | 0.9748 | 0.6761 | 9.948468e-10 | 3214 |
| 0.9654 | 0.6776 | 0.9747 | 0.6761 | 9.948435e-10 | 3215 |
| 0.9698 | 0.6776 | 0.9747 | 0.6761 | 9.948403e-10 | 3216 |
| 0.9620 | 0.6776 | 0.9747 | 0.6761 | 9.948371e-10 | 3217 |
| 0.9687 | 0.6776 | 0.9747 | 0.6761 | 9.948339e-10 | 3218 |
| 0.9670 | 0.6776 | 0.9747 | 0.6761 | 9.948307e-10 | 3219 |
| 0.9665 | 0.6776 | 0.9747 | 0.6761 | 9.948274e-10 | 3220 |
| 0.9652 | 0.6776 | 0.9746 | 0.6761 | 9.948242e-10 | 3221 |
| 0.9639 | 0.6776 | 0.9746 | 0.6761 | 9.94821e-10 | 3222 |
| 0.9637 | 0.6776 | 0.9746 | 0.6761 | 9.948178e-10 | 3223 |
| 0.9692 | 0.6776 | 0.9746 | 0.6761 | 9.948146e-10 | 3224 |
| 0.9674 | 0.6776 | 0.9746 | 0.6761 | 9.948113e-10 | 3225 |
| 0.9631 | 0.6776 | 0.9746 | 0.6761 | 9.948081e-10 | 3226 |
| 0.9678 | 0.6776 | 0.9745 | 0.6761 | 9.948049e-10 | 3227 |
| 0.9654 | 0.6776 | 0.9745 | 0.6761 | 9.948017e-10 | 3228 |
| 0.9660 | 0.6776 | 0.9745 | 0.6761 | 9.947985e-10 | 3229 |
| 0.9591 | 0.6776 | 0.9745 | 0.6761 | 9.947952e-10 | 3230 |
| 0.9616 | 0.6776 | 0.9745 | 0.6761 | 9.94792e-10 | 3231 |
| 0.9614 | 0.6776 | 0.9745 | 0.6761 | 9.947888e-10 | 3232 |
| 0.9606 | 0.6776 | 0.9744 | 0.6761 | 9.947856e-10 | 3233 |
| 0.9634 | 0.6776 | 0.9744 | 0.6761 | 9.947824e-10 | 3234 |
| 0.9651 | 0.6776 | 0.9744 | 0.6761 | 9.947791e-10 | 3235 |
| 0.9671 | 0.6776 | 0.9744 | 0.6761 | 9.947759e-10 | 3236 |
| 0.9583 | 0.6776 | 0.9744 | 0.6761 | 9.947727e-10 | 3237 |
| 0.9638 | 0.6776 | 0.9744 | 0.6761 | 9.947695e-10 | 3238 |
| 0.9629 | 0.6776 | 0.9743 | 0.6761 | 9.947663e-10 | 3239 |
| 0.9654 | 0.6776 | 0.9743 | 0.6761 | 9.94763e-10 | 3240 |
| 0.9660 | 0.6776 | 0.9743 | 0.6761 | 9.947598e-10 | 3241 |
| 0.9625 | 0.6776 | 0.9743 | 0.6761 | 9.947566e-10 | 3242 |
| 0.9622 | 0.6776 | 0.9743 | 0.6761 | 9.947534e-10 | 3243 |
| 0.9629 | 0.6776 | 0.9742 | 0.6761 | 9.947502e-10 | 3244 |
| 0.9707 | 0.6776 | 0.9742 | 0.6761 | 9.94747e-10 | 3245 |
| 0.9660 | 0.6776 | 0.9742 | 0.6761 | 9.947437e-10 | 3246 |
| 0.9588 | 0.6776 | 0.9742 | 0.6761 | 9.947405e-10 | 3247 |
| 0.9602 | 0.6776 | 0.9742 | 0.6761 | 9.947373e-10 | 3248 |
| 0.9647 | 0.6776 | 0.9742 | 0.6761 | 9.947341e-10 | 3249 |
| 0.9646 | 0.6776 | 0.9741 | 0.6761 | 9.947309e-10 | 3250 |
| 0.9653 | 0.6776 | 0.9741 | 0.6761 | 9.947276e-10 | 3251 |
| 0.9707 | 0.6776 | 0.9741 | 0.6761 | 9.947244e-10 | 3252 |
| 0.9633 | 0.6776 | 0.9741 | 0.6761 | 9.947212e-10 | 3253 |
| 0.9625 | 0.6776 | 0.9741 | 0.6761 | 9.94718e-10 | 3254 |
| 0.9625 | 0.6776 | 0.9741 | 0.6761 | 9.947148e-10 | 3255 |
| 0.9651 | 0.6776 | 0.9740 | 0.6761 | 9.947115e-10 | 3256 |
| 0.9634 | 0.6776 | 0.9740 | 0.6761 | 9.947083e-10 | 3257 |
| 0.9664 | 0.6776 | 0.9740 | 0.6761 | 9.947051e-10 | 3258 |
| 0.9544 | 0.6776 | 0.9740 | 0.6761 | 9.947019e-10 | 3259 |
| 0.9646 | 0.6776 | 0.9740 | 0.6761 | 9.946987e-10 | 3260 |
| 0.9646 | 0.6776 | 0.9740 | 0.6761 | 9.946954e-10 | 3261 |
| 0.9684 | 0.6776 | 0.9739 | 0.6761 | 9.946922e-10 | 3262 |
| 0.9652 | 0.6776 | 0.9739 | 0.6761 | 9.94689e-10 | 3263 |
| 0.9630 | 0.6776 | 0.9739 | 0.6761 | 9.946858e-10 | 3264 |
| 0.9583 | 0.6776 | 0.9739 | 0.6761 | 9.946826e-10 | 3265 |
| 0.9664 | 0.6776 | 0.9739 | 0.6761 | 9.946793e-10 | 3266 |
| 0.9611 | 0.6776 | 0.9739 | 0.6761 | 9.946761e-10 | 3267 |
| 0.9640 | 0.6776 | 0.9738 | 0.6761 | 9.946729e-10 | 3268 |
| 0.9656 | 0.6776 | 0.9738 | 0.6761 | 9.946697e-10 | 3269 |
| 0.9626 | 0.6776 | 0.9738 | 0.6761 | 9.946665e-10 | 3270 |
| 0.9662 | 0.6776 | 0.9738 | 0.6761 | 9.946632e-10 | 3271 |
| 0.9641 | 0.6776 | 0.9738 | 0.6761 | 9.9466e-10 | 3272 |
| 0.9605 | 0.6776 | 0.9738 | 0.6761 | 9.946568e-10 | 3273 |
| 0.9628 | 0.6776 | 0.9737 | 0.6761 | 9.946536e-10 | 3274 |
| 0.9607 | 0.6776 | 0.9737 | 0.6761 | 9.946504e-10 | 3275 |
| 0.9577 | 0.6776 | 0.9737 | 0.6761 | 9.946471e-10 | 3276 |
| 0.9624 | 0.6776 | 0.9737 | 0.6761 | 9.946439e-10 | 3277 |
| 0.9662 | 0.6776 | 0.9737 | 0.6761 | 9.946407e-10 | 3278 |
| 0.9645 | 0.6776 | 0.9737 | 0.6761 | 9.946375e-10 | 3279 |
| 0.9541 | 0.6776 | 0.9736 | 0.6761 | 9.946343e-10 | 3280 |
| 0.9570 | 0.6776 | 0.9736 | 0.6761 | 9.94631e-10 | 3281 |
| 0.9645 | 0.6776 | 0.9736 | 0.6761 | 9.946278e-10 | 3282 |
| 0.9620 | 0.6776 | 0.9736 | 0.6761 | 9.946246e-10 | 3283 |
| 0.9649 | 0.6776 | 0.9736 | 0.6761 | 9.946214e-10 | 3284 |
| 0.9667 | 0.6776 | 0.9736 | 0.6761 | 9.946182e-10 | 3285 |
| 0.9616 | 0.6776 | 0.9735 | 0.6761 | 9.94615e-10 | 3286 |
| 0.9651 | 0.6776 | 0.9735 | 0.6761 | 9.946117e-10 | 3287 |
| 0.9595 | 0.6776 | 0.9735 | 0.6761 | 9.946085e-10 | 3288 |
| 0.9626 | 0.6776 | 0.9735 | 0.6761 | 9.946053e-10 | 3289 |
| 0.9668 | 0.6776 | 0.9735 | 0.6761 | 9.946021e-10 | 3290 |
| 0.9627 | 0.6776 | 0.9735 | 0.6761 | 9.945988e-10 | 3291 |
| 0.9652 | 0.6776 | 0.9734 | 0.6761 | 9.945956e-10 | 3292 |
| 0.9606 | 0.6776 | 0.9734 | 0.6761 | 9.945923e-10 | 3293 |
| 0.9612 | 0.6776 | 0.9734 | 0.6761 | 9.94589e-10 | 3294 |
| 0.9657 | 0.6776 | 0.9734 | 0.6761 | 9.945856e-10 | 3295 |
| 0.9601 | 0.6776 | 0.9734 | 0.6761 | 9.945823e-10 | 3296 |
| 0.9671 | 0.6776 | 0.9734 | 0.6761 | 9.94579e-10 | 3297 |
| 0.9605 | 0.6776 | 0.9733 | 0.6761 | 9.945756e-10 | 3298 |
| 0.9581 | 0.6776 | 0.9733 | 0.6761 | 9.945723e-10 | 3299 |
| 0.9627 | 0.6776 | 0.9733 | 0.6761 | 9.94569e-10 | 3300 |
| 0.9606 | 0.6776 | 0.9733 | 0.6761 | 9.945657e-10 | 3301 |
| 0.9664 | 0.6776 | 0.9733 | 0.6761 | 9.945623e-10 | 3302 |
| 0.9656 | 0.6776 | 0.9733 | 0.6761 | 9.94559e-10 | 3303 |
| 0.9645 | 0.6776 | 0.9732 | 0.6761 | 9.945557e-10 | 3304 |
| 0.9675 | 0.6776 | 0.9732 | 0.6761 | 9.945523e-10 | 3305 |
| 0.9616 | 0.6776 | 0.9732 | 0.6761 | 9.94549e-10 | 3306 |
| 0.9607 | 0.6776 | 0.9732 | 0.6761 | 9.945457e-10 | 3307 |
| 0.9593 | 0.6776 | 0.9732 | 0.6761 | 9.945423e-10 | 3308 |
| 0.9618 | 0.6776 | 0.9732 | 0.6761 | 9.94539e-10 | 3309 |
| 0.9617 | 0.6776 | 0.9732 | 0.6761 | 9.945357e-10 | 3310 |
| 0.9611 | 0.6776 | 0.9731 | 0.6761 | 9.945323e-10 | 3311 |
| 0.9662 | 0.6776 | 0.9731 | 0.6761 | 9.94529e-10 | 3312 |
| 0.9622 | 0.6776 | 0.9731 | 0.6761 | 9.945257e-10 | 3313 |
| 0.9608 | 0.6776 | 0.9731 | 0.6761 | 9.945224e-10 | 3314 |
| 0.9644 | 0.6776 | 0.9731 | 0.6761 | 9.94519e-10 | 3315 |
| 0.9600 | 0.6776 | 0.9731 | 0.6761 | 9.945157e-10 | 3316 |
| 0.9599 | 0.6776 | 0.9730 | 0.6761 | 9.945124e-10 | 3317 |
| 0.9656 | 0.6776 | 0.9730 | 0.6761 | 9.94509e-10 | 3318 |
| 0.9585 | 0.6776 | 0.9730 | 0.6761 | 9.945057e-10 | 3319 |
| 0.9629 | 0.6776 | 0.9730 | 0.6761 | 9.945024e-10 | 3320 |
| 0.9642 | 0.6776 | 0.9730 | 0.6761 | 9.94499e-10 | 3321 |
| 0.9667 | 0.6776 | 0.9730 | 0.6761 | 9.944957e-10 | 3322 |
| 0.9642 | 0.6776 | 0.9729 | 0.6761 | 9.944924e-10 | 3323 |
| 0.9584 | 0.6776 | 0.9729 | 0.6761 | 9.94489e-10 | 3324 |
| 0.9560 | 0.6776 | 0.9729 | 0.6761 | 9.944857e-10 | 3325 |
| 0.9587 | 0.6776 | 0.9729 | 0.6761 | 9.944824e-10 | 3326 |
| 0.9606 | 0.6776 | 0.9729 | 0.6761 | 9.94479e-10 | 3327 |
| 0.9584 | 0.6776 | 0.9729 | 0.6761 | 9.944757e-10 | 3328 |
| 0.9668 | 0.6776 | 0.9728 | 0.6761 | 9.944724e-10 | 3329 |
| 0.9604 | 0.6776 | 0.9728 | 0.6761 | 9.944691e-10 | 3330 |
| 0.9648 | 0.6776 | 0.9728 | 0.6761 | 9.944657e-10 | 3331 |
| 0.9694 | 0.6776 | 0.9728 | 0.6761 | 9.944624e-10 | 3332 |
| 0.9644 | 0.6776 | 0.9728 | 0.6761 | 9.944591e-10 | 3333 |
| 0.9729 | 0.6776 | 0.9728 | 0.6761 | 9.944557e-10 | 3334 |
| 0.9602 | 0.6776 | 0.9727 | 0.6761 | 9.944524e-10 | 3335 |
| 0.9596 | 0.6776 | 0.9727 | 0.6761 | 9.944491e-10 | 3336 |
| 0.9612 | 0.6776 | 0.9727 | 0.6761 | 9.944457e-10 | 3337 |
| 0.9623 | 0.6776 | 0.9727 | 0.6761 | 9.944424e-10 | 3338 |
| 0.9686 | 0.6776 | 0.9727 | 0.6761 | 9.944391e-10 | 3339 |
| 0.9597 | 0.6776 | 0.9727 | 0.6761 | 9.944358e-10 | 3340 |
| 0.9608 | 0.6776 | 0.9726 | 0.6761 | 9.944324e-10 | 3341 |
| 0.9589 | 0.6776 | 0.9726 | 0.6761 | 9.944291e-10 | 3342 |
| 0.9647 | 0.6776 | 0.9726 | 0.6761 | 9.944258e-10 | 3343 |
| 0.9613 | 0.6776 | 0.9726 | 0.6761 | 9.944224e-10 | 3344 |
| 0.9602 | 0.6776 | 0.9726 | 0.6761 | 9.944191e-10 | 3345 |
| 0.9569 | 0.6776 | 0.9726 | 0.6761 | 9.944158e-10 | 3346 |
| 0.9688 | 0.6776 | 0.9726 | 0.6761 | 9.944124e-10 | 3347 |
| 0.9595 | 0.6776 | 0.9725 | 0.6761 | 9.944091e-10 | 3348 |
| 0.9622 | 0.6776 | 0.9725 | 0.6761 | 9.944058e-10 | 3349 |
| 0.9618 | 0.6776 | 0.9725 | 0.6761 | 9.944024e-10 | 3350 |
| 0.9598 | 0.6776 | 0.9725 | 0.6761 | 9.943991e-10 | 3351 |
| 0.9649 | 0.6776 | 0.9725 | 0.6761 | 9.943958e-10 | 3352 |
| 0.9629 | 0.6776 | 0.9725 | 0.6761 | 9.943925e-10 | 3353 |
| 0.9685 | 0.6776 | 0.9725 | 0.6761 | 9.943891e-10 | 3354 |
| 0.9631 | 0.6776 | 0.9724 | 0.6761 | 9.943858e-10 | 3355 |
| 0.9641 | 0.6776 | 0.9724 | 0.6761 | 9.943825e-10 | 3356 |
| 0.9588 | 0.6776 | 0.9724 | 0.6761 | 9.943791e-10 | 3357 |
| 0.9630 | 0.6776 | 0.9724 | 0.6761 | 9.943758e-10 | 3358 |
| 0.9625 | 0.6776 | 0.9724 | 0.6761 | 9.943725e-10 | 3359 |
| 0.9622 | 0.6776 | 0.9724 | 0.6761 | 9.943691e-10 | 3360 |
| 0.9674 | 0.6776 | 0.9723 | 0.6761 | 9.943658e-10 | 3361 |
| 0.9657 | 0.6776 | 0.9723 | 0.6761 | 9.943625e-10 | 3362 |
| 0.9597 | 0.6776 | 0.9723 | 0.6761 | 9.943592e-10 | 3363 |
| 0.9581 | 0.6776 | 0.9723 | 0.6761 | 9.943558e-10 | 3364 |
| 0.9606 | 0.6776 | 0.9723 | 0.6761 | 9.943525e-10 | 3365 |
| 0.9623 | 0.6776 | 0.9723 | 0.6761 | 9.943492e-10 | 3366 |
| 0.9575 | 0.6776 | 0.9723 | 0.6761 | 9.943458e-10 | 3367 |
| 0.9671 | 0.6776 | 0.9722 | 0.6761 | 9.943425e-10 | 3368 |
| 0.9662 | 0.6776 | 0.9722 | 0.6761 | 9.943392e-10 | 3369 |
| 0.9681 | 0.6776 | 0.9722 | 0.6761 | 9.943358e-10 | 3370 |
| 0.9561 | 0.6776 | 0.9722 | 0.6761 | 9.943325e-10 | 3371 |
| 0.9647 | 0.6776 | 0.9722 | 0.6761 | 9.943292e-10 | 3372 |
| 0.9635 | 0.6776 | 0.9722 | 0.6761 | 9.943258e-10 | 3373 |
| 0.9638 | 0.6776 | 0.9722 | 0.6761 | 9.943225e-10 | 3374 |
| 0.9639 | 0.6776 | 0.9721 | 0.6761 | 9.943192e-10 | 3375 |
| 0.9635 | 0.6776 | 0.9721 | 0.6761 | 9.943159e-10 | 3376 |
| 0.9613 | 0.6776 | 0.9721 | 0.6761 | 9.943125e-10 | 3377 |
| 0.9638 | 0.6776 | 0.9721 | 0.6761 | 9.943092e-10 | 3378 |
| 0.9596 | 0.6776 | 0.9721 | 0.6761 | 9.943059e-10 | 3379 |
| 0.9606 | 0.6776 | 0.9721 | 0.6761 | 9.943025e-10 | 3380 |
| 0.9612 | 0.6776 | 0.9720 | 0.6761 | 9.942992e-10 | 3381 |
| 0.9575 | 0.6776 | 0.9720 | 0.6761 | 9.942959e-10 | 3382 |
| 0.9628 | 0.6776 | 0.9720 | 0.6761 | 9.942925e-10 | 3383 |
| 0.9632 | 0.6776 | 0.9720 | 0.6761 | 9.942892e-10 | 3384 |
| 0.9587 | 0.6776 | 0.9720 | 0.6761 | 9.942859e-10 | 3385 |
| 0.9602 | 0.6776 | 0.9720 | 0.6761 | 9.942825e-10 | 3386 |
| 0.9670 | 0.6776 | 0.9720 | 0.6761 | 9.942792e-10 | 3387 |
| 0.9557 | 0.6776 | 0.9719 | 0.6761 | 9.942759e-10 | 3388 |
| 0.9569 | 0.6776 | 0.9719 | 0.6761 | 9.942726e-10 | 3389 |
| 0.9603 | 0.6776 | 0.9719 | 0.6761 | 9.942692e-10 | 3390 |
| 0.9574 | 0.6776 | 0.9719 | 0.6761 | 9.942659e-10 | 3391 |
| 0.9605 | 0.6776 | 0.9719 | 0.6761 | 9.942626e-10 | 3392 |
| 0.9588 | 0.6776 | 0.9719 | 0.6761 | 9.942592e-10 | 3393 |
| 0.9610 | 0.6776 | 0.9719 | 0.6761 | 9.942559e-10 | 3394 |
| 0.9594 | 0.6776 | 0.9718 | 0.6761 | 9.942526e-10 | 3395 |
| 0.9605 | 0.6776 | 0.9718 | 0.6761 | 9.942492e-10 | 3396 |
| 0.9595 | 0.6776 | 0.9718 | 0.6761 | 9.942459e-10 | 3397 |
| 0.9566 | 0.6776 | 0.9718 | 0.6761 | 9.942426e-10 | 3398 |
| 0.9579 | 0.6776 | 0.9718 | 0.6761 | 9.942392e-10 | 3399 |
| 0.9583 | 0.6776 | 0.9718 | 0.6761 | 9.942359e-10 | 3400 |
| 0.9590 | 0.6776 | 0.9718 | 0.6761 | 9.942326e-10 | 3401 |
| 0.9572 | 0.6776 | 0.9717 | 0.6761 | 9.942293e-10 | 3402 |
| 0.9634 | 0.6776 | 0.9717 | 0.6761 | 9.942259e-10 | 3403 |
| 0.9620 | 0.6776 | 0.9717 | 0.6761 | 9.942226e-10 | 3404 |
| 0.9573 | 0.6776 | 0.9717 | 0.6761 | 9.942193e-10 | 3405 |
| 0.9674 | 0.6776 | 0.9717 | 0.6761 | 9.942158e-10 | 3406 |
| 0.9618 | 0.6776 | 0.9717 | 0.6761 | 9.942124e-10 | 3407 |
| 0.9575 | 0.6776 | 0.9717 | 0.6761 | 9.942089e-10 | 3408 |
| 0.9621 | 0.6776 | 0.9717 | 0.6761 | 9.942055e-10 | 3409 |
| 0.9679 | 0.6776 | 0.9716 | 0.6761 | 9.94202e-10 | 3410 |
| 0.9574 | 0.6776 | 0.9716 | 0.6761 | 9.941986e-10 | 3411 |
| 0.9527 | 0.6776 | 0.9716 | 0.6761 | 9.941952e-10 | 3412 |
| 0.9649 | 0.6776 | 0.9716 | 0.6761 | 9.941917e-10 | 3413 |
| 0.9542 | 0.6776 | 0.9716 | 0.6761 | 9.941883e-10 | 3414 |
| 0.9605 | 0.6776 | 0.9716 | 0.6761 | 9.941848e-10 | 3415 |
| 0.9587 | 0.6776 | 0.9715 | 0.6761 | 9.941814e-10 | 3416 |
| 0.9605 | 0.6776 | 0.9715 | 0.6761 | 9.94178e-10 | 3417 |
| 0.9615 | 0.6776 | 0.9715 | 0.6761 | 9.941745e-10 | 3418 |
| 0.9579 | 0.6776 | 0.9715 | 0.6761 | 9.941711e-10 | 3419 |
| 0.9592 | 0.6776 | 0.9715 | 0.6761 | 9.941676e-10 | 3420 |
| 0.9583 | 0.6776 | 0.9715 | 0.6761 | 9.941642e-10 | 3421 |
| 0.9576 | 0.6776 | 0.9715 | 0.6761 | 9.941608e-10 | 3422 |
| 0.9555 | 0.6776 | 0.9714 | 0.6761 | 9.941573e-10 | 3423 |
| 0.9653 | 0.6776 | 0.9714 | 0.6761 | 9.941539e-10 | 3424 |
| 0.9629 | 0.6776 | 0.9714 | 0.6761 | 9.941504e-10 | 3425 |
| 0.9633 | 0.6776 | 0.9714 | 0.6761 | 9.94147e-10 | 3426 |
| 0.9601 | 0.6776 | 0.9714 | 0.6761 | 9.941435e-10 | 3427 |
| 0.9581 | 0.6776 | 0.9714 | 0.6761 | 9.941401e-10 | 3428 |
| 0.9615 | 0.6776 | 0.9714 | 0.6761 | 9.941367e-10 | 3429 |
| 0.9599 | 0.6776 | 0.9714 | 0.6761 | 9.941332e-10 | 3430 |
| 0.9550 | 0.6776 | 0.9713 | 0.6761 | 9.941298e-10 | 3431 |
| 0.9523 | 0.6776 | 0.9713 | 0.6761 | 9.941263e-10 | 3432 |
| 0.9624 | 0.6776 | 0.9713 | 0.6761 | 9.941229e-10 | 3433 |
| 0.9635 | 0.6776 | 0.9713 | 0.6761 | 9.941195e-10 | 3434 |
| 0.9608 | 0.6776 | 0.9713 | 0.6761 | 9.94116e-10 | 3435 |
| 0.9635 | 0.6776 | 0.9713 | 0.6761 | 9.941126e-10 | 3436 |
| 0.9580 | 0.6776 | 0.9713 | 0.6761 | 9.941091e-10 | 3437 |
| 0.9572 | 0.6776 | 0.9712 | 0.6761 | 9.941057e-10 | 3438 |
| 0.9584 | 0.6776 | 0.9712 | 0.6761 | 9.941022e-10 | 3439 |
| 0.9603 | 0.6776 | 0.9712 | 0.6761 | 9.940988e-10 | 3440 |
| 0.9516 | 0.6776 | 0.9712 | 0.6761 | 9.940954e-10 | 3441 |
| 0.9642 | 0.6776 | 0.9712 | 0.6761 | 9.940919e-10 | 3442 |
| 0.9638 | 0.6776 | 0.9712 | 0.6761 | 9.940885e-10 | 3443 |
| 0.9597 | 0.6776 | 0.9712 | 0.6761 | 9.94085e-10 | 3444 |
| 0.9597 | 0.6776 | 0.9711 | 0.6761 | 9.940816e-10 | 3445 |
| 0.9613 | 0.6776 | 0.9711 | 0.6761 | 9.940782e-10 | 3446 |
| 0.9604 | 0.6776 | 0.9711 | 0.6761 | 9.940747e-10 | 3447 |
| 0.9542 | 0.6776 | 0.9711 | 0.6761 | 9.940713e-10 | 3448 |
| 0.9560 | 0.6776 | 0.9711 | 0.6761 | 9.940678e-10 | 3449 |
| 0.9616 | 0.6776 | 0.9711 | 0.6761 | 9.940644e-10 | 3450 |
| 0.9613 | 0.6776 | 0.9711 | 0.6761 | 9.94061e-10 | 3451 |
| 0.9656 | 0.6776 | 0.9711 | 0.6761 | 9.940575e-10 | 3452 |
| 0.9653 | 0.6776 | 0.9710 | 0.6761 | 9.940541e-10 | 3453 |
| 0.9619 | 0.6776 | 0.9710 | 0.6761 | 9.940506e-10 | 3454 |
| 0.9635 | 0.6776 | 0.9710 | 0.6761 | 9.940472e-10 | 3455 |
| 0.9579 | 0.6776 | 0.9710 | 0.6761 | 9.940437e-10 | 3456 |
| 0.9616 | 0.6776 | 0.9710 | 0.6761 | 9.940403e-10 | 3457 |
| 0.9649 | 0.6776 | 0.9710 | 0.6761 | 9.940369e-10 | 3458 |
| 0.9673 | 0.6776 | 0.9710 | 0.6761 | 9.940334e-10 | 3459 |
| 0.9509 | 0.6776 | 0.9709 | 0.6761 | 9.9403e-10 | 3460 |
| 0.9621 | 0.6776 | 0.9709 | 0.6761 | 9.940265e-10 | 3461 |
| 0.9673 | 0.6776 | 0.9709 | 0.6761 | 9.940231e-10 | 3462 |
| 0.9702 | 0.6776 | 0.9709 | 0.6761 | 9.940196e-10 | 3463 |
| 0.9703 | 0.6776 | 0.9709 | 0.6761 | 9.940162e-10 | 3464 |
| 0.9623 | 0.6776 | 0.9709 | 0.6761 | 9.940128e-10 | 3465 |
| 0.9566 | 0.6776 | 0.9709 | 0.6761 | 9.940093e-10 | 3466 |
| 0.9596 | 0.6776 | 0.9708 | 0.6761 | 9.940059e-10 | 3467 |
| 0.9584 | 0.6776 | 0.9708 | 0.6761 | 9.940024e-10 | 3468 |
| 0.9592 | 0.6776 | 0.9708 | 0.6761 | 9.93999e-10 | 3469 |
| 0.9609 | 0.6776 | 0.9708 | 0.6761 | 9.939956e-10 | 3470 |
| 0.9614 | 0.6776 | 0.9708 | 0.6761 | 9.939921e-10 | 3471 |
| 0.9616 | 0.6776 | 0.9708 | 0.6761 | 9.939887e-10 | 3472 |
| 0.9546 | 0.6776 | 0.9708 | 0.6761 | 9.939852e-10 | 3473 |
| 0.9630 | 0.6776 | 0.9708 | 0.6761 | 9.939818e-10 | 3474 |
| 0.9508 | 0.6776 | 0.9707 | 0.6761 | 9.939783e-10 | 3475 |
| 0.9614 | 0.6776 | 0.9707 | 0.6761 | 9.939749e-10 | 3476 |
| 0.9599 | 0.6776 | 0.9707 | 0.6761 | 9.939715e-10 | 3477 |
| 0.9634 | 0.6776 | 0.9707 | 0.6761 | 9.93968e-10 | 3478 |
| 0.9619 | 0.6776 | 0.9707 | 0.6761 | 9.939646e-10 | 3479 |
| 0.9616 | 0.6776 | 0.9707 | 0.6761 | 9.939611e-10 | 3480 |
| 0.9604 | 0.6776 | 0.9707 | 0.6761 | 9.939577e-10 | 3481 |
| 0.9530 | 0.6776 | 0.9707 | 0.6761 | 9.939543e-10 | 3482 |
| 0.9577 | 0.6776 | 0.9706 | 0.6761 | 9.939508e-10 | 3483 |
| 0.9569 | 0.6776 | 0.9706 | 0.6761 | 9.939474e-10 | 3484 |
| 0.9611 | 0.6776 | 0.9706 | 0.6761 | 9.939439e-10 | 3485 |
| 0.9644 | 0.6776 | 0.9706 | 0.6761 | 9.939405e-10 | 3486 |
| 0.9629 | 0.6776 | 0.9706 | 0.6761 | 9.93937e-10 | 3487 |
| 0.9528 | 0.6776 | 0.9706 | 0.6761 | 9.939336e-10 | 3488 |
| 0.9618 | 0.6776 | 0.9706 | 0.6761 | 9.939302e-10 | 3489 |
| 0.9591 | 0.6776 | 0.9705 | 0.6761 | 9.939267e-10 | 3490 |
| 0.9595 | 0.6776 | 0.9705 | 0.6761 | 9.939233e-10 | 3491 |
| 0.9572 | 0.6776 | 0.9705 | 0.6761 | 9.939198e-10 | 3492 |
| 0.9537 | 0.6776 | 0.9705 | 0.6761 | 9.939164e-10 | 3493 |
| 0.9575 | 0.6776 | 0.9705 | 0.6761 | 9.93913e-10 | 3494 |
| 0.9638 | 0.6776 | 0.9705 | 0.6761 | 9.939095e-10 | 3495 |
| 0.9621 | 0.6776 | 0.9705 | 0.6761 | 9.939061e-10 | 3496 |
| 0.9556 | 0.6776 | 0.9705 | 0.6761 | 9.939026e-10 | 3497 |
| 0.9610 | 0.6776 | 0.9704 | 0.6761 | 9.938992e-10 | 3498 |
| 0.9548 | 0.6776 | 0.9704 | 0.6761 | 9.938957e-10 | 3499 |
| 0.9606 | 0.6776 | 0.9704 | 0.6761 | 9.938923e-10 | 3500 |
| 0.9581 | 0.6776 | 0.9704 | 0.6761 | 9.938889e-10 | 3501 |
| 0.9599 | 0.6776 | 0.9704 | 0.6761 | 9.938854e-10 | 3502 |
| 0.9617 | 0.6776 | 0.9704 | 0.6761 | 9.93882e-10 | 3503 |
| 0.9623 | 0.6776 | 0.9704 | 0.6761 | 9.938785e-10 | 3504 |
| 0.9603 | 0.6776 | 0.9703 | 0.6761 | 9.938751e-10 | 3505 |
| 0.9570 | 0.6776 | 0.9703 | 0.6761 | 9.938717e-10 | 3506 |
| 0.9553 | 0.6776 | 0.9703 | 0.6761 | 9.938682e-10 | 3507 |
| 0.9573 | 0.6776 | 0.9703 | 0.6761 | 9.938648e-10 | 3508 |
| 0.9523 | 0.6776 | 0.9703 | 0.6761 | 9.938613e-10 | 3509 |
| 0.9557 | 0.6776 | 0.9703 | 0.6761 | 9.938579e-10 | 3510 |
| 0.9603 | 0.6776 | 0.9703 | 0.6761 | 9.938544e-10 | 3511 |
| 0.9628 | 0.6776 | 0.9703 | 0.6761 | 9.93851e-10 | 3512 |
| 0.9646 | 0.6776 | 0.9702 | 0.6761 | 9.938476e-10 | 3513 |
| 0.9649 | 0.6776 | 0.9702 | 0.6761 | 9.938441e-10 | 3514 |
| 0.9559 | 0.6776 | 0.9702 | 0.6761 | 9.938407e-10 | 3515 |
| 0.9597 | 0.6776 | 0.9702 | 0.6761 | 9.938372e-10 | 3516 |
| 0.9595 | 0.6776 | 0.9702 | 0.6761 | 9.938338e-10 | 3517 |
| 0.9647 | 0.6776 | 0.9702 | 0.6761 | 9.938304e-10 | 3518 |
| 0.9570 | 0.6776 | 0.9702 | 0.6761 | 9.938268e-10 | 3519 |
| 0.9549 | 0.6776 | 0.9702 | 0.6761 | 9.938232e-10 | 3520 |
| 0.9549 | 0.6776 | 0.9701 | 0.6761 | 9.938197e-10 | 3521 |
| 0.9623 | 0.6776 | 0.9701 | 0.6761 | 9.938161e-10 | 3522 |
| 0.9611 | 0.6776 | 0.9701 | 0.6761 | 9.938126e-10 | 3523 |
| 0.9581 | 0.6776 | 0.9701 | 0.6761 | 9.93809e-10 | 3524 |
| 0.9592 | 0.6776 | 0.9701 | 0.6761 | 9.938055e-10 | 3525 |
| 0.9578 | 0.6776 | 0.9701 | 0.6761 | 9.938019e-10 | 3526 |
| 0.9639 | 0.6776 | 0.9701 | 0.6761 | 9.937984e-10 | 3527 |
| 0.9520 | 0.6776 | 0.9700 | 0.6761 | 9.937948e-10 | 3528 |
| 0.9500 | 0.6776 | 0.9700 | 0.6761 | 9.937913e-10 | 3529 |
| 0.9586 | 0.6776 | 0.9700 | 0.6761 | 9.937877e-10 | 3530 |
| 0.9656 | 0.6776 | 0.9700 | 0.6761 | 9.937842e-10 | 3531 |
| 0.9585 | 0.6776 | 0.9700 | 0.6761 | 9.937806e-10 | 3532 |
| 0.9641 | 0.6776 | 0.9700 | 0.6761 | 9.937771e-10 | 3533 |
| 0.9661 | 0.6776 | 0.9700 | 0.6761 | 9.937735e-10 | 3534 |
| 0.9631 | 0.6776 | 0.9700 | 0.6761 | 9.9377e-10 | 3535 |
| 0.9624 | 0.6776 | 0.9700 | 0.6761 | 9.937664e-10 | 3536 |
| 0.9524 | 0.6776 | 0.9699 | 0.6761 | 9.937628e-10 | 3537 |
| 0.9595 | 0.6776 | 0.9699 | 0.6761 | 9.937593e-10 | 3538 |
| 0.9531 | 0.6776 | 0.9699 | 0.6761 | 9.937557e-10 | 3539 |
| 0.9651 | 0.6776 | 0.9699 | 0.6761 | 9.937522e-10 | 3540 |
| 0.9560 | 0.6776 | 0.9699 | 0.6761 | 9.937486e-10 | 3541 |
| 0.9579 | 0.6776 | 0.9699 | 0.6761 | 9.937451e-10 | 3542 |
| 0.9556 | 0.6776 | 0.9699 | 0.6761 | 9.937415e-10 | 3543 |
| 0.9586 | 0.6776 | 0.9698 | 0.6761 | 9.93738e-10 | 3544 |
| 0.9610 | 0.6776 | 0.9698 | 0.6761 | 9.937344e-10 | 3545 |
| 0.9595 | 0.6776 | 0.9698 | 0.6761 | 9.937309e-10 | 3546 |
| 0.9548 | 0.6776 | 0.9698 | 0.6761 | 9.937273e-10 | 3547 |
| 0.9474 | 0.6776 | 0.9698 | 0.6761 | 9.937238e-10 | 3548 |
| 0.9614 | 0.6776 | 0.9698 | 0.6761 | 9.937202e-10 | 3549 |
| 0.9595 | 0.6776 | 0.9698 | 0.6761 | 9.937167e-10 | 3550 |
| 0.9607 | 0.6776 | 0.9698 | 0.6761 | 9.937131e-10 | 3551 |
| 0.9578 | 0.6776 | 0.9698 | 0.6761 | 9.937096e-10 | 3552 |
| 0.9636 | 0.6776 | 0.9697 | 0.6761 | 9.93706e-10 | 3553 |
| 0.9537 | 0.6776 | 0.9697 | 0.6761 | 9.937025e-10 | 3554 |
| 0.9533 | 0.6776 | 0.9697 | 0.6761 | 9.936989e-10 | 3555 |
| 0.9587 | 0.6776 | 0.9697 | 0.6761 | 9.936953e-10 | 3556 |
| 0.9600 | 0.6776 | 0.9697 | 0.6761 | 9.936918e-10 | 3557 |
| 0.9595 | 0.6776 | 0.9697 | 0.6761 | 9.936882e-10 | 3558 |
| 0.9508 | 0.6776 | 0.9697 | 0.6761 | 9.936847e-10 | 3559 |
| 0.9606 | 0.6776 | 0.9697 | 0.6761 | 9.936811e-10 | 3560 |
| 0.9560 | 0.6776 | 0.9696 | 0.6761 | 9.936776e-10 | 3561 |
| 0.9588 | 0.6776 | 0.9696 | 0.6761 | 9.93674e-10 | 3562 |
| 0.9522 | 0.6776 | 0.9696 | 0.6761 | 9.936705e-10 | 3563 |
| 0.9597 | 0.6776 | 0.9696 | 0.6761 | 9.936669e-10 | 3564 |
| 0.9572 | 0.6776 | 0.9696 | 0.6761 | 9.936634e-10 | 3565 |
| 0.9493 | 0.6776 | 0.9696 | 0.6761 | 9.936598e-10 | 3566 |
| 0.9579 | 0.6776 | 0.9696 | 0.6761 | 9.936563e-10 | 3567 |
| 0.9618 | 0.6776 | 0.9696 | 0.6761 | 9.936527e-10 | 3568 |
| 0.9538 | 0.6776 | 0.9695 | 0.6761 | 9.936492e-10 | 3569 |
| 0.9600 | 0.6776 | 0.9695 | 0.6761 | 9.936456e-10 | 3570 |
| 0.9620 | 0.6776 | 0.9695 | 0.6761 | 9.936421e-10 | 3571 |
| 0.9575 | 0.6776 | 0.9695 | 0.6761 | 9.936385e-10 | 3572 |
| 0.9571 | 0.6776 | 0.9695 | 0.6761 | 9.93635e-10 | 3573 |
| 0.9588 | 0.6776 | 0.9695 | 0.6761 | 9.936314e-10 | 3574 |
| 0.9562 | 0.6776 | 0.9695 | 0.6761 | 9.936278e-10 | 3575 |
| 0.9614 | 0.6776 | 0.9695 | 0.6761 | 9.936243e-10 | 3576 |
| 0.9611 | 0.6776 | 0.9694 | 0.6761 | 9.936207e-10 | 3577 |
| 0.9563 | 0.6776 | 0.9694 | 0.6761 | 9.936172e-10 | 3578 |
| 0.9531 | 0.6776 | 0.9694 | 0.6761 | 9.936136e-10 | 3579 |
| 0.9569 | 0.6776 | 0.9694 | 0.6761 | 9.936101e-10 | 3580 |
| 0.9554 | 0.6776 | 0.9694 | 0.6761 | 9.936065e-10 | 3581 |
| 0.9609 | 0.6776 | 0.9694 | 0.6761 | 9.93603e-10 | 3582 |
| 0.9519 | 0.6776 | 0.9694 | 0.6761 | 9.935994e-10 | 3583 |
| 0.9561 | 0.6776 | 0.9694 | 0.6761 | 9.935959e-10 | 3584 |
| 0.9573 | 0.6776 | 0.9693 | 0.6761 | 9.935923e-10 | 3585 |
| 0.9595 | 0.6776 | 0.9693 | 0.6761 | 9.935888e-10 | 3586 |
| 0.9566 | 0.6776 | 0.9693 | 0.6761 | 9.935852e-10 | 3587 |
| 0.9532 | 0.6776 | 0.9693 | 0.6761 | 9.935817e-10 | 3588 |
| 0.9520 | 0.6776 | 0.9693 | 0.6761 | 9.935781e-10 | 3589 |
| 0.9559 | 0.6776 | 0.9693 | 0.6761 | 9.935746e-10 | 3590 |
| 0.9598 | 0.6776 | 0.9693 | 0.6761 | 9.93571e-10 | 3591 |
| 0.9570 | 0.6776 | 0.9693 | 0.6761 | 9.935675e-10 | 3592 |
| 0.9553 | 0.6776 | 0.9693 | 0.6761 | 9.935639e-10 | 3593 |
| 0.9593 | 0.6776 | 0.9692 | 0.6761 | 9.935603e-10 | 3594 |
| 0.9618 | 0.6776 | 0.9692 | 0.6761 | 9.935568e-10 | 3595 |
| 0.9565 | 0.6776 | 0.9692 | 0.6761 | 9.935532e-10 | 3596 |
| 0.9599 | 0.6776 | 0.9692 | 0.6761 | 9.935497e-10 | 3597 |
| 0.9595 | 0.6776 | 0.9692 | 0.6761 | 9.935461e-10 | 3598 |
| 0.9590 | 0.6776 | 0.9692 | 0.6761 | 9.935426e-10 | 3599 |
| 0.9512 | 0.6776 | 0.9692 | 0.6761 | 9.93539e-10 | 3600 |
| 0.9557 | 0.6776 | 0.9692 | 0.6761 | 9.935355e-10 | 3601 |
| 0.9608 | 0.6776 | 0.9691 | 0.6761 | 9.935319e-10 | 3602 |
| 0.9560 | 0.6776 | 0.9691 | 0.6761 | 9.935284e-10 | 3603 |
| 0.9587 | 0.6776 | 0.9691 | 0.6761 | 9.935248e-10 | 3604 |
| 0.9614 | 0.6776 | 0.9691 | 0.6761 | 9.935213e-10 | 3605 |
| 0.9480 | 0.6776 | 0.9691 | 0.6761 | 9.935177e-10 | 3606 |
| 0.9557 | 0.6776 | 0.9691 | 0.6761 | 9.935142e-10 | 3607 |
| 0.9607 | 0.6776 | 0.9691 | 0.6761 | 9.935106e-10 | 3608 |
| 0.9535 | 0.6776 | 0.9691 | 0.6761 | 9.93507e-10 | 3609 |
| 0.9556 | 0.6776 | 0.9690 | 0.6761 | 9.935035e-10 | 3610 |
| 0.9521 | 0.6776 | 0.9690 | 0.6761 | 9.935e-10 | 3611 |
| 0.9605 | 0.6776 | 0.9690 | 0.6761 | 9.934964e-10 | 3612 |
| 0.9531 | 0.6776 | 0.9690 | 0.6761 | 9.934928e-10 | 3613 |
| 0.9616 | 0.6776 | 0.9690 | 0.6761 | 9.934893e-10 | 3614 |
| 0.9575 | 0.6776 | 0.9690 | 0.6761 | 9.934857e-10 | 3615 |
| 0.9631 | 0.6776 | 0.9690 | 0.6761 | 9.934822e-10 | 3616 |
| 0.9621 | 0.6776 | 0.9690 | 0.6761 | 9.934786e-10 | 3617 |
| 0.9548 | 0.6776 | 0.9689 | 0.6761 | 9.934751e-10 | 3618 |
| 0.9551 | 0.6776 | 0.9689 | 0.6761 | 9.934715e-10 | 3619 |
| 0.9627 | 0.6776 | 0.9689 | 0.6761 | 9.93468e-10 | 3620 |
| 0.9587 | 0.6776 | 0.9689 | 0.6761 | 9.934644e-10 | 3621 |
| 0.9588 | 0.6776 | 0.9689 | 0.6761 | 9.934609e-10 | 3622 |
| 0.9547 | 0.6776 | 0.9689 | 0.6761 | 9.934573e-10 | 3623 |
| 0.9599 | 0.6776 | 0.9689 | 0.6761 | 9.934538e-10 | 3624 |
| 0.9619 | 0.6776 | 0.9689 | 0.6761 | 9.934502e-10 | 3625 |
| 0.9626 | 0.6776 | 0.9689 | 0.6761 | 9.934467e-10 | 3626 |
| 0.9527 | 0.6776 | 0.9688 | 0.6761 | 9.934431e-10 | 3627 |
| 0.9546 | 0.6776 | 0.9688 | 0.6761 | 9.934396e-10 | 3628 |
| 0.9567 | 0.6776 | 0.9688 | 0.6761 | 9.93436e-10 | 3629 |
| 0.9609 | 0.6776 | 0.9688 | 0.6761 | 9.934324e-10 | 3630 |
| 0.9571 | 0.6776 | 0.9688 | 0.6761 | 9.934289e-10 | 3631 |
| 0.9544 | 0.6776 | 0.9688 | 0.6761 | 9.934253e-10 | 3632 |
| 0.9504 | 0.6776 | 0.9688 | 0.6761 | 9.934217e-10 | 3633 |
| 0.9564 | 0.6776 | 0.9688 | 0.6761 | 9.93418e-10 | 3634 |
| 0.9544 | 0.6776 | 0.9687 | 0.6761 | 9.934144e-10 | 3635 |
| 0.9601 | 0.6776 | 0.9687 | 0.6761 | 9.934107e-10 | 3636 |
| 0.9582 | 0.6776 | 0.9687 | 0.6761 | 9.93407e-10 | 3637 |
| 0.9527 | 0.6776 | 0.9687 | 0.6761 | 9.934034e-10 | 3638 |
| 0.9508 | 0.6776 | 0.9687 | 0.6761 | 9.933997e-10 | 3639 |
| 0.9462 | 0.6776 | 0.9687 | 0.6761 | 9.93396e-10 | 3640 |
| 0.9607 | 0.6776 | 0.9687 | 0.6761 | 9.933924e-10 | 3641 |
| 0.9605 | 0.6776 | 0.9687 | 0.6761 | 9.933887e-10 | 3642 |
| 0.9551 | 0.6776 | 0.9687 | 0.6761 | 9.93385e-10 | 3643 |
| 0.9557 | 0.6776 | 0.9686 | 0.6761 | 9.933814e-10 | 3644 |
| 0.9575 | 0.6776 | 0.9686 | 0.6761 | 9.933777e-10 | 3645 |
| 0.9539 | 0.6776 | 0.9686 | 0.6761 | 9.93374e-10 | 3646 |
| 0.9559 | 0.6776 | 0.9686 | 0.6761 | 9.933704e-10 | 3647 |
| 0.9609 | 0.6776 | 0.9686 | 0.6761 | 9.933667e-10 | 3648 |
| 0.9561 | 0.6776 | 0.9686 | 0.6761 | 9.933631e-10 | 3649 |
| 0.9534 | 0.6776 | 0.9686 | 0.6761 | 9.933594e-10 | 3650 |
| 0.9561 | 0.6776 | 0.9686 | 0.6761 | 9.933557e-10 | 3651 |
| 0.9586 | 0.6776 | 0.9686 | 0.6761 | 9.933521e-10 | 3652 |
| 0.9525 | 0.6776 | 0.9685 | 0.6761 | 9.933484e-10 | 3653 |
| 0.9501 | 0.6776 | 0.9685 | 0.6761 | 9.933447e-10 | 3654 |
| 0.9589 | 0.6776 | 0.9685 | 0.6761 | 9.933411e-10 | 3655 |
| 0.9560 | 0.6776 | 0.9685 | 0.6761 | 9.933374e-10 | 3656 |
| 0.9558 | 0.6776 | 0.9685 | 0.6761 | 9.933337e-10 | 3657 |
| 0.9646 | 0.6776 | 0.9685 | 0.6761 | 9.933301e-10 | 3658 |
| 0.9567 | 0.6776 | 0.9685 | 0.6761 | 9.933264e-10 | 3659 |
| 0.9542 | 0.6776 | 0.9685 | 0.6761 | 9.933228e-10 | 3660 |
| 0.9526 | 0.6776 | 0.9684 | 0.6761 | 9.933191e-10 | 3661 |
| 0.9595 | 0.6776 | 0.9684 | 0.6761 | 9.933154e-10 | 3662 |
| 0.9588 | 0.6776 | 0.9684 | 0.6761 | 9.933118e-10 | 3663 |
| 0.9537 | 0.6776 | 0.9684 | 0.6761 | 9.933081e-10 | 3664 |
| 0.9543 | 0.6776 | 0.9684 | 0.6761 | 9.933044e-10 | 3665 |
| 0.9598 | 0.6776 | 0.9684 | 0.6761 | 9.933008e-10 | 3666 |
| 0.9607 | 0.6776 | 0.9684 | 0.6761 | 9.932971e-10 | 3667 |
| 0.9538 | 0.6776 | 0.9684 | 0.6761 | 9.932934e-10 | 3668 |
| 0.9570 | 0.6776 | 0.9684 | 0.6761 | 9.932898e-10 | 3669 |
| 0.9583 | 0.6776 | 0.9683 | 0.6761 | 9.932861e-10 | 3670 |
| 0.9578 | 0.6776 | 0.9683 | 0.6761 | 9.932825e-10 | 3671 |
| 0.9523 | 0.6776 | 0.9683 | 0.6761 | 9.932788e-10 | 3672 |
| 0.9562 | 0.6776 | 0.9683 | 0.6761 | 9.932751e-10 | 3673 |
| 0.9549 | 0.6776 | 0.9683 | 0.6761 | 9.932715e-10 | 3674 |
| 0.9625 | 0.6776 | 0.9683 | 0.6761 | 9.932678e-10 | 3675 |
| 0.9567 | 0.6776 | 0.9683 | 0.6761 | 9.932641e-10 | 3676 |
| 0.9538 | 0.6776 | 0.9683 | 0.6761 | 9.932605e-10 | 3677 |
| 0.9553 | 0.6776 | 0.9683 | 0.6761 | 9.932568e-10 | 3678 |
| 0.9534 | 0.6776 | 0.9682 | 0.6761 | 9.932531e-10 | 3679 |
| 0.9562 | 0.6776 | 0.9682 | 0.6761 | 9.932495e-10 | 3680 |
| 0.9537 | 0.6776 | 0.9682 | 0.6761 | 9.932458e-10 | 3681 |
| 0.9653 | 0.6776 | 0.9682 | 0.6761 | 9.932422e-10 | 3682 |
| 0.9599 | 0.6776 | 0.9682 | 0.6761 | 9.932385e-10 | 3683 |
| 0.9532 | 0.6776 | 0.9682 | 0.6761 | 9.932348e-10 | 3684 |
| 0.9553 | 0.6776 | 0.9682 | 0.6761 | 9.932312e-10 | 3685 |
| 0.9537 | 0.6776 | 0.9682 | 0.6761 | 9.932275e-10 | 3686 |
| 0.9536 | 0.6776 | 0.9682 | 0.6761 | 9.932238e-10 | 3687 |
| 0.9539 | 0.6776 | 0.9681 | 0.6761 | 9.932202e-10 | 3688 |
| 0.9563 | 0.6776 | 0.9681 | 0.6761 | 9.932165e-10 | 3689 |
| 0.9501 | 0.6776 | 0.9681 | 0.6761 | 9.932128e-10 | 3690 |
| 0.9565 | 0.6776 | 0.9681 | 0.6761 | 9.932092e-10 | 3691 |
| 0.9550 | 0.6776 | 0.9681 | 0.6761 | 9.932055e-10 | 3692 |
| 0.9602 | 0.6776 | 0.9681 | 0.6761 | 9.932019e-10 | 3693 |
| 0.9639 | 0.6776 | 0.9681 | 0.6761 | 9.931982e-10 | 3694 |
| 0.9566 | 0.6776 | 0.9681 | 0.6761 | 9.931945e-10 | 3695 |
| 0.9604 | 0.6776 | 0.9681 | 0.6761 | 9.931909e-10 | 3696 |
| 0.9570 | 0.6776 | 0.9680 | 0.6761 | 9.931872e-10 | 3697 |
| 0.9575 | 0.6776 | 0.9680 | 0.6761 | 9.931835e-10 | 3698 |
| 0.9523 | 0.6776 | 0.9680 | 0.6761 | 9.931799e-10 | 3699 |
| 0.9523 | 0.6776 | 0.9680 | 0.6761 | 9.931762e-10 | 3700 |
| 0.9508 | 0.6776 | 0.9680 | 0.6761 | 9.931725e-10 | 3701 |
| 0.9537 | 0.6776 | 0.9680 | 0.6761 | 9.931689e-10 | 3702 |
| 0.9598 | 0.6776 | 0.9680 | 0.6761 | 9.931652e-10 | 3703 |
| 0.9555 | 0.6776 | 0.9680 | 0.6761 | 9.931616e-10 | 3704 |
| 0.9551 | 0.6776 | 0.9680 | 0.6761 | 9.931579e-10 | 3705 |
| 0.9594 | 0.6776 | 0.9679 | 0.6761 | 9.931542e-10 | 3706 |
| 0.9569 | 0.6776 | 0.9679 | 0.6761 | 9.931506e-10 | 3707 |
| 0.9583 | 0.6776 | 0.9679 | 0.6761 | 9.931469e-10 | 3708 |
| 0.9578 | 0.6776 | 0.9679 | 0.6761 | 9.931432e-10 | 3709 |
| 0.9580 | 0.6776 | 0.9679 | 0.6761 | 9.931396e-10 | 3710 |
| 0.9517 | 0.6776 | 0.9679 | 0.6761 | 9.931359e-10 | 3711 |
| 0.9589 | 0.6776 | 0.9679 | 0.6761 | 9.931322e-10 | 3712 |
| 0.9639 | 0.6776 | 0.9679 | 0.6761 | 9.931286e-10 | 3713 |
| 0.9551 | 0.6776 | 0.9679 | 0.6761 | 9.931249e-10 | 3714 |
| 0.9544 | 0.6776 | 0.9678 | 0.6761 | 9.931213e-10 | 3715 |
| 0.9601 | 0.6776 | 0.9678 | 0.6761 | 9.931176e-10 | 3716 |
| 0.9544 | 0.6776 | 0.9678 | 0.6761 | 9.931139e-10 | 3717 |
| 0.9505 | 0.6776 | 0.9678 | 0.6761 | 9.931103e-10 | 3718 |
| 0.9562 | 0.6776 | 0.9678 | 0.6761 | 9.931066e-10 | 3719 |
| 0.9550 | 0.6776 | 0.9678 | 0.6761 | 9.931029e-10 | 3720 |
| 0.9591 | 0.6776 | 0.9678 | 0.6761 | 9.930993e-10 | 3721 |
| 0.9511 | 0.6776 | 0.9678 | 0.6761 | 9.930956e-10 | 3722 |
| 0.9580 | 0.6776 | 0.9678 | 0.6761 | 9.930919e-10 | 3723 |
| 0.9560 | 0.6776 | 0.9677 | 0.6761 | 9.930883e-10 | 3724 |
| 0.9550 | 0.6776 | 0.9677 | 0.6761 | 9.930846e-10 | 3725 |
| 0.9582 | 0.6776 | 0.9677 | 0.6761 | 9.93081e-10 | 3726 |
| 0.9595 | 0.6776 | 0.9677 | 0.6761 | 9.930773e-10 | 3727 |
| 0.9575 | 0.6776 | 0.9677 | 0.6761 | 9.930736e-10 | 3728 |
| 0.9589 | 0.6776 | 0.9677 | 0.6761 | 9.9307e-10 | 3729 |
| 0.9511 | 0.6776 | 0.9677 | 0.6761 | 9.930663e-10 | 3730 |
| 0.9572 | 0.6776 | 0.9677 | 0.6761 | 9.930626e-10 | 3731 |
| 0.9553 | 0.6776 | 0.9677 | 0.6761 | 9.93059e-10 | 3732 |
| 0.9527 | 0.6776 | 0.9677 | 0.6761 | 9.930553e-10 | 3733 |
| 0.9575 | 0.6776 | 0.9676 | 0.6761 | 9.930516e-10 | 3734 |
| 0.9569 | 0.6776 | 0.9676 | 0.6761 | 9.93048e-10 | 3735 |
| 0.9593 | 0.6776 | 0.9676 | 0.6761 | 9.930443e-10 | 3736 |
| 0.9574 | 0.6776 | 0.9676 | 0.6761 | 9.930406e-10 | 3737 |
| 0.9550 | 0.6776 | 0.9676 | 0.6761 | 9.93037e-10 | 3738 |
| 0.9553 | 0.6776 | 0.9676 | 0.6761 | 9.930333e-10 | 3739 |
| 0.9570 | 0.6776 | 0.9676 | 0.6761 | 9.930297e-10 | 3740 |
| 0.9510 | 0.6776 | 0.9676 | 0.6761 | 9.93026e-10 | 3741 |
| 0.9605 | 0.6776 | 0.9676 | 0.6761 | 9.930223e-10 | 3742 |
| 0.9617 | 0.6776 | 0.9675 | 0.6761 | 9.930187e-10 | 3743 |
| 0.9533 | 0.6776 | 0.9675 | 0.6761 | 9.93015e-10 | 3744 |
| 0.9503 | 0.6776 | 0.9675 | 0.6761 | 9.930113e-10 | 3745 |
| 0.9483 | 0.6776 | 0.9675 | 0.6761 | 9.930076e-10 | 3746 |
| 0.9589 | 0.6776 | 0.9675 | 0.6761 | 9.930038e-10 | 3747 |
| 0.9554 | 0.6776 | 0.9675 | 0.6761 | 9.93e-10 | 3748 |
| 0.9506 | 0.6776 | 0.9675 | 0.6761 | 9.929962e-10 | 3749 |
| 0.9577 | 0.6776 | 0.9675 | 0.6761 | 9.929925e-10 | 3750 |
| 0.9529 | 0.6776 | 0.9675 | 0.6761 | 9.929887e-10 | 3751 |
| 0.9513 | 0.6776 | 0.9675 | 0.6761 | 9.929849e-10 | 3752 |
| 0.9529 | 0.6776 | 0.9674 | 0.6761 | 9.929811e-10 | 3753 |
| 0.9567 | 0.6776 | 0.9674 | 0.6761 | 9.929774e-10 | 3754 |
| 0.9582 | 0.6776 | 0.9674 | 0.6761 | 9.929736e-10 | 3755 |
| 0.9570 | 0.6776 | 0.9674 | 0.6761 | 9.929698e-10 | 3756 |
| 0.9546 | 0.6776 | 0.9674 | 0.6761 | 9.92966e-10 | 3757 |
| 0.9593 | 0.6776 | 0.9674 | 0.6761 | 9.929623e-10 | 3758 |
| 0.9578 | 0.6776 | 0.9674 | 0.6761 | 9.929585e-10 | 3759 |
| 0.9564 | 0.6776 | 0.9674 | 0.6761 | 9.929547e-10 | 3760 |
| 0.9552 | 0.6776 | 0.9674 | 0.6761 | 9.929509e-10 | 3761 |
| 0.9554 | 0.6776 | 0.9674 | 0.6761 | 9.929472e-10 | 3762 |
| 0.9521 | 0.6776 | 0.9673 | 0.6761 | 9.929434e-10 | 3763 |
| 0.9528 | 0.6776 | 0.9673 | 0.6761 | 9.929396e-10 | 3764 |
| 0.9513 | 0.6776 | 0.9673 | 0.6761 | 9.929358e-10 | 3765 |
| 0.9553 | 0.6776 | 0.9673 | 0.6761 | 9.929321e-10 | 3766 |
| 0.9579 | 0.6776 | 0.9673 | 0.6761 | 9.929283e-10 | 3767 |
| 0.9491 | 0.6776 | 0.9673 | 0.6761 | 9.929245e-10 | 3768 |
| 0.9568 | 0.6776 | 0.9673 | 0.6761 | 9.929207e-10 | 3769 |
| 0.9523 | 0.6776 | 0.9673 | 0.6761 | 9.92917e-10 | 3770 |
| 0.9568 | 0.6776 | 0.9673 | 0.6761 | 9.929132e-10 | 3771 |
| 0.9528 | 0.6776 | 0.9672 | 0.6761 | 9.929094e-10 | 3772 |
| 0.9553 | 0.6776 | 0.9672 | 0.6761 | 9.929056e-10 | 3773 |
| 0.9577 | 0.6776 | 0.9672 | 0.6761 | 9.929019e-10 | 3774 |
| 0.9640 | 0.6776 | 0.9672 | 0.6761 | 9.928981e-10 | 3775 |
| 0.9549 | 0.6776 | 0.9672 | 0.6761 | 9.928943e-10 | 3776 |
| 0.9502 | 0.6776 | 0.9672 | 0.6761 | 9.928905e-10 | 3777 |
| 0.9536 | 0.6776 | 0.9672 | 0.6761 | 9.928868e-10 | 3778 |
| 0.9572 | 0.6776 | 0.9672 | 0.6761 | 9.92883e-10 | 3779 |
| 0.9502 | 0.6776 | 0.9672 | 0.6761 | 9.928792e-10 | 3780 |
| 0.9552 | 0.6776 | 0.9672 | 0.6761 | 9.928754e-10 | 3781 |
| 0.9550 | 0.6776 | 0.9672 | 0.6761 | 9.928717e-10 | 3782 |
| 0.9458 | 0.6776 | 0.9671 | 0.6761 | 9.928679e-10 | 3783 |
| 0.9534 | 0.6776 | 0.9671 | 0.6761 | 9.928641e-10 | 3784 |
| 0.9565 | 0.6776 | 0.9671 | 0.6761 | 9.928603e-10 | 3785 |
| 0.9590 | 0.6776 | 0.9671 | 0.6761 | 9.928566e-10 | 3786 |
| 0.9575 | 0.6776 | 0.9671 | 0.6761 | 9.928528e-10 | 3787 |
| 0.9522 | 0.6776 | 0.9671 | 0.6761 | 9.92849e-10 | 3788 |
| 0.9564 | 0.6776 | 0.9671 | 0.6761 | 9.928453e-10 | 3789 |
| 0.9561 | 0.6776 | 0.9671 | 0.6761 | 9.928415e-10 | 3790 |
| 0.9547 | 0.6776 | 0.9671 | 0.6761 | 9.928377e-10 | 3791 |
| 0.9549 | 0.6776 | 0.9670 | 0.6761 | 9.928339e-10 | 3792 |
| 0.9493 | 0.6776 | 0.9670 | 0.6761 | 9.928302e-10 | 3793 |
| 0.9557 | 0.6776 | 0.9670 | 0.6761 | 9.928264e-10 | 3794 |
| 0.9622 | 0.6776 | 0.9670 | 0.6761 | 9.928226e-10 | 3795 |
| 0.9548 | 0.6776 | 0.9670 | 0.6761 | 9.928188e-10 | 3796 |
| 0.9473 | 0.6776 | 0.9670 | 0.6761 | 9.92815e-10 | 3797 |
| 0.9561 | 0.6776 | 0.9670 | 0.6761 | 9.928113e-10 | 3798 |
| 0.9511 | 0.6776 | 0.9670 | 0.6761 | 9.928075e-10 | 3799 |
| 0.9534 | 0.6776 | 0.9670 | 0.6761 | 9.928037e-10 | 3800 |
| 0.9442 | 0.6776 | 0.9670 | 0.6761 | 9.928e-10 | 3801 |
| 0.9558 | 0.6776 | 0.9669 | 0.6761 | 9.927962e-10 | 3802 |
| 0.9494 | 0.6776 | 0.9669 | 0.6761 | 9.927924e-10 | 3803 |
| 0.9529 | 0.6776 | 0.9669 | 0.6761 | 9.927886e-10 | 3804 |
| 0.9584 | 0.6776 | 0.9669 | 0.6761 | 9.927849e-10 | 3805 |
| 0.9529 | 0.6776 | 0.9669 | 0.6761 | 9.927811e-10 | 3806 |
| 0.9601 | 0.6776 | 0.9669 | 0.6761 | 9.927773e-10 | 3807 |
| 0.9566 | 0.6776 | 0.9669 | 0.6761 | 9.927735e-10 | 3808 |
| 0.9557 | 0.6776 | 0.9669 | 0.6761 | 9.927698e-10 | 3809 |
| 0.9532 | 0.6776 | 0.9669 | 0.6761 | 9.92766e-10 | 3810 |
| 0.9550 | 0.6776 | 0.9669 | 0.6761 | 9.927622e-10 | 3811 |
| 0.9470 | 0.6776 | 0.9669 | 0.6761 | 9.927584e-10 | 3812 |
| 0.9474 | 0.6776 | 0.9668 | 0.6761 | 9.927547e-10 | 3813 |
| 0.9513 | 0.6776 | 0.9668 | 0.6761 | 9.927509e-10 | 3814 |
| 0.9549 | 0.6776 | 0.9668 | 0.6761 | 9.927471e-10 | 3815 |
| 0.9537 | 0.6776 | 0.9668 | 0.6761 | 9.927433e-10 | 3816 |
| 0.9532 | 0.6776 | 0.9668 | 0.6761 | 9.927396e-10 | 3817 |
| 0.9599 | 0.6776 | 0.9668 | 0.6761 | 9.927358e-10 | 3818 |
| 0.9574 | 0.6776 | 0.9668 | 0.6761 | 9.92732e-10 | 3819 |
| 0.9545 | 0.6776 | 0.9668 | 0.6761 | 9.927282e-10 | 3820 |
| 0.9576 | 0.6776 | 0.9668 | 0.6761 | 9.927245e-10 | 3821 |
| 0.9535 | 0.6776 | 0.9668 | 0.6761 | 9.927207e-10 | 3822 |
| 0.9477 | 0.6776 | 0.9667 | 0.6761 | 9.927169e-10 | 3823 |
| 0.9535 | 0.6776 | 0.9667 | 0.6761 | 9.927131e-10 | 3824 |
| 0.9491 | 0.6776 | 0.9667 | 0.6761 | 9.927094e-10 | 3825 |
| 0.9550 | 0.6776 | 0.9667 | 0.6761 | 9.927056e-10 | 3826 |
| 0.9538 | 0.6776 | 0.9667 | 0.6761 | 9.927018e-10 | 3827 |
| 0.9553 | 0.6776 | 0.9667 | 0.6761 | 9.92698e-10 | 3828 |
| 0.9544 | 0.6776 | 0.9667 | 0.6761 | 9.926943e-10 | 3829 |
| 0.9612 | 0.6776 | 0.9667 | 0.6761 | 9.926905e-10 | 3830 |
| 0.9587 | 0.6776 | 0.9667 | 0.6761 | 9.926867e-10 | 3831 |
| 0.9559 | 0.6776 | 0.9667 | 0.6761 | 9.926829e-10 | 3832 |
| 0.9583 | 0.6776 | 0.9666 | 0.6761 | 9.926792e-10 | 3833 |
| 0.9526 | 0.6776 | 0.9666 | 0.6761 | 9.926754e-10 | 3834 |
| 0.9529 | 0.6776 | 0.9666 | 0.6761 | 9.926716e-10 | 3835 |
| 0.9501 | 0.6776 | 0.9666 | 0.6761 | 9.926678e-10 | 3836 |
| 0.9476 | 0.6776 | 0.9666 | 0.6761 | 9.926641e-10 | 3837 |
| 0.9500 | 0.6776 | 0.9666 | 0.6761 | 9.926603e-10 | 3838 |
| 0.9553 | 0.6776 | 0.9666 | 0.6761 | 9.926565e-10 | 3839 |
| 0.9584 | 0.6776 | 0.9666 | 0.6761 | 9.926527e-10 | 3840 |
| 0.9529 | 0.6776 | 0.9666 | 0.6761 | 9.92649e-10 | 3841 |
| 0.9520 | 0.6776 | 0.9666 | 0.6761 | 9.926452e-10 | 3842 |
| 0.9519 | 0.6776 | 0.9665 | 0.6761 | 9.926414e-10 | 3843 |
| 0.9567 | 0.6776 | 0.9665 | 0.6761 | 9.926376e-10 | 3844 |
| 0.9519 | 0.6776 | 0.9665 | 0.6761 | 9.926339e-10 | 3845 |
| 0.9549 | 0.6776 | 0.9665 | 0.6761 | 9.926301e-10 | 3846 |
| 0.9562 | 0.6776 | 0.9665 | 0.6761 | 9.926263e-10 | 3847 |
| 0.9504 | 0.6776 | 0.9665 | 0.6761 | 9.926225e-10 | 3848 |
| 0.9542 | 0.6776 | 0.9665 | 0.6761 | 9.926188e-10 | 3849 |
| 0.9536 | 0.6776 | 0.9665 | 0.6761 | 9.92615e-10 | 3850 |
| 0.9554 | 0.6776 | 0.9665 | 0.6761 | 9.926112e-10 | 3851 |
| 0.9524 | 0.6776 | 0.9665 | 0.6761 | 9.926074e-10 | 3852 |
| 0.9485 | 0.6776 | 0.9665 | 0.6761 | 9.926037e-10 | 3853 |
| 0.9579 | 0.6776 | 0.9664 | 0.6761 | 9.925999e-10 | 3854 |
| 0.9496 | 0.6776 | 0.9664 | 0.6761 | 9.925961e-10 | 3855 |
| 0.9561 | 0.6776 | 0.9664 | 0.6761 | 9.925923e-10 | 3856 |
| 0.9514 | 0.6776 | 0.9664 | 0.6761 | 9.925886e-10 | 3857 |
| 0.9476 | 0.6776 | 0.9664 | 0.6761 | 9.925848e-10 | 3858 |
| 0.9565 | 0.6776 | 0.9664 | 0.6761 | 9.925809e-10 | 3859 |
| 0.9468 | 0.6776 | 0.9664 | 0.6761 | 9.92577e-10 | 3860 |
| 0.9548 | 0.6776 | 0.9664 | 0.6761 | 9.925731e-10 | 3861 |
| 0.9485 | 0.6776 | 0.9664 | 0.6761 | 9.925692e-10 | 3862 |
| 0.9512 | 0.6776 | 0.9664 | 0.6761 | 9.925654e-10 | 3863 |
| 0.9537 | 0.6776 | 0.9664 | 0.6761 | 9.925615e-10 | 3864 |
| 0.9551 | 0.6776 | 0.9663 | 0.6761 | 9.925576e-10 | 3865 |
| 0.9531 | 0.6776 | 0.9663 | 0.6761 | 9.925537e-10 | 3866 |
| 0.9529 | 0.6776 | 0.9663 | 0.6761 | 9.925498e-10 | 3867 |
| 0.9560 | 0.6776 | 0.9663 | 0.6761 | 9.925459e-10 | 3868 |
| 0.9560 | 0.6776 | 0.9663 | 0.6761 | 9.92542e-10 | 3869 |
| 0.9514 | 0.6776 | 0.9663 | 0.6761 | 9.925382e-10 | 3870 |
| 0.9539 | 0.6776 | 0.9663 | 0.6761 | 9.925343e-10 | 3871 |
| 0.9533 | 0.6776 | 0.9663 | 0.6761 | 9.925304e-10 | 3872 |
| 0.9481 | 0.6776 | 0.9663 | 0.6761 | 9.925265e-10 | 3873 |
| 0.9504 | 0.6776 | 0.9663 | 0.6761 | 9.925226e-10 | 3874 |
| 0.9518 | 0.6776 | 0.9663 | 0.6761 | 9.925187e-10 | 3875 |
| 0.9557 | 0.6776 | 0.9662 | 0.6761 | 9.925148e-10 | 3876 |
| 0.9445 | 0.6776 | 0.9662 | 0.6761 | 9.92511e-10 | 3877 |
| 0.9533 | 0.6776 | 0.9662 | 0.6761 | 9.925071e-10 | 3878 |
| 0.9574 | 0.6776 | 0.9662 | 0.6761 | 9.925032e-10 | 3879 |
| 0.9469 | 0.6776 | 0.9662 | 0.6761 | 9.924993e-10 | 3880 |
| 0.9532 | 0.6776 | 0.9662 | 0.6761 | 9.924954e-10 | 3881 |
| 0.9563 | 0.6776 | 0.9662 | 0.6761 | 9.924915e-10 | 3882 |
| 0.9568 | 0.6776 | 0.9662 | 0.6761 | 9.924876e-10 | 3883 |
| 0.9593 | 0.6776 | 0.9662 | 0.6761 | 9.924838e-10 | 3884 |
| 0.9496 | 0.6776 | 0.9662 | 0.6761 | 9.924799e-10 | 3885 |
| 0.9564 | 0.6776 | 0.9661 | 0.6761 | 9.92476e-10 | 3886 |
| 0.9526 | 0.6776 | 0.9661 | 0.6761 | 9.924721e-10 | 3887 |
| 0.9546 | 0.6776 | 0.9661 | 0.6761 | 9.924682e-10 | 3888 |
| 0.9609 | 0.6776 | 0.9661 | 0.6761 | 9.924643e-10 | 3889 |
| 0.9515 | 0.6776 | 0.9661 | 0.6761 | 9.924604e-10 | 3890 |
| 0.9535 | 0.6776 | 0.9661 | 0.6761 | 9.924566e-10 | 3891 |
| 0.9588 | 0.6776 | 0.9661 | 0.6761 | 9.924527e-10 | 3892 |
| 0.9462 | 0.6776 | 0.9661 | 0.6761 | 9.924488e-10 | 3893 |
| 0.9525 | 0.6776 | 0.9661 | 0.6761 | 9.924449e-10 | 3894 |
| 0.9575 | 0.6776 | 0.9661 | 0.6761 | 9.92441e-10 | 3895 |
| 0.9489 | 0.6776 | 0.9661 | 0.6761 | 9.924371e-10 | 3896 |
| 0.9521 | 0.6776 | 0.9660 | 0.6761 | 9.924332e-10 | 3897 |
| 0.9544 | 0.6776 | 0.9660 | 0.6761 | 9.924294e-10 | 3898 |
| 0.9565 | 0.6776 | 0.9660 | 0.6761 | 9.924255e-10 | 3899 |
| 0.9518 | 0.6776 | 0.9660 | 0.6761 | 9.924216e-10 | 3900 |
| 0.9602 | 0.6776 | 0.9660 | 0.6761 | 9.924177e-10 | 3901 |
| 0.9512 | 0.6776 | 0.9660 | 0.6761 | 9.924138e-10 | 3902 |
| 0.9567 | 0.6776 | 0.9660 | 0.6761 | 9.924099e-10 | 3903 |
| 0.9507 | 0.6776 | 0.9660 | 0.6761 | 9.92406e-10 | 3904 |
| 0.9490 | 0.6776 | 0.9660 | 0.6761 | 9.924022e-10 | 3905 |
| 0.9547 | 0.6776 | 0.9660 | 0.6761 | 9.923983e-10 | 3906 |
| 0.9535 | 0.6776 | 0.9660 | 0.6761 | 9.923944e-10 | 3907 |
| 0.9547 | 0.6776 | 0.9659 | 0.6761 | 9.923905e-10 | 3908 |
| 0.9539 | 0.6776 | 0.9659 | 0.6761 | 9.923866e-10 | 3909 |
| 0.9467 | 0.6776 | 0.9659 | 0.6761 | 9.923827e-10 | 3910 |
| 0.9511 | 0.6776 | 0.9659 | 0.6761 | 9.923788e-10 | 3911 |
| 0.9404 | 0.6776 | 0.9659 | 0.6761 | 9.92375e-10 | 3912 |
| 0.9469 | 0.6776 | 0.9659 | 0.6761 | 9.923711e-10 | 3913 |
| 0.9479 | 0.6776 | 0.9659 | 0.6761 | 9.923672e-10 | 3914 |
| 0.9517 | 0.6776 | 0.9659 | 0.6761 | 9.923633e-10 | 3915 |
| 0.9597 | 0.6776 | 0.9659 | 0.6761 | 9.923594e-10 | 3916 |
| 0.9527 | 0.6776 | 0.9659 | 0.6761 | 9.923555e-10 | 3917 |
| 0.9569 | 0.6776 | 0.9659 | 0.6761 | 9.923516e-10 | 3918 |
| 0.9539 | 0.6776 | 0.9658 | 0.6761 | 9.923478e-10 | 3919 |
| 0.9573 | 0.6776 | 0.9658 | 0.6761 | 9.923439e-10 | 3920 |
| 0.9469 | 0.6776 | 0.9658 | 0.6761 | 9.9234e-10 | 3921 |
| 0.9581 | 0.6776 | 0.9658 | 0.6761 | 9.923361e-10 | 3922 |
| 0.9540 | 0.6776 | 0.9658 | 0.6761 | 9.923322e-10 | 3923 |
| 0.9566 | 0.6776 | 0.9658 | 0.6761 | 9.923283e-10 | 3924 |
| 0.9530 | 0.6776 | 0.9658 | 0.6761 | 9.923244e-10 | 3925 |
| 0.9554 | 0.6776 | 0.9658 | 0.6761 | 9.923206e-10 | 3926 |
| 0.9463 | 0.6776 | 0.9658 | 0.6761 | 9.923167e-10 | 3927 |
| 0.9496 | 0.6776 | 0.9658 | 0.6761 | 9.923128e-10 | 3928 |
| 0.9523 | 0.6776 | 0.9658 | 0.6761 | 9.923089e-10 | 3929 |
| 0.9489 | 0.6776 | 0.9657 | 0.6761 | 9.92305e-10 | 3930 |
| 0.9556 | 0.6776 | 0.9657 | 0.6761 | 9.923011e-10 | 3931 |
| 0.9500 | 0.6776 | 0.9657 | 0.6761 | 9.922972e-10 | 3932 |
| 0.9494 | 0.6776 | 0.9657 | 0.6761 | 9.922934e-10 | 3933 |
| 0.9546 | 0.6776 | 0.9657 | 0.6761 | 9.922895e-10 | 3934 |
| 0.9519 | 0.6776 | 0.9657 | 0.6761 | 9.922856e-10 | 3935 |
| 0.9579 | 0.6776 | 0.9657 | 0.6761 | 9.922817e-10 | 3936 |
| 0.9526 | 0.6776 | 0.9657 | 0.6761 | 9.922778e-10 | 3937 |
| 0.9502 | 0.6776 | 0.9657 | 0.6761 | 9.922739e-10 | 3938 |
| 0.9446 | 0.6776 | 0.9657 | 0.6761 | 9.9227e-10 | 3939 |
| 0.9510 | 0.6776 | 0.9657 | 0.6761 | 9.922662e-10 | 3940 |
| 0.9495 | 0.6776 | 0.9656 | 0.6761 | 9.922623e-10 | 3941 |
| 0.9545 | 0.6776 | 0.9656 | 0.6761 | 9.922584e-10 | 3942 |
| 0.9557 | 0.6776 | 0.9656 | 0.6761 | 9.922545e-10 | 3943 |
| 0.9524 | 0.6776 | 0.9656 | 0.6761 | 9.922506e-10 | 3944 |
| 0.9492 | 0.6776 | 0.9656 | 0.6761 | 9.922467e-10 | 3945 |
| 0.9496 | 0.6776 | 0.9656 | 0.6761 | 9.922428e-10 | 3946 |
| 0.9507 | 0.6776 | 0.9656 | 0.6761 | 9.92239e-10 | 3947 |
| 0.9536 | 0.6776 | 0.9656 | 0.6761 | 9.922351e-10 | 3948 |
| 0.9500 | 0.6776 | 0.9656 | 0.6761 | 9.922312e-10 | 3949 |
| 0.9570 | 0.6776 | 0.9656 | 0.6761 | 9.922273e-10 | 3950 |
| 0.9500 | 0.6776 | 0.9656 | 0.6761 | 9.922234e-10 | 3951 |
| 0.9471 | 0.6776 | 0.9656 | 0.6761 | 9.922195e-10 | 3952 |
| 0.9480 | 0.6776 | 0.9655 | 0.6761 | 9.922156e-10 | 3953 |
| 0.9536 | 0.6776 | 0.9655 | 0.6761 | 9.922118e-10 | 3954 |
| 0.9557 | 0.6776 | 0.9655 | 0.6761 | 9.922079e-10 | 3955 |
| 0.9533 | 0.6776 | 0.9655 | 0.6761 | 9.92204e-10 | 3956 |
| 0.9476 | 0.6776 | 0.9655 | 0.6761 | 9.922001e-10 | 3957 |
| 0.9413 | 0.6776 | 0.9655 | 0.6761 | 9.921962e-10 | 3958 |
| 0.9476 | 0.6776 | 0.9655 | 0.6761 | 9.921923e-10 | 3959 |
| 0.9562 | 0.6776 | 0.9655 | 0.6761 | 9.921884e-10 | 3960 |
| 0.9548 | 0.6776 | 0.9655 | 0.6761 | 9.921846e-10 | 3961 |
| 0.9589 | 0.6776 | 0.9655 | 0.6761 | 9.921807e-10 | 3962 |
| 0.9526 | 0.6776 | 0.9655 | 0.6761 | 9.921768e-10 | 3963 |
| 0.9561 | 0.6776 | 0.9654 | 0.6761 | 9.921729e-10 | 3964 |
| 0.9544 | 0.6776 | 0.9654 | 0.6761 | 9.92169e-10 | 3965 |
| 0.9488 | 0.6776 | 0.9654 | 0.6761 | 9.921651e-10 | 3966 |
| 0.9525 | 0.6776 | 0.9654 | 0.6761 | 9.921612e-10 | 3967 |
| 0.9554 | 0.6776 | 0.9654 | 0.6761 | 9.921574e-10 | 3968 |
| 0.9478 | 0.6776 | 0.9654 | 0.6761 | 9.921535e-10 | 3969 |
| 0.9501 | 0.6776 | 0.9654 | 0.6761 | 9.921496e-10 | 3970 |
| 0.9476 | 0.6776 | 0.9654 | 0.6761 | 9.921457e-10 | 3971 |
| 0.9475 | 0.6776 | 0.9654 | 0.6761 | 9.921418e-10 | 3972 |
| 0.9470 | 0.6776 | 0.9654 | 0.6761 | 9.921378e-10 | 3973 |
| 0.9554 | 0.6776 | 0.9654 | 0.6761 | 9.921338e-10 | 3974 |
| 0.9512 | 0.6776 | 0.9653 | 0.6761 | 9.921298e-10 | 3975 |
| 0.9543 | 0.6776 | 0.9653 | 0.6761 | 9.921258e-10 | 3976 |
| 0.9506 | 0.6776 | 0.9653 | 0.6761 | 9.921218e-10 | 3977 |
| 0.9548 | 0.6776 | 0.9653 | 0.6761 | 9.921178e-10 | 3978 |
| 0.9482 | 0.6776 | 0.9653 | 0.6761 | 9.921138e-10 | 3979 |
| 0.9495 | 0.6776 | 0.9653 | 0.6761 | 9.921098e-10 | 3980 |
| 0.9560 | 0.6776 | 0.9653 | 0.6761 | 9.921058e-10 | 3981 |
| 0.9503 | 0.6776 | 0.9653 | 0.6761 | 9.921018e-10 | 3982 |
| 0.9499 | 0.6776 | 0.9653 | 0.6761 | 9.920978e-10 | 3983 |
| 0.9519 | 0.6776 | 0.9653 | 0.6761 | 9.920939e-10 | 3984 |
| 0.9480 | 0.6776 | 0.9653 | 0.6761 | 9.920899e-10 | 3985 |
| 0.9513 | 0.6776 | 0.9653 | 0.6761 | 9.920859e-10 | 3986 |
| 0.9508 | 0.6776 | 0.9652 | 0.6761 | 9.920819e-10 | 3987 |
| 0.9519 | 0.6776 | 0.9652 | 0.6761 | 9.920779e-10 | 3988 |
| 0.9465 | 0.6776 | 0.9652 | 0.6761 | 9.920739e-10 | 3989 |
| 0.9523 | 0.6776 | 0.9652 | 0.6761 | 9.920699e-10 | 3990 |
| 0.9546 | 0.6776 | 0.9652 | 0.6761 | 9.920659e-10 | 3991 |
| 0.9500 | 0.6776 | 0.9652 | 0.6761 | 9.920619e-10 | 3992 |
| 0.9499 | 0.6776 | 0.9652 | 0.6761 | 9.920579e-10 | 3993 |
| 0.9519 | 0.6776 | 0.9652 | 0.6761 | 9.920539e-10 | 3994 |
| 0.9478 | 0.6776 | 0.9652 | 0.6761 | 9.920499e-10 | 3995 |
| 0.9505 | 0.6776 | 0.9652 | 0.6761 | 9.920459e-10 | 3996 |
| 0.9509 | 0.6776 | 0.9652 | 0.6761 | 9.920419e-10 | 3997 |
| 0.9529 | 0.6776 | 0.9652 | 0.6761 | 9.920379e-10 | 3998 |
| 0.9438 | 0.6776 | 0.9651 | 0.6761 | 9.920339e-10 | 3999 |
### Framework versions
- Transformers 4.29.0.dev0
- TensorFlow 2.9.1
- Datasets 2.8.0
- Tokenizers 0.13.2
| 381,400 | [
[
-0.049530029296875,
-0.031005859375,
0.0273590087890625,
0.00799560546875,
-0.00014460086822509766,
0.008636474609375,
-0.000018775463104248047,
-0.006801605224609375,
0.056610107421875,
0.025115966796875,
-0.043975830078125,
-0.04595947265625,
-0.04443359375,
... |
rifatozkurt/bert-base-uncased-finetuned-cola | 2023-05-06T13:12:00.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | rifatozkurt | null | null | rifatozkurt/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-06T11:50:38 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5805514135255713
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4434
- Matthews Correlation: 0.5806
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 8.302384098327798e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5122 | 1.0 | 535 | 0.4803 | 0.4895 |
| 0.3629 | 2.0 | 1070 | 0.4434 | 0.5806 |
| 0.2857 | 3.0 | 1605 | 0.5283 | 0.5704 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,886 | [
[
-0.0262451171875,
-0.052032470703125,
0.00951385498046875,
0.018768310546875,
-0.023712158203125,
-0.020172119140625,
-0.01727294921875,
-0.01435089111328125,
0.02716064453125,
0.0171051025390625,
-0.050201416015625,
-0.030670166015625,
-0.05224609375,
-0.02... |
sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr_saturday_from_server | 2023-05-06T13:01:24.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | sepehrbakhshi | null | null | sepehrbakhshi/bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr_saturday_from_server | 0 | 2 | transformers | 2023-05-06T12:57:41 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr_saturday_from_server
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.4863017578040948
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola_sepehr_sepehr_sepehr_saturday_from_server
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4730
- Matthews Correlation: 0.4863
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 268 | 0.4730 | 0.4863 |
### Framework versions
- Transformers 4.27.1
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.13.2
| 1,806 | [
[
-0.0270843505859375,
-0.0517578125,
0.0109100341796875,
0.0208587646484375,
-0.031768798828125,
-0.0262298583984375,
-0.019775390625,
-0.0177459716796875,
0.023162841796875,
0.0172119140625,
-0.0516357421875,
-0.03607177734375,
-0.0518798828125,
-0.022308349... |
sepehrbakhshi/bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_05 | 2023-05-06T16:52:44.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | sepehrbakhshi | null | null | sepehrbakhshi/bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_05 | 0 | 2 | transformers | 2023-05-06T13:29:33 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_05
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5779953180551635
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_05
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4260
- Matthews Correlation: 0.5780
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.530341440816975e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:-----:|:---------------:|:--------------------:|
| 0.5137 | 1.0 | 535 | 0.4936 | 0.4808 |
| 0.362 | 2.0 | 1070 | 0.4270 | 0.5781 |
| 0.2679 | 3.0 | 1605 | 0.6409 | 0.5148 |
| 0.2046 | 4.0 | 2140 | 0.5658 | 0.5892 |
| 0.1736 | 5.0 | 2675 | 0.7711 | 0.5624 |
| 0.1378 | 6.0 | 3210 | 0.8053 | 0.5956 |
| 0.1137 | 7.0 | 3745 | 0.9714 | 0.5523 |
| 0.0903 | 8.0 | 4280 | 0.9119 | 0.5735 |
| 0.0839 | 9.0 | 4815 | 1.0448 | 0.5839 |
| 0.0629 | 10.0 | 5350 | 1.2056 | 0.5521 |
| 0.0577 | 11.0 | 5885 | 1.1880 | 0.5889 |
| 0.0505 | 12.0 | 6420 | 1.1722 | 0.5836 |
| 0.0519 | 13.0 | 6955 | 1.2863 | 0.5884 |
| 0.0369 | 14.0 | 7490 | 1.2971 | 0.5608 |
| 0.032 | 15.0 | 8025 | 1.3024 | 0.5785 |
| 0.0244 | 16.0 | 8560 | 1.3904 | 0.5737 |
| 0.0166 | 17.0 | 9095 | 1.4044 | 0.5778 |
| 0.0185 | 18.0 | 9630 | 1.4234 | 0.5650 |
| 0.0168 | 19.0 | 10165 | 1.4384 | 0.5727 |
| 0.0224 | 20.0 | 10700 | 1.4260 | 0.5780 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 3,227 | [
[
-0.038848876953125,
-0.042572021484375,
0.01152801513671875,
0.0098114013671875,
-0.01393890380859375,
-0.01202392578125,
-0.005100250244140625,
-0.00945281982421875,
0.035491943359375,
0.016510009765625,
-0.051849365234375,
-0.045745849609375,
-0.05050659179687... |
ilkekas/bert-base-uncased-mean-pooling-finetuned-cola | 2023-05-06T15:58:54.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | ilkekas | null | null | ilkekas/bert-base-uncased-mean-pooling-finetuned-cola | 0 | 2 | transformers | 2023-05-06T14:24:45 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-mean-pooling-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5627810283916928
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-mean-pooling-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4983
- Matthews Correlation: 0.5628
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3.3487316926587096e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 9
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5613 | 1.0 | 535 | 0.4981 | 0.4273 |
| 0.43 | 2.0 | 1070 | 0.4379 | 0.5367 |
| 0.3647 | 3.0 | 1605 | 0.5213 | 0.5030 |
| 0.312 | 4.0 | 2140 | 0.5085 | 0.5391 |
| 0.2832 | 5.0 | 2675 | 0.4983 | 0.5628 |
| 0.245 | 6.0 | 3210 | 0.6061 | 0.5339 |
| 0.2291 | 7.0 | 3745 | 0.5835 | 0.5443 |
| 0.2065 | 8.0 | 4280 | 0.5907 | 0.5443 |
| 0.2032 | 9.0 | 4815 | 0.6072 | 0.5469 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,357 | [
[
-0.03277587890625,
-0.043426513671875,
0.004634857177734375,
0.01068878173828125,
-0.0234527587890625,
-0.0181884765625,
-0.0085601806640625,
-0.01119232177734375,
0.030975341796875,
0.01654052734375,
-0.05157470703125,
-0.0347900390625,
-0.053070068359375,
... |
xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1 | 2023-05-07T02:16:11.000Z | [
"transformers",
"tf",
"albert",
"text-classification",
"generated_from_keras_callback",
"endpoints_compatible",
"region:us"
] | text-classification | xinyixiuxiu | null | null | xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1 | 0 | 2 | transformers | 2023-05-06T15:08:36 | ---
tags:
- generated_from_keras_callback
model-index:
- name: xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.2153
- Train Accuracy: 0.9144
- Validation Loss: 0.1911
- Validation Accuracy: 0.9243
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 3e-06, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.2153 | 0.9144 | 0.1911 | 0.9243 | 0 |
### Framework versions
- Transformers 4.28.1
- TensorFlow 2.7.0
- Datasets 2.10.1
- Tokenizers 0.12.1
| 1,437 | [
[
-0.0308380126953125,
-0.0321044921875,
0.02813720703125,
0.0112762451171875,
-0.03424072265625,
-0.029205322265625,
-0.004756927490234375,
-0.025726318359375,
0.0048370361328125,
0.0157623291015625,
-0.052459716796875,
-0.039306640625,
-0.055450439453125,
-0... |
HassoyKerem/bert-base-uncased-finetuned-cola | 2023-05-07T22:22:10.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | HassoyKerem | null | null | HassoyKerem/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-06T15:16:02 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.512703445942988
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4545
- Matthews Correlation: 0.5127
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5085 | 1.0 | 535 | 0.4545 | 0.5127 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,721 | [
[
-0.025177001953125,
-0.053253173828125,
0.0116729736328125,
0.0200653076171875,
-0.027496337890625,
-0.022186279296875,
-0.0191650390625,
-0.0154266357421875,
0.025909423828125,
0.0168914794921875,
-0.049774169921875,
-0.030975341796875,
-0.050872802734375,
... |
cansurav/bert-base-uncased-finetuned-cola-batch-16 | 2023-05-06T16:34:33.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-batch-16 | 0 | 2 | transformers | 2023-05-06T16:20:14 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-batch-16
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5992215466535732
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-batch-16
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4502
- Matthews Correlation: 0.5992
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4987 | 1.0 | 535 | 0.5145 | 0.4872 |
| 0.3065 | 2.0 | 1070 | 0.4502 | 0.5992 |
| 0.2059 | 3.0 | 1605 | 0.7547 | 0.5208 |
| 0.1467 | 4.0 | 2140 | 0.8557 | 0.5390 |
| 0.1006 | 5.0 | 2675 | 0.9277 | 0.5550 |
| 0.0796 | 6.0 | 3210 | 1.0832 | 0.5765 |
| 0.0532 | 7.0 | 3745 | 1.0337 | 0.5687 |
| 0.0367 | 8.0 | 4280 | 1.1539 | 0.5779 |
| 0.0276 | 9.0 | 4815 | 1.3224 | 0.5755 |
| 0.0192 | 10.0 | 5350 | 1.3055 | 0.5810 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,407 | [
[
-0.03277587890625,
-0.047027587890625,
0.0110321044921875,
0.0123748779296875,
-0.0198211669921875,
-0.016998291015625,
-0.01172637939453125,
-0.00951385498046875,
0.031524658203125,
0.0162506103515625,
-0.05328369140625,
-0.038665771484375,
-0.0531005859375,
... |
cansurav/bert-base-uncased-finetuned-cola-batch-32 | 2023-05-06T17:00:41.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-batch-32 | 0 | 2 | transformers | 2023-05-06T16:34:40 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-batch-32
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5927736326773501
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-batch-32
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8835
- Matthews Correlation: 0.5928
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 268 | 0.5093 | 0.5049 |
| 0.4202 | 2.0 | 536 | 0.4633 | 0.5600 |
| 0.4202 | 3.0 | 804 | 0.5369 | 0.5393 |
| 0.1814 | 4.0 | 1072 | 0.6271 | 0.5605 |
| 0.1814 | 5.0 | 1340 | 0.7427 | 0.5662 |
| 0.0947 | 6.0 | 1608 | 0.7794 | 0.5697 |
| 0.0947 | 7.0 | 1876 | 0.8835 | 0.5928 |
| 0.0566 | 8.0 | 2144 | 1.0182 | 0.5751 |
| 0.0566 | 9.0 | 2412 | 1.1300 | 0.5549 |
| 0.0296 | 10.0 | 2680 | 1.1266 | 0.5704 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,407 | [
[
-0.0302734375,
-0.04559326171875,
0.0106658935546875,
0.01291656494140625,
-0.0203399658203125,
-0.01849365234375,
-0.01169586181640625,
-0.01163482666015625,
0.027740478515625,
0.01540374755859375,
-0.0517578125,
-0.0389404296875,
-0.05419921875,
-0.0224456... |
Xenova/sam-vit-base | 2023-08-24T18:15:31.000Z | [
"transformers.js",
"onnx",
"sam",
"mask-generation",
"region:us"
] | null | Xenova | null | null | Xenova/sam-vit-base | 0 | 2 | transformers.js | 2023-05-06T16:40:37 | ---
library_name: "transformers.js"
---
https://huggingface.co/facebook/sam-vit-base with ONNX weights to be compatible with Transformers.js.
Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`). | 500 | [
[
-0.0209808349609375,
0.019012451171875,
0.031280517578125,
0.043609619140625,
-0.016387939453125,
0.005901336669921875,
0.0035915374755859375,
0.0003521442413330078,
0.035125732421875,
0.0305023193359375,
-0.060577392578125,
-0.03961181640625,
-0.04315185546875,... |
esragenc/bert-base-uncased-finetuned-cola | 2023-05-06T17:17:48.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | esragenc | null | null | esragenc/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-06T16:44:55 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.24864597330745425
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5096
- Matthews Correlation: 0.2486
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.312312768726691e-06
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5717 | 1.0 | 1069 | 0.5541 | 0.0696 |
| 0.4917 | 2.0 | 2138 | 0.5059 | 0.2335 |
| 0.4603 | 3.0 | 3207 | 0.5096 | 0.2486 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,886 | [
[
-0.0261993408203125,
-0.050506591796875,
0.0093841552734375,
0.0188751220703125,
-0.023529052734375,
-0.0192718505859375,
-0.015625,
-0.01505279541015625,
0.02685546875,
0.017181396484375,
-0.0518798828125,
-0.0307464599609375,
-0.0513916015625,
-0.021514892... |
cansurav/bert-base-uncased-finetuned-cola-batch-64 | 2023-05-06T17:24:58.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-cola-batch-64 | 0 | 2 | transformers | 2023-05-06T17:00:49 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-batch-64
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5835943612387946
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-batch-64
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7651
- Matthews Correlation: 0.5836
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 134 | 0.4344 | 0.5367 |
| No log | 2.0 | 268 | 0.4313 | 0.5650 |
| No log | 3.0 | 402 | 0.5034 | 0.5495 |
| 0.3177 | 4.0 | 536 | 0.5733 | 0.5293 |
| 0.3177 | 5.0 | 670 | 0.6364 | 0.5498 |
| 0.3177 | 6.0 | 804 | 0.7316 | 0.5600 |
| 0.3177 | 7.0 | 938 | 0.7651 | 0.5836 |
| 0.0846 | 8.0 | 1072 | 0.8575 | 0.5625 |
| 0.0846 | 9.0 | 1206 | 0.8820 | 0.5573 |
| 0.0846 | 10.0 | 1340 | 0.8854 | 0.5704 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,407 | [
[
-0.0302886962890625,
-0.046630859375,
0.00836181640625,
0.01186370849609375,
-0.019073486328125,
-0.0181884765625,
-0.0126953125,
-0.0115509033203125,
0.028656005859375,
0.016387939453125,
-0.051025390625,
-0.039764404296875,
-0.05426025390625,
-0.0226287841... |
cansurav/bert-base-uncased-finetuned-best | 2023-05-06T18:36:13.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | cansurav | null | null | cansurav/bert-base-uncased-finetuned-best | 0 | 2 | transformers | 2023-05-06T17:27:00 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-best
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.6093514522222457
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-best
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4101
- Matthews Correlation: 0.6094
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.9901559201237305e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 268 | 0.4389 | 0.5041 |
| 0.3831 | 2.0 | 536 | 0.4101 | 0.6094 |
| 0.3831 | 3.0 | 804 | 0.5908 | 0.5854 |
| 0.1334 | 4.0 | 1072 | 0.7048 | 0.6012 |
| 0.1334 | 5.0 | 1340 | 0.7637 | 0.5809 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,035 | [
[
-0.0308990478515625,
-0.04644775390625,
0.01047515869140625,
0.01052093505859375,
-0.0249176025390625,
-0.025665283203125,
-0.01824951171875,
-0.01535797119140625,
0.022003173828125,
0.015869140625,
-0.05035400390625,
-0.040435791015625,
-0.053466796875,
-0.... |
sepehrbakhshi/bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_00 | 2023-05-06T18:44:14.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | sepehrbakhshi | null | null | sepehrbakhshi/bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_00 | 0 | 2 | transformers | 2023-05-06T17:39:33 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_00
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5909585115904812
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_00
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3418
- Matthews Correlation: 0.5910
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.9628623388222396e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.484 | 1.0 | 535 | 0.4557 | 0.5053 |
| 0.3013 | 2.0 | 1070 | 0.4224 | 0.5711 |
| 0.1949 | 3.0 | 1605 | 0.8633 | 0.5523 |
| 0.1399 | 4.0 | 2140 | 0.7826 | 0.5858 |
| 0.0933 | 5.0 | 2675 | 0.9575 | 0.5846 |
| 0.0607 | 6.0 | 3210 | 1.0032 | 0.5694 |
| 0.0554 | 7.0 | 3745 | 1.2276 | 0.5702 |
| 0.0368 | 8.0 | 4280 | 1.2437 | 0.5761 |
| 0.0303 | 9.0 | 4815 | 1.2978 | 0.5889 |
| 0.0146 | 10.0 | 5350 | 1.3418 | 0.5910 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,466 | [
[
-0.0309295654296875,
-0.041595458984375,
0.00968170166015625,
0.01020050048828125,
-0.0218505859375,
-0.02008056640625,
-0.01309967041015625,
-0.01059722900390625,
0.0234527587890625,
0.0123443603515625,
-0.0518798828125,
-0.04034423828125,
-0.05267333984375,
... |
Jazielinho/filter_ai_news_didgest | 2023-05-08T17:13:39.000Z | [
"sentence-transformers",
"pytorch",
"bert",
"setfit",
"text-classification",
"arxiv:2209.11055",
"license:apache-2.0",
"region:us"
] | text-classification | Jazielinho | null | null | Jazielinho/filter_ai_news_didgest | 0 | 2 | sentence-transformers | 2023-05-06T18:41:22 | ---
license: apache-2.0
tags:
- setfit
- sentence-transformers
- text-classification
pipeline_tag: text-classification
---
# Jazielinho/filter_ai_news_didgest
This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Usage
To use this model for inference, first install the SetFit library:
```bash
python -m pip install setfit
```
You can then run inference as follows:
```python
from setfit import SetFitModel
# Download from Hub and run inference
model = SetFitModel.from_pretrained("Jazielinho/filter_ai_news_didgest")
# Run inference
preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"])
```
## BibTeX entry and citation info
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
| 1,555 | [
[
-0.0136260986328125,
-0.064697265625,
0.0321044921875,
-0.0182952880859375,
-0.019073486328125,
-0.017120361328125,
-0.018646240234375,
-0.01003265380859375,
-0.0011119842529296875,
0.03826904296875,
-0.04376220703125,
-0.02642822265625,
-0.04119873046875,
0... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.