modelId stringlengths 4 111 | lastModified stringlengths 24 24 | tags list | pipeline_tag stringlengths 5 30 ⌀ | author stringlengths 2 34 ⌀ | config null | securityStatus null | id stringlengths 4 111 | likes int64 0 9.53k | downloads int64 2 73.6M | library_name stringlengths 2 84 ⌀ | created timestamp[us] | card stringlengths 101 901k | card_len int64 101 901k | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
tigindundar4/bert-base-uncased-finetuned-cola | 2023-05-07T14:49:26.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | tigindundar4 | null | null | tigindundar4/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-06T18:57:43 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5108235781406687
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4659
- Matthews Correlation: 0.5108
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4908 | 1.0 | 535 | 0.4659 | 0.5108 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,722 | [
[
-0.0252838134765625,
-0.0533447265625,
0.01160430908203125,
0.0211334228515625,
-0.027587890625,
-0.0226898193359375,
-0.0194854736328125,
-0.015289306640625,
0.0257110595703125,
0.0164794921875,
-0.049530029296875,
-0.03118896484375,
-0.050201416015625,
-0.... |
berkozcelik/bert-base-uncased-finetuned-cola | 2023-05-07T18:13:49.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | berkozcelik | null | null | berkozcelik/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-06T19:02:13 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5365723103616664
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4582
- Matthews Correlation: 0.5366
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4912 | 1.0 | 535 | 0.4582 | 0.5366 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,722 | [
[
-0.025115966796875,
-0.052581787109375,
0.01226043701171875,
0.0208892822265625,
-0.02777099609375,
-0.0220489501953125,
-0.0189056396484375,
-0.0152435302734375,
0.025726318359375,
0.016571044921875,
-0.049957275390625,
-0.0311126708984375,
-0.05072021484375,
... |
sepehrbakhshi/bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_05_16 | 2023-05-06T21:46:53.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | sepehrbakhshi | null | null | sepehrbakhshi/bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_05_16 | 0 | 2 | transformers | 2023-05-06T19:08:51 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_05_16
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5905209134554644
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_05_16
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7283
- Matthews Correlation: 0.5905
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.0356344528514278e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5012 | 1.0 | 535 | 0.4807 | 0.4912 |
| 0.3376 | 2.0 | 1070 | 0.4363 | 0.5882 |
| 0.2395 | 3.0 | 1605 | 0.6192 | 0.5351 |
| 0.1814 | 4.0 | 2140 | 0.6754 | 0.5931 |
| 0.1554 | 5.0 | 2675 | 0.7283 | 0.5905 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,101 | [
[
-0.02972412109375,
-0.04486083984375,
0.0103912353515625,
0.0146942138671875,
-0.024749755859375,
-0.0205841064453125,
-0.016815185546875,
-0.01282501220703125,
0.02313232421875,
0.0135498046875,
-0.0516357421875,
-0.037811279296875,
-0.053619384765625,
-0.0... |
KubraCaglar/bert-base-uncased-finetuned-cola | 2023-05-07T23:26:13.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | KubraCaglar | null | null | KubraCaglar/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-06T20:05:23 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.49430354503894686
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4718
- Matthews Correlation: 0.4943
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5023 | 1.0 | 535 | 0.4718 | 0.4943 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,723 | [
[
-0.0258331298828125,
-0.052886962890625,
0.01111602783203125,
0.020721435546875,
-0.0281524658203125,
-0.021636962890625,
-0.01953125,
-0.01509857177734375,
0.025482177734375,
0.0166168212890625,
-0.049346923828125,
-0.0309295654296875,
-0.05120849609375,
-0... |
sepehrbakhshi/bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_00_16 | 2023-05-06T23:23:49.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | sepehrbakhshi | null | null | sepehrbakhshi/bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_00_16 | 0 | 2 | transformers | 2023-05-06T21:50:14 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_00_16
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.6008788381144764
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola_HW2_sepehr_bakhshi_dropout_00_16
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0825
- Matthews Correlation: 0.6009
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.1204324670557534e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4985 | 1.0 | 535 | 0.4773 | 0.4879 |
| 0.3349 | 2.0 | 1070 | 0.4213 | 0.6088 |
| 0.2322 | 3.0 | 1605 | 0.6781 | 0.5232 |
| 0.1763 | 4.0 | 2140 | 0.6570 | 0.5836 |
| 0.1367 | 5.0 | 2675 | 0.7957 | 0.5880 |
| 0.1047 | 6.0 | 3210 | 0.8028 | 0.6263 |
| 0.0823 | 7.0 | 3745 | 1.0014 | 0.5754 |
| 0.0614 | 8.0 | 4280 | 0.9796 | 0.6012 |
| 0.0576 | 9.0 | 4815 | 1.0651 | 0.6082 |
| 0.0394 | 10.0 | 5350 | 1.0825 | 0.6009 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,472 | [
[
-0.032257080078125,
-0.043304443359375,
0.00904083251953125,
0.011749267578125,
-0.020843505859375,
-0.02008056640625,
-0.0137176513671875,
-0.01290130615234375,
0.02386474609375,
0.013092041015625,
-0.052276611328125,
-0.040771484375,
-0.053253173828125,
-0... |
anilbayramg/bert-base-uncased-finetuned-cola | 2023-05-08T00:09:07.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | anilbayramg | null | null | anilbayramg/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-06T22:33:57 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.4992111877160894
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4890
- Matthews Correlation: 0.4992
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5381 | 1.0 | 535 | 0.4890 | 0.4992 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,722 | [
[
-0.0254669189453125,
-0.052764892578125,
0.01049041748046875,
0.0207366943359375,
-0.0281524658203125,
-0.021484375,
-0.01922607421875,
-0.01459503173828125,
0.025909423828125,
0.0163726806640625,
-0.049041748046875,
-0.0311126708984375,
-0.051483154296875,
... |
Gridflow/distilbert-base-uncased-finetuned-emotion2 | 2023-05-07T00:54:48.000Z | [
"transformers",
"pytorch",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:emotion",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | Gridflow | null | null | Gridflow/distilbert-base-uncased-finetuned-emotion2 | 0 | 2 | transformers | 2023-05-07T00:50:31 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion2
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
config: split
split: validation
args: split
metrics:
- name: Accuracy
type: accuracy
value: 0.9275
- name: F1
type: f1
value: 0.9275719429504966
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion2
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2226
- Accuracy: 0.9275
- F1: 0.9276
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.8425 | 1.0 | 250 | 0.3132 | 0.9065 | 0.9038 |
| 0.2536 | 2.0 | 500 | 0.2226 | 0.9275 | 0.9276 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,850 | [
[
-0.03271484375,
-0.03997802734375,
0.0131988525390625,
0.0211944580078125,
-0.02642822265625,
-0.020233154296875,
-0.012237548828125,
-0.01145172119140625,
0.004886627197265625,
0.00839996337890625,
-0.054168701171875,
-0.045135498046875,
-0.06085205078125,
... |
utkuden/bert-base-uncased-finetuned-cola | 2023-05-07T12:52:42.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | utkuden | null | null | utkuden/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-07T01:22:52 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5286883616838448
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4453
- Matthews Correlation: 0.5287
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4849 | 1.0 | 535 | 0.4453 | 0.5287 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,722 | [
[
-0.0258331298828125,
-0.05242919921875,
0.0116729736328125,
0.0210418701171875,
-0.0279693603515625,
-0.022125244140625,
-0.0188446044921875,
-0.01544189453125,
0.0251312255859375,
0.0164947509765625,
-0.0496826171875,
-0.0308837890625,
-0.05023193359375,
-0... |
takeshiho0531/distilbert-base-uncased-finetuned-emotion | 2023-05-07T04:02:49.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:emotion",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | takeshiho0531 | null | null | takeshiho0531/distilbert-base-uncased-finetuned-emotion | 0 | 2 | transformers | 2023-05-07T03:39:46 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
config: split
split: validation
args: split
metrics:
- name: Accuracy
type: accuracy
value: 0.9295
- name: F1
type: f1
value: 0.9295553605965364
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2124
- Accuracy: 0.9295
- F1: 0.9296
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.8137 | 1.0 | 250 | 0.3047 | 0.908 | 0.9041 |
| 0.2447 | 2.0 | 500 | 0.2124 | 0.9295 | 0.9296 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,848 | [
[
-0.038238525390625,
-0.041534423828125,
0.01495361328125,
0.0216217041015625,
-0.026275634765625,
-0.01885986328125,
-0.01276397705078125,
-0.00864410400390625,
0.01047515869140625,
0.00861358642578125,
-0.056976318359375,
-0.0517578125,
-0.05950927734375,
-... |
Pendo/finetuned-Sentiment-classfication-BERT-model | 2023-05-07T07:04:32.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | Pendo | null | null | Pendo/finetuned-Sentiment-classfication-BERT-model | 0 | 2 | transformers | 2023-05-07T05:56:16 | ---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: finetuned-Sentiment-classfication-BERT-model
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# finetuned-Sentiment-classfication-BERT-model
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6056
- Rmse: 0.6890
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rmse |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.7754 | 2.0 | 500 | 0.6056 | 0.6890 |
| 0.3975 | 4.0 | 1000 | 0.6982 | 0.6452 |
| 0.1308 | 6.0 | 1500 | 1.0715 | 0.6643 |
| 0.0526 | 8.0 | 2000 | 1.3439 | 0.6571 |
| 0.0241 | 10.0 | 2500 | 1.4676 | 0.6695 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,722 | [
[
-0.052825927734375,
-0.052337646484375,
0.0078582763671875,
0.01451873779296875,
-0.029449462890625,
-0.026824951171875,
-0.0254058837890625,
-0.002597808837890625,
0.014923095703125,
0.023284912109375,
-0.06640625,
-0.048187255859375,
-0.050750732421875,
-0... |
vvmnnnkv/wine-quality | 2023-05-19T07:56:37.000Z | [
"sklearn",
"tabular-classification",
"dataset:wine-quality",
"dataset:lvwerra/red-wine",
"region:us"
] | tabular-classification | vvmnnnkv | null | null | vvmnnnkv/wine-quality | 1 | 2 | sklearn | 2023-05-07T07:50:26 | ---
tags:
- tabular-classification
- sklearn
datasets:
- wine-quality
- lvwerra/red-wine
widget:
structuredData:
fixed_acidity:
- 7.4
- 7.8
- 10.3
volatile_acidity:
- 0.7
- 0.88
- 0.32
citric_acid:
- 0
- 0
- 0.45
residual_sugar:
- 1.9
- 2.6
- 6.4
chlorides:
- 0.076
- 0.098
- 0.073
free_sulfur_dioxide:
- 11
- 25
- 5
total_sulfur_dioxide:
- 34
- 67
- 13
density:
- 0.9978
- 0.9968
- 0.9976
pH:
- 3.51
- 3.2
- 3.23
sulphates:
- 0.56
- 0.68
- 0.82
alcohol:
- 9.4
- 9.8
- 12.6
library_name: sklearn
pipeline_tag: tabular-classification
---
## Wine Quality classification
### A Simple Example of Scikit-learn Pipeline
> Inspired by https://towardsdatascience.com/a-simple-example-of-pipeline-in-machine-learning-with-scikit-learn-e726ffbb6976 by Saptashwa Bhattacharyya
### How to use
```python
from huggingface_hub import hf_hub_url, cached_download
import joblib
import pandas as pd
REPO_ID = "julien-c/wine-quality"
FILENAME = "sklearn_model.joblib"
model = joblib.load(cached_download(
hf_hub_url(REPO_ID, FILENAME)
))
# model is a `sklearn.pipeline.Pipeline`
```
#### Get sample data from this repo
```python
data_file = cached_download(
hf_hub_url(REPO_ID, "winequality-red.csv")
)
winedf = pd.read_csv(data_file, sep=";")
X = winedf.drop(["quality"], axis=1)
Y = winedf["quality"]
print(X[:3])
```
| | fixed acidity | volatile acidity | citric acid | residual sugar | chlorides | free sulfur dioxide | total sulfur dioxide | density | pH | sulphates | alcohol |
|---:|----------------:|-------------------:|--------------:|-----------------:|------------:|----------------------:|-----------------------:|----------:|-----:|------------:|----------:|
| 0 | 7.4 | 0.7 | 0 | 1.9 | 0.076 | 11 | 34 | 0.9978 | 3.51 | 0.56 | 9.4 |
| 1 | 7.8 | 0.88 | 0 | 2.6 | 0.098 | 25 | 67 | 0.9968 | 3.2 | 0.68 | 9.8 |
| 2 | 7.8 | 0.76 | 0.04 | 2.3 | 0.092 | 15 | 54 | 0.997 | 3.26 | 0.65 | 9.8 |
#### Get your prediction
```python
labels = model.predict(X[:3])
# [5, 5, 5]
```
#### Eval
```python
model.score(X, Y)
# 0.6616635397123202
```
### 🍷 Disclaimer
No red wine was drunk (unfortunately) while training this model 🍷 | 2,681 | [
[
-0.0121307373046875,
-0.01593017578125,
0.00524139404296875,
0.01552581787109375,
-0.016021728515625,
-0.003986358642578125,
0.0085601806640625,
-0.007564544677734375,
0.02899169921875,
0.0350341796875,
-0.038177490234375,
-0.05462646484375,
-0.03924560546875,
... |
yagmurery/bert-base-uncased-finetuned-part2-cola | 2023-05-07T14:51:30.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | yagmurery | null | null | yagmurery/bert-base-uncased-finetuned-part2-cola | 0 | 2 | transformers | 2023-05-07T08:51:17 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-last-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5892439733711194
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-last-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8731
- Matthews Correlation: 0.5892
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3.350326176009724e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 8
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4834 | 1.0 | 535 | 0.4471 | 0.5024 |
| 0.287 | 2.0 | 1070 | 0.4596 | 0.5573 |
| 0.1848 | 3.0 | 1605 | 0.8394 | 0.5140 |
| 0.1257 | 4.0 | 2140 | 0.8731 | 0.5892 |
| 0.0719 | 5.0 | 2675 | 0.9607 | 0.5851 |
| 0.0467 | 6.0 | 3210 | 1.0737 | 0.5731 |
| 0.0339 | 7.0 | 3745 | 1.3356 | 0.5470 |
| 0.0216 | 8.0 | 4280 | 1.3521 | 0.5579 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,266 | [
[
-0.0270538330078125,
-0.046722412109375,
0.00989532470703125,
0.01235198974609375,
-0.0204315185546875,
-0.015960693359375,
-0.0122833251953125,
-0.01311492919921875,
0.0284881591796875,
0.01654052734375,
-0.050323486328125,
-0.037506103515625,
-0.052734375,
... |
senihylmz/bert-base-uncased-finetuned-cola | 2023-05-07T20:51:07.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | senihylmz | null | null | senihylmz/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-07T09:03:35 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.4832216996895926
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4618
- Matthews Correlation: 0.4832
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5065 | 1.0 | 535 | 0.4618 | 0.4832 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,722 | [
[
-0.0255126953125,
-0.052642822265625,
0.01025390625,
0.020904541015625,
-0.027587890625,
-0.0215911865234375,
-0.0193939208984375,
-0.01471710205078125,
0.025604248046875,
0.016082763671875,
-0.04949951171875,
-0.03076171875,
-0.050933837890625,
-0.020736694... |
EcemSimsek/bert-base-uncased-finetuned-cola | 2023-05-07T21:42:13.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | EcemSimsek | null | null | EcemSimsek/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-07T09:22:32 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5208528714430889
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4661
- Matthews Correlation: 0.5209
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.13e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 268 | 0.4526 | 0.5206 |
| 0.4593 | 2.0 | 536 | 0.4661 | 0.5209 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,799 | [
[
-0.0251007080078125,
-0.05303955078125,
0.01081085205078125,
0.02032470703125,
-0.0257415771484375,
-0.0219879150390625,
-0.018463134765625,
-0.0161590576171875,
0.02532958984375,
0.0169525146484375,
-0.051422119140625,
-0.0302886962890625,
-0.050994873046875,
... |
DorukKaraman/bert-base-uncased-finetuned-cola | 2023-05-07T17:54:31.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | DorukKaraman | null | null | DorukKaraman/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-07T09:43:16 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.54781790671712
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5338
- Matthews Correlation: 0.5478
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6.256549330223815e-06
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4533 | 1.0 | 1069 | 0.4844 | 0.4614 |
| 0.3347 | 2.0 | 2138 | 0.5338 | 0.5478 |
| 0.2847 | 3.0 | 3207 | 0.6569 | 0.5416 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,883 | [
[
-0.026824951171875,
-0.0506591796875,
0.0108489990234375,
0.01898193359375,
-0.0243682861328125,
-0.0203857421875,
-0.0171661376953125,
-0.01457977294921875,
0.0261688232421875,
0.01678466796875,
-0.051025390625,
-0.03082275390625,
-0.051666259765625,
-0.021... |
thomasavare/distilbert-ft-test3 | 2023-05-23T14:41:21.000Z | [
"transformers",
"tf",
"distilbert",
"text-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | thomasavare | null | null | thomasavare/distilbert-ft-test3 | 0 | 2 | transformers | 2023-05-07T10:08:10 | ---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: distilbert-ft-test3
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# distilbert-ft-test3
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on [thomasavare/waste-classification-v2](https://huggingface.co/datasets/thomasavare/waste-classification-v2).
It is part of my master thesis at Politecnico di Torino in partenership with ReLearn.
It achieves the following results on the test set:
accuracy | precision | recall | f1 |
---------|-----------|--------|--------|
0.974 | 0.9805 | 0.9732 | 0.9725 |
## Model description
DistilBERT finetuned for waste classification on 50 different classes as part of my master thesis at Politecnico di Torino.
## Intended uses & limitations
Use for waste classification on 50 different waste classes (see [dataset](https://huggingface.co/datasets/thomasavare/waste-classification-v2))
## Training and evaluation data
[waste-classification-v2 dataset](https://huggingface.co/datasets/thomasavare/waste-classification-v2)
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 5e-05, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
### Framework versions
- Transformers 4.28.1
- TensorFlow 2.12.0
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,874 | [
[
-0.03204345703125,
-0.0277099609375,
0.0269012451171875,
-0.012939453125,
-0.01849365234375,
-0.0264892578125,
0.00547027587890625,
-0.007122039794921875,
0.001312255859375,
0.01080322265625,
-0.024169921875,
-0.03802490234375,
-0.06390380859375,
-0.00073051... |
sharoz/codeparrot-small-custom-functions-dataset-python | 2023-05-07T10:44:48.000Z | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | sharoz | null | null | sharoz/codeparrot-small-custom-functions-dataset-python | 0 | 2 | transformers | 2023-05-07T10:33:36 | ---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: codeparrot-small-custom-functions-dataset-python
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# codeparrot-small-custom-functions-dataset-python
This model is a fine-tuned version of [codeparrot/codeparrot-small](https://huggingface.co/codeparrot/codeparrot-small) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4238
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.216 | 0.12 | 1 | 1.0747 |
| 1.051 | 0.25 | 2 | 1.0005 |
| 0.9855 | 0.38 | 3 | 0.9462 |
| 0.9259 | 0.5 | 4 | 0.9042 |
| 0.9236 | 0.62 | 5 | 0.8675 |
| 0.8644 | 0.75 | 6 | 0.8331 |
| 0.8148 | 0.88 | 7 | 0.8030 |
| 0.7554 | 1.0 | 8 | 0.7800 |
| 0.7815 | 1.12 | 9 | 0.7600 |
| 0.784 | 1.25 | 10 | 0.7440 |
| 0.635 | 1.38 | 11 | 0.7309 |
| 0.6666 | 1.5 | 12 | 0.7170 |
| 0.7676 | 1.62 | 13 | 0.6993 |
| 0.6608 | 1.75 | 14 | 0.6835 |
| 0.6885 | 1.88 | 15 | 0.6696 |
| 0.69 | 2.0 | 16 | 0.6582 |
| 0.6343 | 2.12 | 17 | 0.6463 |
| 0.709 | 2.25 | 18 | 0.6324 |
| 0.5446 | 2.38 | 19 | 0.6206 |
| 0.5298 | 2.5 | 20 | 0.6102 |
| 0.6478 | 2.62 | 21 | 0.6016 |
| 0.546 | 2.75 | 22 | 0.5941 |
| 0.6297 | 2.88 | 23 | 0.5871 |
| 0.4518 | 3.0 | 24 | 0.5814 |
| 0.566 | 3.12 | 25 | 0.5769 |
| 0.6285 | 3.25 | 26 | 0.5702 |
| 0.5938 | 3.38 | 27 | 0.5631 |
| 0.514 | 3.5 | 28 | 0.5568 |
| 0.5113 | 3.62 | 29 | 0.5504 |
| 0.512 | 3.75 | 30 | 0.5451 |
| 0.4392 | 3.88 | 31 | 0.5407 |
| 0.5097 | 4.0 | 32 | 0.5370 |
| 0.4866 | 4.12 | 33 | 0.5326 |
| 0.5028 | 4.25 | 34 | 0.5285 |
| 0.5438 | 4.38 | 35 | 0.5228 |
| 0.5424 | 4.5 | 36 | 0.5166 |
| 0.5156 | 4.62 | 37 | 0.5108 |
| 0.4335 | 4.75 | 38 | 0.5056 |
| 0.4298 | 4.88 | 39 | 0.5013 |
| 0.5268 | 5.0 | 40 | 0.4978 |
| 0.4714 | 5.12 | 41 | 0.4938 |
| 0.4659 | 5.25 | 42 | 0.4907 |
| 0.4573 | 5.38 | 43 | 0.4874 |
| 0.4689 | 5.5 | 44 | 0.4847 |
| 0.4346 | 5.62 | 45 | 0.4824 |
| 0.4563 | 5.75 | 46 | 0.4794 |
| 0.4505 | 5.88 | 47 | 0.4761 |
| 0.7359 | 6.0 | 48 | 0.4732 |
| 0.4704 | 6.12 | 49 | 0.4706 |
| 0.4223 | 6.25 | 50 | 0.4685 |
| 0.4789 | 6.38 | 51 | 0.4651 |
| 0.4402 | 6.5 | 52 | 0.4624 |
| 0.4454 | 6.62 | 53 | 0.4597 |
| 0.4496 | 6.75 | 54 | 0.4566 |
| 0.3942 | 6.88 | 55 | 0.4539 |
| 0.2915 | 7.0 | 56 | 0.4515 |
| 0.3926 | 7.12 | 57 | 0.4496 |
| 0.4102 | 7.25 | 58 | 0.4474 |
| 0.4235 | 7.38 | 59 | 0.4456 |
| 0.4841 | 7.5 | 60 | 0.4441 |
| 0.3914 | 7.62 | 61 | 0.4423 |
| 0.4417 | 7.75 | 62 | 0.4404 |
| 0.4212 | 7.88 | 63 | 0.4384 |
| 0.4343 | 8.0 | 64 | 0.4369 |
| 0.4159 | 8.12 | 65 | 0.4355 |
| 0.4193 | 8.25 | 66 | 0.4343 |
| 0.4393 | 8.38 | 67 | 0.4333 |
| 0.4507 | 8.5 | 68 | 0.4319 |
| 0.3855 | 8.62 | 69 | 0.4305 |
| 0.4064 | 8.75 | 70 | 0.4293 |
| 0.4044 | 8.88 | 71 | 0.4283 |
| 0.2957 | 9.0 | 72 | 0.4275 |
| 0.4442 | 9.12 | 73 | 0.4266 |
| 0.4142 | 9.25 | 74 | 0.4260 |
| 0.4022 | 9.38 | 75 | 0.4253 |
| 0.4161 | 9.5 | 76 | 0.4248 |
| 0.3828 | 9.62 | 77 | 0.4244 |
| 0.384 | 9.75 | 78 | 0.4241 |
| 0.3985 | 9.88 | 79 | 0.4239 |
| 0.4912 | 10.0 | 80 | 0.4238 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 5,364 | [
[
-0.044219970703125,
-0.03955078125,
0.00873565673828125,
0.0019626617431640625,
-0.0013599395751953125,
0.004486083984375,
0.0009627342224121094,
0.005786895751953125,
0.051849365234375,
0.023468017578125,
-0.042144775390625,
-0.04315185546875,
-0.04071044921875... |
ilhanemirhan/bert-base-uncased-finetuned-cola | 2023-05-07T20:17:21.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | ilhanemirhan | null | null | ilhanemirhan/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-07T10:52:04 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
model-index:
- name: bert-base-uncased-finetuned-cola
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- eval_loss: 0.8619
- eval_matthews_correlation: 0.5625
- eval_runtime: 1.8285
- eval_samples_per_second: 570.412
- eval_steps_per_second: 71.643
- step: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,265 | [
[
-0.02606201171875,
-0.057342529296875,
0.007610321044921875,
0.0240936279296875,
-0.0278167724609375,
-0.019439697265625,
-0.0225830078125,
-0.0140228271484375,
0.0226593017578125,
0.0160369873046875,
-0.044769287109375,
-0.0281982421875,
-0.0479736328125,
-... |
keytiong/distilbert-base-uncased-finetuned-emotion | 2023-07-29T14:36:29.000Z | [
"transformers",
"pytorch",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:emotion",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | keytiong | null | null | keytiong/distilbert-base-uncased-finetuned-emotion | 0 | 2 | transformers | 2023-05-07T10:59:04 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
config: split
split: validation
args: split
metrics:
- name: Accuracy
type: accuracy
value: 0.923
- name: F1
type: f1
value: 0.9229063505545305
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2243
- Accuracy: 0.923
- F1: 0.9229
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.8371 | 1.0 | 250 | 0.3205 | 0.9015 | 0.8987 |
| 0.2512 | 2.0 | 500 | 0.2243 | 0.923 | 0.9229 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,846 | [
[
-0.037689208984375,
-0.0419921875,
0.01514434814453125,
0.021759033203125,
-0.026214599609375,
-0.0189971923828125,
-0.013275146484375,
-0.00850677490234375,
0.01105499267578125,
0.008392333984375,
-0.05657958984375,
-0.052154541015625,
-0.060028076171875,
-... |
mazkooleg/digit-mask-wavlm-base-plus-ft | 2023-05-07T11:43:04.000Z | [
"transformers",
"pytorch",
"wavlm",
"audio-classification",
"generated_from_trainer",
"dataset:mazkooleg/digit_mask_augmented_raw",
"endpoints_compatible",
"region:us"
] | audio-classification | mazkooleg | null | null | mazkooleg/digit-mask-wavlm-base-plus-ft | 0 | 2 | transformers | 2023-05-07T11:07:17 | ---
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: wavlm-base-plus-digit-mask-ft
results: []
datasets:
- mazkooleg/digit_mask_augmented_raw
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wavlm-base-plus-digit-mask-ft
This model is a fine-tuned version of [microsoft/wavlm-base-plus](https://huggingface.co/microsoft/wavlm-base-plus) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0068
- Accuracy: 0.9991
- F1: 0.9991
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Accuracy | F1 | Validation Loss |
|:-------------:|:-----:|:-----:|:--------:|:------:|:---------------:|
| 0.0091 | 1.0 | 14264 | 0.9991 | 0.9991 | 0.0068 |
| 0.0023 | 2.0 | 28528 | 0.9987 | 0.9987 | 0.0073 |
| 0.0003 | 3.0 | 42792 | 0.9983 | 0.9983 | 0.0101 |
### Framework versions
- Transformers 4.28.1
- Pytorch 1.13.0+cpu
- Datasets 2.12.0
- Tokenizers 0.13.2 | 1,679 | [
[
-0.033355712890625,
-0.037109375,
0.00835418701171875,
0.0169677734375,
-0.0212249755859375,
-0.020355224609375,
-0.005374908447265625,
-0.02960205078125,
0.0009098052978515625,
0.0290985107421875,
-0.06072998046875,
-0.05169677734375,
-0.045745849609375,
-0... |
Ayouta300/bert-base-uncased-finetuned-cola | 2023-05-07T20:04:32.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | Ayouta300 | null | null | Ayouta300/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-07T11:14:30 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5155383069979991
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4595
- Matthews Correlation: 0.5155
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4923 | 1.0 | 535 | 0.4595 | 0.5155 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,722 | [
[
-0.0252685546875,
-0.05303955078125,
0.01195526123046875,
0.0204010009765625,
-0.0277557373046875,
-0.0217742919921875,
-0.0190277099609375,
-0.01538848876953125,
0.026092529296875,
0.0166015625,
-0.04974365234375,
-0.03106689453125,
-0.05096435546875,
-0.02... |
elifcen/bert-pooling-based | 2023-05-07T16:40:37.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | elifcen | null | null | elifcen/bert-pooling-based | 0 | 2 | transformers | 2023-05-07T12:17:15 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-pooling-based
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.40858564179092355
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-pooling-based
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5115
- Matthews Correlation: 0.4086
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.7718352056354854e-06
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5491 | 1.0 | 1069 | 0.5340 | 0.2513 |
| 0.4726 | 2.0 | 2138 | 0.5115 | 0.4086 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,785 | [
[
-0.033782958984375,
-0.0401611328125,
0.0126495361328125,
0.01117706298828125,
-0.0289154052734375,
-0.0279998779296875,
-0.00992584228515625,
-0.0262451171875,
0.0169525146484375,
0.0210113525390625,
-0.054443359375,
-0.02642822265625,
-0.045745849609375,
-... |
kilgaz/bert-base-uncased-finetuned-cola | 2023-05-07T20:04:44.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | kilgaz | null | null | kilgaz/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-07T12:45:37 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.547014428196921
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4514
- Matthews Correlation: 0.5470
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4923 | 1.0 | 535 | 0.4514 | 0.5470 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,721 | [
[
-0.0245361328125,
-0.052978515625,
0.01218414306640625,
0.0202789306640625,
-0.028289794921875,
-0.022674560546875,
-0.0191802978515625,
-0.014984130859375,
0.0254058837890625,
0.0164794921875,
-0.048736572265625,
-0.0310516357421875,
-0.05108642578125,
-0.0... |
takeshiho0531/distilbert-base-uncased-finetuned-clinc | 2023-05-07T15:51:00.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | takeshiho0531 | null | null | takeshiho0531/distilbert-base-uncased-finetuned-clinc | 0 | 2 | transformers | 2023-05-07T13:47:58 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased-finetuned-clinc
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-clinc
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7720
- Accuracy: 0.9181
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 318 | 3.2887 | 0.7419 |
| 3.7868 | 2.0 | 636 | 1.8753 | 0.8371 |
| 3.7868 | 3.0 | 954 | 1.1570 | 0.8961 |
| 1.6927 | 4.0 | 1272 | 0.8573 | 0.9129 |
| 0.9056 | 5.0 | 1590 | 0.7720 | 0.9181 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Tokenizers 0.13.3
| 1,614 | [
[
-0.03521728515625,
-0.0452880859375,
0.0148468017578125,
0.0126800537109375,
-0.0274200439453125,
-0.0212249755859375,
-0.00965118408203125,
-0.006221771240234375,
0.003421783447265625,
0.02032470703125,
-0.0504150390625,
-0.046539306640625,
-0.05889892578125,
... |
ilkekas/bert-base-uncased-mean-pooling-finetuned3-cola | 2023-05-07T14:09:56.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | ilkekas | null | null | ilkekas/bert-base-uncased-mean-pooling-finetuned3-cola | 0 | 2 | transformers | 2023-05-07T14:02:57 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-mean-pooling-finetuned3-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5687360893544328
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-mean-pooling-finetuned3-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8274
- Matthews Correlation: 0.5687
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4853 | 1.0 | 535 | 0.4786 | 0.5357 |
| 0.2851 | 2.0 | 1070 | 0.5102 | 0.5598 |
| 0.1849 | 3.0 | 1605 | 0.6688 | 0.5495 |
| 0.1206 | 4.0 | 2140 | 0.8274 | 0.5687 |
| 0.0927 | 5.0 | 2675 | 0.9249 | 0.5677 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,046 | [
[
-0.0318603515625,
-0.04351806640625,
0.00853729248046875,
0.0133056640625,
-0.027801513671875,
-0.0234527587890625,
-0.0106353759765625,
-0.01407623291015625,
0.0230712890625,
0.0169219970703125,
-0.048553466796875,
-0.034698486328125,
-0.052978515625,
-0.02... |
ilkekas/bert-base-uncased-mean-pooling-finetuned4-cola | 2023-05-07T14:53:20.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | ilkekas | null | null | ilkekas/bert-base-uncased-mean-pooling-finetuned4-cola | 0 | 2 | transformers | 2023-05-07T14:23:34 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-mean-pooling-finetuned4-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5624066288493853
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-mean-pooling-finetuned4-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5150
- Matthews Correlation: 0.5624
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3.9012058362716625e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 6
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5521 | 1.0 | 535 | 0.4933 | 0.4312 |
| 0.4184 | 2.0 | 1070 | 0.4352 | 0.5291 |
| 0.3537 | 3.0 | 1605 | 0.5243 | 0.5055 |
| 0.3048 | 4.0 | 2140 | 0.5048 | 0.5573 |
| 0.2815 | 5.0 | 2675 | 0.5150 | 0.5624 |
| 0.2498 | 6.0 | 3210 | 0.5527 | 0.5495 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,137 | [
[
-0.03265380859375,
-0.041351318359375,
0.0076446533203125,
0.0119171142578125,
-0.025665283203125,
-0.0211334228515625,
-0.00992584228515625,
-0.01308441162109375,
0.02679443359375,
0.0172882080078125,
-0.051239013671875,
-0.034820556640625,
-0.05078125,
-0.... |
VitaRin/ProtBert-IS | 2023-05-30T16:54:55.000Z | [
"transformers",
"pytorch",
"tf",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | text-classification | VitaRin | null | null | VitaRin/ProtBert-IS | 0 | 2 | transformers | 2023-05-07T14:27:18 |
## ProtBert-IS
### Model Description
ProtBert-IS is a a model fine-tuned on the pre-trained ProtBert model for the purpose of sequence classification. It takes a protein sequence input and predicts whether the protein is soluble or insoluble.
ProtBert-IS has been fine-tuned using 3 different training datasets.
**Finetuned from model:** Rostlab/prot_bert
GitHub repository with relevant files: https://github.com/VitaRin/ProtBert-IS
## Uses
It can be directly used with the pipeline on singular sequences:
```
from transformers import BertModel, BertTokenizer
import re
pipeline = TextClassificationPipeline(
model=AutoModelForSequenceClassification.from_pretrained("VitaRin/ProtBert-IS"),
tokenizer=AutoTokenizer.from_pretrained("VitaRin/ProtBert-IS"),
device=0
)
sequence = "A E T C Z A O"
sequence = re.sub(r"[UZOB]", "X", sequence)
output = pipeline(sequence)
```
Or read multiple sequences from a .fasta file:
```from transformers import AutoTokenizer, AutoModelForSequenceClassification, TextClassificationPipeline
import re
pipeline = TextClassificationPipeline(
model=AutoModelForSequenceClassification.from_pretrained("VitaRin/ProtBert-IS"),
tokenizer=AutoTokenizer.from_pretrained("VitaRin/ProtBert-IS"),
device=0
)
with open("input.fasta", "r") as f:
data = f.read().split(">")
data.remove(data[0])
sequences = []
for d in data:
d = d.split('\n', 1)[-1].replace('\n', '').replace('', ' ')
sequences.append(d)
sequences = [re.sub(r"[UZOB]", "X", sequence) for sequence in sequences]
print(pipeline(sequences))
```
| 1,579 | [
[
-0.0223388671875,
-0.0205841064453125,
0.0131683349609375,
0.00850677490234375,
-0.037384033203125,
0.00614166259765625,
0.0157928466796875,
0.0005860328674316406,
0.020538330078125,
0.0294342041015625,
-0.040863037109375,
-0.031982421875,
-0.0628662109375,
... |
Gursoyy/bert-base-uncased-finetuned-cola | 2023-05-08T17:52:20.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | Gursoyy | null | null | Gursoyy/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-07T14:30:37 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.512703445942988
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5138
- Matthews Correlation: 0.5127
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4.5654407894015775e-06
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4654 | 1.0 | 1069 | 0.5029 | 0.4588 |
| 0.3684 | 2.0 | 2138 | 0.5138 | 0.5127 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,811 | [
[
-0.02557373046875,
-0.052764892578125,
0.011444091796875,
0.019775390625,
-0.025054931640625,
-0.020843505859375,
-0.018218994140625,
-0.0149383544921875,
0.0272979736328125,
0.0176239013671875,
-0.05072021484375,
-0.030029296875,
-0.051177978515625,
-0.0214... |
p0uy4/bert-base-uncased-finetuned-cola | 2023-05-07T19:10:17.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | p0uy4 | null | null | p0uy4/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-07T14:32:09 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5214716883534575
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4742
- Matthews Correlation: 0.5215
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.468554830415339e-05
- train_batch_size: 8
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4392 | 1.0 | 1069 | 0.4742 | 0.5215 |
### Framework versions
- Transformers 4.12.2
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.10.3
| 1,694 | [
[
-0.0243682861328125,
-0.053558349609375,
0.00949859619140625,
0.0205078125,
-0.0267181396484375,
-0.021453857421875,
-0.0185089111328125,
-0.0164031982421875,
0.0263824462890625,
0.0169830322265625,
-0.04986572265625,
-0.029571533203125,
-0.050048828125,
-0.... |
VitaRin/ProtBert-BFD-IS | 2023-05-30T16:56:23.000Z | [
"transformers",
"pytorch",
"tf",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | text-classification | VitaRin | null | null | VitaRin/ProtBert-BFD-IS | 0 | 2 | transformers | 2023-05-07T14:33:53 |
## ProtBert-BDF-IS
### Model Description
ProtBert-BFD-IS is a a model fine-tuned on the pre-trained ProtBert-BFD model for the purpose of sequence classification. It takes a protein sequence input and predicts whether the protein is soluble or insoluble.
ProtBert-BFD-IS has been fine-tuned using 3 different training datasets.
**Finetuned from model:** Rostlab/prot_bert_bfd
GitHub repository with relevant files: https://github.com/VitaRin/ProtBert-IS
## Uses
It can be directly used with the pipeline on singular sequences:
```
from transformers import BertModel, BertTokenizer
import re
pipeline = TextClassificationPipeline(
model=AutoModelForSequenceClassification.from_pretrained("VitaRin/ProtBert-IS"),
tokenizer=AutoTokenizer.from_pretrained("VitaRin/ProtBert-IS"),
device=0
)
sequence = "A E T C Z A O"
sequence = re.sub(r"[UZOB]", "X", sequence)
output = pipeline(sequence)
```
Or read multiple sequences from a .fasta file:
```from transformers import AutoTokenizer, AutoModelForSequenceClassification, TextClassificationPipeline
import re
pipeline = TextClassificationPipeline(
model=AutoModelForSequenceClassification.from_pretrained("VitaRin/ProtBert-IS"),
tokenizer=AutoTokenizer.from_pretrained("VitaRin/ProtBert-IS"),
device=0
)
with open("input.fasta", "r") as f:
data = f.read().split(">")
data.remove(data[0])
sequences = []
for d in data:
d = d.split('\n', 1)[-1].replace('\n', '').replace('', ' ')
sequences.append(d)
sequences = [re.sub(r"[UZOB]", "X", sequence) for sequence in sequences]
print(pipeline(sequences))
```
| 1,598 | [
[
-0.032745361328125,
-0.032012939453125,
0.00835418701171875,
0.0115814208984375,
-0.0369873046875,
0.001705169677734375,
0.016815185546875,
0.0012331008911132812,
0.00952911376953125,
0.031524658203125,
-0.043670654296875,
-0.0364990234375,
-0.062286376953125,
... |
ardanil7/bert-base-uncased-finetuned-cola | 2023-05-08T18:26:38.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | ardanil7 | null | null | ardanil7/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-07T15:14:51 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5154424505113391
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4694
- Matthews Correlation: 0.5154
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4671 | 1.0 | 1069 | 0.4694 | 0.5154 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,720 | [
[
-0.0262451171875,
-0.053009033203125,
0.01114654541015625,
0.0215911865234375,
-0.0278167724609375,
-0.0216217041015625,
-0.019317626953125,
-0.0147857666015625,
0.0262451171875,
0.01666259765625,
-0.05023193359375,
-0.03070068359375,
-0.05084228515625,
-0.0... |
takeshiho0531/distilbert-base-uncased-distilled-clinc | 2023-05-07T16:11:37.000Z | [
"transformers",
"pytorch",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | takeshiho0531 | null | null | takeshiho0531/distilbert-base-uncased-distilled-clinc | 0 | 2 | transformers | 2023-05-07T15:59:16 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased-distilled-clinc
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-distilled-clinc
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0999
- Accuracy: 0.9406
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 318 | 0.5777 | 0.7348 |
| 0.7588 | 2.0 | 636 | 0.2863 | 0.8845 |
| 0.7588 | 3.0 | 954 | 0.1794 | 0.9216 |
| 0.2787 | 4.0 | 1272 | 0.1386 | 0.93 |
| 0.1598 | 5.0 | 1590 | 0.1208 | 0.9355 |
| 0.1598 | 6.0 | 1908 | 0.1111 | 0.94 |
| 0.1245 | 7.0 | 2226 | 0.1057 | 0.9397 |
| 0.1096 | 8.0 | 2544 | 0.1024 | 0.9410 |
| 0.1096 | 9.0 | 2862 | 0.1005 | 0.9410 |
| 0.1034 | 10.0 | 3180 | 0.0999 | 0.9406 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Tokenizers 0.13.3
| 1,925 | [
[
-0.034637451171875,
-0.04132080078125,
0.017181396484375,
0.01038360595703125,
-0.0240020751953125,
-0.01464080810546875,
-0.004749298095703125,
-0.0029888153076171875,
0.01056671142578125,
0.0196075439453125,
-0.047332763671875,
-0.0472412109375,
-0.06161499023... |
tKah/Textclassification-Bert | 2023-05-17T03:01:48.000Z | [
"transformers",
"tf",
"bert",
"text-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | tKah | null | null | tKah/Textclassification-Bert | 0 | 2 | transformers | 2023-05-07T16:39:11 | ---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: Textclassification-Bert
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Textclassification-Bert
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1439
- Validation Loss: 0.5583
- Train Matthews Correlation: 0.5803
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 1602, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Matthews Correlation | Epoch |
|:----------:|:---------------:|:--------------------------:|:-----:|
| 0.4792 | 0.4276 | 0.5446 | 0 |
| 0.2664 | 0.4445 | 0.5602 | 1 |
| 0.1439 | 0.5583 | 0.5803 | 2 |
### Framework versions
- Transformers 4.29.2
- TensorFlow 2.12.0
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,885 | [
[
-0.03955078125,
-0.038970947265625,
0.02667236328125,
0.00411224365234375,
-0.0293121337890625,
-0.019683837890625,
-0.0202178955078125,
-0.0219268798828125,
0.0142974853515625,
0.00395965576171875,
-0.0501708984375,
-0.050537109375,
-0.0562744140625,
-0.024... |
OvgumSezen/bert-base-uncased-finetuned-cola | 2023-05-07T18:56:16.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | OvgumSezen | null | null | OvgumSezen/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-07T16:55:31 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5885471185335819
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7949
- Matthews Correlation: 0.5885
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4879 | 1.0 | 535 | 0.5019 | 0.5089 |
| 0.2878 | 2.0 | 1070 | 0.4687 | 0.5708 |
| 0.1849 | 3.0 | 1605 | 0.6457 | 0.5685 |
| 0.1323 | 4.0 | 2140 | 0.7949 | 0.5885 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,944 | [
[
-0.026824951171875,
-0.049957275390625,
0.0095367431640625,
0.0172882080078125,
-0.0233917236328125,
-0.020355224609375,
-0.0167694091796875,
-0.013153076171875,
0.0260772705078125,
0.0166168212890625,
-0.050537109375,
-0.033203125,
-0.0528564453125,
-0.0205... |
uraskargi/bert-base-cased-fine-tuned-4 | 2023-05-07T18:31:24.000Z | [
"transformers",
"tf",
"bert",
"text-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | uraskargi | null | null | uraskargi/bert-base-cased-fine-tuned-4 | 0 | 2 | transformers | 2023-05-07T18:28:53 | ---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: uraskargi/bert-base-cased-fine-tuned-4
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# uraskargi/bert-base-cased-fine-tuned-4
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1922
- Train Accuracy: 0.9310
- Validation Loss: 0.5247
- Validation Accuracy: 0.8303
- Train Matthews Correlation: 0.5830
- Epoch: 4
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 9.858432402113778e-06, 'decay_steps': 665, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Train Matthews Correlation | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:--------------------------:|:-----:|
| 0.6040 | 0.7007 | 0.5308 | 0.7191 | 0.2443 | 0 |
| 0.4246 | 0.8114 | 0.4163 | 0.8188 | 0.5525 | 1 |
| 0.2897 | 0.8848 | 0.5054 | 0.8121 | 0.5343 | 2 |
| 0.2224 | 0.9146 | 0.4868 | 0.8274 | 0.5754 | 3 |
| 0.1922 | 0.9310 | 0.5247 | 0.8303 | 0.5830 | 4 |
### Framework versions
- Transformers 4.28.1
- TensorFlow 2.12.0
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,394 | [
[
-0.048187255859375,
-0.04486083984375,
0.024505615234375,
0.002658843994140625,
-0.02618408203125,
-0.019195556640625,
-0.01329803466796875,
-0.0120391845703125,
0.01617431640625,
0.01788330078125,
-0.05108642578125,
-0.055023193359375,
-0.049224853515625,
-... |
hakankara/bert-base-uncased-finetuned-cola-v4 | 2023-05-07T20:09:40.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | hakankara | null | null | hakankara/bert-base-uncased-finetuned-cola-v4 | 0 | 2 | transformers | 2023-05-07T19:25:07 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola-v4
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5520922661403441
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola-v4
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4868
- Matthews Correlation: 0.5521
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3.094072228622811e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5115 | 1.0 | 535 | 0.5101 | 0.4999 |
| 0.2886 | 2.0 | 1070 | 0.4868 | 0.5521 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,818 | [
[
-0.0245819091796875,
-0.046478271484375,
0.0134124755859375,
0.01983642578125,
-0.0250244140625,
-0.0182952880859375,
-0.01520538330078125,
-0.01525115966796875,
0.026123046875,
0.0156402587890625,
-0.048675537109375,
-0.02880859375,
-0.050811767578125,
-0.0... |
p0uy4/MeanPoolingBert-finetuned-cola | 2023-05-08T01:40:03.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | p0uy4 | null | null | p0uy4/MeanPoolingBert-finetuned-cola | 0 | 2 | transformers | 2023-05-07T19:53:49 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: MeanPoolingBert-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.4340990431285672
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# MeanPoolingBert-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4949
- Matthews Correlation: 0.4341
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.018367046954782e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 268 | 0.4949 | 0.4341 |
### Framework versions
- Transformers 4.12.2
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.10.3
| 1,691 | [
[
-0.02734375,
-0.04620361328125,
0.00739288330078125,
0.01099395751953125,
-0.035247802734375,
-0.025115966796875,
-0.00901031494140625,
-0.01457977294921875,
0.0330810546875,
0.0146484375,
-0.055633544921875,
-0.027984619140625,
-0.053680419921875,
-0.019607... |
Hasimcan/bert-base-uncased-finetuned-cola | 2023-05-07T20:18:48.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | Hasimcan | null | null | Hasimcan/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-07T19:59:10 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.4965380296929026
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4656
- Matthews Correlation: 0.4965
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4965 | 1.0 | 535 | 0.4656 | 0.4965 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,722 | [
[
-0.0247039794921875,
-0.052764892578125,
0.01204681396484375,
0.02020263671875,
-0.0279541015625,
-0.0222015380859375,
-0.0192718505859375,
-0.015289306640625,
0.0257568359375,
0.0160064697265625,
-0.048614501953125,
-0.031005859375,
-0.050811767578125,
-0.0... |
OvgumSezen/bert-base-uncased-common-finetuned-cola | 2023-05-07T20:46:38.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | OvgumSezen | null | null | OvgumSezen/bert-base-uncased-common-finetuned-cola | 0 | 2 | transformers | 2023-05-07T20:03:34 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-common-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5521390429003941
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-common-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6196
- Matthews Correlation: 0.5521
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3.0535648029673025e-05
- train_batch_size: 4
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5514 | 1.0 | 2138 | 0.6196 | 0.5521 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,752 | [
[
-0.0250244140625,
-0.053680419921875,
0.01081085205078125,
0.02178955078125,
-0.02838134765625,
-0.0219573974609375,
-0.0191497802734375,
-0.015960693359375,
0.028076171875,
0.0164031982421875,
-0.049041748046875,
-0.030975341796875,
-0.051544189453125,
-0.0... |
ATA05/bert-base-uncased-finetuned-cola | 2023-05-07T20:35:52.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | ATA05 | null | null | ATA05/bert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-07T20:11:47 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5099519351292859
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4584
- Matthews Correlation: 0.5100
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4985 | 1.0 | 535 | 0.4584 | 0.5100 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,722 | [
[
-0.02520751953125,
-0.053253173828125,
0.01175689697265625,
0.020751953125,
-0.0272674560546875,
-0.0224456787109375,
-0.0189666748046875,
-0.0149993896484375,
0.0259246826171875,
0.0168914794921875,
-0.050018310546875,
-0.0312347412109375,
-0.05084228515625,
... |
berkozcelik/bert-base-uncased-lrc-finetuned-cola | 2023-05-07T20:37:45.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | berkozcelik | null | null | berkozcelik/bert-base-uncased-lrc-finetuned-cola | 0 | 2 | transformers | 2023-05-07T20:18:39 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-lrc-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5729657494988228
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-lrc-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9583
- Matthews Correlation: 0.5730
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.0638 | 1.0 | 535 | 0.9583 | 0.5730 |
| 0.0486 | 2.0 | 1070 | 1.1459 | 0.5496 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,804 | [
[
-0.0267181396484375,
-0.051239013671875,
0.012451171875,
0.0168914794921875,
-0.0265655517578125,
-0.02130126953125,
-0.020782470703125,
-0.01568603515625,
0.0250701904296875,
0.0162811279296875,
-0.050811767578125,
-0.03155517578125,
-0.04962158203125,
-0.0... |
berkozcelik/bert-base-uncased-bs-finetuned-cola | 2023-05-07T20:47:12.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | berkozcelik | null | null | berkozcelik/bert-base-uncased-bs-finetuned-cola | 0 | 2 | transformers | 2023-05-07T20:43:20 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-bs-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5609903802347734
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-bs-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1887
- Matthews Correlation: 0.5610
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.1909 | 1.0 | 1069 | 0.8341 | 0.5565 |
| 0.0898 | 2.0 | 2138 | 1.1887 | 0.5610 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,800 | [
[
-0.024810791015625,
-0.05133056640625,
0.012664794921875,
0.0207672119140625,
-0.0258636474609375,
-0.023162841796875,
-0.0169830322265625,
-0.01507568359375,
0.0247955322265625,
0.0156707763671875,
-0.04986572265625,
-0.0321044921875,
-0.05059814453125,
-0.... |
AskingAlex/exist-2023-task2 | 2023-05-08T07:02:04.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-classification | AskingAlex | null | null | AskingAlex/exist-2023-task2 | 0 | 2 | transformers | 2023-05-07T20:47:01 | ---
license: mit
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: exist-2023-task2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# exist-2023-task2
This model is a fine-tuned version of [microsoft/Multilingual-MiniLM-L12-H384](https://huggingface.co/microsoft/Multilingual-MiniLM-L12-H384) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4756
- F1: 0.7027
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| No log | 1.0 | 97 | 1.0175 | 0.4991 |
| No log | 2.0 | 194 | 0.8374 | 0.5695 |
| No log | 3.0 | 291 | 0.7967 | 0.5876 |
| No log | 4.0 | 388 | 0.7797 | 0.5982 |
| No log | 5.0 | 485 | 0.7161 | 0.6424 |
| 0.8645 | 6.0 | 582 | 0.6662 | 0.6302 |
| 0.8645 | 7.0 | 679 | 0.6580 | 0.6385 |
| 0.8645 | 8.0 | 776 | 0.6465 | 0.6491 |
| 0.8645 | 9.0 | 873 | 0.8620 | 0.5650 |
| 0.8645 | 10.0 | 970 | 0.5704 | 0.6852 |
| 0.6764 | 11.0 | 1067 | 0.5434 | 0.6806 |
| 0.6764 | 12.0 | 1164 | 0.7109 | 0.6192 |
| 0.6764 | 13.0 | 1261 | 0.5411 | 0.6708 |
| 0.6764 | 14.0 | 1358 | 0.5557 | 0.6675 |
| 0.6764 | 15.0 | 1455 | 0.5483 | 0.6701 |
| 0.56 | 16.0 | 1552 | 0.5155 | 0.6817 |
| 0.56 | 17.0 | 1649 | 0.5375 | 0.6750 |
| 0.56 | 18.0 | 1746 | 0.4858 | 0.6984 |
| 0.56 | 19.0 | 1843 | 0.4571 | 0.7091 |
| 0.56 | 20.0 | 1940 | 0.4756 | 0.7027 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,482 | [
[
-0.03765869140625,
-0.036285400390625,
0.00960540771484375,
0.0008130073547363281,
-0.00641632080078125,
-0.01441192626953125,
-0.0020732879638671875,
-0.010406494140625,
0.0251922607421875,
0.0167388916015625,
-0.059173583984375,
-0.045501708984375,
-0.04190063... |
guoluo/Bert_class_dropout_point2_1e-08 | 2023-05-07T22:18:04.000Z | [
"transformers",
"tf",
"bert",
"text-classification",
"generated_from_keras_callback",
"endpoints_compatible",
"region:us"
] | text-classification | guoluo | null | null | guoluo/Bert_class_dropout_point2_1e-08 | 0 | 2 | transformers | 2023-05-07T22:17:20 | ---
tags:
- generated_from_keras_callback
model-index:
- name: Bert_class_1e-08
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Bert_class_1e-08
This model is a fine-tuned version of [guoluo/Bert_1.5e_07](https://huggingface.co/guoluo/Bert_1.5e_07) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.6953
- Train Accuracy: 0.7459
- Validation Loss: 0.8537
- Validation Accuracy: 0.7324
- Train Lr: 9.23136e-09
- Epoch: 3999
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 9.23136e-09, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Train Lr | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-------------:|:-----:|
| 1.4761 | 0.1294 | 1.4867 | 0.1056 | 1e-08 | 0 |
| 1.4716 | 0.1388 | 1.4809 | 0.1056 | 1e-08 | 1 |
| 1.4629 | 0.1576 | 1.4752 | 0.1056 | 1e-08 | 2 |
| 1.4593 | 0.1600 | 1.4696 | 0.1056 | 1e-08 | 3 |
| 1.4507 | 0.1647 | 1.4639 | 0.1197 | 1e-08 | 4 |
| 1.4551 | 0.1694 | 1.4584 | 0.1268 | 9.999999e-09 | 5 |
| 1.4377 | 0.1976 | 1.4530 | 0.1268 | 9.999998e-09 | 6 |
| 1.4515 | 0.1765 | 1.4477 | 0.1338 | 9.999997e-09 | 7 |
| 1.4423 | 0.1788 | 1.4421 | 0.1408 | 9.999996e-09 | 8 |
| 1.4282 | 0.1953 | 1.4367 | 0.1479 | 9.9999955e-09 | 9 |
| 1.4392 | 0.1506 | 1.4315 | 0.1690 | 9.999995e-09 | 10 |
| 1.4098 | 0.2494 | 1.4262 | 0.1690 | 9.999994e-09 | 11 |
| 1.4103 | 0.2447 | 1.4209 | 0.1972 | 9.999993e-09 | 12 |
| 1.4180 | 0.2259 | 1.4160 | 0.2183 | 9.999992e-09 | 13 |
| 1.4064 | 0.2188 | 1.4107 | 0.2394 | 9.99999e-09 | 14 |
| 1.4012 | 0.2400 | 1.4057 | 0.2606 | 9.999988e-09 | 15 |
| 1.3936 | 0.2729 | 1.4006 | 0.2887 | 9.999987e-09 | 16 |
| 1.3988 | 0.2682 | 1.3956 | 0.2958 | 9.999985e-09 | 17 |
| 1.3870 | 0.2824 | 1.3906 | 0.3169 | 9.999983e-09 | 18 |
| 1.3891 | 0.2847 | 1.3856 | 0.3310 | 9.999981e-09 | 19 |
| 1.3868 | 0.2588 | 1.3807 | 0.3310 | 9.9999795e-09 | 20 |
| 1.3873 | 0.2753 | 1.3757 | 0.3592 | 9.999978e-09 | 21 |
| 1.3668 | 0.2965 | 1.3707 | 0.3732 | 9.999976e-09 | 22 |
| 1.3692 | 0.3129 | 1.3657 | 0.3803 | 9.999973e-09 | 23 |
| 1.3771 | 0.3106 | 1.3608 | 0.3873 | 9.999971e-09 | 24 |
| 1.3719 | 0.3294 | 1.3561 | 0.4085 | 9.999968e-09 | 25 |
| 1.3529 | 0.4000 | 1.3512 | 0.4225 | 9.999965e-09 | 26 |
| 1.3613 | 0.3624 | 1.3467 | 0.4225 | 9.999963e-09 | 27 |
| 1.3388 | 0.3976 | 1.3421 | 0.4366 | 9.99996e-09 | 28 |
| 1.3422 | 0.3600 | 1.3377 | 0.4437 | 9.999957e-09 | 29 |
| 1.3398 | 0.3788 | 1.3330 | 0.4437 | 9.999955e-09 | 30 |
| 1.3454 | 0.3812 | 1.3285 | 0.4648 | 9.999952e-09 | 31 |
| 1.3416 | 0.3741 | 1.3241 | 0.4859 | 9.999948e-09 | 32 |
| 1.3457 | 0.3788 | 1.3196 | 0.4930 | 9.999945e-09 | 33 |
| 1.3383 | 0.4165 | 1.3152 | 0.5070 | 9.999941e-09 | 34 |
| 1.3169 | 0.4753 | 1.3108 | 0.5141 | 9.999938e-09 | 35 |
| 1.3286 | 0.4353 | 1.3066 | 0.5211 | 9.999934e-09 | 36 |
| 1.3141 | 0.4376 | 1.3023 | 0.5423 | 9.999931e-09 | 37 |
| 1.3238 | 0.4635 | 1.2984 | 0.5423 | 9.999927e-09 | 38 |
| 1.3031 | 0.4871 | 1.2945 | 0.5423 | 9.9999236e-09 | 39 |
| 1.3017 | 0.5082 | 1.2903 | 0.5634 | 9.999919e-09 | 40 |
| 1.2915 | 0.5271 | 1.2862 | 0.5845 | 9.999915e-09 | 41 |
| 1.3075 | 0.4729 | 1.2822 | 0.5845 | 9.99991e-09 | 42 |
| 1.2896 | 0.5129 | 1.2780 | 0.5845 | 9.999906e-09 | 43 |
| 1.2864 | 0.5224 | 1.2742 | 0.5986 | 9.999901e-09 | 44 |
| 1.2962 | 0.5153 | 1.2703 | 0.5915 | 9.999897e-09 | 45 |
| 1.2854 | 0.4918 | 1.2665 | 0.6127 | 9.9998925e-09 | 46 |
| 1.2776 | 0.5318 | 1.2625 | 0.6127 | 9.999888e-09 | 47 |
| 1.2619 | 0.5553 | 1.2588 | 0.6268 | 9.999884e-09 | 48 |
| 1.2648 | 0.5388 | 1.2551 | 0.6268 | 9.999878e-09 | 49 |
| 1.2735 | 0.5482 | 1.2514 | 0.6338 | 9.999873e-09 | 50 |
| 1.2713 | 0.5341 | 1.2478 | 0.6338 | 9.999868e-09 | 51 |
| 1.2612 | 0.5671 | 1.2442 | 0.6408 | 9.999862e-09 | 52 |
| 1.2532 | 0.6000 | 1.2406 | 0.6479 | 9.999857e-09 | 53 |
| 1.2714 | 0.5718 | 1.2369 | 0.6479 | 9.999852e-09 | 54 |
| 1.2505 | 0.6094 | 1.2333 | 0.6620 | 9.999846e-09 | 55 |
| 1.2510 | 0.5976 | 1.2298 | 0.6549 | 9.999841e-09 | 56 |
| 1.2475 | 0.6024 | 1.2263 | 0.6549 | 9.999836e-09 | 57 |
| 1.2411 | 0.6047 | 1.2228 | 0.6549 | 9.999829e-09 | 58 |
| 1.2428 | 0.5953 | 1.2194 | 0.6620 | 9.999823e-09 | 59 |
| 1.2362 | 0.6165 | 1.2161 | 0.6620 | 9.999817e-09 | 60 |
| 1.2334 | 0.6212 | 1.2128 | 0.6620 | 9.999811e-09 | 61 |
| 1.2281 | 0.6094 | 1.2096 | 0.6690 | 9.9998045e-09 | 62 |
| 1.2375 | 0.6259 | 1.2064 | 0.6690 | 9.999798e-09 | 63 |
| 1.2283 | 0.6235 | 1.2032 | 0.6690 | 9.999792e-09 | 64 |
| 1.2185 | 0.6424 | 1.2000 | 0.6761 | 9.999786e-09 | 65 |
| 1.2151 | 0.6212 | 1.1968 | 0.6761 | 9.99978e-09 | 66 |
| 1.2140 | 0.6376 | 1.1938 | 0.6761 | 9.999773e-09 | 67 |
| 1.2177 | 0.6329 | 1.1906 | 0.6761 | 9.9997655e-09 | 68 |
| 1.2142 | 0.6518 | 1.1875 | 0.6761 | 9.999758e-09 | 69 |
| 1.2051 | 0.6541 | 1.1844 | 0.6761 | 9.999751e-09 | 70 |
| 1.2120 | 0.6376 | 1.1814 | 0.6761 | 9.999744e-09 | 71 |
| 1.2027 | 0.6494 | 1.1784 | 0.6761 | 9.999737e-09 | 72 |
| 1.1968 | 0.6776 | 1.1755 | 0.6761 | 9.99973e-09 | 73 |
| 1.1915 | 0.6518 | 1.1727 | 0.6761 | 9.999723e-09 | 74 |
| 1.1874 | 0.6494 | 1.1698 | 0.6761 | 9.999716e-09 | 75 |
| 1.1905 | 0.6588 | 1.1670 | 0.6761 | 9.999708e-09 | 76 |
| 1.1880 | 0.6729 | 1.1643 | 0.6761 | 9.9997e-09 | 77 |
| 1.1827 | 0.6706 | 1.1616 | 0.6761 | 9.999692e-09 | 78 |
| 1.1842 | 0.6659 | 1.1588 | 0.6761 | 9.999684e-09 | 79 |
| 1.1803 | 0.6635 | 1.1561 | 0.6761 | 9.999676e-09 | 80 |
| 1.1788 | 0.6659 | 1.1535 | 0.6761 | 9.999668e-09 | 81 |
| 1.1718 | 0.6612 | 1.1510 | 0.6761 | 9.99966e-09 | 82 |
| 1.1776 | 0.6682 | 1.1482 | 0.6761 | 9.999652e-09 | 83 |
| 1.1745 | 0.6612 | 1.1456 | 0.6761 | 9.999644e-09 | 84 |
| 1.1608 | 0.6635 | 1.1432 | 0.6761 | 9.999635e-09 | 85 |
| 1.1649 | 0.6635 | 1.1408 | 0.6761 | 9.999626e-09 | 86 |
| 1.1525 | 0.6682 | 1.1383 | 0.6761 | 9.999617e-09 | 87 |
| 1.1691 | 0.6612 | 1.1358 | 0.6761 | 9.999608e-09 | 88 |
| 1.1659 | 0.6682 | 1.1334 | 0.6761 | 9.999599e-09 | 89 |
| 1.1451 | 0.6753 | 1.1311 | 0.6761 | 9.9995905e-09 | 90 |
| 1.1425 | 0.6659 | 1.1287 | 0.6761 | 9.999582e-09 | 91 |
| 1.1598 | 0.6635 | 1.1263 | 0.6761 | 9.999573e-09 | 92 |
| 1.1498 | 0.6729 | 1.1241 | 0.6761 | 9.999564e-09 | 93 |
| 1.1453 | 0.6706 | 1.1218 | 0.6761 | 9.999554e-09 | 94 |
| 1.1482 | 0.6682 | 1.1195 | 0.6761 | 9.999544e-09 | 95 |
| 1.1368 | 0.6753 | 1.1174 | 0.6761 | 9.9995345e-09 | 96 |
| 1.1397 | 0.6729 | 1.1153 | 0.6761 | 9.999525e-09 | 97 |
| 1.1377 | 0.6729 | 1.1132 | 0.6761 | 9.999515e-09 | 98 |
| 1.1389 | 0.6706 | 1.1110 | 0.6761 | 9.999505e-09 | 99 |
| 1.1307 | 0.6753 | 1.1090 | 0.6761 | 9.9994955e-09 | 100 |
| 1.1364 | 0.6729 | 1.1069 | 0.6761 | 9.999486e-09 | 101 |
| 1.1343 | 0.6753 | 1.1049 | 0.6761 | 9.999476e-09 | 102 |
| 1.1299 | 0.6706 | 1.1028 | 0.6761 | 9.999465e-09 | 103 |
| 1.1335 | 0.6753 | 1.1009 | 0.6761 | 9.999455e-09 | 104 |
| 1.1276 | 0.6776 | 1.0989 | 0.6761 | 9.999444e-09 | 105 |
| 1.1208 | 0.6776 | 1.0970 | 0.6761 | 9.999433e-09 | 106 |
| 1.1197 | 0.6753 | 1.0950 | 0.6761 | 9.999423e-09 | 107 |
| 1.1089 | 0.6776 | 1.0932 | 0.6761 | 9.999412e-09 | 108 |
| 1.1147 | 0.6824 | 1.0913 | 0.6761 | 9.999401e-09 | 109 |
| 1.1235 | 0.6776 | 1.0894 | 0.6761 | 9.999391e-09 | 110 |
| 1.1070 | 0.6776 | 1.0877 | 0.6761 | 9.99938e-09 | 111 |
| 1.1120 | 0.6729 | 1.0861 | 0.6761 | 9.9993684e-09 | 112 |
| 1.1162 | 0.6776 | 1.0843 | 0.6761 | 9.999357e-09 | 113 |
| 1.1038 | 0.6776 | 1.0826 | 0.6761 | 9.999345e-09 | 114 |
| 1.1041 | 0.6776 | 1.0808 | 0.6761 | 9.999334e-09 | 115 |
| 1.0974 | 0.6753 | 1.0791 | 0.6761 | 9.999322e-09 | 116 |
| 1.1025 | 0.6776 | 1.0775 | 0.6761 | 9.999311e-09 | 117 |
| 1.1008 | 0.6776 | 1.0759 | 0.6761 | 9.999299e-09 | 118 |
| 1.0958 | 0.6776 | 1.0741 | 0.6761 | 9.999288e-09 | 119 |
| 1.1005 | 0.6753 | 1.0725 | 0.6761 | 9.999275e-09 | 120 |
| 1.1051 | 0.6776 | 1.0709 | 0.6761 | 9.999263e-09 | 121 |
| 1.0817 | 0.6753 | 1.0693 | 0.6761 | 9.99925e-09 | 122 |
| 1.0924 | 0.6753 | 1.0679 | 0.6761 | 9.999238e-09 | 123 |
| 1.0938 | 0.6776 | 1.0662 | 0.6761 | 9.9992254e-09 | 124 |
| 1.0981 | 0.6776 | 1.0647 | 0.6761 | 9.999213e-09 | 125 |
| 1.0817 | 0.6776 | 1.0632 | 0.6761 | 9.999201e-09 | 126 |
| 1.0869 | 0.6776 | 1.0618 | 0.6761 | 9.999188e-09 | 127 |
| 1.0790 | 0.6776 | 1.0603 | 0.6761 | 9.999176e-09 | 128 |
| 1.0847 | 0.6776 | 1.0589 | 0.6761 | 9.999162e-09 | 129 |
| 1.0836 | 0.6776 | 1.0576 | 0.6761 | 9.999149e-09 | 130 |
| 1.0804 | 0.6776 | 1.0562 | 0.6761 | 9.999136e-09 | 131 |
| 1.0722 | 0.6776 | 1.0549 | 0.6761 | 9.999122e-09 | 132 |
| 1.0784 | 0.6776 | 1.0535 | 0.6761 | 9.999109e-09 | 133 |
| 1.0783 | 0.6776 | 1.0521 | 0.6761 | 9.999096e-09 | 134 |
| 1.0666 | 0.6776 | 1.0509 | 0.6761 | 9.9990825e-09 | 135 |
| 1.0669 | 0.6776 | 1.0497 | 0.6761 | 9.999069e-09 | 136 |
| 1.0718 | 0.6776 | 1.0483 | 0.6761 | 9.999056e-09 | 137 |
| 1.0702 | 0.6776 | 1.0471 | 0.6761 | 9.999042e-09 | 138 |
| 1.0801 | 0.6776 | 1.0459 | 0.6761 | 9.999027e-09 | 139 |
| 1.0798 | 0.6776 | 1.0447 | 0.6761 | 9.999013e-09 | 140 |
| 1.0620 | 0.6776 | 1.0435 | 0.6761 | 9.998999e-09 | 141 |
| 1.0700 | 0.6776 | 1.0424 | 0.6761 | 9.998985e-09 | 142 |
| 1.0657 | 0.6776 | 1.0412 | 0.6761 | 9.9989705e-09 | 143 |
| 1.0596 | 0.6776 | 1.0401 | 0.6761 | 9.998956e-09 | 144 |
| 1.0607 | 0.6776 | 1.0389 | 0.6761 | 9.998942e-09 | 145 |
| 1.0593 | 0.6776 | 1.0378 | 0.6761 | 9.998928e-09 | 146 |
| 1.0533 | 0.6776 | 1.0368 | 0.6761 | 9.998913e-09 | 147 |
| 1.0663 | 0.6776 | 1.0357 | 0.6761 | 9.998898e-09 | 148 |
| 1.0521 | 0.6776 | 1.0347 | 0.6761 | 9.998883e-09 | 149 |
| 1.0610 | 0.6776 | 1.0336 | 0.6761 | 9.9988675e-09 | 150 |
| 1.0571 | 0.6776 | 1.0325 | 0.6761 | 9.998852e-09 | 151 |
| 1.0517 | 0.6776 | 1.0315 | 0.6761 | 9.998837e-09 | 152 |
| 1.0563 | 0.6776 | 1.0305 | 0.6761 | 9.998822e-09 | 153 |
| 1.0491 | 0.6776 | 1.0295 | 0.6761 | 9.998807e-09 | 154 |
| 1.0537 | 0.6776 | 1.0286 | 0.6761 | 9.998792e-09 | 155 |
| 1.0483 | 0.6776 | 1.0277 | 0.6761 | 9.998776e-09 | 156 |
| 1.0554 | 0.6776 | 1.0267 | 0.6761 | 9.99876e-09 | 157 |
| 1.0520 | 0.6776 | 1.0258 | 0.6761 | 9.998744e-09 | 158 |
| 1.0507 | 0.6776 | 1.0249 | 0.6761 | 9.998728e-09 | 159 |
| 1.0490 | 0.6776 | 1.0239 | 0.6761 | 9.998712e-09 | 160 |
| 1.0464 | 0.6776 | 1.0231 | 0.6761 | 9.998696e-09 | 161 |
| 1.0438 | 0.6776 | 1.0222 | 0.6761 | 9.99868e-09 | 162 |
| 1.0417 | 0.6776 | 1.0213 | 0.6761 | 9.998664e-09 | 163 |
| 1.0340 | 0.6776 | 1.0205 | 0.6761 | 9.998648e-09 | 164 |
| 1.0366 | 0.6776 | 1.0197 | 0.6761 | 9.998631e-09 | 165 |
| 1.0417 | 0.6776 | 1.0189 | 0.6761 | 9.998614e-09 | 166 |
| 1.0447 | 0.6776 | 1.0180 | 0.6761 | 9.9985975e-09 | 167 |
| 1.0416 | 0.6776 | 1.0173 | 0.6761 | 9.998581e-09 | 168 |
| 1.0446 | 0.6776 | 1.0165 | 0.6761 | 9.998564e-09 | 169 |
| 1.0352 | 0.6776 | 1.0158 | 0.6761 | 9.998547e-09 | 170 |
| 1.0373 | 0.6776 | 1.0150 | 0.6761 | 9.99853e-09 | 171 |
| 1.0365 | 0.6776 | 1.0143 | 0.6761 | 9.998513e-09 | 172 |
| 1.0429 | 0.6776 | 1.0136 | 0.6761 | 9.998496e-09 | 173 |
| 1.0275 | 0.6776 | 1.0128 | 0.6761 | 9.9984785e-09 | 174 |
| 1.0325 | 0.6776 | 1.0121 | 0.6761 | 9.998461e-09 | 175 |
| 1.0349 | 0.6776 | 1.0113 | 0.6761 | 9.998443e-09 | 176 |
| 1.0380 | 0.6776 | 1.0106 | 0.6761 | 9.998425e-09 | 177 |
| 1.0220 | 0.6776 | 1.0100 | 0.6761 | 9.998407e-09 | 178 |
| 1.0292 | 0.6776 | 1.0093 | 0.6761 | 9.99839e-09 | 179 |
| 1.0296 | 0.6776 | 1.0086 | 0.6761 | 9.998372e-09 | 180 |
| 1.0295 | 0.6776 | 1.0080 | 0.6761 | 9.998354e-09 | 181 |
| 1.0269 | 0.6776 | 1.0073 | 0.6761 | 9.998336e-09 | 182 |
| 1.0279 | 0.6776 | 1.0067 | 0.6761 | 9.998318e-09 | 183 |
| 1.0236 | 0.6776 | 1.0061 | 0.6761 | 9.998299e-09 | 184 |
| 1.0225 | 0.6776 | 1.0055 | 0.6761 | 9.99828e-09 | 185 |
| 1.0233 | 0.6753 | 1.0049 | 0.6761 | 9.998262e-09 | 186 |
| 1.0170 | 0.6776 | 1.0043 | 0.6761 | 9.998243e-09 | 187 |
| 1.0240 | 0.6776 | 1.0037 | 0.6761 | 9.9982245e-09 | 188 |
| 1.0169 | 0.6776 | 1.0032 | 0.6761 | 9.998206e-09 | 189 |
| 1.0307 | 0.6776 | 1.0026 | 0.6761 | 9.998187e-09 | 190 |
| 1.0249 | 0.6776 | 1.0020 | 0.6761 | 9.998168e-09 | 191 |
| 1.0151 | 0.6776 | 1.0014 | 0.6761 | 9.998148e-09 | 192 |
| 1.0214 | 0.6776 | 1.0009 | 0.6761 | 9.9981285e-09 | 193 |
| 1.0256 | 0.6776 | 1.0004 | 0.6761 | 9.998109e-09 | 194 |
| 1.0157 | 0.6776 | 0.9998 | 0.6761 | 9.9980895e-09 | 195 |
| 1.0137 | 0.6776 | 0.9993 | 0.6761 | 9.99807e-09 | 196 |
| 1.0131 | 0.6776 | 0.9988 | 0.6761 | 9.99805e-09 | 197 |
| 1.0136 | 0.6776 | 0.9983 | 0.6761 | 9.998031e-09 | 198 |
| 1.0164 | 0.6776 | 0.9978 | 0.6761 | 9.998011e-09 | 199 |
| 1.0144 | 0.6776 | 0.9974 | 0.6761 | 9.997991e-09 | 200 |
| 1.0176 | 0.6776 | 0.9969 | 0.6761 | 9.9979705e-09 | 201 |
| 1.0096 | 0.6776 | 0.9964 | 0.6761 | 9.99795e-09 | 202 |
| 1.0091 | 0.6776 | 0.9959 | 0.6761 | 9.99793e-09 | 203 |
| 1.0148 | 0.6776 | 0.9954 | 0.6761 | 9.997909e-09 | 204 |
| 1.0078 | 0.6776 | 0.9950 | 0.6761 | 9.997889e-09 | 205 |
| 1.0164 | 0.6776 | 0.9945 | 0.6761 | 9.997868e-09 | 206 |
| 1.0044 | 0.6776 | 0.9941 | 0.6761 | 9.997848e-09 | 207 |
| 1.0151 | 0.6776 | 0.9936 | 0.6761 | 9.9978275e-09 | 208 |
| 1.0018 | 0.6776 | 0.9931 | 0.6761 | 9.997806e-09 | 209 |
| 1.0073 | 0.6776 | 0.9928 | 0.6761 | 9.997785e-09 | 210 |
| 0.9975 | 0.6776 | 0.9924 | 0.6761 | 9.9977635e-09 | 211 |
| 1.0090 | 0.6776 | 0.9920 | 0.6761 | 9.997742e-09 | 212 |
| 0.9984 | 0.6776 | 0.9916 | 0.6761 | 9.997721e-09 | 213 |
| 1.0106 | 0.6776 | 0.9912 | 0.6761 | 9.9977e-09 | 214 |
| 1.0172 | 0.6776 | 0.9907 | 0.6761 | 9.997678e-09 | 215 |
| 1.0035 | 0.6776 | 0.9904 | 0.6761 | 9.997657e-09 | 216 |
| 1.0072 | 0.6776 | 0.9900 | 0.6761 | 9.997636e-09 | 217 |
| 1.0108 | 0.6776 | 0.9897 | 0.6761 | 9.997613e-09 | 218 |
| 0.9962 | 0.6776 | 0.9893 | 0.6761 | 9.997591e-09 | 219 |
| 0.9902 | 0.6776 | 0.9890 | 0.6761 | 9.997569e-09 | 220 |
| 1.0010 | 0.6776 | 0.9886 | 0.6761 | 9.997547e-09 | 221 |
| 0.9988 | 0.6776 | 0.9883 | 0.6761 | 9.997525e-09 | 222 |
| 1.0033 | 0.6776 | 0.9879 | 0.6761 | 9.997502e-09 | 223 |
| 1.0084 | 0.6776 | 0.9876 | 0.6761 | 9.99748e-09 | 224 |
| 0.9926 | 0.6776 | 0.9872 | 0.6761 | 9.997458e-09 | 225 |
| 1.0007 | 0.6776 | 0.9869 | 0.6761 | 9.997436e-09 | 226 |
| 0.9969 | 0.6776 | 0.9866 | 0.6761 | 9.997413e-09 | 227 |
| 0.9945 | 0.6776 | 0.9863 | 0.6761 | 9.99739e-09 | 228 |
| 1.0045 | 0.6776 | 0.9860 | 0.6761 | 9.9973665e-09 | 229 |
| 1.0010 | 0.6776 | 0.9857 | 0.6761 | 9.997343e-09 | 230 |
| 0.9937 | 0.6776 | 0.9854 | 0.6761 | 9.99732e-09 | 231 |
| 0.9978 | 0.6776 | 0.9851 | 0.6761 | 9.997297e-09 | 232 |
| 0.9999 | 0.6776 | 0.9848 | 0.6761 | 9.997274e-09 | 233 |
| 1.0013 | 0.6776 | 0.9845 | 0.6761 | 9.997251e-09 | 234 |
| 0.9862 | 0.6776 | 0.9842 | 0.6761 | 9.997228e-09 | 235 |
| 0.9956 | 0.6776 | 0.9840 | 0.6761 | 9.997204e-09 | 236 |
| 1.0019 | 0.6776 | 0.9837 | 0.6761 | 9.99718e-09 | 237 |
| 0.9979 | 0.6776 | 0.9834 | 0.6761 | 9.997156e-09 | 238 |
| 0.9965 | 0.6776 | 0.9831 | 0.6761 | 9.997132e-09 | 239 |
| 1.0023 | 0.6776 | 0.9829 | 0.6761 | 9.997108e-09 | 240 |
| 0.9920 | 0.6776 | 0.9826 | 0.6761 | 9.997084e-09 | 241 |
| 1.0015 | 0.6776 | 0.9824 | 0.6761 | 9.99706e-09 | 242 |
| 0.9920 | 0.6776 | 0.9821 | 0.6761 | 9.997036e-09 | 243 |
| 1.0004 | 0.6776 | 0.9818 | 0.6761 | 9.997012e-09 | 244 |
| 0.9831 | 0.6776 | 0.9816 | 0.6761 | 9.996987e-09 | 245 |
| 0.9857 | 0.6776 | 0.9813 | 0.6761 | 9.996962e-09 | 246 |
| 0.9860 | 0.6776 | 0.9811 | 0.6761 | 9.9969375e-09 | 247 |
| 0.9885 | 0.6776 | 0.9808 | 0.6761 | 9.996913e-09 | 248 |
| 0.9904 | 0.6776 | 0.9806 | 0.6761 | 9.996888e-09 | 249 |
| 0.9992 | 0.6776 | 0.9804 | 0.6761 | 9.996863e-09 | 250 |
| 0.9859 | 0.6776 | 0.9802 | 0.6761 | 9.996838e-09 | 251 |
| 0.9921 | 0.6776 | 0.9799 | 0.6761 | 9.996813e-09 | 252 |
| 0.9956 | 0.6776 | 0.9797 | 0.6761 | 9.996788e-09 | 253 |
| 0.9800 | 0.6776 | 0.9795 | 0.6761 | 9.9967625e-09 | 254 |
| 0.9931 | 0.6776 | 0.9793 | 0.6761 | 9.996737e-09 | 255 |
| 0.9953 | 0.6776 | 0.9791 | 0.6761 | 9.996711e-09 | 256 |
| 0.9941 | 0.6776 | 0.9789 | 0.6761 | 9.996685e-09 | 257 |
| 0.9779 | 0.6776 | 0.9787 | 0.6761 | 9.9966595e-09 | 258 |
| 0.9825 | 0.6776 | 0.9785 | 0.6761 | 9.996634e-09 | 259 |
| 0.9821 | 0.6776 | 0.9783 | 0.6761 | 9.996608e-09 | 260 |
| 0.9986 | 0.6776 | 0.9781 | 0.6761 | 9.996582e-09 | 261 |
| 0.9849 | 0.6776 | 0.9779 | 0.6761 | 9.9965565e-09 | 262 |
| 0.9802 | 0.6776 | 0.9777 | 0.6761 | 9.99653e-09 | 263 |
| 0.9887 | 0.6776 | 0.9775 | 0.6761 | 9.996503e-09 | 264 |
| 0.9899 | 0.6776 | 0.9773 | 0.6761 | 9.9964765e-09 | 265 |
| 0.9858 | 0.6776 | 0.9772 | 0.6761 | 9.99645e-09 | 266 |
| 0.9844 | 0.6776 | 0.9770 | 0.6761 | 9.996423e-09 | 267 |
| 0.9841 | 0.6776 | 0.9768 | 0.6761 | 9.996397e-09 | 268 |
| 0.9876 | 0.6776 | 0.9766 | 0.6761 | 9.99637e-09 | 269 |
| 0.9946 | 0.6776 | 0.9765 | 0.6761 | 9.996343e-09 | 270 |
| 0.9738 | 0.6776 | 0.9763 | 0.6761 | 9.996316e-09 | 271 |
| 0.9792 | 0.6776 | 0.9761 | 0.6761 | 9.996288e-09 | 272 |
| 0.9736 | 0.6776 | 0.9760 | 0.6761 | 9.996261e-09 | 273 |
| 0.9835 | 0.6776 | 0.9758 | 0.6761 | 9.996233e-09 | 274 |
| 0.9800 | 0.6776 | 0.9757 | 0.6761 | 9.996206e-09 | 275 |
| 0.9849 | 0.6776 | 0.9755 | 0.6761 | 9.996178e-09 | 276 |
| 0.9811 | 0.6776 | 0.9753 | 0.6761 | 9.996151e-09 | 277 |
| 0.9791 | 0.6776 | 0.9752 | 0.6761 | 9.996123e-09 | 278 |
| 0.9814 | 0.6776 | 0.9750 | 0.6761 | 9.9960955e-09 | 279 |
| 0.9785 | 0.6776 | 0.9749 | 0.6761 | 9.996067e-09 | 280 |
| 0.9811 | 0.6776 | 0.9748 | 0.6761 | 9.996039e-09 | 281 |
| 0.9815 | 0.6776 | 0.9746 | 0.6761 | 9.99601e-09 | 282 |
| 0.9788 | 0.6776 | 0.9745 | 0.6761 | 9.995982e-09 | 283 |
| 0.9808 | 0.6776 | 0.9744 | 0.6761 | 9.995953e-09 | 284 |
| 0.9848 | 0.6776 | 0.9742 | 0.6761 | 9.995925e-09 | 285 |
| 0.9832 | 0.6776 | 0.9741 | 0.6761 | 9.9958966e-09 | 286 |
| 0.9860 | 0.6776 | 0.9740 | 0.6761 | 9.995868e-09 | 287 |
| 0.9783 | 0.6776 | 0.9739 | 0.6761 | 9.99584e-09 | 288 |
| 0.9769 | 0.6776 | 0.9737 | 0.6761 | 9.99581e-09 | 289 |
| 0.9765 | 0.6776 | 0.9736 | 0.6761 | 9.995781e-09 | 290 |
| 0.9727 | 0.6776 | 0.9735 | 0.6761 | 9.995752e-09 | 291 |
| 0.9785 | 0.6776 | 0.9733 | 0.6761 | 9.9957225e-09 | 292 |
| 0.9728 | 0.6776 | 0.9732 | 0.6761 | 9.995693e-09 | 293 |
| 0.9659 | 0.6776 | 0.9731 | 0.6761 | 9.995664e-09 | 294 |
| 0.9802 | 0.6776 | 0.9730 | 0.6761 | 9.9956345e-09 | 295 |
| 0.9754 | 0.6776 | 0.9729 | 0.6761 | 9.995605e-09 | 296 |
| 0.9696 | 0.6776 | 0.9727 | 0.6761 | 9.995576e-09 | 297 |
| 0.9851 | 0.6776 | 0.9726 | 0.6761 | 9.995546e-09 | 298 |
| 0.9846 | 0.6776 | 0.9725 | 0.6761 | 9.9955155e-09 | 299 |
| 0.9743 | 0.6776 | 0.9724 | 0.6761 | 9.995485e-09 | 300 |
| 0.9773 | 0.6776 | 0.9723 | 0.6761 | 9.995455e-09 | 301 |
| 0.9698 | 0.6776 | 0.9722 | 0.6761 | 9.995425e-09 | 302 |
| 0.9722 | 0.6776 | 0.9721 | 0.6761 | 9.995395e-09 | 303 |
| 0.9766 | 0.6776 | 0.9720 | 0.6761 | 9.9953645e-09 | 304 |
| 0.9809 | 0.6776 | 0.9718 | 0.6761 | 9.995334e-09 | 305 |
| 0.9695 | 0.6776 | 0.9717 | 0.6761 | 9.995304e-09 | 306 |
| 0.9826 | 0.6776 | 0.9716 | 0.6761 | 9.995273e-09 | 307 |
| 0.9713 | 0.6776 | 0.9716 | 0.6761 | 9.995242e-09 | 308 |
| 0.9857 | 0.6776 | 0.9715 | 0.6761 | 9.995211e-09 | 309 |
| 0.9662 | 0.6776 | 0.9714 | 0.6761 | 9.99518e-09 | 310 |
| 0.9736 | 0.6776 | 0.9713 | 0.6761 | 9.995149e-09 | 311 |
| 0.9752 | 0.6776 | 0.9712 | 0.6761 | 9.995118e-09 | 312 |
| 0.9701 | 0.6776 | 0.9711 | 0.6761 | 9.9950865e-09 | 313 |
| 0.9677 | 0.6776 | 0.9710 | 0.6761 | 9.9950554e-09 | 314 |
| 0.9724 | 0.6776 | 0.9709 | 0.6761 | 9.995024e-09 | 315 |
| 0.9769 | 0.6776 | 0.9708 | 0.6761 | 9.994992e-09 | 316 |
| 0.9711 | 0.6776 | 0.9707 | 0.6761 | 9.99496e-09 | 317 |
| 0.9693 | 0.6776 | 0.9706 | 0.6761 | 9.994928e-09 | 318 |
| 0.9724 | 0.6776 | 0.9705 | 0.6761 | 9.9948965e-09 | 319 |
| 0.9674 | 0.6776 | 0.9704 | 0.6761 | 9.9948645e-09 | 320 |
| 0.9678 | 0.6776 | 0.9703 | 0.6761 | 9.9948325e-09 | 321 |
| 0.9782 | 0.6776 | 0.9703 | 0.6761 | 9.9948005e-09 | 322 |
| 0.9723 | 0.6776 | 0.9702 | 0.6761 | 9.994769e-09 | 323 |
| 0.9656 | 0.6776 | 0.9701 | 0.6761 | 9.994737e-09 | 324 |
| 0.9724 | 0.6776 | 0.9700 | 0.6761 | 9.994704e-09 | 325 |
| 0.9759 | 0.6776 | 0.9700 | 0.6761 | 9.994671e-09 | 326 |
| 0.9735 | 0.6776 | 0.9699 | 0.6761 | 9.994638e-09 | 327 |
| 0.9666 | 0.6776 | 0.9698 | 0.6761 | 9.994605e-09 | 328 |
| 0.9766 | 0.6776 | 0.9697 | 0.6761 | 9.994572e-09 | 329 |
| 0.9737 | 0.6776 | 0.9697 | 0.6761 | 9.994539e-09 | 330 |
| 0.9668 | 0.6776 | 0.9696 | 0.6761 | 9.9945066e-09 | 331 |
| 0.9664 | 0.6776 | 0.9696 | 0.6761 | 9.994474e-09 | 332 |
| 0.9697 | 0.6776 | 0.9695 | 0.6761 | 9.994441e-09 | 333 |
| 0.9733 | 0.6776 | 0.9694 | 0.6761 | 9.994407e-09 | 334 |
| 0.9614 | 0.6776 | 0.9693 | 0.6761 | 9.994373e-09 | 335 |
| 0.9663 | 0.6776 | 0.9693 | 0.6761 | 9.99434e-09 | 336 |
| 0.9762 | 0.6776 | 0.9692 | 0.6761 | 9.994306e-09 | 337 |
| 0.9616 | 0.6776 | 0.9691 | 0.6761 | 9.994272e-09 | 338 |
| 0.9756 | 0.6776 | 0.9690 | 0.6761 | 9.994238e-09 | 339 |
| 0.9684 | 0.6776 | 0.9689 | 0.6761 | 9.994205e-09 | 340 |
| 0.9549 | 0.6776 | 0.9689 | 0.6761 | 9.994171e-09 | 341 |
| 0.9692 | 0.6776 | 0.9688 | 0.6761 | 9.994137e-09 | 342 |
| 0.9567 | 0.6776 | 0.9687 | 0.6761 | 9.994102e-09 | 343 |
| 0.9680 | 0.6776 | 0.9687 | 0.6761 | 9.994068e-09 | 344 |
| 0.9675 | 0.6776 | 0.9686 | 0.6761 | 9.994033e-09 | 345 |
| 0.9621 | 0.6776 | 0.9685 | 0.6761 | 9.9939985e-09 | 346 |
| 0.9760 | 0.6776 | 0.9684 | 0.6761 | 9.993964e-09 | 347 |
| 0.9701 | 0.6776 | 0.9683 | 0.6761 | 9.993929e-09 | 348 |
| 0.9716 | 0.6776 | 0.9683 | 0.6761 | 9.993895e-09 | 349 |
| 0.9621 | 0.6776 | 0.9682 | 0.6761 | 9.99386e-09 | 350 |
| 0.9699 | 0.6776 | 0.9681 | 0.6761 | 9.993825e-09 | 351 |
| 0.9641 | 0.6776 | 0.9681 | 0.6761 | 9.99379e-09 | 352 |
| 0.9702 | 0.6776 | 0.9680 | 0.6761 | 9.993754e-09 | 353 |
| 0.9752 | 0.6776 | 0.9679 | 0.6761 | 9.993719e-09 | 354 |
| 0.9638 | 0.6776 | 0.9678 | 0.6761 | 9.993683e-09 | 355 |
| 0.9706 | 0.6776 | 0.9678 | 0.6761 | 9.993648e-09 | 356 |
| 0.9686 | 0.6776 | 0.9677 | 0.6761 | 9.993612e-09 | 357 |
| 0.9645 | 0.6776 | 0.9676 | 0.6761 | 9.993577e-09 | 358 |
| 0.9646 | 0.6776 | 0.9676 | 0.6761 | 9.993541e-09 | 359 |
| 0.9756 | 0.6776 | 0.9675 | 0.6761 | 9.993505e-09 | 360 |
| 0.9703 | 0.6776 | 0.9674 | 0.6761 | 9.993468e-09 | 361 |
| 0.9673 | 0.6776 | 0.9674 | 0.6761 | 9.993432e-09 | 362 |
| 0.9579 | 0.6776 | 0.9673 | 0.6761 | 9.9933954e-09 | 363 |
| 0.9564 | 0.6776 | 0.9673 | 0.6761 | 9.993359e-09 | 364 |
| 0.9734 | 0.6776 | 0.9672 | 0.6761 | 9.993323e-09 | 365 |
| 0.9653 | 0.6776 | 0.9671 | 0.6761 | 9.993286e-09 | 366 |
| 0.9669 | 0.6776 | 0.9670 | 0.6761 | 9.99325e-09 | 367 |
| 0.9635 | 0.6776 | 0.9670 | 0.6761 | 9.993213e-09 | 368 |
| 0.9629 | 0.6776 | 0.9669 | 0.6761 | 9.993176e-09 | 369 |
| 0.9710 | 0.6776 | 0.9668 | 0.6761 | 9.993139e-09 | 370 |
| 0.9599 | 0.6776 | 0.9668 | 0.6761 | 9.9931015e-09 | 371 |
| 0.9599 | 0.6776 | 0.9667 | 0.6761 | 9.993064e-09 | 372 |
| 0.9672 | 0.6776 | 0.9667 | 0.6761 | 9.993027e-09 | 373 |
| 0.9658 | 0.6776 | 0.9666 | 0.6761 | 9.9929895e-09 | 374 |
| 0.9755 | 0.6776 | 0.9666 | 0.6761 | 9.992952e-09 | 375 |
| 0.9633 | 0.6776 | 0.9665 | 0.6761 | 9.992915e-09 | 376 |
| 0.9658 | 0.6776 | 0.9664 | 0.6761 | 9.992878e-09 | 377 |
| 0.9583 | 0.6776 | 0.9664 | 0.6761 | 9.9928394e-09 | 378 |
| 0.9653 | 0.6776 | 0.9663 | 0.6761 | 9.992801e-09 | 379 |
| 0.9676 | 0.6776 | 0.9662 | 0.6761 | 9.992763e-09 | 380 |
| 0.9614 | 0.6776 | 0.9662 | 0.6761 | 9.992725e-09 | 381 |
| 0.9650 | 0.6776 | 0.9662 | 0.6761 | 9.992687e-09 | 382 |
| 0.9608 | 0.6776 | 0.9661 | 0.6761 | 9.9926485e-09 | 383 |
| 0.9550 | 0.6776 | 0.9660 | 0.6761 | 9.99261e-09 | 384 |
| 0.9701 | 0.6776 | 0.9660 | 0.6761 | 9.992572e-09 | 385 |
| 0.9633 | 0.6776 | 0.9659 | 0.6761 | 9.992534e-09 | 386 |
| 0.9590 | 0.6776 | 0.9658 | 0.6761 | 9.992495e-09 | 387 |
| 0.9596 | 0.6776 | 0.9658 | 0.6761 | 9.992456e-09 | 388 |
| 0.9606 | 0.6776 | 0.9658 | 0.6761 | 9.992417e-09 | 389 |
| 0.9606 | 0.6776 | 0.9657 | 0.6761 | 9.992378e-09 | 390 |
| 0.9597 | 0.6776 | 0.9657 | 0.6761 | 9.9923385e-09 | 391 |
| 0.9665 | 0.6776 | 0.9656 | 0.6761 | 9.992299e-09 | 392 |
| 0.9592 | 0.6776 | 0.9656 | 0.6761 | 9.99226e-09 | 393 |
| 0.9527 | 0.6776 | 0.9656 | 0.6761 | 9.992221e-09 | 394 |
| 0.9617 | 0.6776 | 0.9655 | 0.6761 | 9.992182e-09 | 395 |
| 0.9540 | 0.6776 | 0.9654 | 0.6761 | 9.992142e-09 | 396 |
| 0.9581 | 0.6776 | 0.9654 | 0.6761 | 9.992102e-09 | 397 |
| 0.9627 | 0.6776 | 0.9654 | 0.6761 | 9.992062e-09 | 398 |
| 0.9675 | 0.6776 | 0.9653 | 0.6761 | 9.992022e-09 | 399 |
| 0.9583 | 0.6776 | 0.9653 | 0.6761 | 9.991982e-09 | 400 |
| 0.9746 | 0.6776 | 0.9652 | 0.6761 | 9.991942e-09 | 401 |
| 0.9574 | 0.6776 | 0.9652 | 0.6761 | 9.991902e-09 | 402 |
| 0.9604 | 0.6776 | 0.9652 | 0.6761 | 9.9918624e-09 | 403 |
| 0.9586 | 0.6776 | 0.9651 | 0.6761 | 9.9918225e-09 | 404 |
| 0.9524 | 0.6776 | 0.9651 | 0.6761 | 9.991782e-09 | 405 |
| 0.9600 | 0.6776 | 0.9650 | 0.6761 | 9.991741e-09 | 406 |
| 0.9589 | 0.6776 | 0.9649 | 0.6761 | 9.9917e-09 | 407 |
| 0.9607 | 0.6776 | 0.9649 | 0.6761 | 9.991659e-09 | 408 |
| 0.9580 | 0.6776 | 0.9649 | 0.6761 | 9.991618e-09 | 409 |
| 0.9608 | 0.6776 | 0.9648 | 0.6761 | 9.991577e-09 | 410 |
| 0.9518 | 0.6776 | 0.9647 | 0.6761 | 9.9915365e-09 | 411 |
| 0.9729 | 0.6776 | 0.9647 | 0.6761 | 9.991496e-09 | 412 |
| 0.9532 | 0.6776 | 0.9646 | 0.6761 | 9.991455e-09 | 413 |
| 0.9516 | 0.6776 | 0.9646 | 0.6761 | 9.991413e-09 | 414 |
| 0.9585 | 0.6776 | 0.9645 | 0.6761 | 9.991371e-09 | 415 |
| 0.9651 | 0.6776 | 0.9644 | 0.6761 | 9.9913295e-09 | 416 |
| 0.9584 | 0.6776 | 0.9644 | 0.6761 | 9.991288e-09 | 417 |
| 0.9594 | 0.6776 | 0.9643 | 0.6761 | 9.991246e-09 | 418 |
| 0.9630 | 0.6776 | 0.9643 | 0.6761 | 9.991204e-09 | 419 |
| 0.9597 | 0.6776 | 0.9643 | 0.6761 | 9.991163e-09 | 420 |
| 0.9521 | 0.6776 | 0.9642 | 0.6761 | 9.991121e-09 | 421 |
| 0.9655 | 0.6776 | 0.9642 | 0.6761 | 9.991079e-09 | 422 |
| 0.9542 | 0.6776 | 0.9642 | 0.6761 | 9.991036e-09 | 423 |
| 0.9590 | 0.6776 | 0.9641 | 0.6761 | 9.990994e-09 | 424 |
| 0.9642 | 0.6776 | 0.9640 | 0.6761 | 9.990951e-09 | 425 |
| 0.9620 | 0.6776 | 0.9640 | 0.6761 | 9.9909085e-09 | 426 |
| 0.9512 | 0.6776 | 0.9639 | 0.6761 | 9.990866e-09 | 427 |
| 0.9569 | 0.6776 | 0.9638 | 0.6761 | 9.990823e-09 | 428 |
| 0.9551 | 0.6776 | 0.9638 | 0.6761 | 9.990781e-09 | 429 |
| 0.9564 | 0.6776 | 0.9637 | 0.6761 | 9.990738e-09 | 430 |
| 0.9584 | 0.6776 | 0.9637 | 0.6761 | 9.990695e-09 | 431 |
| 0.9554 | 0.6776 | 0.9636 | 0.6761 | 9.990652e-09 | 432 |
| 0.9657 | 0.6776 | 0.9636 | 0.6761 | 9.990608e-09 | 433 |
| 0.9578 | 0.6776 | 0.9635 | 0.6761 | 9.990565e-09 | 434 |
| 0.9584 | 0.6776 | 0.9635 | 0.6761 | 9.990521e-09 | 435 |
| 0.9598 | 0.6776 | 0.9634 | 0.6761 | 9.990478e-09 | 436 |
| 0.9694 | 0.6776 | 0.9634 | 0.6761 | 9.990434e-09 | 437 |
| 0.9541 | 0.6776 | 0.9633 | 0.6761 | 9.990391e-09 | 438 |
| 0.9524 | 0.6776 | 0.9633 | 0.6761 | 9.990347e-09 | 439 |
| 0.9612 | 0.6776 | 0.9632 | 0.6761 | 9.990304e-09 | 440 |
| 0.9472 | 0.6776 | 0.9632 | 0.6761 | 9.990259e-09 | 441 |
| 0.9539 | 0.6776 | 0.9631 | 0.6761 | 9.990215e-09 | 442 |
| 0.9516 | 0.6776 | 0.9631 | 0.6761 | 9.9901705e-09 | 443 |
| 0.9586 | 0.6776 | 0.9631 | 0.6761 | 9.990126e-09 | 444 |
| 0.9516 | 0.6776 | 0.9630 | 0.6761 | 9.990082e-09 | 445 |
| 0.9470 | 0.6776 | 0.9630 | 0.6761 | 9.990037e-09 | 446 |
| 0.9553 | 0.6776 | 0.9629 | 0.6761 | 9.989993e-09 | 447 |
| 0.9591 | 0.6776 | 0.9629 | 0.6761 | 9.989948e-09 | 448 |
| 0.9484 | 0.6776 | 0.9629 | 0.6761 | 9.989903e-09 | 449 |
| 0.9590 | 0.6776 | 0.9628 | 0.6761 | 9.989858e-09 | 450 |
| 0.9551 | 0.6776 | 0.9628 | 0.6761 | 9.9898125e-09 | 451 |
| 0.9510 | 0.6776 | 0.9627 | 0.6761 | 9.989767e-09 | 452 |
| 0.9511 | 0.6776 | 0.9627 | 0.6761 | 9.989722e-09 | 453 |
| 0.9479 | 0.6776 | 0.9627 | 0.6761 | 9.989677e-09 | 454 |
| 0.9536 | 0.6776 | 0.9626 | 0.6761 | 9.989631e-09 | 455 |
| 0.9412 | 0.6776 | 0.9625 | 0.6761 | 9.989586e-09 | 456 |
| 0.9572 | 0.6776 | 0.9625 | 0.6761 | 9.989541e-09 | 457 |
| 0.9448 | 0.6776 | 0.9624 | 0.6761 | 9.989495e-09 | 458 |
| 0.9516 | 0.6776 | 0.9624 | 0.6761 | 9.989448e-09 | 459 |
| 0.9601 | 0.6776 | 0.9623 | 0.6761 | 9.989402e-09 | 460 |
| 0.9556 | 0.6776 | 0.9623 | 0.6761 | 9.989356e-09 | 461 |
| 0.9546 | 0.6776 | 0.9623 | 0.6761 | 9.98931e-09 | 462 |
| 0.9666 | 0.6776 | 0.9622 | 0.6761 | 9.989264e-09 | 463 |
| 0.9426 | 0.6776 | 0.9621 | 0.6761 | 9.9892175e-09 | 464 |
| 0.9657 | 0.6776 | 0.9621 | 0.6761 | 9.989171e-09 | 465 |
| 0.9538 | 0.6776 | 0.9621 | 0.6761 | 9.989125e-09 | 466 |
| 0.9457 | 0.6776 | 0.9620 | 0.6761 | 9.989078e-09 | 467 |
| 0.9473 | 0.6776 | 0.9620 | 0.6761 | 9.989031e-09 | 468 |
| 0.9519 | 0.6776 | 0.9619 | 0.6761 | 9.988984e-09 | 469 |
| 0.9501 | 0.6776 | 0.9619 | 0.6761 | 9.988937e-09 | 470 |
| 0.9626 | 0.6776 | 0.9619 | 0.6761 | 9.98889e-09 | 471 |
| 0.9525 | 0.6776 | 0.9618 | 0.6761 | 9.988843e-09 | 472 |
| 0.9510 | 0.6776 | 0.9618 | 0.6761 | 9.988796e-09 | 473 |
| 0.9522 | 0.6776 | 0.9617 | 0.6761 | 9.9887485e-09 | 474 |
| 0.9535 | 0.6776 | 0.9617 | 0.6761 | 9.988701e-09 | 475 |
| 0.9549 | 0.6776 | 0.9617 | 0.6761 | 9.9886535e-09 | 476 |
| 0.9476 | 0.6776 | 0.9616 | 0.6761 | 9.9886055e-09 | 477 |
| 0.9468 | 0.6776 | 0.9616 | 0.6761 | 9.9885575e-09 | 478 |
| 0.9527 | 0.6776 | 0.9616 | 0.6761 | 9.98851e-09 | 479 |
| 0.9569 | 0.6776 | 0.9615 | 0.6761 | 9.988462e-09 | 480 |
| 0.9556 | 0.6776 | 0.9614 | 0.6761 | 9.988414e-09 | 481 |
| 0.9517 | 0.6776 | 0.9614 | 0.6761 | 9.988366e-09 | 482 |
| 0.9396 | 0.6776 | 0.9613 | 0.6761 | 9.988318e-09 | 483 |
| 0.9524 | 0.6776 | 0.9613 | 0.6761 | 9.98827e-09 | 484 |
| 0.9470 | 0.6776 | 0.9612 | 0.6761 | 9.988221e-09 | 485 |
| 0.9474 | 0.6776 | 0.9612 | 0.6761 | 9.988172e-09 | 486 |
| 0.9673 | 0.6776 | 0.9612 | 0.6761 | 9.988123e-09 | 487 |
| 0.9419 | 0.6776 | 0.9611 | 0.6761 | 9.988074e-09 | 488 |
| 0.9508 | 0.6776 | 0.9611 | 0.6761 | 9.9880255e-09 | 489 |
| 0.9483 | 0.6776 | 0.9610 | 0.6761 | 9.987977e-09 | 490 |
| 0.9555 | 0.6776 | 0.9610 | 0.6761 | 9.987928e-09 | 491 |
| 0.9536 | 0.6776 | 0.9609 | 0.6761 | 9.987879e-09 | 492 |
| 0.9425 | 0.6776 | 0.9609 | 0.6761 | 9.98783e-09 | 493 |
| 0.9551 | 0.6776 | 0.9609 | 0.6761 | 9.98778e-09 | 494 |
| 0.9509 | 0.6776 | 0.9608 | 0.6761 | 9.987731e-09 | 495 |
| 0.9548 | 0.6776 | 0.9608 | 0.6761 | 9.987681e-09 | 496 |
| 0.9564 | 0.6776 | 0.9608 | 0.6761 | 9.987631e-09 | 497 |
| 0.9552 | 0.6776 | 0.9608 | 0.6761 | 9.987581e-09 | 498 |
| 0.9465 | 0.6776 | 0.9608 | 0.6761 | 9.987532e-09 | 499 |
| 0.9493 | 0.6776 | 0.9607 | 0.6761 | 9.987482e-09 | 500 |
| 0.9470 | 0.6776 | 0.9606 | 0.6761 | 9.987432e-09 | 501 |
| 0.9496 | 0.6776 | 0.9606 | 0.6761 | 9.9873825e-09 | 502 |
| 0.9415 | 0.6776 | 0.9606 | 0.6761 | 9.987332e-09 | 503 |
| 0.9543 | 0.6776 | 0.9605 | 0.6761 | 9.987281e-09 | 504 |
| 0.9519 | 0.6776 | 0.9604 | 0.6761 | 9.987231e-09 | 505 |
| 0.9494 | 0.6776 | 0.9604 | 0.6761 | 9.98718e-09 | 506 |
| 0.9416 | 0.6776 | 0.9604 | 0.6761 | 9.987129e-09 | 507 |
| 0.9473 | 0.6776 | 0.9603 | 0.6761 | 9.987079e-09 | 508 |
| 0.9442 | 0.6776 | 0.9603 | 0.6761 | 9.987028e-09 | 509 |
| 0.9431 | 0.6776 | 0.9602 | 0.6761 | 9.9869775e-09 | 510 |
| 0.9507 | 0.6776 | 0.9602 | 0.6761 | 9.986927e-09 | 511 |
| 0.9545 | 0.6776 | 0.9602 | 0.6761 | 9.986875e-09 | 512 |
| 0.9475 | 0.6776 | 0.9601 | 0.6761 | 9.986824e-09 | 513 |
| 0.9457 | 0.6776 | 0.9601 | 0.6761 | 9.986772e-09 | 514 |
| 0.9555 | 0.6776 | 0.9601 | 0.6761 | 9.986721e-09 | 515 |
| 0.9444 | 0.6776 | 0.9601 | 0.6761 | 9.986669e-09 | 516 |
| 0.9436 | 0.6776 | 0.9600 | 0.6761 | 9.986618e-09 | 517 |
| 0.9535 | 0.6776 | 0.9600 | 0.6761 | 9.986566e-09 | 518 |
| 0.9332 | 0.6776 | 0.9599 | 0.6761 | 9.986515e-09 | 519 |
| 0.9562 | 0.6776 | 0.9599 | 0.6761 | 9.986463e-09 | 520 |
| 0.9547 | 0.6776 | 0.9599 | 0.6761 | 9.986411e-09 | 521 |
| 0.9497 | 0.6776 | 0.9599 | 0.6761 | 9.986358e-09 | 522 |
| 0.9583 | 0.6776 | 0.9598 | 0.6761 | 9.986306e-09 | 523 |
| 0.9537 | 0.6776 | 0.9598 | 0.6761 | 9.986254e-09 | 524 |
| 0.9535 | 0.6776 | 0.9597 | 0.6761 | 9.986201e-09 | 525 |
| 0.9485 | 0.6776 | 0.9597 | 0.6761 | 9.986149e-09 | 526 |
| 0.9535 | 0.6776 | 0.9597 | 0.6761 | 9.986096e-09 | 527 |
| 0.9514 | 0.6776 | 0.9596 | 0.6761 | 9.986044e-09 | 528 |
| 0.9474 | 0.6776 | 0.9596 | 0.6761 | 9.985992e-09 | 529 |
| 0.9484 | 0.6776 | 0.9595 | 0.6761 | 9.985938e-09 | 530 |
| 0.9426 | 0.6776 | 0.9595 | 0.6761 | 9.985885e-09 | 531 |
| 0.9489 | 0.6776 | 0.9594 | 0.6761 | 9.985832e-09 | 532 |
| 0.9440 | 0.6776 | 0.9593 | 0.6761 | 9.985778e-09 | 533 |
| 0.9537 | 0.6776 | 0.9593 | 0.6761 | 9.985725e-09 | 534 |
| 0.9501 | 0.6776 | 0.9592 | 0.6761 | 9.985672e-09 | 535 |
| 0.9446 | 0.6776 | 0.9592 | 0.6761 | 9.9856186e-09 | 536 |
| 0.9387 | 0.6776 | 0.9592 | 0.6761 | 9.985565e-09 | 537 |
| 0.9473 | 0.6776 | 0.9591 | 0.6761 | 9.985512e-09 | 538 |
| 0.9500 | 0.6776 | 0.9591 | 0.6761 | 9.985458e-09 | 539 |
| 0.9396 | 0.6776 | 0.9590 | 0.6761 | 9.985404e-09 | 540 |
| 0.9539 | 0.6776 | 0.9589 | 0.6761 | 9.985349e-09 | 541 |
| 0.9539 | 0.6776 | 0.9589 | 0.6761 | 9.985295e-09 | 542 |
| 0.9446 | 0.6776 | 0.9588 | 0.6761 | 9.985241e-09 | 543 |
| 0.9510 | 0.6776 | 0.9588 | 0.6761 | 9.985187e-09 | 544 |
| 0.9457 | 0.6776 | 0.9587 | 0.6761 | 9.985133e-09 | 545 |
| 0.9431 | 0.6776 | 0.9587 | 0.6761 | 9.9850785e-09 | 546 |
| 0.9463 | 0.6776 | 0.9587 | 0.6761 | 9.985024e-09 | 547 |
| 0.9512 | 0.6776 | 0.9587 | 0.6761 | 9.984969e-09 | 548 |
| 0.9425 | 0.6776 | 0.9586 | 0.6761 | 9.984914e-09 | 549 |
| 0.9499 | 0.6776 | 0.9586 | 0.6761 | 9.984859e-09 | 550 |
| 0.9476 | 0.6776 | 0.9586 | 0.6761 | 9.984804e-09 | 551 |
| 0.9416 | 0.6776 | 0.9585 | 0.6761 | 9.984749e-09 | 552 |
| 0.9446 | 0.6776 | 0.9585 | 0.6761 | 9.984694e-09 | 553 |
| 0.9532 | 0.6776 | 0.9584 | 0.6761 | 9.984639e-09 | 554 |
| 0.9432 | 0.6776 | 0.9584 | 0.6761 | 9.984584e-09 | 555 |
| 0.9535 | 0.6776 | 0.9583 | 0.6761 | 9.984528e-09 | 556 |
| 0.9444 | 0.6776 | 0.9583 | 0.6761 | 9.984472e-09 | 557 |
| 0.9454 | 0.6776 | 0.9582 | 0.6761 | 9.984416e-09 | 558 |
| 0.9382 | 0.6776 | 0.9582 | 0.6761 | 9.98436e-09 | 559 |
| 0.9448 | 0.6776 | 0.9581 | 0.6761 | 9.984304e-09 | 560 |
| 0.9421 | 0.6776 | 0.9581 | 0.6761 | 9.984248e-09 | 561 |
| 0.9283 | 0.6776 | 0.9581 | 0.6761 | 9.984192e-09 | 562 |
| 0.9456 | 0.6776 | 0.9580 | 0.6761 | 9.984136e-09 | 563 |
| 0.9351 | 0.6776 | 0.9580 | 0.6761 | 9.98408e-09 | 564 |
| 0.9370 | 0.6776 | 0.9580 | 0.6761 | 9.984023e-09 | 565 |
| 0.9467 | 0.6776 | 0.9579 | 0.6761 | 9.9839665e-09 | 566 |
| 0.9520 | 0.6776 | 0.9579 | 0.6761 | 9.98391e-09 | 567 |
| 0.9370 | 0.6776 | 0.9579 | 0.6761 | 9.983853e-09 | 568 |
| 0.9443 | 0.6776 | 0.9578 | 0.6761 | 9.983796e-09 | 569 |
| 0.9449 | 0.6776 | 0.9578 | 0.6761 | 9.983739e-09 | 570 |
| 0.9456 | 0.6776 | 0.9577 | 0.6761 | 9.983682e-09 | 571 |
| 0.9396 | 0.6776 | 0.9577 | 0.6761 | 9.9836255e-09 | 572 |
| 0.9403 | 0.6776 | 0.9576 | 0.6761 | 9.983569e-09 | 573 |
| 0.9503 | 0.6776 | 0.9576 | 0.6761 | 9.983511e-09 | 574 |
| 0.9471 | 0.6776 | 0.9576 | 0.6761 | 9.983453e-09 | 575 |
| 0.9399 | 0.6776 | 0.9576 | 0.6761 | 9.983395e-09 | 576 |
| 0.9405 | 0.6776 | 0.9575 | 0.6761 | 9.983338e-09 | 577 |
| 0.9463 | 0.6776 | 0.9575 | 0.6761 | 9.98328e-09 | 578 |
| 0.9434 | 0.6776 | 0.9575 | 0.6761 | 9.983222e-09 | 579 |
| 0.9419 | 0.6776 | 0.9575 | 0.6761 | 9.9831645e-09 | 580 |
| 0.9466 | 0.6776 | 0.9574 | 0.6761 | 9.983107e-09 | 581 |
| 0.9402 | 0.6776 | 0.9574 | 0.6761 | 9.983049e-09 | 582 |
| 0.9430 | 0.6776 | 0.9573 | 0.6761 | 9.98299e-09 | 583 |
| 0.9494 | 0.6776 | 0.9573 | 0.6761 | 9.982932e-09 | 584 |
| 0.9385 | 0.6776 | 0.9573 | 0.6761 | 9.982873e-09 | 585 |
| 0.9402 | 0.6776 | 0.9572 | 0.6761 | 9.982815e-09 | 586 |
| 0.9367 | 0.6776 | 0.9572 | 0.6761 | 9.982756e-09 | 587 |
| 0.9445 | 0.6776 | 0.9571 | 0.6761 | 9.982697e-09 | 588 |
| 0.9444 | 0.6776 | 0.9571 | 0.6761 | 9.982639e-09 | 589 |
| 0.9334 | 0.6776 | 0.9570 | 0.6761 | 9.98258e-09 | 590 |
| 0.9483 | 0.6776 | 0.9570 | 0.6761 | 9.9825215e-09 | 591 |
| 0.9410 | 0.6776 | 0.9570 | 0.6761 | 9.982462e-09 | 592 |
| 0.9503 | 0.6776 | 0.9569 | 0.6761 | 9.9824025e-09 | 593 |
| 0.9433 | 0.6776 | 0.9569 | 0.6761 | 9.982343e-09 | 594 |
| 0.9381 | 0.6776 | 0.9568 | 0.6761 | 9.9822834e-09 | 595 |
| 0.9406 | 0.6776 | 0.9568 | 0.6761 | 9.982224e-09 | 596 |
| 0.9408 | 0.6776 | 0.9568 | 0.6761 | 9.982164e-09 | 597 |
| 0.9371 | 0.6776 | 0.9567 | 0.6761 | 9.982105e-09 | 598 |
| 0.9317 | 0.6776 | 0.9567 | 0.6761 | 9.982045e-09 | 599 |
| 0.9521 | 0.6776 | 0.9567 | 0.6761 | 9.981986e-09 | 600 |
| 0.9430 | 0.6776 | 0.9567 | 0.6761 | 9.9819255e-09 | 601 |
| 0.9417 | 0.6776 | 0.9566 | 0.6761 | 9.981865e-09 | 602 |
| 0.9415 | 0.6776 | 0.9566 | 0.6761 | 9.981805e-09 | 603 |
| 0.9316 | 0.6776 | 0.9566 | 0.6761 | 9.981744e-09 | 604 |
| 0.9418 | 0.6776 | 0.9565 | 0.6761 | 9.981684e-09 | 605 |
| 0.9433 | 0.6776 | 0.9565 | 0.6761 | 9.9816235e-09 | 606 |
| 0.9361 | 0.6776 | 0.9564 | 0.6761 | 9.981563e-09 | 607 |
| 0.9416 | 0.6776 | 0.9563 | 0.6761 | 9.981503e-09 | 608 |
| 0.9497 | 0.6776 | 0.9563 | 0.6761 | 9.981442e-09 | 609 |
| 0.9439 | 0.6776 | 0.9562 | 0.6761 | 9.981381e-09 | 610 |
| 0.9345 | 0.6776 | 0.9562 | 0.6761 | 9.98132e-09 | 611 |
| 0.9370 | 0.6776 | 0.9561 | 0.6761 | 9.9812585e-09 | 612 |
| 0.9362 | 0.6776 | 0.9561 | 0.6761 | 9.981197e-09 | 613 |
| 0.9421 | 0.6776 | 0.9560 | 0.6761 | 9.981136e-09 | 614 |
| 0.9327 | 0.6776 | 0.9560 | 0.6761 | 9.981075e-09 | 615 |
| 0.9372 | 0.6776 | 0.9560 | 0.6761 | 9.981013e-09 | 616 |
| 0.9389 | 0.6776 | 0.9560 | 0.6761 | 9.980952e-09 | 617 |
| 0.9440 | 0.6776 | 0.9559 | 0.6761 | 9.980891e-09 | 618 |
| 0.9400 | 0.6776 | 0.9559 | 0.6761 | 9.980829e-09 | 619 |
| 0.9354 | 0.6776 | 0.9559 | 0.6761 | 9.980766e-09 | 620 |
| 0.9434 | 0.6776 | 0.9558 | 0.6761 | 9.980704e-09 | 621 |
| 0.9443 | 0.6776 | 0.9558 | 0.6761 | 9.980642e-09 | 622 |
| 0.9405 | 0.6776 | 0.9557 | 0.6761 | 9.98058e-09 | 623 |
| 0.9373 | 0.6776 | 0.9557 | 0.6761 | 9.980518e-09 | 624 |
| 0.9389 | 0.6776 | 0.9556 | 0.6761 | 9.980456e-09 | 625 |
| 0.9451 | 0.6776 | 0.9556 | 0.6761 | 9.980393e-09 | 626 |
| 0.9334 | 0.6776 | 0.9555 | 0.6761 | 9.980331e-09 | 627 |
| 0.9365 | 0.6776 | 0.9555 | 0.6761 | 9.980268e-09 | 628 |
| 0.9491 | 0.6776 | 0.9554 | 0.6761 | 9.980205e-09 | 629 |
| 0.9414 | 0.6776 | 0.9554 | 0.6761 | 9.980142e-09 | 630 |
| 0.9377 | 0.6776 | 0.9553 | 0.6761 | 9.980079e-09 | 631 |
| 0.9367 | 0.6776 | 0.9553 | 0.6761 | 9.980016e-09 | 632 |
| 0.9399 | 0.6776 | 0.9552 | 0.6761 | 9.979953e-09 | 633 |
| 0.9333 | 0.6776 | 0.9552 | 0.6761 | 9.97989e-09 | 634 |
| 0.9274 | 0.6776 | 0.9551 | 0.6761 | 9.979827e-09 | 635 |
| 0.9354 | 0.6776 | 0.9551 | 0.6761 | 9.979764e-09 | 636 |
| 0.9338 | 0.6776 | 0.9550 | 0.6761 | 9.9797e-09 | 637 |
| 0.9321 | 0.6776 | 0.9550 | 0.6761 | 9.979636e-09 | 638 |
| 0.9376 | 0.6776 | 0.9549 | 0.6761 | 9.979572e-09 | 639 |
| 0.9387 | 0.6776 | 0.9549 | 0.6761 | 9.979508e-09 | 640 |
| 0.9414 | 0.6776 | 0.9549 | 0.6761 | 9.979444e-09 | 641 |
| 0.9388 | 0.6776 | 0.9548 | 0.6761 | 9.97938e-09 | 642 |
| 0.9467 | 0.6776 | 0.9548 | 0.6761 | 9.979316e-09 | 643 |
| 0.9441 | 0.6776 | 0.9548 | 0.6761 | 9.979252e-09 | 644 |
| 0.9372 | 0.6776 | 0.9547 | 0.6761 | 9.979188e-09 | 645 |
| 0.9422 | 0.6776 | 0.9547 | 0.6761 | 9.979123e-09 | 646 |
| 0.9479 | 0.6776 | 0.9546 | 0.6761 | 9.9790585e-09 | 647 |
| 0.9369 | 0.6776 | 0.9546 | 0.6761 | 9.978994e-09 | 648 |
| 0.9333 | 0.6776 | 0.9545 | 0.6761 | 9.978929e-09 | 649 |
| 0.9361 | 0.6776 | 0.9545 | 0.6761 | 9.978864e-09 | 650 |
| 0.9415 | 0.6776 | 0.9544 | 0.6761 | 9.978799e-09 | 651 |
| 0.9406 | 0.6776 | 0.9544 | 0.6761 | 9.978734e-09 | 652 |
| 0.9347 | 0.6776 | 0.9544 | 0.6761 | 9.9786694e-09 | 653 |
| 0.9468 | 0.6776 | 0.9544 | 0.6761 | 9.978605e-09 | 654 |
| 0.9398 | 0.6776 | 0.9543 | 0.6761 | 9.978539e-09 | 655 |
| 0.9397 | 0.6776 | 0.9543 | 0.6761 | 9.978473e-09 | 656 |
| 0.9415 | 0.6776 | 0.9542 | 0.6761 | 9.978407e-09 | 657 |
| 0.9323 | 0.6776 | 0.9542 | 0.6761 | 9.978342e-09 | 658 |
| 0.9311 | 0.6776 | 0.9541 | 0.6761 | 9.978276e-09 | 659 |
| 0.9390 | 0.6776 | 0.9541 | 0.6761 | 9.97821e-09 | 660 |
| 0.9533 | 0.6776 | 0.9540 | 0.6761 | 9.9781445e-09 | 661 |
| 0.9333 | 0.6776 | 0.9540 | 0.6761 | 9.978079e-09 | 662 |
| 0.9435 | 0.6776 | 0.9540 | 0.6761 | 9.978013e-09 | 663 |
| 0.9337 | 0.6776 | 0.9539 | 0.6761 | 9.9779465e-09 | 664 |
| 0.9369 | 0.6776 | 0.9539 | 0.6761 | 9.97788e-09 | 665 |
| 0.9300 | 0.6776 | 0.9538 | 0.6761 | 9.977813e-09 | 666 |
| 0.9405 | 0.6776 | 0.9538 | 0.6761 | 9.977747e-09 | 667 |
| 0.9321 | 0.6776 | 0.9537 | 0.6761 | 9.97768e-09 | 668 |
| 0.9296 | 0.6776 | 0.9537 | 0.6761 | 9.977613e-09 | 669 |
| 0.9357 | 0.6776 | 0.9536 | 0.6761 | 9.977547e-09 | 670 |
| 0.9377 | 0.6776 | 0.9536 | 0.6761 | 9.97748e-09 | 671 |
| 0.9295 | 0.6776 | 0.9536 | 0.6761 | 9.977414e-09 | 672 |
| 0.9351 | 0.6776 | 0.9535 | 0.6761 | 9.977346e-09 | 673 |
| 0.9288 | 0.6776 | 0.9535 | 0.6761 | 9.977279e-09 | 674 |
| 0.9381 | 0.6776 | 0.9535 | 0.6761 | 9.977211e-09 | 675 |
| 0.9283 | 0.6776 | 0.9534 | 0.6761 | 9.9771436e-09 | 676 |
| 0.9299 | 0.6776 | 0.9534 | 0.6761 | 9.977076e-09 | 677 |
| 0.9329 | 0.6776 | 0.9534 | 0.6761 | 9.9770086e-09 | 678 |
| 0.9351 | 0.6776 | 0.9534 | 0.6761 | 9.976941e-09 | 679 |
| 0.9319 | 0.6776 | 0.9533 | 0.6761 | 9.9768735e-09 | 680 |
| 0.9331 | 0.6776 | 0.9533 | 0.6761 | 9.976806e-09 | 681 |
| 0.9389 | 0.6776 | 0.9533 | 0.6761 | 9.976738e-09 | 682 |
| 0.9301 | 0.6776 | 0.9532 | 0.6761 | 9.976669e-09 | 683 |
| 0.9252 | 0.6776 | 0.9531 | 0.6761 | 9.976601e-09 | 684 |
| 0.9363 | 0.6776 | 0.9531 | 0.6761 | 9.9765325e-09 | 685 |
| 0.9327 | 0.6776 | 0.9531 | 0.6761 | 9.976464e-09 | 686 |
| 0.9373 | 0.6776 | 0.9531 | 0.6761 | 9.976396e-09 | 687 |
| 0.9379 | 0.6776 | 0.9530 | 0.6761 | 9.976327e-09 | 688 |
| 0.9360 | 0.6776 | 0.9530 | 0.6761 | 9.976259e-09 | 689 |
| 0.9484 | 0.6776 | 0.9530 | 0.6761 | 9.97619e-09 | 690 |
| 0.9369 | 0.6776 | 0.9529 | 0.6761 | 9.97612e-09 | 691 |
| 0.9303 | 0.6776 | 0.9529 | 0.6761 | 9.976051e-09 | 692 |
| 0.9339 | 0.6776 | 0.9528 | 0.6761 | 9.975982e-09 | 693 |
| 0.9513 | 0.6776 | 0.9528 | 0.6761 | 9.9759125e-09 | 694 |
| 0.9340 | 0.6776 | 0.9528 | 0.6761 | 9.975843e-09 | 695 |
| 0.9339 | 0.6776 | 0.9527 | 0.6761 | 9.975774e-09 | 696 |
| 0.9307 | 0.6776 | 0.9527 | 0.6761 | 9.975705e-09 | 697 |
| 0.9295 | 0.6776 | 0.9527 | 0.6761 | 9.975635e-09 | 698 |
| 0.9332 | 0.6776 | 0.9526 | 0.6761 | 9.975565e-09 | 699 |
| 0.9384 | 0.6776 | 0.9526 | 0.6761 | 9.975495e-09 | 700 |
| 0.9283 | 0.6776 | 0.9525 | 0.6761 | 9.975425e-09 | 701 |
| 0.9367 | 0.6776 | 0.9525 | 0.6761 | 9.975355e-09 | 702 |
| 0.9344 | 0.6776 | 0.9524 | 0.6761 | 9.975285e-09 | 703 |
| 0.9315 | 0.6776 | 0.9524 | 0.6761 | 9.975214e-09 | 704 |
| 0.9365 | 0.6776 | 0.9523 | 0.6761 | 9.975144e-09 | 705 |
| 0.9317 | 0.6776 | 0.9523 | 0.6761 | 9.975074e-09 | 706 |
| 0.9282 | 0.6776 | 0.9522 | 0.6761 | 9.975004e-09 | 707 |
| 0.9372 | 0.6776 | 0.9522 | 0.6761 | 9.974933e-09 | 708 |
| 0.9377 | 0.6776 | 0.9522 | 0.6761 | 9.974862e-09 | 709 |
| 0.9354 | 0.6776 | 0.9522 | 0.6761 | 9.974791e-09 | 710 |
| 0.9400 | 0.6776 | 0.9521 | 0.6761 | 9.97472e-09 | 711 |
| 0.9344 | 0.6776 | 0.9521 | 0.6761 | 9.974649e-09 | 712 |
| 0.9309 | 0.6776 | 0.9521 | 0.6761 | 9.974578e-09 | 713 |
| 0.9324 | 0.6776 | 0.9520 | 0.6761 | 9.9745066e-09 | 714 |
| 0.9252 | 0.6776 | 0.9520 | 0.6761 | 9.9744355e-09 | 715 |
| 0.9404 | 0.6776 | 0.9519 | 0.6761 | 9.9743644e-09 | 716 |
| 0.9336 | 0.6776 | 0.9519 | 0.6761 | 9.9742925e-09 | 717 |
| 0.9370 | 0.6776 | 0.9518 | 0.6761 | 9.974221e-09 | 718 |
| 0.9331 | 0.6776 | 0.9517 | 0.6761 | 9.974149e-09 | 719 |
| 0.9329 | 0.6776 | 0.9517 | 0.6761 | 9.974077e-09 | 720 |
| 0.9370 | 0.6776 | 0.9516 | 0.6761 | 9.974005e-09 | 721 |
| 0.9278 | 0.6776 | 0.9516 | 0.6761 | 9.973933e-09 | 722 |
| 0.9385 | 0.6776 | 0.9516 | 0.6761 | 9.973861e-09 | 723 |
| 0.9390 | 0.6776 | 0.9515 | 0.6761 | 9.973789e-09 | 724 |
| 0.9306 | 0.6776 | 0.9515 | 0.6761 | 9.973717e-09 | 725 |
| 0.9355 | 0.6776 | 0.9515 | 0.6761 | 9.973644e-09 | 726 |
| 0.9399 | 0.6776 | 0.9514 | 0.6761 | 9.973571e-09 | 727 |
| 0.9380 | 0.6776 | 0.9514 | 0.6761 | 9.9734985e-09 | 728 |
| 0.9283 | 0.6776 | 0.9513 | 0.6761 | 9.973426e-09 | 729 |
| 0.9293 | 0.6776 | 0.9513 | 0.6761 | 9.973353e-09 | 730 |
| 0.9383 | 0.6776 | 0.9513 | 0.6761 | 9.97328e-09 | 731 |
| 0.9391 | 0.6776 | 0.9512 | 0.6761 | 9.973207e-09 | 732 |
| 0.9281 | 0.6776 | 0.9512 | 0.6761 | 9.973134e-09 | 733 |
| 0.9311 | 0.6776 | 0.9512 | 0.6761 | 9.9730615e-09 | 734 |
| 0.9290 | 0.6776 | 0.9511 | 0.6761 | 9.972988e-09 | 735 |
| 0.9319 | 0.6776 | 0.9511 | 0.6761 | 9.972914e-09 | 736 |
| 0.9237 | 0.6776 | 0.9510 | 0.6761 | 9.97284e-09 | 737 |
| 0.9313 | 0.6776 | 0.9510 | 0.6761 | 9.972767e-09 | 738 |
| 0.9323 | 0.6776 | 0.9510 | 0.6761 | 9.972693e-09 | 739 |
| 0.9364 | 0.6776 | 0.9510 | 0.6761 | 9.972619e-09 | 740 |
| 0.9331 | 0.6776 | 0.9510 | 0.6761 | 9.9725455e-09 | 741 |
| 0.9325 | 0.6776 | 0.9509 | 0.6761 | 9.972472e-09 | 742 |
| 0.9307 | 0.6776 | 0.9509 | 0.6761 | 9.972398e-09 | 743 |
| 0.9315 | 0.6776 | 0.9509 | 0.6761 | 9.972323e-09 | 744 |
| 0.9322 | 0.6776 | 0.9509 | 0.6761 | 9.972249e-09 | 745 |
| 0.9349 | 0.6776 | 0.9508 | 0.6761 | 9.972174e-09 | 746 |
| 0.9273 | 0.6776 | 0.9508 | 0.6761 | 9.9721e-09 | 747 |
| 0.9314 | 0.6776 | 0.9507 | 0.6761 | 9.972025e-09 | 748 |
| 0.9342 | 0.6776 | 0.9506 | 0.6761 | 9.97195e-09 | 749 |
| 0.9316 | 0.6776 | 0.9506 | 0.6761 | 9.971876e-09 | 750 |
| 0.9328 | 0.6776 | 0.9506 | 0.6761 | 9.971801e-09 | 751 |
| 0.9439 | 0.6776 | 0.9506 | 0.6761 | 9.9717266e-09 | 752 |
| 0.9299 | 0.6776 | 0.9505 | 0.6761 | 9.971651e-09 | 753 |
| 0.9276 | 0.6776 | 0.9505 | 0.6761 | 9.971576e-09 | 754 |
| 0.9135 | 0.6776 | 0.9505 | 0.6761 | 9.9715e-09 | 755 |
| 0.9400 | 0.6776 | 0.9505 | 0.6761 | 9.971425e-09 | 756 |
| 0.9349 | 0.6776 | 0.9504 | 0.6761 | 9.971349e-09 | 757 |
| 0.9348 | 0.6776 | 0.9504 | 0.6761 | 9.971274e-09 | 758 |
| 0.9294 | 0.6776 | 0.9503 | 0.6761 | 9.971198e-09 | 759 |
| 0.9315 | 0.6776 | 0.9502 | 0.6761 | 9.971123e-09 | 760 |
| 0.9219 | 0.6776 | 0.9502 | 0.6761 | 9.971047e-09 | 761 |
| 0.9296 | 0.6776 | 0.9502 | 0.6761 | 9.970971e-09 | 762 |
| 0.9199 | 0.6776 | 0.9501 | 0.6761 | 9.970894e-09 | 763 |
| 0.9285 | 0.6776 | 0.9501 | 0.6761 | 9.970818e-09 | 764 |
| 0.9361 | 0.6776 | 0.9501 | 0.6761 | 9.970742e-09 | 765 |
| 0.9270 | 0.6776 | 0.9500 | 0.6761 | 9.970665e-09 | 766 |
| 0.9364 | 0.6776 | 0.9500 | 0.6761 | 9.970589e-09 | 767 |
| 0.9314 | 0.6776 | 0.9499 | 0.6761 | 9.970512e-09 | 768 |
| 0.9217 | 0.6776 | 0.9499 | 0.6761 | 9.970436e-09 | 769 |
| 0.9383 | 0.6776 | 0.9499 | 0.6761 | 9.97036e-09 | 770 |
| 0.9299 | 0.6776 | 0.9498 | 0.6761 | 9.970282e-09 | 771 |
| 0.9310 | 0.6776 | 0.9498 | 0.6761 | 9.970205e-09 | 772 |
| 0.9336 | 0.6776 | 0.9498 | 0.6761 | 9.970128e-09 | 773 |
| 0.9320 | 0.6776 | 0.9497 | 0.6761 | 9.970051e-09 | 774 |
| 0.9277 | 0.6776 | 0.9497 | 0.6761 | 9.969973e-09 | 775 |
| 0.9229 | 0.6776 | 0.9497 | 0.6761 | 9.969896e-09 | 776 |
| 0.9240 | 0.6776 | 0.9497 | 0.6761 | 9.969819e-09 | 777 |
| 0.9275 | 0.6776 | 0.9496 | 0.6761 | 9.9697415e-09 | 778 |
| 0.9337 | 0.6776 | 0.9496 | 0.6761 | 9.969664e-09 | 779 |
| 0.9259 | 0.6776 | 0.9496 | 0.6761 | 9.969586e-09 | 780 |
| 0.9273 | 0.6776 | 0.9495 | 0.6761 | 9.969508e-09 | 781 |
| 0.9257 | 0.6776 | 0.9495 | 0.6761 | 9.96943e-09 | 782 |
| 0.9303 | 0.6776 | 0.9495 | 0.6761 | 9.969352e-09 | 783 |
| 0.9328 | 0.6776 | 0.9495 | 0.6761 | 9.969273e-09 | 784 |
| 0.9205 | 0.6776 | 0.9494 | 0.6761 | 9.969195e-09 | 785 |
| 0.9350 | 0.6776 | 0.9494 | 0.6761 | 9.969117e-09 | 786 |
| 0.9249 | 0.6776 | 0.9494 | 0.6761 | 9.969039e-09 | 787 |
| 0.9154 | 0.6776 | 0.9493 | 0.6761 | 9.968961e-09 | 788 |
| 0.9251 | 0.6776 | 0.9493 | 0.6761 | 9.968882e-09 | 789 |
| 0.9260 | 0.6776 | 0.9493 | 0.6761 | 9.968803e-09 | 790 |
| 0.9210 | 0.6776 | 0.9492 | 0.6761 | 9.968724e-09 | 791 |
| 0.9229 | 0.6776 | 0.9492 | 0.6761 | 9.968645e-09 | 792 |
| 0.9308 | 0.6776 | 0.9491 | 0.6761 | 9.9685655e-09 | 793 |
| 0.9253 | 0.6776 | 0.9491 | 0.6761 | 9.9684865e-09 | 794 |
| 0.9263 | 0.6776 | 0.9490 | 0.6761 | 9.968407e-09 | 795 |
| 0.9271 | 0.6776 | 0.9490 | 0.6761 | 9.968328e-09 | 796 |
| 0.9214 | 0.6776 | 0.9490 | 0.6761 | 9.968249e-09 | 797 |
| 0.9409 | 0.6776 | 0.9490 | 0.6761 | 9.968169e-09 | 798 |
| 0.9263 | 0.6776 | 0.9490 | 0.6761 | 9.9680895e-09 | 799 |
| 0.9355 | 0.6776 | 0.9489 | 0.6761 | 9.9680095e-09 | 800 |
| 0.9303 | 0.6776 | 0.9489 | 0.6761 | 9.96793e-09 | 801 |
| 0.9304 | 0.6776 | 0.9489 | 0.6761 | 9.96785e-09 | 802 |
| 0.9261 | 0.6776 | 0.9489 | 0.6761 | 9.96777e-09 | 803 |
| 0.9315 | 0.6776 | 0.9488 | 0.6761 | 9.96769e-09 | 804 |
| 0.9261 | 0.6776 | 0.9488 | 0.6761 | 9.96761e-09 | 805 |
| 0.9261 | 0.6776 | 0.9488 | 0.6761 | 9.96753e-09 | 806 |
| 0.9211 | 0.6776 | 0.9488 | 0.6761 | 9.967449e-09 | 807 |
| 0.9266 | 0.6776 | 0.9487 | 0.6761 | 9.967368e-09 | 808 |
| 0.9280 | 0.6776 | 0.9487 | 0.6761 | 9.967287e-09 | 809 |
| 0.9273 | 0.6776 | 0.9487 | 0.6761 | 9.967207e-09 | 810 |
| 0.9220 | 0.6776 | 0.9486 | 0.6761 | 9.967126e-09 | 811 |
| 0.9303 | 0.6776 | 0.9485 | 0.6761 | 9.967045e-09 | 812 |
| 0.9309 | 0.6776 | 0.9485 | 0.6761 | 9.966964e-09 | 813 |
| 0.9222 | 0.6776 | 0.9484 | 0.6761 | 9.966883e-09 | 814 |
| 0.9198 | 0.6776 | 0.9484 | 0.6761 | 9.9668025e-09 | 815 |
| 0.9223 | 0.6776 | 0.9484 | 0.6761 | 9.966721e-09 | 816 |
| 0.9225 | 0.6776 | 0.9483 | 0.6761 | 9.966639e-09 | 817 |
| 0.9150 | 0.6776 | 0.9483 | 0.6761 | 9.966557e-09 | 818 |
| 0.9289 | 0.6776 | 0.9482 | 0.6761 | 9.966476e-09 | 819 |
| 0.9272 | 0.6776 | 0.9482 | 0.6761 | 9.966394e-09 | 820 |
| 0.9191 | 0.6776 | 0.9483 | 0.6761 | 9.966312e-09 | 821 |
| 0.9271 | 0.6776 | 0.9482 | 0.6761 | 9.9662305e-09 | 822 |
| 0.9136 | 0.6776 | 0.9482 | 0.6761 | 9.966149e-09 | 823 |
| 0.9227 | 0.6776 | 0.9481 | 0.6761 | 9.966067e-09 | 824 |
| 0.9297 | 0.6776 | 0.9480 | 0.6761 | 9.9659845e-09 | 825 |
| 0.9213 | 0.6776 | 0.9480 | 0.6761 | 9.965902e-09 | 826 |
| 0.9218 | 0.6776 | 0.9479 | 0.6761 | 9.965819e-09 | 827 |
| 0.9186 | 0.6776 | 0.9479 | 0.6761 | 9.965737e-09 | 828 |
| 0.9286 | 0.6776 | 0.9479 | 0.6761 | 9.965654e-09 | 829 |
| 0.9355 | 0.6776 | 0.9478 | 0.6761 | 9.9655715e-09 | 830 |
| 0.9264 | 0.6776 | 0.9478 | 0.6761 | 9.965489e-09 | 831 |
| 0.9218 | 0.6776 | 0.9477 | 0.6761 | 9.965406e-09 | 832 |
| 0.9312 | 0.6776 | 0.9476 | 0.6761 | 9.965324e-09 | 833 |
| 0.9155 | 0.6776 | 0.9476 | 0.6761 | 9.96524e-09 | 834 |
| 0.9244 | 0.6776 | 0.9476 | 0.6761 | 9.965157e-09 | 835 |
| 0.9234 | 0.6776 | 0.9476 | 0.6761 | 9.965073e-09 | 836 |
| 0.9359 | 0.6776 | 0.9475 | 0.6761 | 9.96499e-09 | 837 |
| 0.9310 | 0.6776 | 0.9475 | 0.6761 | 9.964906e-09 | 838 |
| 0.9238 | 0.6776 | 0.9474 | 0.6761 | 9.964823e-09 | 839 |
| 0.9289 | 0.6776 | 0.9474 | 0.6761 | 9.964739e-09 | 840 |
| 0.9223 | 0.6776 | 0.9473 | 0.6761 | 9.964656e-09 | 841 |
| 0.9323 | 0.6776 | 0.9473 | 0.6761 | 9.964572e-09 | 842 |
| 0.9291 | 0.6776 | 0.9473 | 0.6761 | 9.964488e-09 | 843 |
| 0.9327 | 0.6776 | 0.9472 | 0.6761 | 9.9644035e-09 | 844 |
| 0.9213 | 0.6776 | 0.9472 | 0.6761 | 9.964319e-09 | 845 |
| 0.9181 | 0.6776 | 0.9472 | 0.6761 | 9.964235e-09 | 846 |
| 0.9181 | 0.6776 | 0.9471 | 0.6761 | 9.96415e-09 | 847 |
| 0.9196 | 0.6776 | 0.9471 | 0.6761 | 9.964066e-09 | 848 |
| 0.9160 | 0.6776 | 0.9471 | 0.6761 | 9.963982e-09 | 849 |
| 0.9151 | 0.6776 | 0.9470 | 0.6761 | 9.963897e-09 | 850 |
| 0.9267 | 0.6776 | 0.9470 | 0.6761 | 9.963813e-09 | 851 |
| 0.9237 | 0.6776 | 0.9470 | 0.6761 | 9.963728e-09 | 852 |
| 0.9168 | 0.6776 | 0.9469 | 0.6761 | 9.963642e-09 | 853 |
| 0.9125 | 0.6776 | 0.9469 | 0.6761 | 9.963557e-09 | 854 |
| 0.9252 | 0.6776 | 0.9468 | 0.6761 | 9.963472e-09 | 855 |
| 0.9254 | 0.6776 | 0.9468 | 0.6761 | 9.963387e-09 | 856 |
| 0.9292 | 0.6776 | 0.9467 | 0.6761 | 9.963301e-09 | 857 |
| 0.9187 | 0.6776 | 0.9467 | 0.6761 | 9.963216e-09 | 858 |
| 0.9181 | 0.6776 | 0.9467 | 0.6761 | 9.963131e-09 | 859 |
| 0.9211 | 0.6776 | 0.9466 | 0.6761 | 9.9630455e-09 | 860 |
| 0.9206 | 0.6776 | 0.9466 | 0.6761 | 9.962959e-09 | 861 |
| 0.9183 | 0.6776 | 0.9465 | 0.6761 | 9.962873e-09 | 862 |
| 0.9181 | 0.6776 | 0.9465 | 0.6761 | 9.962787e-09 | 863 |
| 0.9210 | 0.6776 | 0.9464 | 0.6761 | 9.962701e-09 | 864 |
| 0.9219 | 0.6776 | 0.9464 | 0.6761 | 9.962615e-09 | 865 |
| 0.9251 | 0.6776 | 0.9464 | 0.6761 | 9.962529e-09 | 866 |
| 0.9147 | 0.6776 | 0.9463 | 0.6761 | 9.962442e-09 | 867 |
| 0.9277 | 0.6776 | 0.9462 | 0.6761 | 9.962356e-09 | 868 |
| 0.9283 | 0.6776 | 0.9462 | 0.6761 | 9.96227e-09 | 869 |
| 0.9168 | 0.6776 | 0.9462 | 0.6761 | 9.962183e-09 | 870 |
| 0.9212 | 0.6776 | 0.9462 | 0.6761 | 9.962096e-09 | 871 |
| 0.9173 | 0.6776 | 0.9461 | 0.6761 | 9.962009e-09 | 872 |
| 0.9250 | 0.6776 | 0.9461 | 0.6761 | 9.961922e-09 | 873 |
| 0.9177 | 0.6776 | 0.9461 | 0.6761 | 9.961835e-09 | 874 |
| 0.9142 | 0.6776 | 0.9460 | 0.6761 | 9.961748e-09 | 875 |
| 0.9231 | 0.6776 | 0.9460 | 0.6761 | 9.961661e-09 | 876 |
| 0.9173 | 0.6776 | 0.9460 | 0.6761 | 9.961574e-09 | 877 |
| 0.9223 | 0.6776 | 0.9459 | 0.6761 | 9.961487e-09 | 878 |
| 0.9212 | 0.6776 | 0.9459 | 0.6761 | 9.961399e-09 | 879 |
| 0.9209 | 0.6776 | 0.9459 | 0.6761 | 9.961311e-09 | 880 |
| 0.9173 | 0.6776 | 0.9459 | 0.6761 | 9.961223e-09 | 881 |
| 0.9184 | 0.6776 | 0.9458 | 0.6761 | 9.961135e-09 | 882 |
| 0.9156 | 0.6776 | 0.9457 | 0.6761 | 9.961047e-09 | 883 |
| 0.9144 | 0.6776 | 0.9457 | 0.6761 | 9.960959e-09 | 884 |
| 0.9230 | 0.6776 | 0.9457 | 0.6761 | 9.960871e-09 | 885 |
| 0.9253 | 0.6776 | 0.9456 | 0.6761 | 9.960783e-09 | 886 |
| 0.9216 | 0.6776 | 0.9456 | 0.6761 | 9.960695e-09 | 887 |
| 0.9162 | 0.6776 | 0.9456 | 0.6761 | 9.960607e-09 | 888 |
| 0.9157 | 0.6776 | 0.9455 | 0.6761 | 9.960518e-09 | 889 |
| 0.9162 | 0.6776 | 0.9455 | 0.6761 | 9.960429e-09 | 890 |
| 0.9124 | 0.6776 | 0.9454 | 0.6761 | 9.96034e-09 | 891 |
| 0.9181 | 0.6776 | 0.9454 | 0.6761 | 9.960251e-09 | 892 |
| 0.9221 | 0.6776 | 0.9454 | 0.6761 | 9.9601625e-09 | 893 |
| 0.9197 | 0.6776 | 0.9454 | 0.6761 | 9.960074e-09 | 894 |
| 0.9240 | 0.6776 | 0.9454 | 0.6761 | 9.959985e-09 | 895 |
| 0.9183 | 0.6776 | 0.9453 | 0.6761 | 9.959896e-09 | 896 |
| 0.9225 | 0.6776 | 0.9453 | 0.6761 | 9.959806e-09 | 897 |
| 0.9179 | 0.6776 | 0.9452 | 0.6761 | 9.959717e-09 | 898 |
| 0.9116 | 0.6776 | 0.9452 | 0.6761 | 9.959627e-09 | 899 |
| 0.9179 | 0.6776 | 0.9451 | 0.6761 | 9.959537e-09 | 900 |
| 0.9216 | 0.6776 | 0.9451 | 0.6761 | 9.9594475e-09 | 901 |
| 0.9225 | 0.6776 | 0.9451 | 0.6761 | 9.959358e-09 | 902 |
| 0.9251 | 0.6776 | 0.9451 | 0.6761 | 9.959268e-09 | 903 |
| 0.9139 | 0.6776 | 0.9450 | 0.6761 | 9.959178e-09 | 904 |
| 0.9314 | 0.6776 | 0.9450 | 0.6761 | 9.959089e-09 | 905 |
| 0.9220 | 0.6776 | 0.9449 | 0.6761 | 9.958998e-09 | 906 |
| 0.9211 | 0.6776 | 0.9449 | 0.6761 | 9.9589075e-09 | 907 |
| 0.9191 | 0.6776 | 0.9449 | 0.6761 | 9.958817e-09 | 908 |
| 0.9175 | 0.6776 | 0.9448 | 0.6761 | 9.958726e-09 | 909 |
| 0.9154 | 0.6776 | 0.9448 | 0.6761 | 9.958636e-09 | 910 |
| 0.9253 | 0.6776 | 0.9448 | 0.6761 | 9.958545e-09 | 911 |
| 0.9160 | 0.6776 | 0.9448 | 0.6761 | 9.9584545e-09 | 912 |
| 0.9290 | 0.6776 | 0.9447 | 0.6761 | 9.958364e-09 | 913 |
| 0.9152 | 0.6776 | 0.9446 | 0.6761 | 9.958273e-09 | 914 |
| 0.9273 | 0.6776 | 0.9446 | 0.6761 | 9.958182e-09 | 915 |
| 0.9065 | 0.6776 | 0.9446 | 0.6761 | 9.95809e-09 | 916 |
| 0.9147 | 0.6776 | 0.9445 | 0.6761 | 9.957999e-09 | 917 |
| 0.9091 | 0.6776 | 0.9445 | 0.6761 | 9.957907e-09 | 918 |
| 0.9175 | 0.6776 | 0.9444 | 0.6761 | 9.957816e-09 | 919 |
| 0.9242 | 0.6776 | 0.9444 | 0.6761 | 9.957724e-09 | 920 |
| 0.9269 | 0.6776 | 0.9444 | 0.6761 | 9.957633e-09 | 921 |
| 0.9117 | 0.6776 | 0.9444 | 0.6761 | 9.9575415e-09 | 922 |
| 0.9167 | 0.6776 | 0.9444 | 0.6761 | 9.95745e-09 | 923 |
| 0.9228 | 0.6776 | 0.9444 | 0.6761 | 9.957358e-09 | 924 |
| 0.9186 | 0.6776 | 0.9443 | 0.6761 | 9.957265e-09 | 925 |
| 0.9156 | 0.6776 | 0.9443 | 0.6761 | 9.957173e-09 | 926 |
| 0.9130 | 0.6776 | 0.9442 | 0.6761 | 9.9570805e-09 | 927 |
| 0.9200 | 0.6776 | 0.9441 | 0.6761 | 9.956988e-09 | 928 |
| 0.9159 | 0.6776 | 0.9441 | 0.6761 | 9.956896e-09 | 929 |
| 0.9267 | 0.6776 | 0.9441 | 0.6761 | 9.956803e-09 | 930 |
| 0.9218 | 0.6776 | 0.9440 | 0.6761 | 9.956711e-09 | 931 |
| 0.9167 | 0.6776 | 0.9440 | 0.6761 | 9.956619e-09 | 932 |
| 0.9228 | 0.6776 | 0.9439 | 0.6761 | 9.956525e-09 | 933 |
| 0.9056 | 0.6776 | 0.9439 | 0.6761 | 9.956432e-09 | 934 |
| 0.9115 | 0.6776 | 0.9438 | 0.6761 | 9.956339e-09 | 935 |
| 0.9314 | 0.6776 | 0.9437 | 0.6761 | 9.956246e-09 | 936 |
| 0.9182 | 0.6776 | 0.9437 | 0.6761 | 9.956152e-09 | 937 |
| 0.9169 | 0.6776 | 0.9437 | 0.6761 | 9.956059e-09 | 938 |
| 0.9060 | 0.6776 | 0.9436 | 0.6761 | 9.955966e-09 | 939 |
| 0.9156 | 0.6776 | 0.9435 | 0.6761 | 9.955873e-09 | 940 |
| 0.9196 | 0.6776 | 0.9435 | 0.6761 | 9.955779e-09 | 941 |
| 0.9208 | 0.6776 | 0.9434 | 0.6761 | 9.955685e-09 | 942 |
| 0.9043 | 0.6776 | 0.9434 | 0.6761 | 9.955591e-09 | 943 |
| 0.9169 | 0.6776 | 0.9433 | 0.6761 | 9.955497e-09 | 944 |
| 0.9171 | 0.6776 | 0.9433 | 0.6761 | 9.955403e-09 | 945 |
| 0.9203 | 0.6776 | 0.9433 | 0.6761 | 9.955309e-09 | 946 |
| 0.9209 | 0.6776 | 0.9433 | 0.6761 | 9.955214e-09 | 947 |
| 0.9250 | 0.6776 | 0.9432 | 0.6761 | 9.95512e-09 | 948 |
| 0.9218 | 0.6776 | 0.9432 | 0.6761 | 9.955026e-09 | 949 |
| 0.9124 | 0.6776 | 0.9432 | 0.6761 | 9.954932e-09 | 950 |
| 0.9076 | 0.6776 | 0.9432 | 0.6761 | 9.954837e-09 | 951 |
| 0.9154 | 0.6776 | 0.9431 | 0.6761 | 9.954742e-09 | 952 |
| 0.9106 | 0.6776 | 0.9431 | 0.6761 | 9.954647e-09 | 953 |
| 0.9123 | 0.6776 | 0.9430 | 0.6761 | 9.954552e-09 | 954 |
| 0.9142 | 0.6776 | 0.9430 | 0.6761 | 9.954457e-09 | 955 |
| 0.9117 | 0.6776 | 0.9429 | 0.6761 | 9.954362e-09 | 956 |
| 0.9136 | 0.6776 | 0.9429 | 0.6761 | 9.954267e-09 | 957 |
| 0.9127 | 0.6776 | 0.9428 | 0.6761 | 9.954172e-09 | 958 |
| 0.9022 | 0.6776 | 0.9428 | 0.6761 | 9.954077e-09 | 959 |
| 0.9206 | 0.6776 | 0.9427 | 0.6761 | 9.953981e-09 | 960 |
| 0.9147 | 0.6776 | 0.9426 | 0.6761 | 9.953885e-09 | 961 |
| 0.9134 | 0.6776 | 0.9426 | 0.6761 | 9.953789e-09 | 962 |
| 0.9204 | 0.6776 | 0.9425 | 0.6761 | 9.953693e-09 | 963 |
| 0.9055 | 0.6776 | 0.9425 | 0.6761 | 9.953597e-09 | 964 |
| 0.9146 | 0.6776 | 0.9424 | 0.6761 | 9.953501e-09 | 965 |
| 0.9129 | 0.6776 | 0.9424 | 0.6761 | 9.953405e-09 | 966 |
| 0.9197 | 0.6776 | 0.9423 | 0.6761 | 9.953309e-09 | 967 |
| 0.9140 | 0.6776 | 0.9423 | 0.6761 | 9.953213e-09 | 968 |
| 0.9079 | 0.6776 | 0.9423 | 0.6761 | 9.9531166e-09 | 969 |
| 0.9134 | 0.6776 | 0.9422 | 0.6761 | 9.95302e-09 | 970 |
| 0.9144 | 0.6776 | 0.9421 | 0.6761 | 9.952923e-09 | 971 |
| 0.9203 | 0.6776 | 0.9421 | 0.6761 | 9.952826e-09 | 972 |
| 0.9071 | 0.6776 | 0.9421 | 0.6761 | 9.952729e-09 | 973 |
| 0.9214 | 0.6776 | 0.9421 | 0.6761 | 9.9526325e-09 | 974 |
| 0.9184 | 0.6776 | 0.9421 | 0.6761 | 9.952536e-09 | 975 |
| 0.9114 | 0.6776 | 0.9420 | 0.6761 | 9.952439e-09 | 976 |
| 0.9094 | 0.6776 | 0.9420 | 0.6761 | 9.952342e-09 | 977 |
| 0.9195 | 0.6776 | 0.9419 | 0.6761 | 9.952244e-09 | 978 |
| 0.9166 | 0.6776 | 0.9418 | 0.6761 | 9.952147e-09 | 979 |
| 0.9218 | 0.6776 | 0.9418 | 0.6761 | 9.952049e-09 | 980 |
| 0.9112 | 0.6776 | 0.9418 | 0.6761 | 9.951951e-09 | 981 |
| 0.9267 | 0.6776 | 0.9417 | 0.6761 | 9.951854e-09 | 982 |
| 0.9155 | 0.6776 | 0.9417 | 0.6761 | 9.951756e-09 | 983 |
| 0.9054 | 0.6776 | 0.9416 | 0.6761 | 9.951658e-09 | 984 |
| 0.9067 | 0.6776 | 0.9416 | 0.6761 | 9.9515605e-09 | 985 |
| 0.9257 | 0.6776 | 0.9415 | 0.6761 | 9.951463e-09 | 986 |
| 0.9202 | 0.6776 | 0.9415 | 0.6761 | 9.951364e-09 | 987 |
| 0.9100 | 0.6776 | 0.9414 | 0.6761 | 9.951266e-09 | 988 |
| 0.9130 | 0.6776 | 0.9414 | 0.6761 | 9.951167e-09 | 989 |
| 0.9054 | 0.6776 | 0.9413 | 0.6761 | 9.951068e-09 | 990 |
| 0.9093 | 0.6776 | 0.9413 | 0.6761 | 9.95097e-09 | 991 |
| 0.9209 | 0.6776 | 0.9412 | 0.6761 | 9.950871e-09 | 992 |
| 0.9130 | 0.6776 | 0.9412 | 0.6761 | 9.950773e-09 | 993 |
| 0.9246 | 0.6776 | 0.9412 | 0.6761 | 9.950674e-09 | 994 |
| 0.9142 | 0.6776 | 0.9412 | 0.6761 | 9.9505755e-09 | 995 |
| 0.9165 | 0.6776 | 0.9411 | 0.6761 | 9.950476e-09 | 996 |
| 0.9144 | 0.6776 | 0.9411 | 0.6761 | 9.9503765e-09 | 997 |
| 0.9106 | 0.6776 | 0.9411 | 0.6761 | 9.950277e-09 | 998 |
| 0.9026 | 0.6776 | 0.9410 | 0.6761 | 9.950178e-09 | 999 |
| 0.9217 | 0.6776 | 0.9410 | 0.6761 | 9.950078e-09 | 1000 |
| 0.9179 | 0.6776 | 0.9410 | 0.6761 | 9.949979e-09 | 1001 |
| 0.9161 | 0.6776 | 0.9409 | 0.6761 | 9.949879e-09 | 1002 |
| 0.9193 | 0.6776 | 0.9409 | 0.6761 | 9.94978e-09 | 1003 |
| 0.9103 | 0.6776 | 0.9409 | 0.6761 | 9.94968e-09 | 1004 |
| 0.9109 | 0.6776 | 0.9408 | 0.6761 | 9.94958e-09 | 1005 |
| 0.9125 | 0.6776 | 0.9408 | 0.6761 | 9.9494795e-09 | 1006 |
| 0.9207 | 0.6776 | 0.9408 | 0.6761 | 9.949379e-09 | 1007 |
| 0.9085 | 0.6776 | 0.9408 | 0.6761 | 9.949279e-09 | 1008 |
| 0.9109 | 0.6776 | 0.9407 | 0.6761 | 9.949178e-09 | 1009 |
| 0.9090 | 0.6776 | 0.9407 | 0.6761 | 9.949078e-09 | 1010 |
| 0.9060 | 0.6776 | 0.9406 | 0.6761 | 9.948978e-09 | 1011 |
| 0.9073 | 0.6776 | 0.9406 | 0.6761 | 9.948877e-09 | 1012 |
| 0.9097 | 0.6776 | 0.9406 | 0.6761 | 9.948777e-09 | 1013 |
| 0.9048 | 0.6776 | 0.9405 | 0.6761 | 9.948676e-09 | 1014 |
| 0.9057 | 0.6776 | 0.9404 | 0.6761 | 9.948574e-09 | 1015 |
| 0.9131 | 0.6776 | 0.9404 | 0.6761 | 9.948473e-09 | 1016 |
| 0.9158 | 0.6776 | 0.9403 | 0.6761 | 9.948372e-09 | 1017 |
| 0.9119 | 0.6776 | 0.9403 | 0.6761 | 9.948271e-09 | 1018 |
| 0.9202 | 0.6776 | 0.9403 | 0.6761 | 9.948169e-09 | 1019 |
| 0.9126 | 0.6776 | 0.9403 | 0.6761 | 9.948068e-09 | 1020 |
| 0.9063 | 0.6776 | 0.9402 | 0.6761 | 9.947967e-09 | 1021 |
| 0.9055 | 0.6776 | 0.9402 | 0.6761 | 9.947866e-09 | 1022 |
| 0.9168 | 0.6776 | 0.9401 | 0.6761 | 9.9477635e-09 | 1023 |
| 0.9170 | 0.6776 | 0.9400 | 0.6761 | 9.947661e-09 | 1024 |
| 0.9121 | 0.6776 | 0.9399 | 0.6761 | 9.947559e-09 | 1025 |
| 0.9155 | 0.6776 | 0.9399 | 0.6761 | 9.947457e-09 | 1026 |
| 0.9097 | 0.6776 | 0.9398 | 0.6761 | 9.947355e-09 | 1027 |
| 0.9073 | 0.6776 | 0.9398 | 0.6761 | 9.947253e-09 | 1028 |
| 0.9123 | 0.6776 | 0.9397 | 0.6761 | 9.947151e-09 | 1029 |
| 0.8952 | 0.6776 | 0.9397 | 0.6761 | 9.9470485e-09 | 1030 |
| 0.9117 | 0.6776 | 0.9397 | 0.6761 | 9.946946e-09 | 1031 |
| 0.9115 | 0.6776 | 0.9397 | 0.6761 | 9.946843e-09 | 1032 |
| 0.9047 | 0.6776 | 0.9396 | 0.6761 | 9.94674e-09 | 1033 |
| 0.9027 | 0.6776 | 0.9395 | 0.6761 | 9.946637e-09 | 1034 |
| 0.9098 | 0.6776 | 0.9394 | 0.6761 | 9.946534e-09 | 1035 |
| 0.9090 | 0.6776 | 0.9394 | 0.6761 | 9.946431e-09 | 1036 |
| 0.9205 | 0.6776 | 0.9394 | 0.6761 | 9.946328e-09 | 1037 |
| 0.9045 | 0.6776 | 0.9393 | 0.6761 | 9.946225e-09 | 1038 |
| 0.9069 | 0.6776 | 0.9393 | 0.6761 | 9.946122e-09 | 1039 |
| 0.9089 | 0.6776 | 0.9393 | 0.6761 | 9.946019e-09 | 1040 |
| 0.9147 | 0.6776 | 0.9392 | 0.6761 | 9.945915e-09 | 1041 |
| 0.9043 | 0.6776 | 0.9392 | 0.6761 | 9.945811e-09 | 1042 |
| 0.9134 | 0.6776 | 0.9392 | 0.6761 | 9.945707e-09 | 1043 |
| 0.9071 | 0.6776 | 0.9391 | 0.6761 | 9.9456035e-09 | 1044 |
| 0.9105 | 0.6776 | 0.9391 | 0.6761 | 9.9454995e-09 | 1045 |
| 0.9033 | 0.6776 | 0.9391 | 0.6761 | 9.945396e-09 | 1046 |
| 0.9119 | 0.6776 | 0.9391 | 0.6761 | 9.945292e-09 | 1047 |
| 0.9111 | 0.6776 | 0.9391 | 0.6761 | 9.945188e-09 | 1048 |
| 0.8984 | 0.6776 | 0.9391 | 0.6761 | 9.945084e-09 | 1049 |
| 0.9092 | 0.6776 | 0.9390 | 0.6761 | 9.944979e-09 | 1050 |
| 0.9097 | 0.6776 | 0.9390 | 0.6761 | 9.944874e-09 | 1051 |
| 0.9027 | 0.6776 | 0.9390 | 0.6761 | 9.9447695e-09 | 1052 |
| 0.9057 | 0.6776 | 0.9389 | 0.6761 | 9.944665e-09 | 1053 |
| 0.9052 | 0.6776 | 0.9389 | 0.6761 | 9.94456e-09 | 1054 |
| 0.9121 | 0.6776 | 0.9389 | 0.6761 | 9.944455e-09 | 1055 |
| 0.9148 | 0.6776 | 0.9388 | 0.6761 | 9.94435e-09 | 1056 |
| 0.9171 | 0.6776 | 0.9388 | 0.6761 | 9.944245e-09 | 1057 |
| 0.9056 | 0.6776 | 0.9388 | 0.6761 | 9.944141e-09 | 1058 |
| 0.9133 | 0.6776 | 0.9388 | 0.6761 | 9.944035e-09 | 1059 |
| 0.9091 | 0.6776 | 0.9387 | 0.6761 | 9.943929e-09 | 1060 |
| 0.9053 | 0.6776 | 0.9387 | 0.6761 | 9.9438235e-09 | 1061 |
| 0.9065 | 0.6776 | 0.9386 | 0.6761 | 9.943718e-09 | 1062 |
| 0.9067 | 0.6776 | 0.9386 | 0.6761 | 9.943612e-09 | 1063 |
| 0.9047 | 0.6776 | 0.9386 | 0.6761 | 9.9435065e-09 | 1064 |
| 0.9039 | 0.6776 | 0.9385 | 0.6761 | 9.943401e-09 | 1065 |
| 0.9065 | 0.6776 | 0.9385 | 0.6761 | 9.943295e-09 | 1066 |
| 0.9034 | 0.6776 | 0.9385 | 0.6761 | 9.943189e-09 | 1067 |
| 0.9051 | 0.6776 | 0.9384 | 0.6761 | 9.943083e-09 | 1068 |
| 0.9001 | 0.6776 | 0.9384 | 0.6761 | 9.942976e-09 | 1069 |
| 0.9145 | 0.6776 | 0.9383 | 0.6761 | 9.94287e-09 | 1070 |
| 0.9113 | 0.6776 | 0.9383 | 0.6761 | 9.942763e-09 | 1071 |
| 0.9052 | 0.6776 | 0.9382 | 0.6761 | 9.9426565e-09 | 1072 |
| 0.9059 | 0.6776 | 0.9382 | 0.6761 | 9.94255e-09 | 1073 |
| 0.9157 | 0.6776 | 0.9382 | 0.6761 | 9.942443e-09 | 1074 |
| 0.9075 | 0.6776 | 0.9381 | 0.6761 | 9.942337e-09 | 1075 |
| 0.9023 | 0.6776 | 0.9380 | 0.6761 | 9.94223e-09 | 1076 |
| 0.9036 | 0.6776 | 0.9380 | 0.6761 | 9.942123e-09 | 1077 |
| 0.9033 | 0.6776 | 0.9380 | 0.6761 | 9.942015e-09 | 1078 |
| 0.9055 | 0.6776 | 0.9379 | 0.6761 | 9.941908e-09 | 1079 |
| 0.9100 | 0.6776 | 0.9379 | 0.6761 | 9.9418e-09 | 1080 |
| 0.8961 | 0.6776 | 0.9378 | 0.6761 | 9.941693e-09 | 1081 |
| 0.9052 | 0.6776 | 0.9378 | 0.6761 | 9.941585e-09 | 1082 |
| 0.9043 | 0.6776 | 0.9378 | 0.6761 | 9.941478e-09 | 1083 |
| 0.9017 | 0.6776 | 0.9377 | 0.6761 | 9.94137e-09 | 1084 |
| 0.8999 | 0.6776 | 0.9377 | 0.6761 | 9.941263e-09 | 1085 |
| 0.9031 | 0.6776 | 0.9377 | 0.6761 | 9.941155e-09 | 1086 |
| 0.9057 | 0.6776 | 0.9377 | 0.6761 | 9.941046e-09 | 1087 |
| 0.9026 | 0.6776 | 0.9376 | 0.6761 | 9.940938e-09 | 1088 |
| 0.9068 | 0.6776 | 0.9376 | 0.6761 | 9.9408295e-09 | 1089 |
| 0.9026 | 0.6776 | 0.9376 | 0.6761 | 9.940721e-09 | 1090 |
| 0.9030 | 0.6776 | 0.9375 | 0.6761 | 9.940613e-09 | 1091 |
| 0.9066 | 0.6776 | 0.9375 | 0.6761 | 9.940504e-09 | 1092 |
| 0.9161 | 0.6776 | 0.9375 | 0.6761 | 9.940396e-09 | 1093 |
| 0.9055 | 0.6776 | 0.9374 | 0.6761 | 9.940288e-09 | 1094 |
| 0.9106 | 0.6776 | 0.9374 | 0.6761 | 9.9401785e-09 | 1095 |
| 0.9116 | 0.6776 | 0.9374 | 0.6761 | 9.940069e-09 | 1096 |
| 0.9028 | 0.6776 | 0.9373 | 0.6761 | 9.93996e-09 | 1097 |
| 0.9061 | 0.6776 | 0.9373 | 0.6761 | 9.939851e-09 | 1098 |
| 0.9030 | 0.6776 | 0.9373 | 0.6761 | 9.9397415e-09 | 1099 |
| 0.8962 | 0.6776 | 0.9372 | 0.6761 | 9.939632e-09 | 1100 |
| 0.9123 | 0.6776 | 0.9371 | 0.6761 | 9.939523e-09 | 1101 |
| 0.9054 | 0.6776 | 0.9371 | 0.6761 | 9.939414e-09 | 1102 |
| 0.9002 | 0.6776 | 0.9370 | 0.6761 | 9.9393045e-09 | 1103 |
| 0.9125 | 0.6776 | 0.9370 | 0.6761 | 9.939194e-09 | 1104 |
| 0.9063 | 0.6776 | 0.9370 | 0.6761 | 9.939084e-09 | 1105 |
| 0.9021 | 0.6776 | 0.9369 | 0.6761 | 9.938974e-09 | 1106 |
| 0.9081 | 0.6776 | 0.9369 | 0.6761 | 9.938864e-09 | 1107 |
| 0.9005 | 0.6776 | 0.9368 | 0.6761 | 9.938754e-09 | 1108 |
| 0.9077 | 0.6776 | 0.9368 | 0.6761 | 9.938644e-09 | 1109 |
| 0.8992 | 0.6776 | 0.9368 | 0.6761 | 9.9385336e-09 | 1110 |
| 0.9107 | 0.6776 | 0.9367 | 0.6761 | 9.938423e-09 | 1111 |
| 0.9090 | 0.6776 | 0.9366 | 0.6761 | 9.938313e-09 | 1112 |
| 0.9055 | 0.6776 | 0.9366 | 0.6761 | 9.938202e-09 | 1113 |
| 0.8947 | 0.6776 | 0.9366 | 0.6761 | 9.938091e-09 | 1114 |
| 0.8961 | 0.6776 | 0.9365 | 0.6761 | 9.93798e-09 | 1115 |
| 0.9039 | 0.6776 | 0.9365 | 0.6761 | 9.937869e-09 | 1116 |
| 0.9044 | 0.6776 | 0.9365 | 0.6761 | 9.937758e-09 | 1117 |
| 0.9057 | 0.6776 | 0.9365 | 0.6761 | 9.937647e-09 | 1118 |
| 0.8984 | 0.6776 | 0.9365 | 0.6761 | 9.937536e-09 | 1119 |
| 0.8990 | 0.6776 | 0.9364 | 0.6761 | 9.937425e-09 | 1120 |
| 0.9085 | 0.6776 | 0.9364 | 0.6761 | 9.937314e-09 | 1121 |
| 0.9042 | 0.6776 | 0.9364 | 0.6761 | 9.937202e-09 | 1122 |
| 0.8934 | 0.6776 | 0.9363 | 0.6761 | 9.93709e-09 | 1123 |
| 0.9091 | 0.6776 | 0.9363 | 0.6761 | 9.936978e-09 | 1124 |
| 0.9133 | 0.6776 | 0.9363 | 0.6761 | 9.936866e-09 | 1125 |
| 0.9032 | 0.6776 | 0.9363 | 0.6761 | 9.9367545e-09 | 1126 |
| 0.9045 | 0.6776 | 0.9362 | 0.6761 | 9.936643e-09 | 1127 |
| 0.9044 | 0.6776 | 0.9362 | 0.6761 | 9.936531e-09 | 1128 |
| 0.9048 | 0.6776 | 0.9362 | 0.6761 | 9.936419e-09 | 1129 |
| 0.9014 | 0.6776 | 0.9362 | 0.6761 | 9.936307e-09 | 1130 |
| 0.9082 | 0.6776 | 0.9361 | 0.6761 | 9.936194e-09 | 1131 |
| 0.9032 | 0.6776 | 0.9361 | 0.6761 | 9.936081e-09 | 1132 |
| 0.9065 | 0.6776 | 0.9360 | 0.6761 | 9.9359685e-09 | 1133 |
| 0.8989 | 0.6776 | 0.9360 | 0.6761 | 9.935856e-09 | 1134 |
| 0.9077 | 0.6776 | 0.9360 | 0.6761 | 9.935743e-09 | 1135 |
| 0.8987 | 0.6776 | 0.9359 | 0.6761 | 9.93563e-09 | 1136 |
| 0.8963 | 0.6776 | 0.9359 | 0.6761 | 9.935517e-09 | 1137 |
| 0.9086 | 0.6776 | 0.9359 | 0.6761 | 9.9354045e-09 | 1138 |
| 0.9023 | 0.6776 | 0.9358 | 0.6761 | 9.935292e-09 | 1139 |
| 0.8930 | 0.6776 | 0.9358 | 0.6761 | 9.935178e-09 | 1140 |
| 0.9021 | 0.6776 | 0.9357 | 0.6761 | 9.935064e-09 | 1141 |
| 0.9007 | 0.6776 | 0.9357 | 0.6761 | 9.934951e-09 | 1142 |
| 0.9062 | 0.6776 | 0.9357 | 0.6761 | 9.934837e-09 | 1143 |
| 0.9097 | 0.6776 | 0.9357 | 0.6761 | 9.934723e-09 | 1144 |
| 0.8975 | 0.6776 | 0.9357 | 0.6761 | 9.93461e-09 | 1145 |
| 0.8971 | 0.6776 | 0.9356 | 0.6761 | 9.934496e-09 | 1146 |
| 0.8989 | 0.6776 | 0.9356 | 0.6761 | 9.934382e-09 | 1147 |
| 0.9030 | 0.6776 | 0.9355 | 0.6761 | 9.9342685e-09 | 1148 |
| 0.8989 | 0.6776 | 0.9355 | 0.6761 | 9.934154e-09 | 1149 |
| 0.8932 | 0.6776 | 0.9354 | 0.6761 | 9.934039e-09 | 1150 |
| 0.8967 | 0.6776 | 0.9354 | 0.6761 | 9.933925e-09 | 1151 |
| 0.9096 | 0.6776 | 0.9353 | 0.6761 | 9.93381e-09 | 1152 |
| 0.9086 | 0.6776 | 0.9353 | 0.6761 | 9.933696e-09 | 1153 |
| 0.8899 | 0.6776 | 0.9353 | 0.6761 | 9.933581e-09 | 1154 |
| 0.9088 | 0.6776 | 0.9353 | 0.6761 | 9.9334665e-09 | 1155 |
| 0.9054 | 0.6776 | 0.9353 | 0.6761 | 9.933352e-09 | 1156 |
| 0.9072 | 0.6776 | 0.9352 | 0.6761 | 9.933237e-09 | 1157 |
| 0.8917 | 0.6776 | 0.9352 | 0.6761 | 9.933122e-09 | 1158 |
| 0.8970 | 0.6776 | 0.9352 | 0.6761 | 9.933006e-09 | 1159 |
| 0.9011 | 0.6776 | 0.9351 | 0.6761 | 9.932891e-09 | 1160 |
| 0.8998 | 0.6776 | 0.9350 | 0.6761 | 9.9327755e-09 | 1161 |
| 0.9087 | 0.6776 | 0.9350 | 0.6761 | 9.93266e-09 | 1162 |
| 0.8982 | 0.6776 | 0.9349 | 0.6761 | 9.932545e-09 | 1163 |
| 0.9109 | 0.6776 | 0.9349 | 0.6761 | 9.932429e-09 | 1164 |
| 0.9079 | 0.6776 | 0.9349 | 0.6761 | 9.932314e-09 | 1165 |
| 0.9030 | 0.6776 | 0.9349 | 0.6761 | 9.932198e-09 | 1166 |
| 0.9018 | 0.6776 | 0.9348 | 0.6761 | 9.932082e-09 | 1167 |
| 0.8984 | 0.6776 | 0.9348 | 0.6761 | 9.9319655e-09 | 1168 |
| 0.8980 | 0.6776 | 0.9347 | 0.6761 | 9.931849e-09 | 1169 |
| 0.8946 | 0.6776 | 0.9347 | 0.6761 | 9.931733e-09 | 1170 |
| 0.9080 | 0.6776 | 0.9346 | 0.6761 | 9.931616e-09 | 1171 |
| 0.9064 | 0.6776 | 0.9346 | 0.6761 | 9.9315e-09 | 1172 |
| 0.9087 | 0.6776 | 0.9345 | 0.6761 | 9.931384e-09 | 1173 |
| 0.9041 | 0.6776 | 0.9345 | 0.6761 | 9.931267e-09 | 1174 |
| 0.8988 | 0.6776 | 0.9344 | 0.6761 | 9.931151e-09 | 1175 |
| 0.9119 | 0.6776 | 0.9344 | 0.6761 | 9.931035e-09 | 1176 |
| 0.9033 | 0.6776 | 0.9344 | 0.6761 | 9.930917e-09 | 1177 |
| 0.9042 | 0.6776 | 0.9343 | 0.6761 | 9.9308e-09 | 1178 |
| 0.9044 | 0.6776 | 0.9343 | 0.6761 | 9.930683e-09 | 1179 |
| 0.8964 | 0.6776 | 0.9342 | 0.6761 | 9.930566e-09 | 1180 |
| 0.8963 | 0.6776 | 0.9342 | 0.6761 | 9.9304485e-09 | 1181 |
| 0.9015 | 0.6776 | 0.9342 | 0.6761 | 9.930331e-09 | 1182 |
| 0.8996 | 0.6776 | 0.9342 | 0.6761 | 9.930214e-09 | 1183 |
| 0.8986 | 0.6776 | 0.9341 | 0.6761 | 9.930097e-09 | 1184 |
| 0.9005 | 0.6776 | 0.9341 | 0.6761 | 9.9299795e-09 | 1185 |
| 0.8975 | 0.6776 | 0.9340 | 0.6761 | 9.929861e-09 | 1186 |
| 0.9065 | 0.6776 | 0.9340 | 0.6761 | 9.929743e-09 | 1187 |
| 0.9050 | 0.6776 | 0.9339 | 0.6761 | 9.929625e-09 | 1188 |
| 0.8887 | 0.6776 | 0.9338 | 0.6761 | 9.929507e-09 | 1189 |
| 0.8999 | 0.6776 | 0.9338 | 0.6761 | 9.929389e-09 | 1190 |
| 0.8985 | 0.6776 | 0.9337 | 0.6761 | 9.929271e-09 | 1191 |
| 0.9022 | 0.6776 | 0.9337 | 0.6761 | 9.929153e-09 | 1192 |
| 0.8938 | 0.6776 | 0.9336 | 0.6761 | 9.9290345e-09 | 1193 |
| 0.8969 | 0.6776 | 0.9336 | 0.6761 | 9.928916e-09 | 1194 |
| 0.9023 | 0.6776 | 0.9336 | 0.6761 | 9.928797e-09 | 1195 |
| 0.8909 | 0.6776 | 0.9335 | 0.6761 | 9.928678e-09 | 1196 |
| 0.8993 | 0.6776 | 0.9335 | 0.6761 | 9.928559e-09 | 1197 |
| 0.8900 | 0.6776 | 0.9334 | 0.6761 | 9.92844e-09 | 1198 |
| 0.8846 | 0.6776 | 0.9334 | 0.6761 | 9.928321e-09 | 1199 |
| 0.8967 | 0.6776 | 0.9334 | 0.6761 | 9.928202e-09 | 1200 |
| 0.8916 | 0.6776 | 0.9333 | 0.6761 | 9.928083e-09 | 1201 |
| 0.9082 | 0.6776 | 0.9332 | 0.6761 | 9.927964e-09 | 1202 |
| 0.9067 | 0.6776 | 0.9332 | 0.6761 | 9.927845e-09 | 1203 |
| 0.8969 | 0.6776 | 0.9331 | 0.6761 | 9.927725e-09 | 1204 |
| 0.8977 | 0.6776 | 0.9331 | 0.6761 | 9.927605e-09 | 1205 |
| 0.8978 | 0.6776 | 0.9330 | 0.6761 | 9.9274855e-09 | 1206 |
| 0.8920 | 0.6776 | 0.9330 | 0.6761 | 9.927366e-09 | 1207 |
| 0.8992 | 0.6776 | 0.9329 | 0.6761 | 9.927246e-09 | 1208 |
| 0.8927 | 0.6776 | 0.9329 | 0.6761 | 9.927126e-09 | 1209 |
| 0.8952 | 0.6776 | 0.9328 | 0.6761 | 9.927006e-09 | 1210 |
| 0.8957 | 0.6776 | 0.9328 | 0.6761 | 9.926886e-09 | 1211 |
| 0.8951 | 0.6776 | 0.9327 | 0.6761 | 9.926766e-09 | 1212 |
| 0.9016 | 0.6776 | 0.9327 | 0.6761 | 9.926645e-09 | 1213 |
| 0.8931 | 0.6776 | 0.9327 | 0.6761 | 9.9265245e-09 | 1214 |
| 0.9006 | 0.6776 | 0.9327 | 0.6761 | 9.926404e-09 | 1215 |
| 0.9188 | 0.6776 | 0.9326 | 0.6761 | 9.926283e-09 | 1216 |
| 0.8923 | 0.6776 | 0.9326 | 0.6761 | 9.926162e-09 | 1217 |
| 0.8985 | 0.6776 | 0.9325 | 0.6761 | 9.926041e-09 | 1218 |
| 0.8956 | 0.6776 | 0.9325 | 0.6761 | 9.9259205e-09 | 1219 |
| 0.9038 | 0.6800 | 0.9325 | 0.6761 | 9.9258e-09 | 1220 |
| 0.8942 | 0.6776 | 0.9324 | 0.6761 | 9.925679e-09 | 1221 |
| 0.8975 | 0.6776 | 0.9323 | 0.6761 | 9.925557e-09 | 1222 |
| 0.8996 | 0.6776 | 0.9323 | 0.6761 | 9.925436e-09 | 1223 |
| 0.8949 | 0.6753 | 0.9324 | 0.6761 | 9.925314e-09 | 1224 |
| 0.8988 | 0.6776 | 0.9323 | 0.6761 | 9.925192e-09 | 1225 |
| 0.8928 | 0.6776 | 0.9322 | 0.6761 | 9.9250705e-09 | 1226 |
| 0.8980 | 0.6776 | 0.9322 | 0.6761 | 9.924949e-09 | 1227 |
| 0.8869 | 0.6776 | 0.9322 | 0.6761 | 9.924827e-09 | 1228 |
| 0.8921 | 0.6776 | 0.9321 | 0.6761 | 9.9247055e-09 | 1229 |
| 0.8996 | 0.6776 | 0.9321 | 0.6761 | 9.924584e-09 | 1230 |
| 0.8974 | 0.6776 | 0.9321 | 0.6761 | 9.924461e-09 | 1231 |
| 0.9053 | 0.6776 | 0.9320 | 0.6761 | 9.924339e-09 | 1232 |
| 0.8934 | 0.6776 | 0.9320 | 0.6761 | 9.924216e-09 | 1233 |
| 0.9023 | 0.6776 | 0.9320 | 0.6761 | 9.9240935e-09 | 1234 |
| 0.9037 | 0.6776 | 0.9319 | 0.6761 | 9.923971e-09 | 1235 |
| 0.9004 | 0.6776 | 0.9318 | 0.6761 | 9.923848e-09 | 1236 |
| 0.8916 | 0.6776 | 0.9317 | 0.6761 | 9.923726e-09 | 1237 |
| 0.9011 | 0.6776 | 0.9317 | 0.6761 | 9.923603e-09 | 1238 |
| 0.8920 | 0.6776 | 0.9317 | 0.6761 | 9.923481e-09 | 1239 |
| 0.8932 | 0.6776 | 0.9316 | 0.6761 | 9.923357e-09 | 1240 |
| 0.8896 | 0.6776 | 0.9316 | 0.6761 | 9.923234e-09 | 1241 |
| 0.9012 | 0.6776 | 0.9315 | 0.6761 | 9.92311e-09 | 1242 |
| 0.8889 | 0.6776 | 0.9315 | 0.6761 | 9.922987e-09 | 1243 |
| 0.8961 | 0.6776 | 0.9315 | 0.6761 | 9.922863e-09 | 1244 |
| 0.9024 | 0.6776 | 0.9314 | 0.6761 | 9.92274e-09 | 1245 |
| 0.8967 | 0.6776 | 0.9314 | 0.6761 | 9.9226165e-09 | 1246 |
| 0.8904 | 0.6776 | 0.9313 | 0.6761 | 9.922493e-09 | 1247 |
| 0.8933 | 0.6776 | 0.9313 | 0.6761 | 9.92237e-09 | 1248 |
| 0.8977 | 0.6776 | 0.9312 | 0.6761 | 9.922245e-09 | 1249 |
| 0.8942 | 0.6776 | 0.9312 | 0.6761 | 9.922121e-09 | 1250 |
| 0.8983 | 0.6776 | 0.9312 | 0.6761 | 9.921997e-09 | 1251 |
| 0.9029 | 0.6776 | 0.9311 | 0.6761 | 9.921872e-09 | 1252 |
| 0.8966 | 0.6776 | 0.9310 | 0.6761 | 9.921748e-09 | 1253 |
| 0.8833 | 0.6776 | 0.9309 | 0.6761 | 9.9216235e-09 | 1254 |
| 0.9004 | 0.6776 | 0.9309 | 0.6761 | 9.921499e-09 | 1255 |
| 0.8937 | 0.6776 | 0.9309 | 0.6761 | 9.921375e-09 | 1256 |
| 0.8871 | 0.6776 | 0.9308 | 0.6761 | 9.9212505e-09 | 1257 |
| 0.8971 | 0.6776 | 0.9308 | 0.6761 | 9.921125e-09 | 1258 |
| 0.8997 | 0.6776 | 0.9307 | 0.6761 | 9.921e-09 | 1259 |
| 0.9031 | 0.6776 | 0.9306 | 0.6761 | 9.920875e-09 | 1260 |
| 0.8832 | 0.6776 | 0.9306 | 0.6761 | 9.9207496e-09 | 1261 |
| 0.8903 | 0.6800 | 0.9305 | 0.6761 | 9.920624e-09 | 1262 |
| 0.8881 | 0.6776 | 0.9306 | 0.6761 | 9.920499e-09 | 1263 |
| 0.8978 | 0.6776 | 0.9305 | 0.6761 | 9.920374e-09 | 1264 |
| 0.8993 | 0.6776 | 0.9304 | 0.6761 | 9.920249e-09 | 1265 |
| 0.9109 | 0.6776 | 0.9304 | 0.6761 | 9.920123e-09 | 1266 |
| 0.8927 | 0.6776 | 0.9303 | 0.6761 | 9.919997e-09 | 1267 |
| 0.8922 | 0.6776 | 0.9303 | 0.6761 | 9.919871e-09 | 1268 |
| 0.8920 | 0.6776 | 0.9303 | 0.6761 | 9.919745e-09 | 1269 |
| 0.8935 | 0.6776 | 0.9302 | 0.6761 | 9.919619e-09 | 1270 |
| 0.8986 | 0.6776 | 0.9302 | 0.6761 | 9.919493e-09 | 1271 |
| 0.8926 | 0.6776 | 0.9301 | 0.6761 | 9.919367e-09 | 1272 |
| 0.8973 | 0.6776 | 0.9301 | 0.6761 | 9.9192405e-09 | 1273 |
| 0.8902 | 0.6776 | 0.9301 | 0.6761 | 9.919114e-09 | 1274 |
| 0.8858 | 0.6776 | 0.9300 | 0.6761 | 9.918988e-09 | 1275 |
| 0.8993 | 0.6776 | 0.9300 | 0.6761 | 9.918862e-09 | 1276 |
| 0.8979 | 0.6776 | 0.9299 | 0.6761 | 9.918735e-09 | 1277 |
| 0.8886 | 0.6776 | 0.9299 | 0.6761 | 9.918608e-09 | 1278 |
| 0.8927 | 0.6776 | 0.9298 | 0.6761 | 9.918481e-09 | 1279 |
| 0.8849 | 0.6776 | 0.9298 | 0.6761 | 9.918354e-09 | 1280 |
| 0.8824 | 0.6800 | 0.9298 | 0.6761 | 9.918227e-09 | 1281 |
| 0.8964 | 0.6776 | 0.9297 | 0.6761 | 9.9181e-09 | 1282 |
| 0.8906 | 0.6776 | 0.9296 | 0.6761 | 9.917973e-09 | 1283 |
| 0.8881 | 0.6776 | 0.9296 | 0.6761 | 9.917846e-09 | 1284 |
| 0.8825 | 0.6776 | 0.9296 | 0.6761 | 9.917719e-09 | 1285 |
| 0.9026 | 0.6776 | 0.9295 | 0.6761 | 9.917591e-09 | 1286 |
| 0.8882 | 0.6776 | 0.9295 | 0.6761 | 9.917463e-09 | 1287 |
| 0.8889 | 0.6776 | 0.9294 | 0.6761 | 9.917335e-09 | 1288 |
| 0.8937 | 0.6776 | 0.9294 | 0.6761 | 9.9172075e-09 | 1289 |
| 0.8922 | 0.6776 | 0.9294 | 0.6761 | 9.91708e-09 | 1290 |
| 0.8960 | 0.6776 | 0.9294 | 0.6761 | 9.916952e-09 | 1291 |
| 0.8890 | 0.6776 | 0.9293 | 0.6761 | 9.916824e-09 | 1292 |
| 0.8966 | 0.6776 | 0.9293 | 0.6761 | 9.916696e-09 | 1293 |
| 0.9002 | 0.6776 | 0.9293 | 0.6761 | 9.916568e-09 | 1294 |
| 0.8894 | 0.6776 | 0.9292 | 0.6761 | 9.916439e-09 | 1295 |
| 0.8884 | 0.6776 | 0.9292 | 0.6761 | 9.91631e-09 | 1296 |
| 0.8913 | 0.6776 | 0.9291 | 0.6761 | 9.916182e-09 | 1297 |
| 0.8939 | 0.6776 | 0.9291 | 0.6761 | 9.916053e-09 | 1298 |
| 0.8912 | 0.6776 | 0.9291 | 0.6761 | 9.915924e-09 | 1299 |
| 0.8888 | 0.6776 | 0.9291 | 0.6761 | 9.915795e-09 | 1300 |
| 0.8746 | 0.6776 | 0.9290 | 0.6761 | 9.9156665e-09 | 1301 |
| 0.8895 | 0.6776 | 0.9289 | 0.6761 | 9.915538e-09 | 1302 |
| 0.8919 | 0.6776 | 0.9289 | 0.6761 | 9.915409e-09 | 1303 |
| 0.8967 | 0.6776 | 0.9289 | 0.6761 | 9.915279e-09 | 1304 |
| 0.8898 | 0.6776 | 0.9288 | 0.6761 | 9.91515e-09 | 1305 |
| 0.8790 | 0.6776 | 0.9288 | 0.6761 | 9.91502e-09 | 1306 |
| 0.8973 | 0.6776 | 0.9287 | 0.6761 | 9.91489e-09 | 1307 |
| 0.8832 | 0.6776 | 0.9286 | 0.6761 | 9.914761e-09 | 1308 |
| 0.8852 | 0.6776 | 0.9286 | 0.6761 | 9.914631e-09 | 1309 |
| 0.9032 | 0.6776 | 0.9286 | 0.6761 | 9.914501e-09 | 1310 |
| 0.8839 | 0.6776 | 0.9285 | 0.6761 | 9.9143715e-09 | 1311 |
| 0.8992 | 0.6776 | 0.9285 | 0.6761 | 9.914242e-09 | 1312 |
| 0.8812 | 0.6776 | 0.9285 | 0.6761 | 9.914111e-09 | 1313 |
| 0.8838 | 0.6776 | 0.9284 | 0.6761 | 9.913981e-09 | 1314 |
| 0.8874 | 0.6776 | 0.9284 | 0.6761 | 9.91385e-09 | 1315 |
| 0.8918 | 0.6776 | 0.9283 | 0.6761 | 9.91372e-09 | 1316 |
| 0.8823 | 0.6776 | 0.9282 | 0.6761 | 9.913589e-09 | 1317 |
| 0.8962 | 0.6776 | 0.9281 | 0.6761 | 9.9134585e-09 | 1318 |
| 0.8900 | 0.6776 | 0.9281 | 0.6761 | 9.913328e-09 | 1319 |
| 0.8871 | 0.6776 | 0.9280 | 0.6761 | 9.913197e-09 | 1320 |
| 0.8877 | 0.6776 | 0.9280 | 0.6761 | 9.913067e-09 | 1321 |
| 0.8912 | 0.6776 | 0.9279 | 0.6761 | 9.912935e-09 | 1322 |
| 0.8896 | 0.6776 | 0.9279 | 0.6761 | 9.912804e-09 | 1323 |
| 0.8842 | 0.6776 | 0.9278 | 0.6761 | 9.9126725e-09 | 1324 |
| 0.8871 | 0.6776 | 0.9278 | 0.6761 | 9.912541e-09 | 1325 |
| 0.8782 | 0.6776 | 0.9277 | 0.6761 | 9.91241e-09 | 1326 |
| 0.8883 | 0.6776 | 0.9277 | 0.6761 | 9.912278e-09 | 1327 |
| 0.8834 | 0.6776 | 0.9276 | 0.6761 | 9.912147e-09 | 1328 |
| 0.8918 | 0.6776 | 0.9276 | 0.6761 | 9.912015e-09 | 1329 |
| 0.8977 | 0.6776 | 0.9275 | 0.6761 | 9.911884e-09 | 1330 |
| 0.8913 | 0.6776 | 0.9275 | 0.6761 | 9.911751e-09 | 1331 |
| 0.8855 | 0.6776 | 0.9274 | 0.6761 | 9.911619e-09 | 1332 |
| 0.8959 | 0.6776 | 0.9274 | 0.6761 | 9.911487e-09 | 1333 |
| 0.8877 | 0.6776 | 0.9274 | 0.6761 | 9.911354e-09 | 1334 |
| 0.8910 | 0.6776 | 0.9273 | 0.6761 | 9.911222e-09 | 1335 |
| 0.8985 | 0.6776 | 0.9273 | 0.6761 | 9.91109e-09 | 1336 |
| 0.8940 | 0.6776 | 0.9272 | 0.6761 | 9.910957e-09 | 1337 |
| 0.8925 | 0.6776 | 0.9271 | 0.6761 | 9.910825e-09 | 1338 |
| 0.8852 | 0.6776 | 0.9271 | 0.6761 | 9.910693e-09 | 1339 |
| 0.8819 | 0.6776 | 0.9271 | 0.6761 | 9.9105595e-09 | 1340 |
| 0.8889 | 0.6776 | 0.9270 | 0.6761 | 9.910426e-09 | 1341 |
| 0.8932 | 0.6776 | 0.9270 | 0.6761 | 9.910293e-09 | 1342 |
| 0.8891 | 0.6800 | 0.9269 | 0.6761 | 9.91016e-09 | 1343 |
| 0.8824 | 0.6776 | 0.9269 | 0.6761 | 9.910027e-09 | 1344 |
| 0.8850 | 0.6776 | 0.9269 | 0.6761 | 9.909893e-09 | 1345 |
| 0.8924 | 0.6776 | 0.9268 | 0.6761 | 9.90976e-09 | 1346 |
| 0.8874 | 0.6776 | 0.9268 | 0.6761 | 9.909627e-09 | 1347 |
| 0.8821 | 0.6776 | 0.9267 | 0.6761 | 9.909494e-09 | 1348 |
| 0.8938 | 0.6776 | 0.9267 | 0.6761 | 9.9093596e-09 | 1349 |
| 0.8871 | 0.6776 | 0.9267 | 0.6761 | 9.909225e-09 | 1350 |
| 0.8911 | 0.6776 | 0.9266 | 0.6761 | 9.909091e-09 | 1351 |
| 0.8720 | 0.6776 | 0.9266 | 0.6761 | 9.908957e-09 | 1352 |
| 0.8999 | 0.6776 | 0.9265 | 0.6761 | 9.908823e-09 | 1353 |
| 0.8843 | 0.6776 | 0.9265 | 0.6761 | 9.908689e-09 | 1354 |
| 0.8946 | 0.6776 | 0.9265 | 0.6761 | 9.908555e-09 | 1355 |
| 0.8888 | 0.6776 | 0.9265 | 0.6761 | 9.908421e-09 | 1356 |
| 0.8837 | 0.6776 | 0.9264 | 0.6761 | 9.908287e-09 | 1357 |
| 0.8821 | 0.6776 | 0.9264 | 0.6761 | 9.9081525e-09 | 1358 |
| 0.8868 | 0.6776 | 0.9264 | 0.6761 | 9.9080175e-09 | 1359 |
| 0.8799 | 0.6776 | 0.9264 | 0.6761 | 9.9078825e-09 | 1360 |
| 0.8883 | 0.6776 | 0.9263 | 0.6761 | 9.9077475e-09 | 1361 |
| 0.8906 | 0.6776 | 0.9263 | 0.6761 | 9.9076125e-09 | 1362 |
| 0.8831 | 0.6776 | 0.9263 | 0.6761 | 9.9074775e-09 | 1363 |
| 0.8873 | 0.6776 | 0.9262 | 0.6761 | 9.9073425e-09 | 1364 |
| 0.8866 | 0.6776 | 0.9261 | 0.6761 | 9.9072075e-09 | 1365 |
| 0.8888 | 0.6776 | 0.9261 | 0.6761 | 9.9070725e-09 | 1366 |
| 0.8889 | 0.6776 | 0.9261 | 0.6761 | 9.9069375e-09 | 1367 |
| 0.8958 | 0.6776 | 0.9260 | 0.6761 | 9.906802e-09 | 1368 |
| 0.8855 | 0.6776 | 0.9260 | 0.6761 | 9.906666e-09 | 1369 |
| 0.8825 | 0.6776 | 0.9259 | 0.6761 | 9.90653e-09 | 1370 |
| 0.8913 | 0.6776 | 0.9259 | 0.6761 | 9.906394e-09 | 1371 |
| 0.8837 | 0.6776 | 0.9258 | 0.6761 | 9.906258e-09 | 1372 |
| 0.8888 | 0.6776 | 0.9258 | 0.6761 | 9.906122e-09 | 1373 |
| 0.8824 | 0.6776 | 0.9258 | 0.6761 | 9.905986e-09 | 1374 |
| 0.8885 | 0.6776 | 0.9257 | 0.6761 | 9.90585e-09 | 1375 |
| 0.8822 | 0.6776 | 0.9257 | 0.6761 | 9.9057145e-09 | 1376 |
| 0.8834 | 0.6776 | 0.9257 | 0.6761 | 9.905578e-09 | 1377 |
| 0.8801 | 0.6776 | 0.9256 | 0.6761 | 9.905441e-09 | 1378 |
| 0.8793 | 0.6776 | 0.9255 | 0.6761 | 9.905304e-09 | 1379 |
| 0.8894 | 0.6776 | 0.9255 | 0.6761 | 9.905167e-09 | 1380 |
| 0.8796 | 0.6776 | 0.9255 | 0.6761 | 9.905031e-09 | 1381 |
| 0.8894 | 0.6776 | 0.9255 | 0.6761 | 9.904894e-09 | 1382 |
| 0.8797 | 0.6776 | 0.9254 | 0.6761 | 9.904757e-09 | 1383 |
| 0.8825 | 0.6776 | 0.9254 | 0.6761 | 9.90462e-09 | 1384 |
| 0.8960 | 0.6776 | 0.9253 | 0.6761 | 9.9044835e-09 | 1385 |
| 0.8777 | 0.6776 | 0.9253 | 0.6761 | 9.904346e-09 | 1386 |
| 0.8924 | 0.6776 | 0.9253 | 0.6761 | 9.904208e-09 | 1387 |
| 0.8918 | 0.6753 | 0.9252 | 0.6761 | 9.9040705e-09 | 1388 |
| 0.8807 | 0.6776 | 0.9252 | 0.6761 | 9.903933e-09 | 1389 |
| 0.8726 | 0.6776 | 0.9251 | 0.6761 | 9.903795e-09 | 1390 |
| 0.8928 | 0.6776 | 0.9250 | 0.6761 | 9.9036574e-09 | 1391 |
| 0.8778 | 0.6776 | 0.9250 | 0.6761 | 9.90352e-09 | 1392 |
| 0.8830 | 0.6776 | 0.9249 | 0.6761 | 9.903382e-09 | 1393 |
| 0.8844 | 0.6776 | 0.9249 | 0.6761 | 9.9032444e-09 | 1394 |
| 0.8790 | 0.6776 | 0.9249 | 0.6761 | 9.903106e-09 | 1395 |
| 0.8821 | 0.6776 | 0.9248 | 0.6761 | 9.902967e-09 | 1396 |
| 0.8803 | 0.6776 | 0.9247 | 0.6761 | 9.902829e-09 | 1397 |
| 0.8900 | 0.6776 | 0.9247 | 0.6761 | 9.90269e-09 | 1398 |
| 0.8839 | 0.6776 | 0.9247 | 0.6761 | 9.902552e-09 | 1399 |
| 0.8790 | 0.6776 | 0.9247 | 0.6761 | 9.902413e-09 | 1400 |
| 0.8774 | 0.6776 | 0.9246 | 0.6761 | 9.9022746e-09 | 1401 |
| 0.8764 | 0.6776 | 0.9246 | 0.6761 | 9.902136e-09 | 1402 |
| 0.8760 | 0.6776 | 0.9245 | 0.6761 | 9.9019974e-09 | 1403 |
| 0.8807 | 0.6776 | 0.9245 | 0.6761 | 9.901858e-09 | 1404 |
| 0.8881 | 0.6800 | 0.9244 | 0.6761 | 9.901719e-09 | 1405 |
| 0.8828 | 0.6776 | 0.9244 | 0.6761 | 9.901579e-09 | 1406 |
| 0.8779 | 0.6776 | 0.9243 | 0.6761 | 9.90144e-09 | 1407 |
| 0.8788 | 0.6776 | 0.9243 | 0.6761 | 9.9013e-09 | 1408 |
| 0.8869 | 0.6776 | 0.9242 | 0.6761 | 9.901161e-09 | 1409 |
| 0.8805 | 0.6776 | 0.9242 | 0.6761 | 9.901021e-09 | 1410 |
| 0.8844 | 0.6776 | 0.9242 | 0.6761 | 9.900882e-09 | 1411 |
| 0.8862 | 0.6776 | 0.9241 | 0.6761 | 9.9007424e-09 | 1412 |
| 0.8800 | 0.6800 | 0.9241 | 0.6761 | 9.900602e-09 | 1413 |
| 0.8773 | 0.6776 | 0.9241 | 0.6761 | 9.900462e-09 | 1414 |
| 0.8765 | 0.6776 | 0.9240 | 0.6761 | 9.9003215e-09 | 1415 |
| 0.8804 | 0.6776 | 0.9240 | 0.6761 | 9.900181e-09 | 1416 |
| 0.8774 | 0.6800 | 0.9240 | 0.6761 | 9.900041e-09 | 1417 |
| 0.8871 | 0.6776 | 0.9239 | 0.6761 | 9.8999005e-09 | 1418 |
| 0.8822 | 0.6776 | 0.9239 | 0.6761 | 9.89976e-09 | 1419 |
| 0.8821 | 0.6776 | 0.9239 | 0.6761 | 9.89962e-09 | 1420 |
| 0.8812 | 0.6776 | 0.9239 | 0.6761 | 9.8994795e-09 | 1421 |
| 0.8919 | 0.6776 | 0.9238 | 0.6761 | 9.899339e-09 | 1422 |
| 0.8835 | 0.6776 | 0.9238 | 0.6761 | 9.899198e-09 | 1423 |
| 0.8878 | 0.6776 | 0.9237 | 0.6761 | 9.899057e-09 | 1424 |
| 0.8840 | 0.6776 | 0.9237 | 0.6761 | 9.8989155e-09 | 1425 |
| 0.8897 | 0.6776 | 0.9237 | 0.6761 | 9.898774e-09 | 1426 |
| 0.8874 | 0.6776 | 0.9237 | 0.6761 | 9.898633e-09 | 1427 |
| 0.8887 | 0.6776 | 0.9236 | 0.6761 | 9.898492e-09 | 1428 |
| 0.8806 | 0.6776 | 0.9236 | 0.6761 | 9.898351e-09 | 1429 |
| 0.8883 | 0.6776 | 0.9236 | 0.6761 | 9.898209e-09 | 1430 |
| 0.8847 | 0.6776 | 0.9235 | 0.6761 | 9.898068e-09 | 1431 |
| 0.8762 | 0.6776 | 0.9234 | 0.6761 | 9.897926e-09 | 1432 |
| 0.8828 | 0.6776 | 0.9234 | 0.6761 | 9.897784e-09 | 1433 |
| 0.8833 | 0.6776 | 0.9233 | 0.6761 | 9.897642e-09 | 1434 |
| 0.8869 | 0.6776 | 0.9232 | 0.6761 | 9.8975e-09 | 1435 |
| 0.8829 | 0.6800 | 0.9232 | 0.6761 | 9.897358e-09 | 1436 |
| 0.8883 | 0.6776 | 0.9231 | 0.6761 | 9.8972155e-09 | 1437 |
| 0.8820 | 0.6776 | 0.9231 | 0.6761 | 9.897073e-09 | 1438 |
| 0.8887 | 0.6776 | 0.9230 | 0.6761 | 9.896931e-09 | 1439 |
| 0.8851 | 0.6776 | 0.9230 | 0.6761 | 9.896789e-09 | 1440 |
| 0.8778 | 0.6776 | 0.9230 | 0.6761 | 9.896646e-09 | 1441 |
| 0.8730 | 0.6776 | 0.9230 | 0.6761 | 9.896503e-09 | 1442 |
| 0.8799 | 0.6776 | 0.9229 | 0.6761 | 9.89636e-09 | 1443 |
| 0.8799 | 0.6776 | 0.9228 | 0.6761 | 9.896217e-09 | 1444 |
| 0.8702 | 0.6776 | 0.9228 | 0.6761 | 9.896074e-09 | 1445 |
| 0.8844 | 0.6776 | 0.9227 | 0.6761 | 9.895931e-09 | 1446 |
| 0.8821 | 0.6776 | 0.9227 | 0.6761 | 9.895788e-09 | 1447 |
| 0.8885 | 0.6776 | 0.9227 | 0.6761 | 9.895645e-09 | 1448 |
| 0.8802 | 0.6776 | 0.9226 | 0.6761 | 9.895502e-09 | 1449 |
| 0.8712 | 0.6776 | 0.9226 | 0.6761 | 9.895358e-09 | 1450 |
| 0.8776 | 0.6776 | 0.9225 | 0.6761 | 9.895214e-09 | 1451 |
| 0.8852 | 0.6776 | 0.9225 | 0.6761 | 9.8950705e-09 | 1452 |
| 0.8806 | 0.6776 | 0.9225 | 0.6761 | 9.894927e-09 | 1453 |
| 0.8700 | 0.6800 | 0.9224 | 0.6761 | 9.894783e-09 | 1454 |
| 0.8819 | 0.6776 | 0.9224 | 0.6761 | 9.894639e-09 | 1455 |
| 0.8844 | 0.6776 | 0.9224 | 0.6761 | 9.894495e-09 | 1456 |
| 0.8861 | 0.6776 | 0.9223 | 0.6761 | 9.894351e-09 | 1457 |
| 0.8780 | 0.6776 | 0.9223 | 0.6761 | 9.894207e-09 | 1458 |
| 0.8818 | 0.6776 | 0.9222 | 0.6761 | 9.8940625e-09 | 1459 |
| 0.8809 | 0.6800 | 0.9222 | 0.6761 | 9.893918e-09 | 1460 |
| 0.8862 | 0.6776 | 0.9222 | 0.6761 | 9.893773e-09 | 1461 |
| 0.8776 | 0.6800 | 0.9221 | 0.6761 | 9.893628e-09 | 1462 |
| 0.8711 | 0.6776 | 0.9221 | 0.6761 | 9.893483e-09 | 1463 |
| 0.8857 | 0.6776 | 0.9221 | 0.6761 | 9.893339e-09 | 1464 |
| 0.8788 | 0.6753 | 0.9221 | 0.6761 | 9.893194e-09 | 1465 |
| 0.8778 | 0.6776 | 0.9221 | 0.6761 | 9.893049e-09 | 1466 |
| 0.8715 | 0.6776 | 0.9220 | 0.6761 | 9.892904e-09 | 1467 |
| 0.8800 | 0.6776 | 0.9220 | 0.6761 | 9.892759e-09 | 1468 |
| 0.8728 | 0.6776 | 0.9220 | 0.6761 | 9.892613e-09 | 1469 |
| 0.8893 | 0.6776 | 0.9219 | 0.6761 | 9.892467e-09 | 1470 |
| 0.8772 | 0.6776 | 0.9219 | 0.6761 | 9.892322e-09 | 1471 |
| 0.8794 | 0.6776 | 0.9218 | 0.6761 | 9.892176e-09 | 1472 |
| 0.8751 | 0.6776 | 0.9218 | 0.6761 | 9.89203e-09 | 1473 |
| 0.8704 | 0.6776 | 0.9217 | 0.6761 | 9.891885e-09 | 1474 |
| 0.8834 | 0.6776 | 0.9217 | 0.6761 | 9.891739e-09 | 1475 |
| 0.8766 | 0.6776 | 0.9216 | 0.6761 | 9.891593e-09 | 1476 |
| 0.8778 | 0.6776 | 0.9216 | 0.6761 | 9.891448e-09 | 1477 |
| 0.8770 | 0.6776 | 0.9215 | 0.6761 | 9.891301e-09 | 1478 |
| 0.8886 | 0.6776 | 0.9215 | 0.6761 | 9.891155e-09 | 1479 |
| 0.8850 | 0.6776 | 0.9214 | 0.6761 | 9.891008e-09 | 1480 |
| 0.8703 | 0.6776 | 0.9214 | 0.6761 | 9.8908615e-09 | 1481 |
| 0.8781 | 0.6753 | 0.9213 | 0.6761 | 9.890715e-09 | 1482 |
| 0.8760 | 0.6776 | 0.9212 | 0.6761 | 9.890568e-09 | 1483 |
| 0.8701 | 0.6800 | 0.9211 | 0.6761 | 9.890422e-09 | 1484 |
| 0.8774 | 0.6776 | 0.9211 | 0.6761 | 9.890275e-09 | 1485 |
| 0.8769 | 0.6776 | 0.9211 | 0.6761 | 9.890129e-09 | 1486 |
| 0.8863 | 0.6776 | 0.9211 | 0.6761 | 9.889981e-09 | 1487 |
| 0.8659 | 0.6776 | 0.9210 | 0.6761 | 9.889834e-09 | 1488 |
| 0.8706 | 0.6800 | 0.9210 | 0.6761 | 9.889686e-09 | 1489 |
| 0.8789 | 0.6800 | 0.9209 | 0.6761 | 9.889539e-09 | 1490 |
| 0.8717 | 0.6776 | 0.9209 | 0.6761 | 9.8893915e-09 | 1491 |
| 0.8794 | 0.6753 | 0.9209 | 0.6761 | 9.889244e-09 | 1492 |
| 0.8715 | 0.6800 | 0.9209 | 0.6761 | 9.889097e-09 | 1493 |
| 0.8814 | 0.6776 | 0.9208 | 0.6761 | 9.888949e-09 | 1494 |
| 0.8801 | 0.6776 | 0.9208 | 0.6761 | 9.888802e-09 | 1495 |
| 0.8759 | 0.6776 | 0.9207 | 0.6761 | 9.8886535e-09 | 1496 |
| 0.8746 | 0.6800 | 0.9207 | 0.6761 | 9.888505e-09 | 1497 |
| 0.8760 | 0.6776 | 0.9207 | 0.6761 | 9.888357e-09 | 1498 |
| 0.8733 | 0.6776 | 0.9206 | 0.6761 | 9.8882085e-09 | 1499 |
| 0.8828 | 0.6776 | 0.9206 | 0.6761 | 9.88806e-09 | 1500 |
| 0.8702 | 0.6776 | 0.9206 | 0.6761 | 9.887912e-09 | 1501 |
| 0.8760 | 0.6800 | 0.9205 | 0.6761 | 9.8877635e-09 | 1502 |
| 0.8698 | 0.6800 | 0.9205 | 0.6761 | 9.887615e-09 | 1503 |
| 0.8810 | 0.6776 | 0.9205 | 0.6761 | 9.887467e-09 | 1504 |
| 0.8706 | 0.6776 | 0.9204 | 0.6761 | 9.887318e-09 | 1505 |
| 0.8710 | 0.6776 | 0.9204 | 0.6761 | 9.887168e-09 | 1506 |
| 0.8762 | 0.6776 | 0.9203 | 0.6761 | 9.887019e-09 | 1507 |
| 0.8774 | 0.6776 | 0.9202 | 0.6761 | 9.88687e-09 | 1508 |
| 0.8788 | 0.6800 | 0.9202 | 0.6761 | 9.886721e-09 | 1509 |
| 0.8723 | 0.6776 | 0.9201 | 0.6761 | 9.886572e-09 | 1510 |
| 0.8711 | 0.6753 | 0.9201 | 0.6761 | 9.886422e-09 | 1511 |
| 0.8760 | 0.6776 | 0.9201 | 0.6761 | 9.886273e-09 | 1512 |
| 0.8741 | 0.6776 | 0.9201 | 0.6761 | 9.886124e-09 | 1513 |
| 0.8730 | 0.6776 | 0.9200 | 0.6761 | 9.885974e-09 | 1514 |
| 0.8762 | 0.6753 | 0.9200 | 0.6761 | 9.885824e-09 | 1515 |
| 0.8900 | 0.6753 | 0.9199 | 0.6761 | 9.885674e-09 | 1516 |
| 0.8729 | 0.6776 | 0.9199 | 0.6761 | 9.8855235e-09 | 1517 |
| 0.8693 | 0.6800 | 0.9198 | 0.6761 | 9.885373e-09 | 1518 |
| 0.8749 | 0.6776 | 0.9198 | 0.6761 | 9.885223e-09 | 1519 |
| 0.8835 | 0.6776 | 0.9197 | 0.6761 | 9.885073e-09 | 1520 |
| 0.8797 | 0.6776 | 0.9196 | 0.6761 | 9.884923e-09 | 1521 |
| 0.8717 | 0.6776 | 0.9196 | 0.6761 | 9.884773e-09 | 1522 |
| 0.8748 | 0.6824 | 0.9196 | 0.6761 | 9.884623e-09 | 1523 |
| 0.8803 | 0.6776 | 0.9195 | 0.6761 | 9.884472e-09 | 1524 |
| 0.8768 | 0.6776 | 0.9195 | 0.6761 | 9.884321e-09 | 1525 |
| 0.8725 | 0.6800 | 0.9194 | 0.6761 | 9.88417e-09 | 1526 |
| 0.8738 | 0.6776 | 0.9193 | 0.6761 | 9.884019e-09 | 1527 |
| 0.8704 | 0.6776 | 0.9192 | 0.6761 | 9.883868e-09 | 1528 |
| 0.8771 | 0.6776 | 0.9191 | 0.6761 | 9.883717e-09 | 1529 |
| 0.8764 | 0.6776 | 0.9191 | 0.6761 | 9.883566e-09 | 1530 |
| 0.8800 | 0.6776 | 0.9190 | 0.6761 | 9.883415e-09 | 1531 |
| 0.8680 | 0.6776 | 0.9190 | 0.6761 | 9.883264e-09 | 1532 |
| 0.8688 | 0.6776 | 0.9189 | 0.6761 | 9.883112e-09 | 1533 |
| 0.8793 | 0.6776 | 0.9189 | 0.6761 | 9.88296e-09 | 1534 |
| 0.8872 | 0.6776 | 0.9188 | 0.6761 | 9.882808e-09 | 1535 |
| 0.8742 | 0.6800 | 0.9188 | 0.6761 | 9.8826565e-09 | 1536 |
| 0.8772 | 0.6776 | 0.9188 | 0.6761 | 9.882505e-09 | 1537 |
| 0.8790 | 0.6776 | 0.9188 | 0.6761 | 9.882353e-09 | 1538 |
| 0.8716 | 0.6800 | 0.9187 | 0.6761 | 9.882201e-09 | 1539 |
| 0.8718 | 0.6776 | 0.9187 | 0.6761 | 9.882049e-09 | 1540 |
| 0.8742 | 0.6776 | 0.9187 | 0.6761 | 9.881897e-09 | 1541 |
| 0.8691 | 0.6753 | 0.9187 | 0.6761 | 9.881744e-09 | 1542 |
| 0.8795 | 0.6776 | 0.9186 | 0.6761 | 9.8815915e-09 | 1543 |
| 0.8752 | 0.6776 | 0.9186 | 0.6761 | 9.881439e-09 | 1544 |
| 0.8717 | 0.6776 | 0.9186 | 0.6761 | 9.881286e-09 | 1545 |
| 0.8821 | 0.6800 | 0.9185 | 0.6761 | 9.881133e-09 | 1546 |
| 0.8767 | 0.6776 | 0.9185 | 0.6761 | 9.8809805e-09 | 1547 |
| 0.8818 | 0.6776 | 0.9184 | 0.6761 | 9.880828e-09 | 1548 |
| 0.8767 | 0.6800 | 0.9184 | 0.6761 | 9.880675e-09 | 1549 |
| 0.8747 | 0.6776 | 0.9183 | 0.6761 | 9.880522e-09 | 1550 |
| 0.8654 | 0.6776 | 0.9183 | 0.6761 | 9.8803685e-09 | 1551 |
| 0.8661 | 0.6776 | 0.9183 | 0.6761 | 9.880215e-09 | 1552 |
| 0.8728 | 0.6776 | 0.9182 | 0.6761 | 9.880061e-09 | 1553 |
| 0.8662 | 0.6776 | 0.9182 | 0.6761 | 9.879908e-09 | 1554 |
| 0.8696 | 0.6776 | 0.9182 | 0.6761 | 9.879754e-09 | 1555 |
| 0.8753 | 0.6800 | 0.9182 | 0.6761 | 9.8796e-09 | 1556 |
| 0.8704 | 0.6776 | 0.9181 | 0.6761 | 9.879447e-09 | 1557 |
| 0.8679 | 0.6776 | 0.9181 | 0.6761 | 9.879293e-09 | 1558 |
| 0.8705 | 0.6776 | 0.9180 | 0.6761 | 9.879139e-09 | 1559 |
| 0.8686 | 0.6776 | 0.9180 | 0.6761 | 9.878985e-09 | 1560 |
| 0.8682 | 0.6776 | 0.9180 | 0.6761 | 9.87883e-09 | 1561 |
| 0.8789 | 0.6776 | 0.9179 | 0.6761 | 9.878676e-09 | 1562 |
| 0.8745 | 0.6776 | 0.9179 | 0.6761 | 9.878521e-09 | 1563 |
| 0.8721 | 0.6776 | 0.9178 | 0.6761 | 9.878367e-09 | 1564 |
| 0.8721 | 0.6776 | 0.9177 | 0.6761 | 9.878212e-09 | 1565 |
| 0.8709 | 0.6776 | 0.9177 | 0.6761 | 9.8780575e-09 | 1566 |
| 0.8717 | 0.6776 | 0.9177 | 0.6761 | 9.877903e-09 | 1567 |
| 0.8756 | 0.6753 | 0.9176 | 0.6761 | 9.877748e-09 | 1568 |
| 0.8656 | 0.6776 | 0.9176 | 0.6761 | 9.877594e-09 | 1569 |
| 0.8762 | 0.6776 | 0.9175 | 0.6761 | 9.877438e-09 | 1570 |
| 0.8743 | 0.6776 | 0.9175 | 0.6761 | 9.877283e-09 | 1571 |
| 0.8737 | 0.6776 | 0.9175 | 0.6761 | 9.877128e-09 | 1572 |
| 0.8715 | 0.6776 | 0.9174 | 0.6761 | 9.876972e-09 | 1573 |
| 0.8662 | 0.6776 | 0.9174 | 0.6761 | 9.876817e-09 | 1574 |
| 0.8730 | 0.6776 | 0.9173 | 0.6761 | 9.876661e-09 | 1575 |
| 0.8692 | 0.6776 | 0.9172 | 0.6761 | 9.876506e-09 | 1576 |
| 0.8691 | 0.6753 | 0.9172 | 0.6761 | 9.87635e-09 | 1577 |
| 0.8652 | 0.6776 | 0.9171 | 0.6761 | 9.876195e-09 | 1578 |
| 0.8696 | 0.6800 | 0.9172 | 0.6761 | 9.876039e-09 | 1579 |
| 0.8759 | 0.6776 | 0.9171 | 0.6761 | 9.875882e-09 | 1580 |
| 0.8669 | 0.6776 | 0.9171 | 0.6761 | 9.875726e-09 | 1581 |
| 0.8731 | 0.6776 | 0.9171 | 0.6761 | 9.87557e-09 | 1582 |
| 0.8617 | 0.6776 | 0.9170 | 0.6761 | 9.875413e-09 | 1583 |
| 0.8716 | 0.6800 | 0.9170 | 0.6761 | 9.875257e-09 | 1584 |
| 0.8706 | 0.6776 | 0.9169 | 0.6761 | 9.875101e-09 | 1585 |
| 0.8645 | 0.6776 | 0.9169 | 0.6761 | 9.874944e-09 | 1586 |
| 0.8663 | 0.6824 | 0.9168 | 0.6761 | 9.874788e-09 | 1587 |
| 0.8677 | 0.6776 | 0.9168 | 0.6761 | 9.874631e-09 | 1588 |
| 0.8745 | 0.6776 | 0.9168 | 0.6761 | 9.874474e-09 | 1589 |
| 0.8679 | 0.6776 | 0.9167 | 0.6761 | 9.8743165e-09 | 1590 |
| 0.8668 | 0.6776 | 0.9167 | 0.6761 | 9.874159e-09 | 1591 |
| 0.8626 | 0.6753 | 0.9166 | 0.6761 | 9.874002e-09 | 1592 |
| 0.8746 | 0.6776 | 0.9166 | 0.6761 | 9.873845e-09 | 1593 |
| 0.8627 | 0.6800 | 0.9166 | 0.6761 | 9.873688e-09 | 1594 |
| 0.8739 | 0.6776 | 0.9165 | 0.6761 | 9.87353e-09 | 1595 |
| 0.8683 | 0.6800 | 0.9164 | 0.6761 | 9.873373e-09 | 1596 |
| 0.8742 | 0.6776 | 0.9164 | 0.6761 | 9.873215e-09 | 1597 |
| 0.8598 | 0.6776 | 0.9163 | 0.6761 | 9.873057e-09 | 1598 |
| 0.8725 | 0.6800 | 0.9162 | 0.6761 | 9.872899e-09 | 1599 |
| 0.8717 | 0.6753 | 0.9162 | 0.6761 | 9.872741e-09 | 1600 |
| 0.8675 | 0.6753 | 0.9161 | 0.6761 | 9.872583e-09 | 1601 |
| 0.8670 | 0.6776 | 0.9160 | 0.6761 | 9.872425e-09 | 1602 |
| 0.8681 | 0.6776 | 0.9160 | 0.6761 | 9.872267e-09 | 1603 |
| 0.8689 | 0.6776 | 0.9160 | 0.6761 | 9.8721085e-09 | 1604 |
| 0.8683 | 0.6753 | 0.9159 | 0.6761 | 9.87195e-09 | 1605 |
| 0.8626 | 0.6776 | 0.9159 | 0.6761 | 9.871791e-09 | 1606 |
| 0.8601 | 0.6776 | 0.9159 | 0.6761 | 9.871632e-09 | 1607 |
| 0.8665 | 0.6776 | 0.9158 | 0.6761 | 9.871473e-09 | 1608 |
| 0.8760 | 0.6753 | 0.9157 | 0.6761 | 9.871314e-09 | 1609 |
| 0.8738 | 0.6776 | 0.9156 | 0.6761 | 9.8711554e-09 | 1610 |
| 0.8753 | 0.6776 | 0.9156 | 0.6761 | 9.8709965e-09 | 1611 |
| 0.8653 | 0.6776 | 0.9156 | 0.6761 | 9.8708375e-09 | 1612 |
| 0.8693 | 0.6776 | 0.9155 | 0.6761 | 9.8706785e-09 | 1613 |
| 0.8647 | 0.6776 | 0.9155 | 0.6761 | 9.8705195e-09 | 1614 |
| 0.8732 | 0.6776 | 0.9155 | 0.6761 | 9.8703605e-09 | 1615 |
| 0.8708 | 0.6776 | 0.9154 | 0.6761 | 9.870201e-09 | 1616 |
| 0.8675 | 0.6753 | 0.9154 | 0.6761 | 9.870041e-09 | 1617 |
| 0.8695 | 0.6776 | 0.9153 | 0.6761 | 9.869881e-09 | 1618 |
| 0.8635 | 0.6800 | 0.9153 | 0.6761 | 9.869721e-09 | 1619 |
| 0.8616 | 0.6776 | 0.9153 | 0.6761 | 9.869561e-09 | 1620 |
| 0.8715 | 0.6776 | 0.9152 | 0.6761 | 9.869401e-09 | 1621 |
| 0.8606 | 0.6800 | 0.9152 | 0.6761 | 9.869241e-09 | 1622 |
| 0.8681 | 0.6800 | 0.9151 | 0.6761 | 9.8690816e-09 | 1623 |
| 0.8584 | 0.6800 | 0.9151 | 0.6761 | 9.868922e-09 | 1624 |
| 0.8632 | 0.6776 | 0.9150 | 0.6761 | 9.868761e-09 | 1625 |
| 0.8732 | 0.6776 | 0.9150 | 0.6761 | 9.8686e-09 | 1626 |
| 0.8706 | 0.6800 | 0.9150 | 0.6761 | 9.868439e-09 | 1627 |
| 0.8657 | 0.6800 | 0.9150 | 0.6761 | 9.868279e-09 | 1628 |
| 0.8632 | 0.6776 | 0.9149 | 0.6761 | 9.868118e-09 | 1629 |
| 0.8681 | 0.6776 | 0.9148 | 0.6761 | 9.867957e-09 | 1630 |
| 0.8691 | 0.6776 | 0.9147 | 0.6761 | 9.867796e-09 | 1631 |
| 0.8635 | 0.6776 | 0.9147 | 0.6761 | 9.867636e-09 | 1632 |
| 0.8671 | 0.6776 | 0.9146 | 0.6761 | 9.867475e-09 | 1633 |
| 0.8666 | 0.6776 | 0.9146 | 0.6761 | 9.867313e-09 | 1634 |
| 0.8589 | 0.6776 | 0.9146 | 0.6761 | 9.8671515e-09 | 1635 |
| 0.8682 | 0.6776 | 0.9145 | 0.6761 | 9.86699e-09 | 1636 |
| 0.8657 | 0.6776 | 0.9145 | 0.6761 | 9.866828e-09 | 1637 |
| 0.8705 | 0.6776 | 0.9145 | 0.6761 | 9.866667e-09 | 1638 |
| 0.8577 | 0.6800 | 0.9144 | 0.6761 | 9.866505e-09 | 1639 |
| 0.8570 | 0.6824 | 0.9144 | 0.6761 | 9.866343e-09 | 1640 |
| 0.8706 | 0.6776 | 0.9143 | 0.6761 | 9.866182e-09 | 1641 |
| 0.8625 | 0.6824 | 0.9142 | 0.6761 | 9.86602e-09 | 1642 |
| 0.8602 | 0.6776 | 0.9142 | 0.6761 | 9.8658575e-09 | 1643 |
| 0.8640 | 0.6824 | 0.9142 | 0.6761 | 9.865695e-09 | 1644 |
| 0.8614 | 0.6776 | 0.9141 | 0.6761 | 9.865532e-09 | 1645 |
| 0.8732 | 0.6776 | 0.9141 | 0.6761 | 9.86537e-09 | 1646 |
| 0.8616 | 0.6824 | 0.9140 | 0.6761 | 9.865207e-09 | 1647 |
| 0.8630 | 0.6776 | 0.9140 | 0.6761 | 9.865045e-09 | 1648 |
| 0.8764 | 0.6753 | 0.9140 | 0.6761 | 9.864882e-09 | 1649 |
| 0.8634 | 0.6776 | 0.9139 | 0.6761 | 9.86472e-09 | 1650 |
| 0.8675 | 0.6776 | 0.9139 | 0.6761 | 9.864557e-09 | 1651 |
| 0.8672 | 0.6800 | 0.9138 | 0.6761 | 9.864395e-09 | 1652 |
| 0.8628 | 0.6776 | 0.9138 | 0.6761 | 9.864231e-09 | 1653 |
| 0.8637 | 0.6776 | 0.9137 | 0.6761 | 9.864068e-09 | 1654 |
| 0.8690 | 0.6800 | 0.9137 | 0.6761 | 9.863904e-09 | 1655 |
| 0.8717 | 0.6753 | 0.9136 | 0.6761 | 9.863741e-09 | 1656 |
| 0.8603 | 0.6753 | 0.9136 | 0.6761 | 9.8635775e-09 | 1657 |
| 0.8586 | 0.6776 | 0.9136 | 0.6761 | 9.863414e-09 | 1658 |
| 0.8667 | 0.6776 | 0.9135 | 0.6761 | 9.863251e-09 | 1659 |
| 0.8657 | 0.6800 | 0.9134 | 0.6761 | 9.863087e-09 | 1660 |
| 0.8623 | 0.6753 | 0.9134 | 0.6761 | 9.862924e-09 | 1661 |
| 0.8683 | 0.6753 | 0.9134 | 0.6761 | 9.8627595e-09 | 1662 |
| 0.8546 | 0.6776 | 0.9133 | 0.6761 | 9.862595e-09 | 1663 |
| 0.8623 | 0.6824 | 0.9133 | 0.6761 | 9.862431e-09 | 1664 |
| 0.8731 | 0.6776 | 0.9133 | 0.6761 | 9.862267e-09 | 1665 |
| 0.8687 | 0.6824 | 0.9133 | 0.6761 | 9.862102e-09 | 1666 |
| 0.8685 | 0.6800 | 0.9132 | 0.6761 | 9.861938e-09 | 1667 |
| 0.8554 | 0.6824 | 0.9131 | 0.6761 | 9.861774e-09 | 1668 |
| 0.8586 | 0.6800 | 0.9131 | 0.6761 | 9.861609e-09 | 1669 |
| 0.8684 | 0.6776 | 0.9131 | 0.6761 | 9.861445e-09 | 1670 |
| 0.8668 | 0.6776 | 0.9130 | 0.6761 | 9.86128e-09 | 1671 |
| 0.8631 | 0.6824 | 0.9130 | 0.6761 | 9.861115e-09 | 1672 |
| 0.8750 | 0.6776 | 0.9130 | 0.6761 | 9.860949e-09 | 1673 |
| 0.8731 | 0.6800 | 0.9130 | 0.6761 | 9.860784e-09 | 1674 |
| 0.8740 | 0.6800 | 0.9129 | 0.6761 | 9.860619e-09 | 1675 |
| 0.8658 | 0.6776 | 0.9129 | 0.6761 | 9.860454e-09 | 1676 |
| 0.8759 | 0.6776 | 0.9128 | 0.6761 | 9.860289e-09 | 1677 |
| 0.8661 | 0.6776 | 0.9128 | 0.6761 | 9.860123e-09 | 1678 |
| 0.8630 | 0.6800 | 0.9127 | 0.6761 | 9.859958e-09 | 1679 |
| 0.8689 | 0.6776 | 0.9127 | 0.6761 | 9.859793e-09 | 1680 |
| 0.8632 | 0.6800 | 0.9126 | 0.6761 | 9.859627e-09 | 1681 |
| 0.8555 | 0.6776 | 0.9125 | 0.6761 | 9.859461e-09 | 1682 |
| 0.8557 | 0.6776 | 0.9125 | 0.6761 | 9.859295e-09 | 1683 |
| 0.8562 | 0.6776 | 0.9125 | 0.6761 | 9.859129e-09 | 1684 |
| 0.8576 | 0.6753 | 0.9125 | 0.6761 | 9.8589625e-09 | 1685 |
| 0.8598 | 0.6776 | 0.9124 | 0.6761 | 9.8587964e-09 | 1686 |
| 0.8609 | 0.6824 | 0.9124 | 0.6761 | 9.85863e-09 | 1687 |
| 0.8634 | 0.6753 | 0.9123 | 0.6761 | 9.858464e-09 | 1688 |
| 0.8653 | 0.6753 | 0.9123 | 0.6761 | 9.858298e-09 | 1689 |
| 0.8642 | 0.6729 | 0.9123 | 0.6761 | 9.858131e-09 | 1690 |
| 0.8575 | 0.6800 | 0.9122 | 0.6761 | 9.857964e-09 | 1691 |
| 0.8605 | 0.6776 | 0.9122 | 0.6761 | 9.857797e-09 | 1692 |
| 0.8582 | 0.6776 | 0.9121 | 0.6761 | 9.85763e-09 | 1693 |
| 0.8696 | 0.6776 | 0.9121 | 0.6761 | 9.857463e-09 | 1694 |
| 0.8676 | 0.6776 | 0.9121 | 0.6761 | 9.857296e-09 | 1695 |
| 0.8566 | 0.6776 | 0.9121 | 0.6761 | 9.857129e-09 | 1696 |
| 0.8537 | 0.6776 | 0.9120 | 0.6761 | 9.856962e-09 | 1697 |
| 0.8719 | 0.6753 | 0.9120 | 0.6761 | 9.856795e-09 | 1698 |
| 0.8591 | 0.6776 | 0.9120 | 0.6761 | 9.8566275e-09 | 1699 |
| 0.8608 | 0.6776 | 0.9119 | 0.6761 | 9.85646e-09 | 1700 |
| 0.8599 | 0.6776 | 0.9118 | 0.6761 | 9.856292e-09 | 1701 |
| 0.8512 | 0.6800 | 0.9118 | 0.6761 | 9.856124e-09 | 1702 |
| 0.8649 | 0.6776 | 0.9117 | 0.6761 | 9.855956e-09 | 1703 |
| 0.8727 | 0.6776 | 0.9117 | 0.6761 | 9.855788e-09 | 1704 |
| 0.8664 | 0.6824 | 0.9117 | 0.6761 | 9.85562e-09 | 1705 |
| 0.8650 | 0.6824 | 0.9117 | 0.6761 | 9.8554525e-09 | 1706 |
| 0.8715 | 0.6753 | 0.9116 | 0.6761 | 9.855285e-09 | 1707 |
| 0.8575 | 0.6753 | 0.9116 | 0.6761 | 9.855116e-09 | 1708 |
| 0.8536 | 0.6800 | 0.9116 | 0.6761 | 9.854947e-09 | 1709 |
| 0.8709 | 0.6847 | 0.9116 | 0.6761 | 9.854778e-09 | 1710 |
| 0.8624 | 0.6800 | 0.9115 | 0.6761 | 9.85461e-09 | 1711 |
| 0.8574 | 0.6824 | 0.9115 | 0.6761 | 9.854441e-09 | 1712 |
| 0.8650 | 0.6824 | 0.9115 | 0.6761 | 9.854272e-09 | 1713 |
| 0.8539 | 0.6800 | 0.9115 | 0.6761 | 9.854103e-09 | 1714 |
| 0.8549 | 0.6776 | 0.9115 | 0.6761 | 9.853935e-09 | 1715 |
| 0.8660 | 0.6847 | 0.9114 | 0.6761 | 9.853766e-09 | 1716 |
| 0.8625 | 0.6776 | 0.9113 | 0.6761 | 9.853597e-09 | 1717 |
| 0.8630 | 0.6824 | 0.9112 | 0.6761 | 9.853427e-09 | 1718 |
| 0.8616 | 0.6824 | 0.9112 | 0.6761 | 9.853258e-09 | 1719 |
| 0.8616 | 0.6800 | 0.9112 | 0.6761 | 9.853088e-09 | 1720 |
| 0.8672 | 0.6776 | 0.9111 | 0.6761 | 9.8529185e-09 | 1721 |
| 0.8507 | 0.6800 | 0.9110 | 0.6761 | 9.852749e-09 | 1722 |
| 0.8631 | 0.6776 | 0.9110 | 0.6761 | 9.852579e-09 | 1723 |
| 0.8570 | 0.6800 | 0.9110 | 0.6761 | 9.8524096e-09 | 1724 |
| 0.8532 | 0.6824 | 0.9110 | 0.6761 | 9.85224e-09 | 1725 |
| 0.8687 | 0.6800 | 0.9110 | 0.6761 | 9.85207e-09 | 1726 |
| 0.8579 | 0.6776 | 0.9109 | 0.6761 | 9.8519e-09 | 1727 |
| 0.8596 | 0.6800 | 0.9108 | 0.6761 | 9.851729e-09 | 1728 |
| 0.8600 | 0.6776 | 0.9108 | 0.6761 | 9.851559e-09 | 1729 |
| 0.8643 | 0.6776 | 0.9107 | 0.6761 | 9.851388e-09 | 1730 |
| 0.8446 | 0.6824 | 0.9106 | 0.6761 | 9.851218e-09 | 1731 |
| 0.8564 | 0.6753 | 0.9106 | 0.6761 | 9.851047e-09 | 1732 |
| 0.8490 | 0.6800 | 0.9105 | 0.6761 | 9.850877e-09 | 1733 |
| 0.8608 | 0.6776 | 0.9105 | 0.6761 | 9.850706e-09 | 1734 |
| 0.8603 | 0.6824 | 0.9105 | 0.6761 | 9.8505355e-09 | 1735 |
| 0.8529 | 0.6753 | 0.9105 | 0.6761 | 9.850364e-09 | 1736 |
| 0.8583 | 0.6800 | 0.9104 | 0.6761 | 9.850193e-09 | 1737 |
| 0.8494 | 0.6800 | 0.9104 | 0.6761 | 9.850021e-09 | 1738 |
| 0.8595 | 0.6776 | 0.9104 | 0.6761 | 9.84985e-09 | 1739 |
| 0.8507 | 0.6824 | 0.9103 | 0.6761 | 9.849678e-09 | 1740 |
| 0.8613 | 0.6800 | 0.9102 | 0.6761 | 9.849507e-09 | 1741 |
| 0.8488 | 0.6824 | 0.9102 | 0.6761 | 9.849336e-09 | 1742 |
| 0.8650 | 0.6753 | 0.9102 | 0.6761 | 9.849164e-09 | 1743 |
| 0.8606 | 0.6800 | 0.9102 | 0.6761 | 9.848993e-09 | 1744 |
| 0.8642 | 0.6753 | 0.9101 | 0.6761 | 9.848821e-09 | 1745 |
| 0.8625 | 0.6824 | 0.9100 | 0.6761 | 9.848649e-09 | 1746 |
| 0.8563 | 0.6776 | 0.9100 | 0.6761 | 9.848477e-09 | 1747 |
| 0.8508 | 0.6800 | 0.9099 | 0.6761 | 9.848304e-09 | 1748 |
| 0.8519 | 0.6800 | 0.9099 | 0.6761 | 9.848132e-09 | 1749 |
| 0.8524 | 0.6776 | 0.9099 | 0.6761 | 9.84796e-09 | 1750 |
| 0.8580 | 0.6824 | 0.9098 | 0.6761 | 9.8477875e-09 | 1751 |
| 0.8665 | 0.6824 | 0.9098 | 0.6761 | 9.847615e-09 | 1752 |
| 0.8600 | 0.6824 | 0.9097 | 0.6761 | 9.847443e-09 | 1753 |
| 0.8603 | 0.6800 | 0.9097 | 0.6761 | 9.8472706e-09 | 1754 |
| 0.8579 | 0.6800 | 0.9096 | 0.6761 | 9.847097e-09 | 1755 |
| 0.8503 | 0.6800 | 0.9096 | 0.6761 | 9.846924e-09 | 1756 |
| 0.8496 | 0.6800 | 0.9096 | 0.6761 | 9.846751e-09 | 1757 |
| 0.8585 | 0.6800 | 0.9095 | 0.6761 | 9.846578e-09 | 1758 |
| 0.8577 | 0.6800 | 0.9095 | 0.6761 | 9.846405e-09 | 1759 |
| 0.8597 | 0.6800 | 0.9094 | 0.6761 | 9.846231e-09 | 1760 |
| 0.8622 | 0.6800 | 0.9094 | 0.6761 | 9.846058e-09 | 1761 |
| 0.8519 | 0.6824 | 0.9093 | 0.6761 | 9.845885e-09 | 1762 |
| 0.8552 | 0.6776 | 0.9093 | 0.6761 | 9.845712e-09 | 1763 |
| 0.8683 | 0.6776 | 0.9093 | 0.6761 | 9.845538e-09 | 1764 |
| 0.8569 | 0.6824 | 0.9092 | 0.6761 | 9.845364e-09 | 1765 |
| 0.8561 | 0.6800 | 0.9092 | 0.6761 | 9.8451896e-09 | 1766 |
| 0.8519 | 0.6776 | 0.9091 | 0.6761 | 9.8450155e-09 | 1767 |
| 0.8563 | 0.6800 | 0.9091 | 0.6761 | 9.844841e-09 | 1768 |
| 0.8478 | 0.6800 | 0.9091 | 0.6761 | 9.844667e-09 | 1769 |
| 0.8555 | 0.6824 | 0.9090 | 0.6761 | 9.844493e-09 | 1770 |
| 0.8599 | 0.6800 | 0.9090 | 0.6761 | 9.844319e-09 | 1771 |
| 0.8610 | 0.6824 | 0.9090 | 0.6761 | 9.844145e-09 | 1772 |
| 0.8554 | 0.6776 | 0.9090 | 0.6761 | 9.84397e-09 | 1773 |
| 0.8603 | 0.6800 | 0.9089 | 0.6761 | 9.843795e-09 | 1774 |
| 0.8659 | 0.6776 | 0.9089 | 0.6761 | 9.84362e-09 | 1775 |
| 0.8587 | 0.6776 | 0.9089 | 0.6761 | 9.843445e-09 | 1776 |
| 0.8613 | 0.6824 | 0.9088 | 0.6761 | 9.84327e-09 | 1777 |
| 0.8547 | 0.6776 | 0.9088 | 0.6761 | 9.843095e-09 | 1778 |
| 0.8514 | 0.6753 | 0.9087 | 0.6761 | 9.84292e-09 | 1779 |
| 0.8548 | 0.6800 | 0.9087 | 0.6761 | 9.842745e-09 | 1780 |
| 0.8576 | 0.6800 | 0.9087 | 0.6761 | 9.84257e-09 | 1781 |
| 0.8576 | 0.6800 | 0.9087 | 0.6761 | 9.842395e-09 | 1782 |
| 0.8549 | 0.6800 | 0.9086 | 0.6761 | 9.8422195e-09 | 1783 |
| 0.8621 | 0.6800 | 0.9086 | 0.6761 | 9.842044e-09 | 1784 |
| 0.8595 | 0.6776 | 0.9085 | 0.6761 | 9.841868e-09 | 1785 |
| 0.8538 | 0.6776 | 0.9084 | 0.6761 | 9.841692e-09 | 1786 |
| 0.8507 | 0.6776 | 0.9084 | 0.6761 | 9.841516e-09 | 1787 |
| 0.8496 | 0.6800 | 0.9083 | 0.6761 | 9.84134e-09 | 1788 |
| 0.8599 | 0.6800 | 0.9083 | 0.6761 | 9.841164e-09 | 1789 |
| 0.8594 | 0.6776 | 0.9083 | 0.6761 | 9.8409885e-09 | 1790 |
| 0.8481 | 0.6800 | 0.9082 | 0.6761 | 9.840813e-09 | 1791 |
| 0.8550 | 0.6800 | 0.9082 | 0.6761 | 9.840636e-09 | 1792 |
| 0.8522 | 0.6847 | 0.9082 | 0.6761 | 9.840459e-09 | 1793 |
| 0.8491 | 0.6800 | 0.9082 | 0.6761 | 9.840282e-09 | 1794 |
| 0.8570 | 0.6753 | 0.9081 | 0.6761 | 9.840106e-09 | 1795 |
| 0.8674 | 0.6776 | 0.9080 | 0.6761 | 9.839929e-09 | 1796 |
| 0.8618 | 0.6800 | 0.9080 | 0.6761 | 9.839752e-09 | 1797 |
| 0.8440 | 0.6800 | 0.9079 | 0.6761 | 9.839575e-09 | 1798 |
| 0.8617 | 0.6800 | 0.9079 | 0.6761 | 9.839399e-09 | 1799 |
| 0.8620 | 0.6824 | 0.9079 | 0.6761 | 9.839222e-09 | 1800 |
| 0.8449 | 0.6824 | 0.9078 | 0.6761 | 9.839044e-09 | 1801 |
| 0.8566 | 0.6800 | 0.9077 | 0.6761 | 9.838867e-09 | 1802 |
| 0.8520 | 0.6800 | 0.9077 | 0.6761 | 9.838689e-09 | 1803 |
| 0.8605 | 0.6800 | 0.9077 | 0.6761 | 9.838511e-09 | 1804 |
| 0.8452 | 0.6800 | 0.9076 | 0.6761 | 9.838334e-09 | 1805 |
| 0.8493 | 0.6800 | 0.9076 | 0.6761 | 9.838156e-09 | 1806 |
| 0.8587 | 0.6776 | 0.9076 | 0.6761 | 9.837978e-09 | 1807 |
| 0.8527 | 0.6753 | 0.9075 | 0.6761 | 9.837801e-09 | 1808 |
| 0.8526 | 0.6824 | 0.9075 | 0.6761 | 9.837623e-09 | 1809 |
| 0.8420 | 0.6847 | 0.9074 | 0.6761 | 9.8374455e-09 | 1810 |
| 0.8603 | 0.6800 | 0.9074 | 0.6761 | 9.837267e-09 | 1811 |
| 0.8560 | 0.6776 | 0.9073 | 0.6761 | 9.8370885e-09 | 1812 |
| 0.8387 | 0.6847 | 0.9073 | 0.6761 | 9.83691e-09 | 1813 |
| 0.8542 | 0.6776 | 0.9072 | 0.6761 | 9.836731e-09 | 1814 |
| 0.8541 | 0.6824 | 0.9072 | 0.6761 | 9.836553e-09 | 1815 |
| 0.8527 | 0.6776 | 0.9071 | 0.6761 | 9.836374e-09 | 1816 |
| 0.8563 | 0.6800 | 0.9071 | 0.6761 | 9.836196e-09 | 1817 |
| 0.8537 | 0.6800 | 0.9071 | 0.6761 | 9.836017e-09 | 1818 |
| 0.8630 | 0.6824 | 0.9070 | 0.6761 | 9.835839e-09 | 1819 |
| 0.8507 | 0.6800 | 0.9070 | 0.6761 | 9.835659e-09 | 1820 |
| 0.8634 | 0.6776 | 0.9069 | 0.6761 | 9.83548e-09 | 1821 |
| 0.8499 | 0.6776 | 0.9069 | 0.6761 | 9.835301e-09 | 1822 |
| 0.8364 | 0.6847 | 0.9069 | 0.6761 | 9.835121e-09 | 1823 |
| 0.8486 | 0.6824 | 0.9068 | 0.6761 | 9.834942e-09 | 1824 |
| 0.8494 | 0.6776 | 0.9068 | 0.6761 | 9.834762e-09 | 1825 |
| 0.8483 | 0.6824 | 0.9067 | 0.6761 | 9.834583e-09 | 1826 |
| 0.8491 | 0.6847 | 0.9066 | 0.6761 | 9.8344035e-09 | 1827 |
| 0.8536 | 0.6800 | 0.9066 | 0.6761 | 9.834224e-09 | 1828 |
| 0.8544 | 0.6753 | 0.9066 | 0.6761 | 9.834044e-09 | 1829 |
| 0.8500 | 0.6800 | 0.9065 | 0.6761 | 9.8338635e-09 | 1830 |
| 0.8464 | 0.6800 | 0.9065 | 0.6761 | 9.833683e-09 | 1831 |
| 0.8570 | 0.6776 | 0.9064 | 0.6761 | 9.833503e-09 | 1832 |
| 0.8400 | 0.6847 | 0.9064 | 0.6761 | 9.833323e-09 | 1833 |
| 0.8494 | 0.6800 | 0.9063 | 0.6761 | 9.833142e-09 | 1834 |
| 0.8525 | 0.6824 | 0.9063 | 0.6761 | 9.832962e-09 | 1835 |
| 0.8518 | 0.6800 | 0.9062 | 0.6761 | 9.832782e-09 | 1836 |
| 0.8628 | 0.6824 | 0.9062 | 0.6761 | 9.832601e-09 | 1837 |
| 0.8569 | 0.6776 | 0.9062 | 0.6761 | 9.832421e-09 | 1838 |
| 0.8508 | 0.6800 | 0.9062 | 0.6761 | 9.83224e-09 | 1839 |
| 0.8645 | 0.6776 | 0.9061 | 0.6761 | 9.832059e-09 | 1840 |
| 0.8474 | 0.6847 | 0.9061 | 0.6761 | 9.8318775e-09 | 1841 |
| 0.8489 | 0.6800 | 0.9061 | 0.6761 | 9.831696e-09 | 1842 |
| 0.8390 | 0.6824 | 0.9061 | 0.6761 | 9.831515e-09 | 1843 |
| 0.8523 | 0.6847 | 0.9060 | 0.6761 | 9.831334e-09 | 1844 |
| 0.8537 | 0.6824 | 0.9060 | 0.6761 | 9.831153e-09 | 1845 |
| 0.8447 | 0.6824 | 0.9059 | 0.6761 | 9.830972e-09 | 1846 |
| 0.8505 | 0.6847 | 0.9059 | 0.6761 | 9.83079e-09 | 1847 |
| 0.8453 | 0.6776 | 0.9058 | 0.6761 | 9.830608e-09 | 1848 |
| 0.8598 | 0.6753 | 0.9058 | 0.6761 | 9.830426e-09 | 1849 |
| 0.8470 | 0.6847 | 0.9058 | 0.6761 | 9.830244e-09 | 1850 |
| 0.8518 | 0.6847 | 0.9057 | 0.6761 | 9.830062e-09 | 1851 |
| 0.8526 | 0.6776 | 0.9056 | 0.6761 | 9.82988e-09 | 1852 |
| 0.8474 | 0.6800 | 0.9056 | 0.6761 | 9.829698e-09 | 1853 |
| 0.8495 | 0.6800 | 0.9055 | 0.6761 | 9.829516e-09 | 1854 |
| 0.8458 | 0.6753 | 0.9055 | 0.6761 | 9.829334e-09 | 1855 |
| 0.8449 | 0.6824 | 0.9055 | 0.6761 | 9.829152e-09 | 1856 |
| 0.8447 | 0.6776 | 0.9054 | 0.6761 | 9.828969e-09 | 1857 |
| 0.8422 | 0.6847 | 0.9054 | 0.6761 | 9.828786e-09 | 1858 |
| 0.8464 | 0.6776 | 0.9054 | 0.6761 | 9.828603e-09 | 1859 |
| 0.8561 | 0.6753 | 0.9053 | 0.6761 | 9.82842e-09 | 1860 |
| 0.8508 | 0.6776 | 0.9053 | 0.6761 | 9.828237e-09 | 1861 |
| 0.8484 | 0.6800 | 0.9052 | 0.6761 | 9.828054e-09 | 1862 |
| 0.8485 | 0.6800 | 0.9052 | 0.6761 | 9.827871e-09 | 1863 |
| 0.8548 | 0.6847 | 0.9051 | 0.6761 | 9.827688e-09 | 1864 |
| 0.8552 | 0.6776 | 0.9051 | 0.6761 | 9.827505e-09 | 1865 |
| 0.8484 | 0.6800 | 0.9051 | 0.6761 | 9.827322e-09 | 1866 |
| 0.8513 | 0.6824 | 0.9050 | 0.6761 | 9.827138e-09 | 1867 |
| 0.8465 | 0.6847 | 0.9050 | 0.6761 | 9.826954e-09 | 1868 |
| 0.8522 | 0.6824 | 0.9050 | 0.6761 | 9.8267705e-09 | 1869 |
| 0.8556 | 0.6824 | 0.9049 | 0.6761 | 9.826587e-09 | 1870 |
| 0.8486 | 0.6800 | 0.9048 | 0.6761 | 9.826403e-09 | 1871 |
| 0.8497 | 0.6847 | 0.9047 | 0.6761 | 9.826219e-09 | 1872 |
| 0.8476 | 0.6800 | 0.9047 | 0.6761 | 9.826035e-09 | 1873 |
| 0.8554 | 0.6824 | 0.9046 | 0.6761 | 9.825851e-09 | 1874 |
| 0.8517 | 0.6776 | 0.9046 | 0.6761 | 9.825667e-09 | 1875 |
| 0.8390 | 0.6824 | 0.9046 | 0.6761 | 9.825483e-09 | 1876 |
| 0.8409 | 0.6800 | 0.9045 | 0.6761 | 9.825298e-09 | 1877 |
| 0.8433 | 0.6776 | 0.9045 | 0.6761 | 9.825113e-09 | 1878 |
| 0.8482 | 0.6800 | 0.9044 | 0.6761 | 9.824928e-09 | 1879 |
| 0.8519 | 0.6871 | 0.9044 | 0.6761 | 9.824744e-09 | 1880 |
| 0.8576 | 0.6776 | 0.9044 | 0.6761 | 9.824559e-09 | 1881 |
| 0.8539 | 0.6800 | 0.9044 | 0.6761 | 9.824374e-09 | 1882 |
| 0.8468 | 0.6776 | 0.9043 | 0.6761 | 9.8241895e-09 | 1883 |
| 0.8374 | 0.6824 | 0.9043 | 0.6761 | 9.824005e-09 | 1884 |
| 0.8502 | 0.6800 | 0.9042 | 0.6761 | 9.82382e-09 | 1885 |
| 0.8461 | 0.6800 | 0.9042 | 0.6761 | 9.823634e-09 | 1886 |
| 0.8375 | 0.6824 | 0.9042 | 0.6761 | 9.823449e-09 | 1887 |
| 0.8376 | 0.6847 | 0.9041 | 0.6761 | 9.823263e-09 | 1888 |
| 0.8440 | 0.6800 | 0.9041 | 0.6761 | 9.8230775e-09 | 1889 |
| 0.8580 | 0.6800 | 0.9040 | 0.6761 | 9.822892e-09 | 1890 |
| 0.8458 | 0.6800 | 0.9040 | 0.6761 | 9.822706e-09 | 1891 |
| 0.8477 | 0.6824 | 0.9040 | 0.6761 | 9.822521e-09 | 1892 |
| 0.8488 | 0.6776 | 0.9040 | 0.6761 | 9.822335e-09 | 1893 |
| 0.8426 | 0.6776 | 0.9040 | 0.6761 | 9.822149e-09 | 1894 |
| 0.8479 | 0.6753 | 0.9038 | 0.6761 | 9.821963e-09 | 1895 |
| 0.8417 | 0.6800 | 0.9038 | 0.6761 | 9.821776e-09 | 1896 |
| 0.8548 | 0.6847 | 0.9037 | 0.6761 | 9.82159e-09 | 1897 |
| 0.8581 | 0.6800 | 0.9037 | 0.6761 | 9.821403e-09 | 1898 |
| 0.8452 | 0.6871 | 0.9037 | 0.6761 | 9.821217e-09 | 1899 |
| 0.8514 | 0.6753 | 0.9036 | 0.6761 | 9.82103e-09 | 1900 |
| 0.8439 | 0.6847 | 0.9036 | 0.6761 | 9.820844e-09 | 1901 |
| 0.8528 | 0.6824 | 0.9036 | 0.6761 | 9.820657e-09 | 1902 |
| 0.8425 | 0.6800 | 0.9035 | 0.6761 | 9.820471e-09 | 1903 |
| 0.8475 | 0.6800 | 0.9034 | 0.6761 | 9.820283e-09 | 1904 |
| 0.8519 | 0.6776 | 0.9034 | 0.6761 | 9.820096e-09 | 1905 |
| 0.8378 | 0.6871 | 0.9033 | 0.6761 | 9.819908e-09 | 1906 |
| 0.8489 | 0.6800 | 0.9033 | 0.6761 | 9.819721e-09 | 1907 |
| 0.8317 | 0.6824 | 0.9032 | 0.6761 | 9.819534e-09 | 1908 |
| 0.8578 | 0.6776 | 0.9032 | 0.6761 | 9.819346e-09 | 1909 |
| 0.8485 | 0.6824 | 0.9031 | 0.6761 | 9.819159e-09 | 1910 |
| 0.8398 | 0.6776 | 0.9031 | 0.6761 | 9.818971e-09 | 1911 |
| 0.8452 | 0.6824 | 0.9030 | 0.6761 | 9.818784e-09 | 1912 |
| 0.8400 | 0.6871 | 0.9030 | 0.6761 | 9.818597e-09 | 1913 |
| 0.8432 | 0.6871 | 0.9030 | 0.6761 | 9.818408e-09 | 1914 |
| 0.8365 | 0.6824 | 0.9030 | 0.6761 | 9.81822e-09 | 1915 |
| 0.8396 | 0.6847 | 0.9029 | 0.6761 | 9.818032e-09 | 1916 |
| 0.8390 | 0.6824 | 0.9028 | 0.6761 | 9.817843e-09 | 1917 |
| 0.8574 | 0.6800 | 0.9028 | 0.6761 | 9.817655e-09 | 1918 |
| 0.8506 | 0.6800 | 0.9028 | 0.6761 | 9.817467e-09 | 1919 |
| 0.8528 | 0.6847 | 0.9027 | 0.6761 | 9.8172785e-09 | 1920 |
| 0.8449 | 0.6847 | 0.9027 | 0.6761 | 9.81709e-09 | 1921 |
| 0.8500 | 0.6800 | 0.9027 | 0.6761 | 9.816902e-09 | 1922 |
| 0.8434 | 0.6871 | 0.9026 | 0.6761 | 9.816713e-09 | 1923 |
| 0.8391 | 0.6776 | 0.9026 | 0.6761 | 9.816524e-09 | 1924 |
| 0.8463 | 0.6776 | 0.9025 | 0.6761 | 9.816334e-09 | 1925 |
| 0.8432 | 0.6824 | 0.9024 | 0.6761 | 9.816145e-09 | 1926 |
| 0.8474 | 0.6824 | 0.9024 | 0.6761 | 9.815956e-09 | 1927 |
| 0.8486 | 0.6824 | 0.9024 | 0.6761 | 9.815767e-09 | 1928 |
| 0.8466 | 0.6824 | 0.9023 | 0.6761 | 9.815578e-09 | 1929 |
| 0.8328 | 0.6800 | 0.9023 | 0.6761 | 9.8153885e-09 | 1930 |
| 0.8411 | 0.6847 | 0.9022 | 0.6761 | 9.815199e-09 | 1931 |
| 0.8462 | 0.6824 | 0.9021 | 0.6761 | 9.81501e-09 | 1932 |
| 0.8405 | 0.6847 | 0.9021 | 0.6761 | 9.81482e-09 | 1933 |
| 0.8456 | 0.6776 | 0.9020 | 0.6761 | 9.81463e-09 | 1934 |
| 0.8441 | 0.6800 | 0.9020 | 0.6761 | 9.81444e-09 | 1935 |
| 0.8484 | 0.6800 | 0.9020 | 0.6761 | 9.81425e-09 | 1936 |
| 0.8398 | 0.6800 | 0.9019 | 0.6761 | 9.81406e-09 | 1937 |
| 0.8328 | 0.6871 | 0.9019 | 0.6761 | 9.81387e-09 | 1938 |
| 0.8443 | 0.6824 | 0.9019 | 0.6761 | 9.81368e-09 | 1939 |
| 0.8341 | 0.6847 | 0.9018 | 0.6761 | 9.81349e-09 | 1940 |
| 0.8373 | 0.6847 | 0.9018 | 0.6761 | 9.8132995e-09 | 1941 |
| 0.8477 | 0.6824 | 0.9018 | 0.6761 | 9.8131085e-09 | 1942 |
| 0.8474 | 0.6847 | 0.9017 | 0.6761 | 9.812918e-09 | 1943 |
| 0.8421 | 0.6824 | 0.9016 | 0.6761 | 9.812727e-09 | 1944 |
| 0.8456 | 0.6847 | 0.9015 | 0.6761 | 9.812536e-09 | 1945 |
| 0.8371 | 0.6824 | 0.9014 | 0.6761 | 9.812345e-09 | 1946 |
| 0.8400 | 0.6847 | 0.9014 | 0.6761 | 9.812154e-09 | 1947 |
| 0.8482 | 0.6847 | 0.9013 | 0.6761 | 9.811963e-09 | 1948 |
| 0.8347 | 0.6824 | 0.9013 | 0.6761 | 9.811772e-09 | 1949 |
| 0.8306 | 0.6776 | 0.9013 | 0.6761 | 9.811581e-09 | 1950 |
| 0.8435 | 0.6776 | 0.9012 | 0.6761 | 9.811389e-09 | 1951 |
| 0.8502 | 0.6800 | 0.9012 | 0.6761 | 9.811197e-09 | 1952 |
| 0.8522 | 0.6800 | 0.9012 | 0.6761 | 9.811005e-09 | 1953 |
| 0.8351 | 0.6847 | 0.9011 | 0.6761 | 9.8108135e-09 | 1954 |
| 0.8382 | 0.6800 | 0.9011 | 0.6761 | 9.810622e-09 | 1955 |
| 0.8510 | 0.6800 | 0.9011 | 0.6761 | 9.81043e-09 | 1956 |
| 0.8460 | 0.6800 | 0.9010 | 0.6761 | 9.810238e-09 | 1957 |
| 0.8491 | 0.6824 | 0.9010 | 0.6761 | 9.810046e-09 | 1958 |
| 0.8461 | 0.6800 | 0.9010 | 0.6761 | 9.809854e-09 | 1959 |
| 0.8395 | 0.6800 | 0.9009 | 0.6761 | 9.809662e-09 | 1960 |
| 0.8427 | 0.6776 | 0.9009 | 0.6761 | 9.80947e-09 | 1961 |
| 0.8363 | 0.6847 | 0.9008 | 0.6761 | 9.809277e-09 | 1962 |
| 0.8403 | 0.6800 | 0.9008 | 0.6761 | 9.809084e-09 | 1963 |
| 0.8432 | 0.6800 | 0.9007 | 0.6761 | 9.8088915e-09 | 1964 |
| 0.8412 | 0.6824 | 0.9007 | 0.6761 | 9.808699e-09 | 1965 |
| 0.8430 | 0.6847 | 0.9006 | 0.6761 | 9.808506e-09 | 1966 |
| 0.8397 | 0.6776 | 0.9006 | 0.6761 | 9.808313e-09 | 1967 |
| 0.8359 | 0.6824 | 0.9005 | 0.6761 | 9.8081205e-09 | 1968 |
| 0.8273 | 0.6800 | 0.9005 | 0.6761 | 9.807928e-09 | 1969 |
| 0.8379 | 0.6871 | 0.9004 | 0.6761 | 9.807734e-09 | 1970 |
| 0.8465 | 0.6776 | 0.9004 | 0.6761 | 9.807541e-09 | 1971 |
| 0.8440 | 0.6800 | 0.9004 | 0.6761 | 9.807347e-09 | 1972 |
| 0.8437 | 0.6824 | 0.9003 | 0.6761 | 9.807153e-09 | 1973 |
| 0.8439 | 0.6776 | 0.9002 | 0.6761 | 9.80696e-09 | 1974 |
| 0.8520 | 0.6847 | 0.9002 | 0.6761 | 9.806766e-09 | 1975 |
| 0.8398 | 0.6871 | 0.9001 | 0.6761 | 9.806572e-09 | 1976 |
| 0.8463 | 0.6776 | 0.9001 | 0.6761 | 9.806379e-09 | 1977 |
| 0.8417 | 0.6824 | 0.9001 | 0.6761 | 9.806185e-09 | 1978 |
| 0.8424 | 0.6871 | 0.9000 | 0.6761 | 9.805992e-09 | 1979 |
| 0.8315 | 0.6800 | 0.9000 | 0.6761 | 9.805797e-09 | 1980 |
| 0.8494 | 0.6918 | 0.8999 | 0.6761 | 9.8056026e-09 | 1981 |
| 0.8497 | 0.6824 | 0.9000 | 0.6761 | 9.805408e-09 | 1982 |
| 0.8433 | 0.6824 | 0.8999 | 0.6761 | 9.8052135e-09 | 1983 |
| 0.8350 | 0.6847 | 0.8999 | 0.6761 | 9.805019e-09 | 1984 |
| 0.8364 | 0.6824 | 0.8998 | 0.6761 | 9.8048245e-09 | 1985 |
| 0.8465 | 0.6824 | 0.8998 | 0.6761 | 9.80463e-09 | 1986 |
| 0.8490 | 0.6847 | 0.8998 | 0.6761 | 9.8044355e-09 | 1987 |
| 0.8303 | 0.6776 | 0.8998 | 0.6761 | 9.804241e-09 | 1988 |
| 0.8393 | 0.6847 | 0.8997 | 0.6761 | 9.804046e-09 | 1989 |
| 0.8415 | 0.6871 | 0.8997 | 0.6761 | 9.80385e-09 | 1990 |
| 0.8383 | 0.6871 | 0.8996 | 0.6761 | 9.803655e-09 | 1991 |
| 0.8355 | 0.6753 | 0.8996 | 0.6761 | 9.803459e-09 | 1992 |
| 0.8381 | 0.6824 | 0.8995 | 0.6761 | 9.803264e-09 | 1993 |
| 0.8418 | 0.6753 | 0.8995 | 0.6761 | 9.803069e-09 | 1994 |
| 0.8323 | 0.6824 | 0.8994 | 0.6761 | 9.802873e-09 | 1995 |
| 0.8319 | 0.6824 | 0.8994 | 0.6761 | 9.802678e-09 | 1996 |
| 0.8372 | 0.6871 | 0.8994 | 0.6761 | 9.802482e-09 | 1997 |
| 0.8367 | 0.6847 | 0.8994 | 0.6761 | 9.802286e-09 | 1998 |
| 0.8393 | 0.6800 | 0.8994 | 0.6761 | 9.80209e-09 | 1999 |
| 0.8454 | 0.6824 | 0.8993 | 0.6761 | 9.8018935e-09 | 2000 |
| 0.8457 | 0.6800 | 0.8993 | 0.6761 | 9.801697e-09 | 2001 |
| 0.8384 | 0.6824 | 0.8993 | 0.6761 | 9.801501e-09 | 2002 |
| 0.8524 | 0.6753 | 0.8992 | 0.6761 | 9.801305e-09 | 2003 |
| 0.8382 | 0.6871 | 0.8992 | 0.6761 | 9.801108e-09 | 2004 |
| 0.8411 | 0.6847 | 0.8992 | 0.6761 | 9.800912e-09 | 2005 |
| 0.8232 | 0.6847 | 0.8992 | 0.6761 | 9.800716e-09 | 2006 |
| 0.8271 | 0.6847 | 0.8992 | 0.6761 | 9.8005195e-09 | 2007 |
| 0.8269 | 0.6776 | 0.8992 | 0.6761 | 9.800322e-09 | 2008 |
| 0.8360 | 0.6894 | 0.8991 | 0.6761 | 9.800125e-09 | 2009 |
| 0.8318 | 0.6847 | 0.8991 | 0.6761 | 9.799928e-09 | 2010 |
| 0.8400 | 0.6753 | 0.8991 | 0.6761 | 9.799731e-09 | 2011 |
| 0.8326 | 0.6824 | 0.8991 | 0.6761 | 9.799534e-09 | 2012 |
| 0.8410 | 0.6824 | 0.8990 | 0.6761 | 9.7993365e-09 | 2013 |
| 0.8457 | 0.6871 | 0.8990 | 0.6761 | 9.799139e-09 | 2014 |
| 0.8254 | 0.6847 | 0.8990 | 0.6761 | 9.798942e-09 | 2015 |
| 0.8277 | 0.6824 | 0.8990 | 0.6761 | 9.798745e-09 | 2016 |
| 0.8385 | 0.6847 | 0.8989 | 0.6761 | 9.798547e-09 | 2017 |
| 0.8303 | 0.6776 | 0.8989 | 0.6761 | 9.798349e-09 | 2018 |
| 0.8392 | 0.6824 | 0.8988 | 0.6761 | 9.798151e-09 | 2019 |
| 0.8393 | 0.6847 | 0.8987 | 0.6761 | 9.797953e-09 | 2020 |
| 0.8434 | 0.6800 | 0.8987 | 0.6761 | 9.797755e-09 | 2021 |
| 0.8481 | 0.6918 | 0.8987 | 0.6761 | 9.7975565e-09 | 2022 |
| 0.8348 | 0.6776 | 0.8987 | 0.6761 | 9.7973585e-09 | 2023 |
| 0.8412 | 0.6871 | 0.8987 | 0.6761 | 9.79716e-09 | 2024 |
| 0.8442 | 0.6800 | 0.8986 | 0.6761 | 9.796962e-09 | 2025 |
| 0.8394 | 0.6871 | 0.8986 | 0.6761 | 9.796764e-09 | 2026 |
| 0.8450 | 0.6847 | 0.8986 | 0.6761 | 9.796565e-09 | 2027 |
| 0.8447 | 0.6871 | 0.8985 | 0.6761 | 9.796366e-09 | 2028 |
| 0.8437 | 0.6847 | 0.8985 | 0.6761 | 9.796167e-09 | 2029 |
| 0.8357 | 0.6824 | 0.8984 | 0.6761 | 9.7959685e-09 | 2030 |
| 0.8348 | 0.6800 | 0.8984 | 0.6761 | 9.7957695e-09 | 2031 |
| 0.8226 | 0.6847 | 0.8983 | 0.6761 | 9.795571e-09 | 2032 |
| 0.8326 | 0.6918 | 0.8983 | 0.6761 | 9.795372e-09 | 2033 |
| 0.8359 | 0.6847 | 0.8983 | 0.6761 | 9.795173e-09 | 2034 |
| 0.8387 | 0.6847 | 0.8983 | 0.6761 | 9.794974e-09 | 2035 |
| 0.8258 | 0.6847 | 0.8982 | 0.6761 | 9.794774e-09 | 2036 |
| 0.8379 | 0.6871 | 0.8981 | 0.6761 | 9.794574e-09 | 2037 |
| 0.8408 | 0.6824 | 0.8981 | 0.6761 | 9.794374e-09 | 2038 |
| 0.8345 | 0.6847 | 0.8980 | 0.6761 | 9.794174e-09 | 2039 |
| 0.8324 | 0.6894 | 0.8979 | 0.6761 | 9.7939745e-09 | 2040 |
| 0.8377 | 0.6871 | 0.8979 | 0.6761 | 9.793775e-09 | 2041 |
| 0.8439 | 0.6847 | 0.8979 | 0.6761 | 9.793575e-09 | 2042 |
| 0.8338 | 0.6824 | 0.8978 | 0.6761 | 9.793375e-09 | 2043 |
| 0.8277 | 0.6824 | 0.8978 | 0.6761 | 9.793175e-09 | 2044 |
| 0.8434 | 0.6824 | 0.8978 | 0.6761 | 9.792975e-09 | 2045 |
| 0.8319 | 0.6894 | 0.8977 | 0.6761 | 9.792775e-09 | 2046 |
| 0.8320 | 0.6824 | 0.8977 | 0.6761 | 9.792574e-09 | 2047 |
| 0.8495 | 0.6871 | 0.8976 | 0.6761 | 9.792373e-09 | 2048 |
| 0.8467 | 0.6776 | 0.8976 | 0.6761 | 9.792172e-09 | 2049 |
| 0.8256 | 0.6894 | 0.8976 | 0.6761 | 9.791972e-09 | 2050 |
| 0.8409 | 0.6847 | 0.8975 | 0.6761 | 9.791771e-09 | 2051 |
| 0.8410 | 0.6894 | 0.8975 | 0.6761 | 9.79157e-09 | 2052 |
| 0.8295 | 0.6824 | 0.8975 | 0.6761 | 9.7913695e-09 | 2053 |
| 0.8384 | 0.6847 | 0.8975 | 0.6761 | 9.791169e-09 | 2054 |
| 0.8423 | 0.6776 | 0.8975 | 0.6761 | 9.790967e-09 | 2055 |
| 0.8348 | 0.6894 | 0.8975 | 0.6761 | 9.7907655e-09 | 2056 |
| 0.8335 | 0.6776 | 0.8974 | 0.6761 | 9.790564e-09 | 2057 |
| 0.8411 | 0.6871 | 0.8974 | 0.6761 | 9.790362e-09 | 2058 |
| 0.8298 | 0.6824 | 0.8973 | 0.6761 | 9.790161e-09 | 2059 |
| 0.8355 | 0.6871 | 0.8973 | 0.6761 | 9.789959e-09 | 2060 |
| 0.8313 | 0.6847 | 0.8972 | 0.6761 | 9.7897574e-09 | 2061 |
| 0.8266 | 0.6847 | 0.8972 | 0.6761 | 9.789556e-09 | 2062 |
| 0.8348 | 0.6824 | 0.8971 | 0.6761 | 9.789354e-09 | 2063 |
| 0.8307 | 0.6847 | 0.8971 | 0.6761 | 9.789153e-09 | 2064 |
| 0.8333 | 0.6824 | 0.8970 | 0.6761 | 9.78895e-09 | 2065 |
| 0.8318 | 0.6800 | 0.8970 | 0.6761 | 9.788748e-09 | 2066 |
| 0.8392 | 0.6847 | 0.8969 | 0.6761 | 9.788545e-09 | 2067 |
| 0.8372 | 0.6847 | 0.8969 | 0.6761 | 9.788343e-09 | 2068 |
| 0.8364 | 0.6824 | 0.8968 | 0.6761 | 9.78814e-09 | 2069 |
| 0.8357 | 0.6824 | 0.8968 | 0.6761 | 9.787938e-09 | 2070 |
| 0.8367 | 0.6847 | 0.8967 | 0.6761 | 9.787735e-09 | 2071 |
| 0.8309 | 0.6800 | 0.8967 | 0.6761 | 9.787533e-09 | 2072 |
| 0.8310 | 0.6871 | 0.8966 | 0.6761 | 9.78733e-09 | 2073 |
| 0.8252 | 0.6894 | 0.8965 | 0.6761 | 9.787127e-09 | 2074 |
| 0.8329 | 0.6847 | 0.8965 | 0.6761 | 9.786923e-09 | 2075 |
| 0.8296 | 0.6847 | 0.8964 | 0.6761 | 9.78672e-09 | 2076 |
| 0.8391 | 0.6871 | 0.8964 | 0.6761 | 9.7865165e-09 | 2077 |
| 0.8229 | 0.6847 | 0.8964 | 0.6761 | 9.786313e-09 | 2078 |
| 0.8487 | 0.6776 | 0.8963 | 0.6761 | 9.78611e-09 | 2079 |
| 0.8274 | 0.6847 | 0.8963 | 0.6761 | 9.785906e-09 | 2080 |
| 0.8308 | 0.6847 | 0.8963 | 0.6761 | 9.785703e-09 | 2081 |
| 0.8418 | 0.6776 | 0.8962 | 0.6761 | 9.7854995e-09 | 2082 |
| 0.8360 | 0.6800 | 0.8962 | 0.6761 | 9.785296e-09 | 2083 |
| 0.8374 | 0.6800 | 0.8962 | 0.6761 | 9.785092e-09 | 2084 |
| 0.8326 | 0.6871 | 0.8961 | 0.6761 | 9.784888e-09 | 2085 |
| 0.8337 | 0.6871 | 0.8961 | 0.6761 | 9.784683e-09 | 2086 |
| 0.8358 | 0.6847 | 0.8960 | 0.6761 | 9.784479e-09 | 2087 |
| 0.8351 | 0.6918 | 0.8960 | 0.6761 | 9.784275e-09 | 2088 |
| 0.8290 | 0.6847 | 0.8960 | 0.6761 | 9.78407e-09 | 2089 |
| 0.8320 | 0.6847 | 0.8959 | 0.6761 | 9.783866e-09 | 2090 |
| 0.8287 | 0.6871 | 0.8959 | 0.6761 | 9.783662e-09 | 2091 |
| 0.8370 | 0.6918 | 0.8959 | 0.6761 | 9.783458e-09 | 2092 |
| 0.8386 | 0.6776 | 0.8958 | 0.6761 | 9.783252e-09 | 2093 |
| 0.8301 | 0.6871 | 0.8958 | 0.6761 | 9.783047e-09 | 2094 |
| 0.8263 | 0.6847 | 0.8958 | 0.6761 | 9.782842e-09 | 2095 |
| 0.8358 | 0.6871 | 0.8957 | 0.6761 | 9.782637e-09 | 2096 |
| 0.8299 | 0.6847 | 0.8957 | 0.6761 | 9.782432e-09 | 2097 |
| 0.8366 | 0.6847 | 0.8957 | 0.6761 | 9.782227e-09 | 2098 |
| 0.8385 | 0.6871 | 0.8956 | 0.6761 | 9.782021e-09 | 2099 |
| 0.8295 | 0.6824 | 0.8956 | 0.6761 | 9.781816e-09 | 2100 |
| 0.8389 | 0.6894 | 0.8955 | 0.6761 | 9.781611e-09 | 2101 |
| 0.8306 | 0.6871 | 0.8955 | 0.6761 | 9.781406e-09 | 2102 |
| 0.8342 | 0.6871 | 0.8955 | 0.6761 | 9.7812e-09 | 2103 |
| 0.8238 | 0.6847 | 0.8954 | 0.6761 | 9.780994e-09 | 2104 |
| 0.8403 | 0.6894 | 0.8954 | 0.6761 | 9.780788e-09 | 2105 |
| 0.8325 | 0.6871 | 0.8953 | 0.6761 | 9.780582e-09 | 2106 |
| 0.8193 | 0.6847 | 0.8953 | 0.6761 | 9.780376e-09 | 2107 |
| 0.8278 | 0.6824 | 0.8952 | 0.6761 | 9.78017e-09 | 2108 |
| 0.8368 | 0.6847 | 0.8952 | 0.6761 | 9.7799635e-09 | 2109 |
| 0.8374 | 0.6871 | 0.8951 | 0.6761 | 9.7797574e-09 | 2110 |
| 0.8276 | 0.6847 | 0.8951 | 0.6761 | 9.779551e-09 | 2111 |
| 0.8261 | 0.6824 | 0.8951 | 0.6761 | 9.7793444e-09 | 2112 |
| 0.8437 | 0.6824 | 0.8950 | 0.6761 | 9.7791375e-09 | 2113 |
| 0.8261 | 0.6824 | 0.8950 | 0.6761 | 9.7789306e-09 | 2114 |
| 0.8206 | 0.6847 | 0.8950 | 0.6761 | 9.778724e-09 | 2115 |
| 0.8250 | 0.6800 | 0.8949 | 0.6761 | 9.778517e-09 | 2116 |
| 0.8229 | 0.6871 | 0.8949 | 0.6761 | 9.77831e-09 | 2117 |
| 0.8328 | 0.6824 | 0.8948 | 0.6761 | 9.778103e-09 | 2118 |
| 0.8336 | 0.6894 | 0.8948 | 0.6761 | 9.777896e-09 | 2119 |
| 0.8340 | 0.6800 | 0.8947 | 0.6761 | 9.777689e-09 | 2120 |
| 0.8335 | 0.6871 | 0.8947 | 0.6761 | 9.777482e-09 | 2121 |
| 0.8283 | 0.6847 | 0.8947 | 0.6761 | 9.777274e-09 | 2122 |
| 0.8353 | 0.6824 | 0.8946 | 0.6761 | 9.777066e-09 | 2123 |
| 0.8261 | 0.6800 | 0.8945 | 0.6761 | 9.776858e-09 | 2124 |
| 0.8334 | 0.6871 | 0.8945 | 0.6761 | 9.776651e-09 | 2125 |
| 0.8230 | 0.6894 | 0.8945 | 0.6761 | 9.776443e-09 | 2126 |
| 0.8304 | 0.6894 | 0.8944 | 0.6761 | 9.776235e-09 | 2127 |
| 0.8300 | 0.6894 | 0.8944 | 0.6761 | 9.776027e-09 | 2128 |
| 0.8324 | 0.6800 | 0.8944 | 0.6761 | 9.775819e-09 | 2129 |
| 0.8393 | 0.6847 | 0.8943 | 0.6761 | 9.775611e-09 | 2130 |
| 0.8195 | 0.6918 | 0.8943 | 0.6761 | 9.775403e-09 | 2131 |
| 0.8198 | 0.6871 | 0.8942 | 0.6761 | 9.775194e-09 | 2132 |
| 0.8311 | 0.6871 | 0.8942 | 0.6761 | 9.774985e-09 | 2133 |
| 0.8239 | 0.6941 | 0.8941 | 0.6761 | 9.7747765e-09 | 2134 |
| 0.8385 | 0.6800 | 0.8941 | 0.6761 | 9.774568e-09 | 2135 |
| 0.8331 | 0.6824 | 0.8941 | 0.6761 | 9.774359e-09 | 2136 |
| 0.8361 | 0.6824 | 0.8940 | 0.6761 | 9.77415e-09 | 2137 |
| 0.8259 | 0.6847 | 0.8940 | 0.6761 | 9.773942e-09 | 2138 |
| 0.8237 | 0.6824 | 0.8939 | 0.6761 | 9.773733e-09 | 2139 |
| 0.8182 | 0.6824 | 0.8939 | 0.6761 | 9.773524e-09 | 2140 |
| 0.8283 | 0.6871 | 0.8939 | 0.6761 | 9.773315e-09 | 2141 |
| 0.8283 | 0.6824 | 0.8938 | 0.6761 | 9.773105e-09 | 2142 |
| 0.8208 | 0.6894 | 0.8938 | 0.6761 | 9.772895e-09 | 2143 |
| 0.8257 | 0.6776 | 0.8938 | 0.6761 | 9.772686e-09 | 2144 |
| 0.8349 | 0.6871 | 0.8937 | 0.6761 | 9.772476e-09 | 2145 |
| 0.8317 | 0.6847 | 0.8937 | 0.6761 | 9.7722666e-09 | 2146 |
| 0.8243 | 0.6894 | 0.8936 | 0.6761 | 9.772057e-09 | 2147 |
| 0.8171 | 0.6824 | 0.8936 | 0.6761 | 9.771847e-09 | 2148 |
| 0.8265 | 0.6871 | 0.8935 | 0.6761 | 9.771638e-09 | 2149 |
| 0.8195 | 0.6824 | 0.8935 | 0.6761 | 9.771427e-09 | 2150 |
| 0.8257 | 0.6918 | 0.8935 | 0.6761 | 9.771217e-09 | 2151 |
| 0.8264 | 0.6918 | 0.8935 | 0.6761 | 9.771006e-09 | 2152 |
| 0.8269 | 0.6894 | 0.8934 | 0.6761 | 9.770796e-09 | 2153 |
| 0.8148 | 0.6894 | 0.8934 | 0.6761 | 9.770585e-09 | 2154 |
| 0.8247 | 0.6824 | 0.8933 | 0.6761 | 9.770375e-09 | 2155 |
| 0.8169 | 0.6871 | 0.8932 | 0.6761 | 9.770164e-09 | 2156 |
| 0.8333 | 0.6824 | 0.8932 | 0.6761 | 9.769954e-09 | 2157 |
| 0.8281 | 0.6894 | 0.8932 | 0.6761 | 9.769743e-09 | 2158 |
| 0.8234 | 0.6871 | 0.8932 | 0.6761 | 9.769533e-09 | 2159 |
| 0.8231 | 0.6824 | 0.8931 | 0.6761 | 9.769321e-09 | 2160 |
| 0.8141 | 0.6847 | 0.8931 | 0.6761 | 9.76911e-09 | 2161 |
| 0.8316 | 0.6918 | 0.8931 | 0.6761 | 9.768899e-09 | 2162 |
| 0.8204 | 0.6941 | 0.8930 | 0.6761 | 9.768687e-09 | 2163 |
| 0.8243 | 0.6894 | 0.8930 | 0.6761 | 9.768476e-09 | 2164 |
| 0.8203 | 0.6800 | 0.8930 | 0.6761 | 9.768264e-09 | 2165 |
| 0.8292 | 0.6753 | 0.8929 | 0.6761 | 9.768053e-09 | 2166 |
| 0.8301 | 0.6824 | 0.8929 | 0.6761 | 9.767842e-09 | 2167 |
| 0.8248 | 0.6800 | 0.8928 | 0.6761 | 9.76763e-09 | 2168 |
| 0.8337 | 0.6894 | 0.8928 | 0.6761 | 9.767418e-09 | 2169 |
| 0.8139 | 0.6847 | 0.8928 | 0.6761 | 9.767206e-09 | 2170 |
| 0.8206 | 0.6894 | 0.8928 | 0.6761 | 9.766993e-09 | 2171 |
| 0.8205 | 0.6918 | 0.8927 | 0.6761 | 9.766781e-09 | 2172 |
| 0.8229 | 0.6847 | 0.8927 | 0.6761 | 9.766569e-09 | 2173 |
| 0.8214 | 0.6894 | 0.8926 | 0.6761 | 9.766357e-09 | 2174 |
| 0.8236 | 0.6800 | 0.8926 | 0.6761 | 9.766144e-09 | 2175 |
| 0.8234 | 0.6847 | 0.8925 | 0.6761 | 9.765932e-09 | 2176 |
| 0.8295 | 0.6871 | 0.8924 | 0.6761 | 9.76572e-09 | 2177 |
| 0.8214 | 0.6918 | 0.8924 | 0.6761 | 9.7655075e-09 | 2178 |
| 0.8158 | 0.6800 | 0.8923 | 0.6761 | 9.765294e-09 | 2179 |
| 0.8289 | 0.6824 | 0.8923 | 0.6761 | 9.765081e-09 | 2180 |
| 0.8274 | 0.6871 | 0.8923 | 0.6761 | 9.764868e-09 | 2181 |
| 0.8217 | 0.6847 | 0.8922 | 0.6761 | 9.764655e-09 | 2182 |
| 0.8222 | 0.6965 | 0.8922 | 0.6761 | 9.764442e-09 | 2183 |
| 0.8358 | 0.6824 | 0.8921 | 0.6761 | 9.7642285e-09 | 2184 |
| 0.8185 | 0.6800 | 0.8922 | 0.6761 | 9.764015e-09 | 2185 |
| 0.8310 | 0.6871 | 0.8921 | 0.6761 | 9.763802e-09 | 2186 |
| 0.8249 | 0.6894 | 0.8921 | 0.6761 | 9.763589e-09 | 2187 |
| 0.8266 | 0.6824 | 0.8921 | 0.6761 | 9.763375e-09 | 2188 |
| 0.8204 | 0.6824 | 0.8920 | 0.6761 | 9.763161e-09 | 2189 |
| 0.8395 | 0.6847 | 0.8920 | 0.6761 | 9.762947e-09 | 2190 |
| 0.8271 | 0.6800 | 0.8919 | 0.6761 | 9.762733e-09 | 2191 |
| 0.8297 | 0.6871 | 0.8919 | 0.6761 | 9.762519e-09 | 2192 |
| 0.8181 | 0.6871 | 0.8919 | 0.6761 | 9.762305e-09 | 2193 |
| 0.8259 | 0.6800 | 0.8918 | 0.6761 | 9.762091e-09 | 2194 |
| 0.8216 | 0.6965 | 0.8918 | 0.6761 | 9.761877e-09 | 2195 |
| 0.8185 | 0.6918 | 0.8917 | 0.6761 | 9.761663e-09 | 2196 |
| 0.8270 | 0.6847 | 0.8917 | 0.6761 | 9.7614485e-09 | 2197 |
| 0.8254 | 0.6871 | 0.8916 | 0.6761 | 9.761234e-09 | 2198 |
| 0.8216 | 0.6824 | 0.8916 | 0.6761 | 9.761019e-09 | 2199 |
| 0.8277 | 0.6894 | 0.8916 | 0.6761 | 9.760804e-09 | 2200 |
| 0.8283 | 0.6871 | 0.8915 | 0.6761 | 9.760589e-09 | 2201 |
| 0.8257 | 0.6847 | 0.8915 | 0.6761 | 9.760374e-09 | 2202 |
| 0.8299 | 0.6824 | 0.8915 | 0.6761 | 9.760159e-09 | 2203 |
| 0.8234 | 0.6871 | 0.8915 | 0.6761 | 9.759944e-09 | 2204 |
| 0.8205 | 0.6894 | 0.8915 | 0.6761 | 9.759729e-09 | 2205 |
| 0.8398 | 0.6800 | 0.8915 | 0.6761 | 9.759514e-09 | 2206 |
| 0.8202 | 0.6824 | 0.8915 | 0.6761 | 9.759298e-09 | 2207 |
| 0.8149 | 0.6894 | 0.8915 | 0.6761 | 9.759082e-09 | 2208 |
| 0.8178 | 0.6871 | 0.8914 | 0.6761 | 9.758867e-09 | 2209 |
| 0.8268 | 0.6800 | 0.8914 | 0.6761 | 9.758651e-09 | 2210 |
| 0.8211 | 0.6824 | 0.8914 | 0.6761 | 9.758435e-09 | 2211 |
| 0.8147 | 0.6871 | 0.8914 | 0.6761 | 9.758219e-09 | 2212 |
| 0.8245 | 0.6847 | 0.8913 | 0.6761 | 9.758003e-09 | 2213 |
| 0.8205 | 0.6918 | 0.8913 | 0.6761 | 9.7577875e-09 | 2214 |
| 0.8273 | 0.6871 | 0.8913 | 0.6761 | 9.757572e-09 | 2215 |
| 0.8228 | 0.6847 | 0.8912 | 0.6761 | 9.757356e-09 | 2216 |
| 0.8229 | 0.6847 | 0.8912 | 0.6761 | 9.757139e-09 | 2217 |
| 0.8176 | 0.6871 | 0.8911 | 0.6761 | 9.756922e-09 | 2218 |
| 0.8236 | 0.6894 | 0.8911 | 0.6761 | 9.756706e-09 | 2219 |
| 0.8214 | 0.6894 | 0.8910 | 0.6761 | 9.756489e-09 | 2220 |
| 0.8262 | 0.6824 | 0.8910 | 0.6761 | 9.756272e-09 | 2221 |
| 0.8146 | 0.6847 | 0.8910 | 0.6761 | 9.7560555e-09 | 2222 |
| 0.8175 | 0.6847 | 0.8909 | 0.6761 | 9.755839e-09 | 2223 |
| 0.8193 | 0.6847 | 0.8909 | 0.6761 | 9.755622e-09 | 2224 |
| 0.8188 | 0.6894 | 0.8908 | 0.6761 | 9.755405e-09 | 2225 |
| 0.8237 | 0.6847 | 0.8908 | 0.6761 | 9.755189e-09 | 2226 |
| 0.8152 | 0.6824 | 0.8907 | 0.6761 | 9.754971e-09 | 2227 |
| 0.8263 | 0.6776 | 0.8907 | 0.6761 | 9.7547534e-09 | 2228 |
| 0.8205 | 0.6847 | 0.8907 | 0.6761 | 9.754536e-09 | 2229 |
| 0.8155 | 0.6918 | 0.8907 | 0.6761 | 9.754318e-09 | 2230 |
| 0.8150 | 0.6871 | 0.8906 | 0.6761 | 9.754101e-09 | 2231 |
| 0.8262 | 0.6824 | 0.8906 | 0.6761 | 9.753883e-09 | 2232 |
| 0.8085 | 0.6965 | 0.8905 | 0.6761 | 9.753665e-09 | 2233 |
| 0.8165 | 0.6824 | 0.8905 | 0.6761 | 9.753448e-09 | 2234 |
| 0.8291 | 0.6824 | 0.8904 | 0.6761 | 9.75323e-09 | 2235 |
| 0.8206 | 0.6918 | 0.8904 | 0.6761 | 9.753012e-09 | 2236 |
| 0.8209 | 0.6824 | 0.8904 | 0.6761 | 9.752793e-09 | 2237 |
| 0.8207 | 0.6965 | 0.8904 | 0.6761 | 9.752575e-09 | 2238 |
| 0.8200 | 0.6894 | 0.8904 | 0.6761 | 9.752356e-09 | 2239 |
| 0.8212 | 0.6894 | 0.8904 | 0.6761 | 9.752138e-09 | 2240 |
| 0.8228 | 0.6871 | 0.8903 | 0.6761 | 9.751919e-09 | 2241 |
| 0.8218 | 0.6776 | 0.8903 | 0.6761 | 9.751701e-09 | 2242 |
| 0.8228 | 0.6847 | 0.8902 | 0.6761 | 9.751482e-09 | 2243 |
| 0.8265 | 0.6894 | 0.8902 | 0.6761 | 9.751264e-09 | 2244 |
| 0.8143 | 0.6871 | 0.8901 | 0.6761 | 9.751045e-09 | 2245 |
| 0.8120 | 0.6824 | 0.8901 | 0.6761 | 9.750826e-09 | 2246 |
| 0.8224 | 0.6894 | 0.8901 | 0.6761 | 9.7506065e-09 | 2247 |
| 0.8117 | 0.6965 | 0.8900 | 0.6761 | 9.750387e-09 | 2248 |
| 0.8180 | 0.6871 | 0.8900 | 0.6761 | 9.750168e-09 | 2249 |
| 0.8058 | 0.6871 | 0.8899 | 0.6761 | 9.749948e-09 | 2250 |
| 0.8076 | 0.6918 | 0.8899 | 0.6761 | 9.749729e-09 | 2251 |
| 0.8255 | 0.6847 | 0.8899 | 0.6761 | 9.74951e-09 | 2252 |
| 0.8159 | 0.6847 | 0.8899 | 0.6761 | 9.74929e-09 | 2253 |
| 0.8221 | 0.6871 | 0.8898 | 0.6761 | 9.749071e-09 | 2254 |
| 0.8197 | 0.6941 | 0.8898 | 0.6761 | 9.748851e-09 | 2255 |
| 0.8213 | 0.6847 | 0.8897 | 0.6761 | 9.74863e-09 | 2256 |
| 0.8208 | 0.6824 | 0.8897 | 0.6761 | 9.74841e-09 | 2257 |
| 0.8273 | 0.6871 | 0.8896 | 0.6761 | 9.74819e-09 | 2258 |
| 0.8211 | 0.6918 | 0.8896 | 0.6761 | 9.7479695e-09 | 2259 |
| 0.8263 | 0.6894 | 0.8896 | 0.6761 | 9.747749e-09 | 2260 |
| 0.8178 | 0.6800 | 0.8895 | 0.6761 | 9.747529e-09 | 2261 |
| 0.8240 | 0.6847 | 0.8895 | 0.6761 | 9.747309e-09 | 2262 |
| 0.8195 | 0.6894 | 0.8894 | 0.6761 | 9.7470885e-09 | 2263 |
| 0.8220 | 0.6965 | 0.8894 | 0.6761 | 9.746868e-09 | 2264 |
| 0.8157 | 0.6847 | 0.8894 | 0.6761 | 9.746647e-09 | 2265 |
| 0.8101 | 0.6871 | 0.8893 | 0.6761 | 9.746426e-09 | 2266 |
| 0.8186 | 0.6988 | 0.8893 | 0.6761 | 9.746205e-09 | 2267 |
| 0.8342 | 0.6894 | 0.8893 | 0.6761 | 9.745984e-09 | 2268 |
| 0.8207 | 0.6894 | 0.8893 | 0.6761 | 9.745762e-09 | 2269 |
| 0.8267 | 0.6847 | 0.8892 | 0.6761 | 9.745541e-09 | 2270 |
| 0.8201 | 0.6918 | 0.8891 | 0.6761 | 9.74532e-09 | 2271 |
| 0.8077 | 0.6918 | 0.8891 | 0.6761 | 9.745099e-09 | 2272 |
| 0.8081 | 0.6988 | 0.8890 | 0.6761 | 9.744878e-09 | 2273 |
| 0.8115 | 0.6918 | 0.8890 | 0.6761 | 9.744657e-09 | 2274 |
| 0.8050 | 0.6941 | 0.8889 | 0.6761 | 9.744435e-09 | 2275 |
| 0.8173 | 0.6894 | 0.8889 | 0.6761 | 9.7442125e-09 | 2276 |
| 0.8287 | 0.6847 | 0.8889 | 0.6761 | 9.7439905e-09 | 2277 |
| 0.8180 | 0.6894 | 0.8888 | 0.6761 | 9.7437685e-09 | 2278 |
| 0.8125 | 0.6894 | 0.8888 | 0.6761 | 9.743546e-09 | 2279 |
| 0.8118 | 0.6847 | 0.8888 | 0.6761 | 9.743324e-09 | 2280 |
| 0.8130 | 0.6894 | 0.8888 | 0.6761 | 9.743102e-09 | 2281 |
| 0.8159 | 0.6894 | 0.8887 | 0.6761 | 9.74288e-09 | 2282 |
| 0.8140 | 0.6918 | 0.8886 | 0.6761 | 9.742658e-09 | 2283 |
| 0.8121 | 0.6824 | 0.8886 | 0.6761 | 9.742435e-09 | 2284 |
| 0.8133 | 0.6918 | 0.8886 | 0.6761 | 9.742212e-09 | 2285 |
| 0.8194 | 0.6965 | 0.8885 | 0.6761 | 9.741989e-09 | 2286 |
| 0.8143 | 0.6941 | 0.8885 | 0.6761 | 9.7417665e-09 | 2287 |
| 0.8187 | 0.6941 | 0.8885 | 0.6761 | 9.741544e-09 | 2288 |
| 0.8153 | 0.6918 | 0.8885 | 0.6761 | 9.741321e-09 | 2289 |
| 0.8151 | 0.6894 | 0.8884 | 0.6761 | 9.741098e-09 | 2290 |
| 0.8223 | 0.6824 | 0.8884 | 0.6761 | 9.740875e-09 | 2291 |
| 0.8148 | 0.6894 | 0.8884 | 0.6761 | 9.740652e-09 | 2292 |
| 0.8244 | 0.6894 | 0.8883 | 0.6761 | 9.740429e-09 | 2293 |
| 0.8137 | 0.6918 | 0.8883 | 0.6761 | 9.740205e-09 | 2294 |
| 0.8108 | 0.6847 | 0.8882 | 0.6761 | 9.739981e-09 | 2295 |
| 0.8205 | 0.6800 | 0.8882 | 0.6761 | 9.7397574e-09 | 2296 |
| 0.8172 | 0.6918 | 0.8881 | 0.6761 | 9.739534e-09 | 2297 |
| 0.7971 | 0.6988 | 0.8881 | 0.6761 | 9.73931e-09 | 2298 |
| 0.8201 | 0.6894 | 0.8881 | 0.6761 | 9.739086e-09 | 2299 |
| 0.8021 | 0.6894 | 0.8880 | 0.6761 | 9.738862e-09 | 2300 |
| 0.8095 | 0.6871 | 0.8880 | 0.6761 | 9.738638e-09 | 2301 |
| 0.8033 | 0.6894 | 0.8880 | 0.6761 | 9.7384145e-09 | 2302 |
| 0.8199 | 0.6941 | 0.8879 | 0.6761 | 9.73819e-09 | 2303 |
| 0.8202 | 0.6941 | 0.8879 | 0.6761 | 9.737965e-09 | 2304 |
| 0.8261 | 0.6847 | 0.8879 | 0.6761 | 9.73774e-09 | 2305 |
| 0.8120 | 0.6847 | 0.8878 | 0.6761 | 9.737516e-09 | 2306 |
| 0.8091 | 0.6894 | 0.8877 | 0.6761 | 9.737291e-09 | 2307 |
| 0.8190 | 0.6871 | 0.8877 | 0.6761 | 9.737066e-09 | 2308 |
| 0.8178 | 0.6871 | 0.8876 | 0.6761 | 9.7368416e-09 | 2309 |
| 0.8267 | 0.6894 | 0.8876 | 0.6761 | 9.736617e-09 | 2310 |
| 0.8241 | 0.6894 | 0.8876 | 0.6761 | 9.736392e-09 | 2311 |
| 0.8209 | 0.6894 | 0.8875 | 0.6761 | 9.736167e-09 | 2312 |
| 0.8194 | 0.6918 | 0.8875 | 0.6761 | 9.735942e-09 | 2313 |
| 0.8121 | 0.6941 | 0.8875 | 0.6761 | 9.735716e-09 | 2314 |
| 0.8154 | 0.6871 | 0.8875 | 0.6761 | 9.735491e-09 | 2315 |
| 0.8072 | 0.6847 | 0.8874 | 0.6761 | 9.735265e-09 | 2316 |
| 0.8181 | 0.6871 | 0.8875 | 0.6761 | 9.735039e-09 | 2317 |
| 0.8205 | 0.6918 | 0.8874 | 0.6761 | 9.734814e-09 | 2318 |
| 0.8140 | 0.6918 | 0.8873 | 0.6761 | 9.734588e-09 | 2319 |
| 0.8186 | 0.6941 | 0.8872 | 0.6761 | 9.734363e-09 | 2320 |
| 0.8158 | 0.6894 | 0.8872 | 0.6761 | 9.734137e-09 | 2321 |
| 0.8116 | 0.6894 | 0.8872 | 0.6761 | 9.7339115e-09 | 2322 |
| 0.8110 | 0.6918 | 0.8872 | 0.6761 | 9.733685e-09 | 2323 |
| 0.8046 | 0.6941 | 0.8872 | 0.6761 | 9.7334585e-09 | 2324 |
| 0.8096 | 0.6965 | 0.8871 | 0.6761 | 9.733232e-09 | 2325 |
| 0.8095 | 0.6918 | 0.8871 | 0.6761 | 9.7330055e-09 | 2326 |
| 0.8120 | 0.6918 | 0.8870 | 0.6761 | 9.732779e-09 | 2327 |
| 0.8148 | 0.6965 | 0.8870 | 0.6761 | 9.7325525e-09 | 2328 |
| 0.8182 | 0.6847 | 0.8870 | 0.6761 | 9.732326e-09 | 2329 |
| 0.8144 | 0.6894 | 0.8869 | 0.6690 | 9.7321e-09 | 2330 |
| 0.8080 | 0.6871 | 0.8869 | 0.6690 | 9.731873e-09 | 2331 |
| 0.8095 | 0.6918 | 0.8868 | 0.6690 | 9.731646e-09 | 2332 |
| 0.8191 | 0.6894 | 0.8867 | 0.6690 | 9.731418e-09 | 2333 |
| 0.8189 | 0.6871 | 0.8867 | 0.6690 | 9.731191e-09 | 2334 |
| 0.8060 | 0.6894 | 0.8866 | 0.6690 | 9.730964e-09 | 2335 |
| 0.8167 | 0.6918 | 0.8866 | 0.6690 | 9.730736e-09 | 2336 |
| 0.8107 | 0.6918 | 0.8866 | 0.6690 | 9.730509e-09 | 2337 |
| 0.8162 | 0.6871 | 0.8866 | 0.6690 | 9.7302815e-09 | 2338 |
| 0.8077 | 0.6894 | 0.8865 | 0.6690 | 9.730054e-09 | 2339 |
| 0.8244 | 0.6824 | 0.8865 | 0.6690 | 9.729827e-09 | 2340 |
| 0.8157 | 0.6941 | 0.8864 | 0.6690 | 9.729599e-09 | 2341 |
| 0.8205 | 0.6918 | 0.8864 | 0.6690 | 9.729371e-09 | 2342 |
| 0.8133 | 0.6918 | 0.8864 | 0.6690 | 9.729143e-09 | 2343 |
| 0.8082 | 0.6918 | 0.8864 | 0.6690 | 9.728915e-09 | 2344 |
| 0.8137 | 0.6918 | 0.8863 | 0.6690 | 9.728686e-09 | 2345 |
| 0.8173 | 0.6918 | 0.8863 | 0.6690 | 9.728458e-09 | 2346 |
| 0.8143 | 0.6941 | 0.8863 | 0.6690 | 9.72823e-09 | 2347 |
| 0.8082 | 0.6965 | 0.8863 | 0.6690 | 9.7280015e-09 | 2348 |
| 0.8086 | 0.6918 | 0.8862 | 0.6690 | 9.727773e-09 | 2349 |
| 0.8151 | 0.6965 | 0.8862 | 0.6690 | 9.727545e-09 | 2350 |
| 0.8040 | 0.6894 | 0.8862 | 0.6690 | 9.727317e-09 | 2351 |
| 0.8100 | 0.6824 | 0.8862 | 0.6690 | 9.727088e-09 | 2352 |
| 0.8126 | 0.6941 | 0.8861 | 0.6690 | 9.726858e-09 | 2353 |
| 0.8087 | 0.6871 | 0.8861 | 0.6690 | 9.726629e-09 | 2354 |
| 0.8151 | 0.6894 | 0.8861 | 0.6690 | 9.7264e-09 | 2355 |
| 0.8177 | 0.6871 | 0.8860 | 0.6690 | 9.726171e-09 | 2356 |
| 0.8105 | 0.6871 | 0.8859 | 0.6690 | 9.725942e-09 | 2357 |
| 0.8157 | 0.6894 | 0.8859 | 0.6690 | 9.725713e-09 | 2358 |
| 0.8096 | 0.6918 | 0.8859 | 0.6690 | 9.7254835e-09 | 2359 |
| 0.8128 | 0.6847 | 0.8858 | 0.6690 | 9.725254e-09 | 2360 |
| 0.8193 | 0.6918 | 0.8858 | 0.6690 | 9.725024e-09 | 2361 |
| 0.8174 | 0.6871 | 0.8857 | 0.6690 | 9.724794e-09 | 2362 |
| 0.8157 | 0.6918 | 0.8858 | 0.6690 | 9.724564e-09 | 2363 |
| 0.8045 | 0.6918 | 0.8857 | 0.6690 | 9.724334e-09 | 2364 |
| 0.8102 | 0.6988 | 0.8857 | 0.6690 | 9.724104e-09 | 2365 |
| 0.8170 | 0.6918 | 0.8856 | 0.6690 | 9.723874e-09 | 2366 |
| 0.8138 | 0.6894 | 0.8856 | 0.6690 | 9.723644e-09 | 2367 |
| 0.8123 | 0.6918 | 0.8855 | 0.6690 | 9.723414e-09 | 2368 |
| 0.8207 | 0.6941 | 0.8855 | 0.6690 | 9.723184e-09 | 2369 |
| 0.8095 | 0.6847 | 0.8855 | 0.6690 | 9.722954e-09 | 2370 |
| 0.8153 | 0.6871 | 0.8855 | 0.6690 | 9.722723e-09 | 2371 |
| 0.8021 | 0.6894 | 0.8855 | 0.6690 | 9.722492e-09 | 2372 |
| 0.8096 | 0.6965 | 0.8854 | 0.6690 | 9.722261e-09 | 2373 |
| 0.8218 | 0.6894 | 0.8854 | 0.6690 | 9.72203e-09 | 2374 |
| 0.8096 | 0.6894 | 0.8853 | 0.6690 | 9.721799e-09 | 2375 |
| 0.8125 | 0.6965 | 0.8853 | 0.6690 | 9.721568e-09 | 2376 |
| 0.8122 | 0.6965 | 0.8853 | 0.6690 | 9.7213375e-09 | 2377 |
| 0.8081 | 0.7012 | 0.8852 | 0.6690 | 9.721107e-09 | 2378 |
| 0.8077 | 0.6965 | 0.8852 | 0.6690 | 9.720876e-09 | 2379 |
| 0.8079 | 0.6918 | 0.8852 | 0.6690 | 9.720645e-09 | 2380 |
| 0.8151 | 0.6871 | 0.8852 | 0.6690 | 9.720413e-09 | 2381 |
| 0.8123 | 0.6965 | 0.8852 | 0.6690 | 9.720181e-09 | 2382 |
| 0.8053 | 0.6965 | 0.8851 | 0.6690 | 9.719949e-09 | 2383 |
| 0.8161 | 0.6894 | 0.8851 | 0.6690 | 9.7197175e-09 | 2384 |
| 0.8059 | 0.6871 | 0.8850 | 0.6690 | 9.719486e-09 | 2385 |
| 0.8109 | 0.6871 | 0.8849 | 0.6690 | 9.719254e-09 | 2386 |
| 0.8054 | 0.6894 | 0.8849 | 0.6690 | 9.719022e-09 | 2387 |
| 0.8115 | 0.6847 | 0.8848 | 0.6690 | 9.71879e-09 | 2388 |
| 0.8145 | 0.6941 | 0.8848 | 0.6690 | 9.718558e-09 | 2389 |
| 0.8058 | 0.6965 | 0.8848 | 0.6690 | 9.718326e-09 | 2390 |
| 0.8177 | 0.6871 | 0.8848 | 0.6690 | 9.718093e-09 | 2391 |
| 0.8169 | 0.6918 | 0.8847 | 0.6690 | 9.71786e-09 | 2392 |
| 0.8029 | 0.6918 | 0.8847 | 0.6690 | 9.717628e-09 | 2393 |
| 0.8164 | 0.6918 | 0.8847 | 0.6690 | 9.717395e-09 | 2394 |
| 0.8103 | 0.6894 | 0.8846 | 0.6690 | 9.717162e-09 | 2395 |
| 0.8115 | 0.6871 | 0.8845 | 0.6690 | 9.7169295e-09 | 2396 |
| 0.8056 | 0.6941 | 0.8845 | 0.6690 | 9.716697e-09 | 2397 |
| 0.8085 | 0.6871 | 0.8845 | 0.6690 | 9.716464e-09 | 2398 |
| 0.8114 | 0.6988 | 0.8845 | 0.6690 | 9.716231e-09 | 2399 |
| 0.8058 | 0.6941 | 0.8844 | 0.6690 | 9.715998e-09 | 2400 |
| 0.8114 | 0.6894 | 0.8844 | 0.6690 | 9.715764e-09 | 2401 |
| 0.8110 | 0.6988 | 0.8844 | 0.6690 | 9.715531e-09 | 2402 |
| 0.8040 | 0.6988 | 0.8844 | 0.6690 | 9.715297e-09 | 2403 |
| 0.7972 | 0.6918 | 0.8843 | 0.6690 | 9.715063e-09 | 2404 |
| 0.8081 | 0.6894 | 0.8842 | 0.6690 | 9.71483e-09 | 2405 |
| 0.8078 | 0.7012 | 0.8842 | 0.6690 | 9.714596e-09 | 2406 |
| 0.8149 | 0.6965 | 0.8842 | 0.6690 | 9.714363e-09 | 2407 |
| 0.8022 | 0.6965 | 0.8841 | 0.6690 | 9.714129e-09 | 2408 |
| 0.8048 | 0.6918 | 0.8841 | 0.6690 | 9.7138955e-09 | 2409 |
| 0.8141 | 0.6894 | 0.8841 | 0.6690 | 9.713661e-09 | 2410 |
| 0.8089 | 0.6965 | 0.8841 | 0.6690 | 9.7134265e-09 | 2411 |
| 0.8153 | 0.6918 | 0.8841 | 0.6690 | 9.713192e-09 | 2412 |
| 0.8075 | 0.6941 | 0.8841 | 0.6690 | 9.7129575e-09 | 2413 |
| 0.8045 | 0.6894 | 0.8840 | 0.6690 | 9.712723e-09 | 2414 |
| 0.8021 | 0.6894 | 0.8840 | 0.6690 | 9.712489e-09 | 2415 |
| 0.8123 | 0.7059 | 0.8839 | 0.6690 | 9.712254e-09 | 2416 |
| 0.8056 | 0.6918 | 0.8839 | 0.6690 | 9.71202e-09 | 2417 |
| 0.8069 | 0.6941 | 0.8839 | 0.6690 | 9.711785e-09 | 2418 |
| 0.8049 | 0.6965 | 0.8838 | 0.6690 | 9.711551e-09 | 2419 |
| 0.7983 | 0.6918 | 0.8837 | 0.6690 | 9.711315e-09 | 2420 |
| 0.8105 | 0.6988 | 0.8837 | 0.6690 | 9.71108e-09 | 2421 |
| 0.8068 | 0.6894 | 0.8837 | 0.6690 | 9.710845e-09 | 2422 |
| 0.8075 | 0.6918 | 0.8837 | 0.6690 | 9.710609e-09 | 2423 |
| 0.7976 | 0.6941 | 0.8837 | 0.6690 | 9.710374e-09 | 2424 |
| 0.8058 | 0.6894 | 0.8837 | 0.6690 | 9.7101385e-09 | 2425 |
| 0.8075 | 0.6941 | 0.8837 | 0.6690 | 9.709903e-09 | 2426 |
| 0.8046 | 0.6871 | 0.8836 | 0.6690 | 9.709668e-09 | 2427 |
| 0.8048 | 0.6894 | 0.8835 | 0.6690 | 9.709432e-09 | 2428 |
| 0.8072 | 0.6941 | 0.8835 | 0.6690 | 9.709196e-09 | 2429 |
| 0.7984 | 0.7035 | 0.8835 | 0.6690 | 9.70896e-09 | 2430 |
| 0.8201 | 0.6871 | 0.8835 | 0.6690 | 9.708724e-09 | 2431 |
| 0.8070 | 0.6988 | 0.8834 | 0.6690 | 9.708487e-09 | 2432 |
| 0.8015 | 0.6918 | 0.8834 | 0.6690 | 9.708251e-09 | 2433 |
| 0.8021 | 0.6894 | 0.8833 | 0.6690 | 9.708015e-09 | 2434 |
| 0.8007 | 0.6894 | 0.8833 | 0.6690 | 9.707779e-09 | 2435 |
| 0.8031 | 0.6871 | 0.8833 | 0.6690 | 9.707542e-09 | 2436 |
| 0.8112 | 0.6941 | 0.8832 | 0.6690 | 9.707306e-09 | 2437 |
| 0.8047 | 0.6918 | 0.8832 | 0.6690 | 9.70707e-09 | 2438 |
| 0.8088 | 0.6988 | 0.8831 | 0.6690 | 9.706833e-09 | 2439 |
| 0.8145 | 0.6941 | 0.8831 | 0.6690 | 9.7065955e-09 | 2440 |
| 0.8054 | 0.6847 | 0.8830 | 0.6690 | 9.706358e-09 | 2441 |
| 0.8100 | 0.6894 | 0.8830 | 0.6690 | 9.706121e-09 | 2442 |
| 0.8062 | 0.6941 | 0.8830 | 0.6690 | 9.705884e-09 | 2443 |
| 0.7980 | 0.6965 | 0.8831 | 0.6690 | 9.705647e-09 | 2444 |
| 0.8017 | 0.6918 | 0.8830 | 0.6690 | 9.70541e-09 | 2445 |
| 0.8161 | 0.6941 | 0.8829 | 0.6690 | 9.705173e-09 | 2446 |
| 0.8154 | 0.6894 | 0.8829 | 0.6690 | 9.7049355e-09 | 2447 |
| 0.8072 | 0.6965 | 0.8829 | 0.6690 | 9.704698e-09 | 2448 |
| 0.8112 | 0.6871 | 0.8828 | 0.6690 | 9.70446e-09 | 2449 |
| 0.8041 | 0.6941 | 0.8828 | 0.6690 | 9.704222e-09 | 2450 |
| 0.8145 | 0.6965 | 0.8827 | 0.6690 | 9.703984e-09 | 2451 |
| 0.8061 | 0.6918 | 0.8827 | 0.6690 | 9.703746e-09 | 2452 |
| 0.7980 | 0.6988 | 0.8827 | 0.6690 | 9.703508e-09 | 2453 |
| 0.8023 | 0.6941 | 0.8827 | 0.6690 | 9.70327e-09 | 2454 |
| 0.8055 | 0.6894 | 0.8827 | 0.6690 | 9.703032e-09 | 2455 |
| 0.8070 | 0.6918 | 0.8826 | 0.6690 | 9.702794e-09 | 2456 |
| 0.8080 | 0.6965 | 0.8826 | 0.6690 | 9.702556e-09 | 2457 |
| 0.7929 | 0.6871 | 0.8825 | 0.6690 | 9.702317e-09 | 2458 |
| 0.8127 | 0.6847 | 0.8825 | 0.6690 | 9.702078e-09 | 2459 |
| 0.8141 | 0.6894 | 0.8824 | 0.6690 | 9.701839e-09 | 2460 |
| 0.8078 | 0.6918 | 0.8824 | 0.6690 | 9.7016e-09 | 2461 |
| 0.8094 | 0.6941 | 0.8823 | 0.6690 | 9.7013615e-09 | 2462 |
| 0.8009 | 0.6988 | 0.8823 | 0.6690 | 9.701123e-09 | 2463 |
| 0.7958 | 0.7012 | 0.8823 | 0.6690 | 9.700884e-09 | 2464 |
| 0.8050 | 0.6941 | 0.8823 | 0.6690 | 9.700645e-09 | 2465 |
| 0.8146 | 0.6824 | 0.8822 | 0.6690 | 9.700406e-09 | 2466 |
| 0.7989 | 0.6918 | 0.8822 | 0.6690 | 9.700167e-09 | 2467 |
| 0.8017 | 0.6894 | 0.8822 | 0.6690 | 9.699927e-09 | 2468 |
| 0.8002 | 0.6847 | 0.8822 | 0.6690 | 9.699687e-09 | 2469 |
| 0.8067 | 0.6965 | 0.8821 | 0.6690 | 9.6994475e-09 | 2470 |
| 0.8065 | 0.6918 | 0.8821 | 0.6690 | 9.699208e-09 | 2471 |
| 0.8039 | 0.6918 | 0.8821 | 0.6690 | 9.698968e-09 | 2472 |
| 0.8011 | 0.6988 | 0.8821 | 0.6690 | 9.698728e-09 | 2473 |
| 0.8064 | 0.7012 | 0.8820 | 0.6690 | 9.698488e-09 | 2474 |
| 0.8073 | 0.6965 | 0.8821 | 0.6690 | 9.698248e-09 | 2475 |
| 0.8034 | 0.6941 | 0.8820 | 0.6690 | 9.698009e-09 | 2476 |
| 0.8003 | 0.6965 | 0.8820 | 0.6690 | 9.697769e-09 | 2477 |
| 0.7995 | 0.7035 | 0.8819 | 0.6690 | 9.697528e-09 | 2478 |
| 0.8024 | 0.6965 | 0.8819 | 0.6690 | 9.697287e-09 | 2479 |
| 0.8027 | 0.6918 | 0.8819 | 0.6690 | 9.697047e-09 | 2480 |
| 0.8031 | 0.6965 | 0.8818 | 0.6690 | 9.696806e-09 | 2481 |
| 0.7975 | 0.6871 | 0.8818 | 0.6690 | 9.696565e-09 | 2482 |
| 0.8000 | 0.6894 | 0.8818 | 0.6690 | 9.696325e-09 | 2483 |
| 0.8089 | 0.6894 | 0.8818 | 0.6690 | 9.696084e-09 | 2484 |
| 0.7936 | 0.6941 | 0.8817 | 0.6690 | 9.695843e-09 | 2485 |
| 0.8040 | 0.6918 | 0.8817 | 0.6690 | 9.6956025e-09 | 2486 |
| 0.7984 | 0.6988 | 0.8816 | 0.6690 | 9.695362e-09 | 2487 |
| 0.8097 | 0.6894 | 0.8816 | 0.6690 | 9.69512e-09 | 2488 |
| 0.8010 | 0.6965 | 0.8816 | 0.6690 | 9.694879e-09 | 2489 |
| 0.8058 | 0.6871 | 0.8816 | 0.6690 | 9.694637e-09 | 2490 |
| 0.8128 | 0.6847 | 0.8815 | 0.6690 | 9.6943955e-09 | 2491 |
| 0.8031 | 0.6941 | 0.8815 | 0.6690 | 9.694154e-09 | 2492 |
| 0.8033 | 0.7059 | 0.8814 | 0.6690 | 9.693912e-09 | 2493 |
| 0.7974 | 0.6871 | 0.8814 | 0.6690 | 9.693671e-09 | 2494 |
| 0.7996 | 0.6871 | 0.8813 | 0.6690 | 9.693429e-09 | 2495 |
| 0.7926 | 0.6941 | 0.8813 | 0.6690 | 9.693188e-09 | 2496 |
| 0.7988 | 0.6965 | 0.8813 | 0.6690 | 9.692945e-09 | 2497 |
| 0.8048 | 0.6988 | 0.8813 | 0.6690 | 9.692703e-09 | 2498 |
| 0.7994 | 0.7012 | 0.8812 | 0.6690 | 9.69246e-09 | 2499 |
| 0.7911 | 0.7035 | 0.8811 | 0.6690 | 9.692218e-09 | 2500 |
| 0.7900 | 0.6871 | 0.8811 | 0.6761 | 9.691975e-09 | 2501 |
| 0.8032 | 0.6965 | 0.8810 | 0.6761 | 9.691733e-09 | 2502 |
| 0.8019 | 0.6941 | 0.8810 | 0.6831 | 9.69149e-09 | 2503 |
| 0.7995 | 0.6988 | 0.8810 | 0.6831 | 9.691248e-09 | 2504 |
| 0.7959 | 0.6894 | 0.8809 | 0.6831 | 9.691005e-09 | 2505 |
| 0.8042 | 0.6918 | 0.8808 | 0.6831 | 9.690763e-09 | 2506 |
| 0.8090 | 0.6918 | 0.8808 | 0.6831 | 9.6905195e-09 | 2507 |
| 0.8008 | 0.6965 | 0.8808 | 0.6831 | 9.690276e-09 | 2508 |
| 0.8007 | 0.6871 | 0.8808 | 0.6831 | 9.690033e-09 | 2509 |
| 0.8058 | 0.7035 | 0.8807 | 0.6831 | 9.689789e-09 | 2510 |
| 0.8080 | 0.7012 | 0.8808 | 0.6831 | 9.689546e-09 | 2511 |
| 0.7956 | 0.6918 | 0.8808 | 0.6831 | 9.689303e-09 | 2512 |
| 0.8076 | 0.6965 | 0.8807 | 0.6831 | 9.689059e-09 | 2513 |
| 0.8033 | 0.6965 | 0.8807 | 0.6831 | 9.688816e-09 | 2514 |
| 0.8007 | 0.6965 | 0.8807 | 0.6831 | 9.688573e-09 | 2515 |
| 0.8047 | 0.6941 | 0.8807 | 0.6831 | 9.688329e-09 | 2516 |
| 0.8055 | 0.6847 | 0.8807 | 0.6831 | 9.688085e-09 | 2517 |
| 0.8063 | 0.6918 | 0.8807 | 0.6831 | 9.687841e-09 | 2518 |
| 0.7960 | 0.7012 | 0.8807 | 0.6831 | 9.6875965e-09 | 2519 |
| 0.8092 | 0.6941 | 0.8806 | 0.6831 | 9.687352e-09 | 2520 |
| 0.7939 | 0.7012 | 0.8806 | 0.6831 | 9.687108e-09 | 2521 |
| 0.8120 | 0.6871 | 0.8805 | 0.6831 | 9.686864e-09 | 2522 |
| 0.7963 | 0.6941 | 0.8805 | 0.6831 | 9.6866195e-09 | 2523 |
| 0.8076 | 0.6918 | 0.8805 | 0.6831 | 9.686375e-09 | 2524 |
| 0.8012 | 0.6894 | 0.8804 | 0.6831 | 9.686131e-09 | 2525 |
| 0.7962 | 0.6894 | 0.8804 | 0.6831 | 9.685887e-09 | 2526 |
| 0.7975 | 0.6965 | 0.8804 | 0.6831 | 9.685642e-09 | 2527 |
| 0.8012 | 0.6918 | 0.8803 | 0.6831 | 9.6853965e-09 | 2528 |
| 0.7933 | 0.7082 | 0.8802 | 0.6831 | 9.685151e-09 | 2529 |
| 0.8028 | 0.7012 | 0.8801 | 0.6831 | 9.684906e-09 | 2530 |
| 0.7932 | 0.7012 | 0.8801 | 0.6831 | 9.684661e-09 | 2531 |
| 0.8045 | 0.6918 | 0.8801 | 0.6831 | 9.684416e-09 | 2532 |
| 0.7918 | 0.6988 | 0.8801 | 0.6761 | 9.684171e-09 | 2533 |
| 0.7974 | 0.6918 | 0.8801 | 0.6761 | 9.683926e-09 | 2534 |
| 0.7918 | 0.7012 | 0.8800 | 0.6761 | 9.6836805e-09 | 2535 |
| 0.7961 | 0.6941 | 0.8800 | 0.6761 | 9.683435e-09 | 2536 |
| 0.8005 | 0.6871 | 0.8799 | 0.6761 | 9.683189e-09 | 2537 |
| 0.7958 | 0.6941 | 0.8799 | 0.6761 | 9.682943e-09 | 2538 |
| 0.7934 | 0.7035 | 0.8799 | 0.6761 | 9.682697e-09 | 2539 |
| 0.8002 | 0.7012 | 0.8798 | 0.6761 | 9.682451e-09 | 2540 |
| 0.8036 | 0.6988 | 0.8798 | 0.6761 | 9.682205e-09 | 2541 |
| 0.7972 | 0.7012 | 0.8798 | 0.6761 | 9.681959e-09 | 2542 |
| 0.7984 | 0.6871 | 0.8798 | 0.6761 | 9.681713e-09 | 2543 |
| 0.8067 | 0.7012 | 0.8797 | 0.6761 | 9.681467e-09 | 2544 |
| 0.7991 | 0.6941 | 0.8797 | 0.6761 | 9.681221e-09 | 2545 |
| 0.7994 | 0.6965 | 0.8797 | 0.6761 | 9.680974e-09 | 2546 |
| 0.7881 | 0.7035 | 0.8796 | 0.6761 | 9.680727e-09 | 2547 |
| 0.8028 | 0.6988 | 0.8796 | 0.6761 | 9.68048e-09 | 2548 |
| 0.7945 | 0.6941 | 0.8796 | 0.6761 | 9.6802335e-09 | 2549 |
| 0.8007 | 0.6918 | 0.8796 | 0.6761 | 9.679987e-09 | 2550 |
| 0.7925 | 0.6965 | 0.8795 | 0.6761 | 9.67974e-09 | 2551 |
| 0.8097 | 0.6894 | 0.8795 | 0.6761 | 9.679493e-09 | 2552 |
| 0.7995 | 0.7106 | 0.8794 | 0.6761 | 9.679246e-09 | 2553 |
| 0.7852 | 0.7012 | 0.8794 | 0.6761 | 9.678999e-09 | 2554 |
| 0.7993 | 0.7035 | 0.8794 | 0.6761 | 9.678752e-09 | 2555 |
| 0.7991 | 0.7035 | 0.8793 | 0.6761 | 9.678504e-09 | 2556 |
| 0.8073 | 0.6941 | 0.8793 | 0.6761 | 9.678256e-09 | 2557 |
| 0.7967 | 0.6988 | 0.8793 | 0.6761 | 9.678009e-09 | 2558 |
| 0.8005 | 0.6965 | 0.8793 | 0.6761 | 9.677761e-09 | 2559 |
| 0.7855 | 0.6965 | 0.8792 | 0.6761 | 9.677513e-09 | 2560 |
| 0.7903 | 0.6918 | 0.8792 | 0.6761 | 9.677265e-09 | 2561 |
| 0.7935 | 0.6894 | 0.8792 | 0.6761 | 9.677017e-09 | 2562 |
| 0.8018 | 0.6941 | 0.8792 | 0.6761 | 9.67677e-09 | 2563 |
| 0.7907 | 0.6918 | 0.8791 | 0.6761 | 9.676522e-09 | 2564 |
| 0.7914 | 0.7082 | 0.8791 | 0.6761 | 9.676274e-09 | 2565 |
| 0.7922 | 0.6965 | 0.8790 | 0.6761 | 9.676025e-09 | 2566 |
| 0.8182 | 0.6800 | 0.8790 | 0.6761 | 9.675777e-09 | 2567 |
| 0.8075 | 0.6894 | 0.8789 | 0.6761 | 9.675528e-09 | 2568 |
| 0.8062 | 0.6894 | 0.8789 | 0.6761 | 9.675279e-09 | 2569 |
| 0.8024 | 0.6988 | 0.8788 | 0.6761 | 9.6750306e-09 | 2570 |
| 0.7849 | 0.7059 | 0.8788 | 0.6761 | 9.674782e-09 | 2571 |
| 0.8011 | 0.7035 | 0.8787 | 0.6761 | 9.674533e-09 | 2572 |
| 0.8119 | 0.6918 | 0.8787 | 0.6761 | 9.6742845e-09 | 2573 |
| 0.7883 | 0.7035 | 0.8787 | 0.6761 | 9.674036e-09 | 2574 |
| 0.7990 | 0.6965 | 0.8787 | 0.6761 | 9.673787e-09 | 2575 |
| 0.7958 | 0.6894 | 0.8786 | 0.6831 | 9.6735375e-09 | 2576 |
| 0.7972 | 0.6988 | 0.8786 | 0.6831 | 9.673288e-09 | 2577 |
| 0.7971 | 0.6918 | 0.8785 | 0.6831 | 9.673038e-09 | 2578 |
| 0.7972 | 0.7059 | 0.8785 | 0.6831 | 9.672789e-09 | 2579 |
| 0.8059 | 0.6988 | 0.8784 | 0.6831 | 9.672539e-09 | 2580 |
| 0.7886 | 0.6965 | 0.8784 | 0.6831 | 9.67229e-09 | 2581 |
| 0.8115 | 0.7059 | 0.8784 | 0.6831 | 9.67204e-09 | 2582 |
| 0.7910 | 0.7082 | 0.8783 | 0.6831 | 9.6717905e-09 | 2583 |
| 0.7963 | 0.6988 | 0.8783 | 0.6831 | 9.671541e-09 | 2584 |
| 0.7938 | 0.6965 | 0.8783 | 0.6831 | 9.671291e-09 | 2585 |
| 0.7874 | 0.6941 | 0.8783 | 0.6831 | 9.671041e-09 | 2586 |
| 0.7963 | 0.6941 | 0.8782 | 0.6831 | 9.67079e-09 | 2587 |
| 0.7935 | 0.6894 | 0.8782 | 0.6831 | 9.67054e-09 | 2588 |
| 0.7950 | 0.6988 | 0.8781 | 0.6831 | 9.6702895e-09 | 2589 |
| 0.7949 | 0.6918 | 0.8781 | 0.6831 | 9.670039e-09 | 2590 |
| 0.7940 | 0.6988 | 0.8781 | 0.6831 | 9.6697885e-09 | 2591 |
| 0.7985 | 0.7012 | 0.8781 | 0.6831 | 9.669538e-09 | 2592 |
| 0.7912 | 0.6871 | 0.8780 | 0.6831 | 9.669288e-09 | 2593 |
| 0.7965 | 0.6965 | 0.8780 | 0.6831 | 9.669037e-09 | 2594 |
| 0.7923 | 0.7012 | 0.8780 | 0.6831 | 9.668787e-09 | 2595 |
| 0.7958 | 0.6941 | 0.8780 | 0.6831 | 9.668535e-09 | 2596 |
| 0.7976 | 0.7059 | 0.8780 | 0.6831 | 9.668284e-09 | 2597 |
| 0.7976 | 0.6894 | 0.8780 | 0.6831 | 9.668033e-09 | 2598 |
| 0.8046 | 0.6988 | 0.8779 | 0.6831 | 9.667781e-09 | 2599 |
| 0.7926 | 0.6965 | 0.8779 | 0.6831 | 9.66753e-09 | 2600 |
| 0.7908 | 0.7012 | 0.8779 | 0.6831 | 9.6672785e-09 | 2601 |
| 0.7974 | 0.6918 | 0.8779 | 0.6831 | 9.667027e-09 | 2602 |
| 0.7821 | 0.7059 | 0.8778 | 0.6831 | 9.666776e-09 | 2603 |
| 0.8016 | 0.7012 | 0.8778 | 0.6831 | 9.6665245e-09 | 2604 |
| 0.7912 | 0.6965 | 0.8777 | 0.6831 | 9.666272e-09 | 2605 |
| 0.7981 | 0.7059 | 0.8777 | 0.6831 | 9.66602e-09 | 2606 |
| 0.7958 | 0.6988 | 0.8776 | 0.6831 | 9.665768e-09 | 2607 |
| 0.7962 | 0.7059 | 0.8776 | 0.6831 | 9.6655155e-09 | 2608 |
| 0.7944 | 0.7035 | 0.8776 | 0.6831 | 9.665263e-09 | 2609 |
| 0.7934 | 0.6988 | 0.8775 | 0.6831 | 9.665011e-09 | 2610 |
| 0.7861 | 0.7012 | 0.8775 | 0.6831 | 9.664759e-09 | 2611 |
| 0.7956 | 0.6918 | 0.8774 | 0.6831 | 9.6645065e-09 | 2612 |
| 0.7899 | 0.7059 | 0.8774 | 0.6831 | 9.664254e-09 | 2613 |
| 0.7822 | 0.6988 | 0.8773 | 0.6831 | 9.664002e-09 | 2614 |
| 0.7953 | 0.6988 | 0.8773 | 0.6831 | 9.663749e-09 | 2615 |
| 0.7986 | 0.6918 | 0.8773 | 0.6831 | 9.663496e-09 | 2616 |
| 0.7944 | 0.7012 | 0.8772 | 0.6831 | 9.663243e-09 | 2617 |
| 0.7934 | 0.7082 | 0.8772 | 0.6831 | 9.6629895e-09 | 2618 |
| 0.7916 | 0.7082 | 0.8772 | 0.6831 | 9.662736e-09 | 2619 |
| 0.7973 | 0.6965 | 0.8772 | 0.6831 | 9.662483e-09 | 2620 |
| 0.8014 | 0.7035 | 0.8772 | 0.6831 | 9.66223e-09 | 2621 |
| 0.7979 | 0.6871 | 0.8772 | 0.6831 | 9.661977e-09 | 2622 |
| 0.7940 | 0.7035 | 0.8772 | 0.6831 | 9.661724e-09 | 2623 |
| 0.7825 | 0.6918 | 0.8771 | 0.6831 | 9.661471e-09 | 2624 |
| 0.8060 | 0.6965 | 0.8770 | 0.6831 | 9.661217e-09 | 2625 |
| 0.7915 | 0.7035 | 0.8770 | 0.6831 | 9.660963e-09 | 2626 |
| 0.7959 | 0.6965 | 0.8769 | 0.6831 | 9.660709e-09 | 2627 |
| 0.7850 | 0.7035 | 0.8769 | 0.6831 | 9.660455e-09 | 2628 |
| 0.7871 | 0.6988 | 0.8769 | 0.6831 | 9.660201e-09 | 2629 |
| 0.7857 | 0.7012 | 0.8768 | 0.6831 | 9.659947e-09 | 2630 |
| 0.7889 | 0.7035 | 0.8768 | 0.6831 | 9.659693e-09 | 2631 |
| 0.7909 | 0.7035 | 0.8768 | 0.6831 | 9.659439e-09 | 2632 |
| 0.7934 | 0.7035 | 0.8767 | 0.6831 | 9.659185e-09 | 2633 |
| 0.7999 | 0.6894 | 0.8768 | 0.6831 | 9.6589305e-09 | 2634 |
| 0.7855 | 0.6941 | 0.8768 | 0.6831 | 9.658676e-09 | 2635 |
| 0.8005 | 0.7059 | 0.8768 | 0.6831 | 9.658421e-09 | 2636 |
| 0.7835 | 0.7082 | 0.8767 | 0.6831 | 9.658166e-09 | 2637 |
| 0.7898 | 0.6965 | 0.8767 | 0.6831 | 9.657911e-09 | 2638 |
| 0.7905 | 0.6941 | 0.8767 | 0.6831 | 9.657656e-09 | 2639 |
| 0.7967 | 0.7106 | 0.8766 | 0.6831 | 9.657401e-09 | 2640 |
| 0.7823 | 0.6965 | 0.8766 | 0.6831 | 9.657146e-09 | 2641 |
| 0.7910 | 0.6941 | 0.8766 | 0.6831 | 9.656891e-09 | 2642 |
| 0.7883 | 0.6988 | 0.8766 | 0.6831 | 9.656636e-09 | 2643 |
| 0.7874 | 0.7012 | 0.8766 | 0.6831 | 9.6563815e-09 | 2644 |
| 0.8081 | 0.6800 | 0.8765 | 0.6831 | 9.656126e-09 | 2645 |
| 0.7957 | 0.7012 | 0.8765 | 0.6831 | 9.65587e-09 | 2646 |
| 0.7912 | 0.7082 | 0.8765 | 0.6831 | 9.655614e-09 | 2647 |
| 0.7949 | 0.6941 | 0.8764 | 0.6831 | 9.655358e-09 | 2648 |
| 0.7908 | 0.6894 | 0.8765 | 0.6831 | 9.6551025e-09 | 2649 |
| 0.7938 | 0.6941 | 0.8764 | 0.6831 | 9.654847e-09 | 2650 |
| 0.7906 | 0.6918 | 0.8764 | 0.6831 | 9.654591e-09 | 2651 |
| 0.7919 | 0.6988 | 0.8763 | 0.6831 | 9.654335e-09 | 2652 |
| 0.7857 | 0.6941 | 0.8763 | 0.6831 | 9.654079e-09 | 2653 |
| 0.7917 | 0.7035 | 0.8762 | 0.6831 | 9.6538235e-09 | 2654 |
| 0.7967 | 0.6965 | 0.8762 | 0.6831 | 9.653567e-09 | 2655 |
| 0.7837 | 0.7082 | 0.8762 | 0.6831 | 9.65331e-09 | 2656 |
| 0.7819 | 0.7082 | 0.8762 | 0.6831 | 9.6530535e-09 | 2657 |
| 0.7916 | 0.6988 | 0.8761 | 0.6831 | 9.652797e-09 | 2658 |
| 0.7884 | 0.7012 | 0.8761 | 0.6831 | 9.65254e-09 | 2659 |
| 0.7964 | 0.6941 | 0.8761 | 0.6831 | 9.652283e-09 | 2660 |
| 0.7932 | 0.6941 | 0.8760 | 0.6831 | 9.652027e-09 | 2661 |
| 0.7743 | 0.7176 | 0.8760 | 0.6831 | 9.65177e-09 | 2662 |
| 0.7887 | 0.7035 | 0.8760 | 0.6831 | 9.651513e-09 | 2663 |
| 0.7932 | 0.6988 | 0.8759 | 0.6831 | 9.651257e-09 | 2664 |
| 0.7999 | 0.6988 | 0.8759 | 0.6831 | 9.650999e-09 | 2665 |
| 0.7864 | 0.6894 | 0.8759 | 0.6831 | 9.6507415e-09 | 2666 |
| 0.7954 | 0.6988 | 0.8759 | 0.6831 | 9.650484e-09 | 2667 |
| 0.7830 | 0.7035 | 0.8758 | 0.6831 | 9.650226e-09 | 2668 |
| 0.7939 | 0.7059 | 0.8759 | 0.6831 | 9.649969e-09 | 2669 |
| 0.7887 | 0.7012 | 0.8758 | 0.6831 | 9.649711e-09 | 2670 |
| 0.7902 | 0.7012 | 0.8759 | 0.6831 | 9.649454e-09 | 2671 |
| 0.7915 | 0.7035 | 0.8759 | 0.6831 | 9.649196e-09 | 2672 |
| 0.7901 | 0.6965 | 0.8758 | 0.6831 | 9.6489385e-09 | 2673 |
| 0.7864 | 0.6965 | 0.8758 | 0.6831 | 9.648681e-09 | 2674 |
| 0.7900 | 0.7012 | 0.8757 | 0.6831 | 9.6484225e-09 | 2675 |
| 0.7956 | 0.7059 | 0.8757 | 0.6831 | 9.648164e-09 | 2676 |
| 0.7935 | 0.7082 | 0.8757 | 0.6831 | 9.647906e-09 | 2677 |
| 0.7830 | 0.7059 | 0.8757 | 0.6831 | 9.647647e-09 | 2678 |
| 0.7837 | 0.6941 | 0.8757 | 0.6831 | 9.647389e-09 | 2679 |
| 0.7903 | 0.7059 | 0.8757 | 0.6831 | 9.64713e-09 | 2680 |
| 0.7860 | 0.7106 | 0.8757 | 0.6831 | 9.646872e-09 | 2681 |
| 0.7780 | 0.7035 | 0.8756 | 0.6831 | 9.646613e-09 | 2682 |
| 0.7876 | 0.6941 | 0.8756 | 0.6831 | 9.646355e-09 | 2683 |
| 0.7884 | 0.6988 | 0.8755 | 0.6831 | 9.646096e-09 | 2684 |
| 0.7832 | 0.7035 | 0.8754 | 0.6831 | 9.645837e-09 | 2685 |
| 0.7863 | 0.7059 | 0.8754 | 0.6831 | 9.645578e-09 | 2686 |
| 0.7831 | 0.7059 | 0.8754 | 0.6831 | 9.645318e-09 | 2687 |
| 0.8011 | 0.6941 | 0.8753 | 0.6831 | 9.645059e-09 | 2688 |
| 0.7741 | 0.7012 | 0.8753 | 0.6831 | 9.6448e-09 | 2689 |
| 0.7911 | 0.6894 | 0.8753 | 0.6831 | 9.64454e-09 | 2690 |
| 0.7795 | 0.6965 | 0.8753 | 0.6831 | 9.644281e-09 | 2691 |
| 0.7766 | 0.7082 | 0.8753 | 0.6831 | 9.644022e-09 | 2692 |
| 0.7876 | 0.6988 | 0.8753 | 0.6831 | 9.643762e-09 | 2693 |
| 0.7859 | 0.6965 | 0.8752 | 0.6831 | 9.643502e-09 | 2694 |
| 0.7782 | 0.7059 | 0.8752 | 0.6831 | 9.643242e-09 | 2695 |
| 0.7887 | 0.7035 | 0.8752 | 0.6831 | 9.6429815e-09 | 2696 |
| 0.7943 | 0.6988 | 0.8752 | 0.6831 | 9.642721e-09 | 2697 |
| 0.7950 | 0.6965 | 0.8752 | 0.6831 | 9.642461e-09 | 2698 |
| 0.7874 | 0.7035 | 0.8751 | 0.6831 | 9.642201e-09 | 2699 |
| 0.7838 | 0.6941 | 0.8751 | 0.6831 | 9.641941e-09 | 2700 |
| 0.7761 | 0.7082 | 0.8751 | 0.6831 | 9.64168e-09 | 2701 |
| 0.7787 | 0.7035 | 0.8751 | 0.6831 | 9.64142e-09 | 2702 |
| 0.7823 | 0.6988 | 0.8751 | 0.6831 | 9.64116e-09 | 2703 |
| 0.7846 | 0.7082 | 0.8750 | 0.6831 | 9.640899e-09 | 2704 |
| 0.7828 | 0.6918 | 0.8750 | 0.6831 | 9.640638e-09 | 2705 |
| 0.7775 | 0.7059 | 0.8750 | 0.6831 | 9.6403765e-09 | 2706 |
| 0.7741 | 0.6918 | 0.8750 | 0.6831 | 9.640115e-09 | 2707 |
| 0.7876 | 0.7012 | 0.8749 | 0.6831 | 9.639854e-09 | 2708 |
| 0.7971 | 0.6965 | 0.8749 | 0.6831 | 9.639593e-09 | 2709 |
| 0.7855 | 0.7035 | 0.8749 | 0.6831 | 9.639332e-09 | 2710 |
| 0.7970 | 0.7012 | 0.8748 | 0.6831 | 9.639071e-09 | 2711 |
| 0.7945 | 0.7082 | 0.8748 | 0.6831 | 9.63881e-09 | 2712 |
| 0.7993 | 0.7129 | 0.8748 | 0.6831 | 9.638549e-09 | 2713 |
| 0.7777 | 0.7035 | 0.8748 | 0.6831 | 9.638287e-09 | 2714 |
| 0.7937 | 0.7012 | 0.8747 | 0.6831 | 9.638025e-09 | 2715 |
| 0.7886 | 0.6965 | 0.8747 | 0.6831 | 9.637763e-09 | 2716 |
| 0.8014 | 0.6941 | 0.8747 | 0.6831 | 9.637501e-09 | 2717 |
| 0.7840 | 0.7012 | 0.8746 | 0.6831 | 9.637239e-09 | 2718 |
| 0.7834 | 0.7035 | 0.8746 | 0.6831 | 9.6369766e-09 | 2719 |
| 0.7889 | 0.7059 | 0.8745 | 0.6831 | 9.6367145e-09 | 2720 |
| 0.7841 | 0.7059 | 0.8745 | 0.6831 | 9.6364525e-09 | 2721 |
| 0.7936 | 0.6965 | 0.8744 | 0.6831 | 9.6361905e-09 | 2722 |
| 0.7794 | 0.7035 | 0.8744 | 0.6831 | 9.6359285e-09 | 2723 |
| 0.7781 | 0.7176 | 0.8744 | 0.6831 | 9.635666e-09 | 2724 |
| 0.7943 | 0.6941 | 0.8744 | 0.6831 | 9.635403e-09 | 2725 |
| 0.7862 | 0.6988 | 0.8743 | 0.6831 | 9.63514e-09 | 2726 |
| 0.7877 | 0.6988 | 0.8743 | 0.6831 | 9.634877e-09 | 2727 |
| 0.7806 | 0.7012 | 0.8742 | 0.6831 | 9.634614e-09 | 2728 |
| 0.7823 | 0.7035 | 0.8742 | 0.6831 | 9.634351e-09 | 2729 |
| 0.7908 | 0.7012 | 0.8742 | 0.6831 | 9.634088e-09 | 2730 |
| 0.7798 | 0.6988 | 0.8742 | 0.6831 | 9.633825e-09 | 2731 |
| 0.7819 | 0.7106 | 0.8741 | 0.6831 | 9.633562e-09 | 2732 |
| 0.7790 | 0.7082 | 0.8741 | 0.6831 | 9.6332995e-09 | 2733 |
| 0.7858 | 0.7129 | 0.8741 | 0.6831 | 9.633036e-09 | 2734 |
| 0.7738 | 0.6918 | 0.8740 | 0.6831 | 9.632772e-09 | 2735 |
| 0.7861 | 0.6965 | 0.8740 | 0.6831 | 9.632508e-09 | 2736 |
| 0.7881 | 0.7035 | 0.8740 | 0.6831 | 9.632244e-09 | 2737 |
| 0.7840 | 0.7012 | 0.8740 | 0.6831 | 9.6319805e-09 | 2738 |
| 0.7920 | 0.7012 | 0.8740 | 0.6831 | 9.631717e-09 | 2739 |
| 0.7871 | 0.7082 | 0.8739 | 0.6831 | 9.631453e-09 | 2740 |
| 0.7917 | 0.7035 | 0.8739 | 0.6831 | 9.631189e-09 | 2741 |
| 0.7885 | 0.6988 | 0.8739 | 0.6831 | 9.630925e-09 | 2742 |
| 0.7797 | 0.7035 | 0.8738 | 0.6831 | 9.630662e-09 | 2743 |
| 0.7641 | 0.7129 | 0.8738 | 0.6831 | 9.630397e-09 | 2744 |
| 0.7859 | 0.6988 | 0.8738 | 0.6831 | 9.630132e-09 | 2745 |
| 0.7828 | 0.6941 | 0.8738 | 0.6831 | 9.629868e-09 | 2746 |
| 0.7924 | 0.7106 | 0.8737 | 0.6831 | 9.629603e-09 | 2747 |
| 0.7949 | 0.7012 | 0.8737 | 0.6831 | 9.629338e-09 | 2748 |
| 0.7914 | 0.7106 | 0.8736 | 0.6831 | 9.6290735e-09 | 2749 |
| 0.7945 | 0.7012 | 0.8736 | 0.6831 | 9.628809e-09 | 2750 |
| 0.7805 | 0.7129 | 0.8736 | 0.6831 | 9.628544e-09 | 2751 |
| 0.7888 | 0.6918 | 0.8736 | 0.6831 | 9.6282795e-09 | 2752 |
| 0.7887 | 0.7153 | 0.8736 | 0.6831 | 9.628015e-09 | 2753 |
| 0.7771 | 0.7035 | 0.8736 | 0.6831 | 9.627749e-09 | 2754 |
| 0.7827 | 0.6988 | 0.8735 | 0.6831 | 9.627484e-09 | 2755 |
| 0.7902 | 0.6894 | 0.8735 | 0.6831 | 9.627218e-09 | 2756 |
| 0.7779 | 0.7012 | 0.8735 | 0.6831 | 9.626953e-09 | 2757 |
| 0.7792 | 0.7176 | 0.8734 | 0.6831 | 9.626687e-09 | 2758 |
| 0.7916 | 0.7106 | 0.8734 | 0.6901 | 9.626421e-09 | 2759 |
| 0.7838 | 0.7059 | 0.8734 | 0.6831 | 9.626156e-09 | 2760 |
| 0.7849 | 0.7035 | 0.8734 | 0.6831 | 9.62589e-09 | 2761 |
| 0.7861 | 0.7012 | 0.8734 | 0.6831 | 9.625625e-09 | 2762 |
| 0.7863 | 0.6988 | 0.8733 | 0.6831 | 9.625359e-09 | 2763 |
| 0.7851 | 0.7059 | 0.8733 | 0.6831 | 9.625093e-09 | 2764 |
| 0.7887 | 0.7012 | 0.8732 | 0.6831 | 9.624826e-09 | 2765 |
| 0.7908 | 0.6894 | 0.8732 | 0.6901 | 9.62456e-09 | 2766 |
| 0.7790 | 0.7129 | 0.8731 | 0.6901 | 9.624293e-09 | 2767 |
| 0.7943 | 0.6918 | 0.8731 | 0.6901 | 9.624027e-09 | 2768 |
| 0.7857 | 0.7035 | 0.8731 | 0.6901 | 9.6237605e-09 | 2769 |
| 0.7759 | 0.6941 | 0.8731 | 0.6901 | 9.623494e-09 | 2770 |
| 0.7783 | 0.7012 | 0.8730 | 0.6901 | 9.6232275e-09 | 2771 |
| 0.7937 | 0.6965 | 0.8730 | 0.6901 | 9.622961e-09 | 2772 |
| 0.7875 | 0.7012 | 0.8729 | 0.6901 | 9.622695e-09 | 2773 |
| 0.7836 | 0.7035 | 0.8729 | 0.6901 | 9.622427e-09 | 2774 |
| 0.7796 | 0.7106 | 0.8729 | 0.6901 | 9.62216e-09 | 2775 |
| 0.7853 | 0.6965 | 0.8728 | 0.6901 | 9.621893e-09 | 2776 |
| 0.7796 | 0.7082 | 0.8728 | 0.6901 | 9.621625e-09 | 2777 |
| 0.7815 | 0.7059 | 0.8728 | 0.6901 | 9.621358e-09 | 2778 |
| 0.7861 | 0.6988 | 0.8728 | 0.6901 | 9.621091e-09 | 2779 |
| 0.7785 | 0.7200 | 0.8728 | 0.6901 | 9.620823e-09 | 2780 |
| 0.7746 | 0.7200 | 0.8728 | 0.6901 | 9.620556e-09 | 2781 |
| 0.7846 | 0.7012 | 0.8727 | 0.6901 | 9.620289e-09 | 2782 |
| 0.7762 | 0.7082 | 0.8727 | 0.6901 | 9.620021e-09 | 2783 |
| 0.7839 | 0.6988 | 0.8727 | 0.6901 | 9.619753e-09 | 2784 |
| 0.7904 | 0.7012 | 0.8727 | 0.6901 | 9.619485e-09 | 2785 |
| 0.7776 | 0.7012 | 0.8726 | 0.6901 | 9.6192165e-09 | 2786 |
| 0.7917 | 0.7035 | 0.8726 | 0.6901 | 9.618948e-09 | 2787 |
| 0.7727 | 0.7012 | 0.8725 | 0.6901 | 9.61868e-09 | 2788 |
| 0.7771 | 0.7059 | 0.8725 | 0.6901 | 9.618412e-09 | 2789 |
| 0.7815 | 0.7035 | 0.8725 | 0.6901 | 9.618144e-09 | 2790 |
| 0.7895 | 0.6871 | 0.8725 | 0.6901 | 9.617875e-09 | 2791 |
| 0.7786 | 0.7035 | 0.8725 | 0.6901 | 9.617607e-09 | 2792 |
| 0.7826 | 0.7082 | 0.8724 | 0.6901 | 9.617339e-09 | 2793 |
| 0.7797 | 0.7176 | 0.8724 | 0.6901 | 9.61707e-09 | 2794 |
| 0.7789 | 0.7082 | 0.8724 | 0.6901 | 9.616801e-09 | 2795 |
| 0.7827 | 0.6918 | 0.8724 | 0.6901 | 9.616532e-09 | 2796 |
| 0.7765 | 0.6988 | 0.8723 | 0.6901 | 9.6162625e-09 | 2797 |
| 0.7840 | 0.6988 | 0.8723 | 0.6901 | 9.615993e-09 | 2798 |
| 0.7625 | 0.7012 | 0.8723 | 0.6901 | 9.615724e-09 | 2799 |
| 0.7821 | 0.7012 | 0.8723 | 0.6901 | 9.615455e-09 | 2800 |
| 0.7721 | 0.7153 | 0.8722 | 0.6901 | 9.615186e-09 | 2801 |
| 0.7727 | 0.7012 | 0.8722 | 0.6901 | 9.614917e-09 | 2802 |
| 0.7748 | 0.7059 | 0.8722 | 0.6901 | 9.614648e-09 | 2803 |
| 0.7846 | 0.7035 | 0.8722 | 0.6901 | 9.614378e-09 | 2804 |
| 0.7738 | 0.7035 | 0.8722 | 0.6901 | 9.614108e-09 | 2805 |
| 0.7876 | 0.6965 | 0.8722 | 0.6901 | 9.613838e-09 | 2806 |
| 0.7863 | 0.6988 | 0.8721 | 0.6901 | 9.613568e-09 | 2807 |
| 0.7707 | 0.7035 | 0.8721 | 0.6901 | 9.613298e-09 | 2808 |
| 0.7819 | 0.7059 | 0.8721 | 0.6901 | 9.613028e-09 | 2809 |
| 0.7753 | 0.7153 | 0.8721 | 0.6901 | 9.612758e-09 | 2810 |
| 0.8003 | 0.7106 | 0.8720 | 0.6901 | 9.612488e-09 | 2811 |
| 0.7841 | 0.7012 | 0.8720 | 0.6901 | 9.612218e-09 | 2812 |
| 0.7747 | 0.7129 | 0.8719 | 0.6831 | 9.611948e-09 | 2813 |
| 0.7759 | 0.6965 | 0.8719 | 0.6831 | 9.611677e-09 | 2814 |
| 0.7895 | 0.6941 | 0.8718 | 0.6831 | 9.611406e-09 | 2815 |
| 0.7960 | 0.6965 | 0.8718 | 0.6831 | 9.611135e-09 | 2816 |
| 0.7692 | 0.7059 | 0.8718 | 0.6831 | 9.610864e-09 | 2817 |
| 0.7825 | 0.7082 | 0.8718 | 0.6831 | 9.610593e-09 | 2818 |
| 0.7675 | 0.7129 | 0.8717 | 0.6831 | 9.610322e-09 | 2819 |
| 0.7649 | 0.7059 | 0.8717 | 0.6831 | 9.610051e-09 | 2820 |
| 0.7916 | 0.7059 | 0.8716 | 0.6831 | 9.6097805e-09 | 2821 |
| 0.7697 | 0.7153 | 0.8716 | 0.6831 | 9.60951e-09 | 2822 |
| 0.7750 | 0.7176 | 0.8716 | 0.6831 | 9.609239e-09 | 2823 |
| 0.7876 | 0.7129 | 0.8716 | 0.6831 | 9.608967e-09 | 2824 |
| 0.7712 | 0.7129 | 0.8716 | 0.6831 | 9.608695e-09 | 2825 |
| 0.7801 | 0.7106 | 0.8715 | 0.6831 | 9.608423e-09 | 2826 |
| 0.7789 | 0.6941 | 0.8715 | 0.6831 | 9.608152e-09 | 2827 |
| 0.7704 | 0.7176 | 0.8715 | 0.6831 | 9.60788e-09 | 2828 |
| 0.7711 | 0.7200 | 0.8715 | 0.6831 | 9.607608e-09 | 2829 |
| 0.7837 | 0.7153 | 0.8714 | 0.6761 | 9.607336e-09 | 2830 |
| 0.7813 | 0.7200 | 0.8714 | 0.6761 | 9.6070645e-09 | 2831 |
| 0.7800 | 0.7082 | 0.8714 | 0.6761 | 9.606793e-09 | 2832 |
| 0.7722 | 0.7106 | 0.8713 | 0.6761 | 9.606521e-09 | 2833 |
| 0.7726 | 0.7106 | 0.8713 | 0.6761 | 9.606248e-09 | 2834 |
| 0.7752 | 0.7153 | 0.8713 | 0.6761 | 9.605976e-09 | 2835 |
| 0.7731 | 0.7176 | 0.8713 | 0.6761 | 9.605703e-09 | 2836 |
| 0.7878 | 0.7035 | 0.8712 | 0.6761 | 9.60543e-09 | 2837 |
| 0.7771 | 0.7082 | 0.8712 | 0.6761 | 9.605158e-09 | 2838 |
| 0.7677 | 0.7200 | 0.8712 | 0.6761 | 9.604885e-09 | 2839 |
| 0.7856 | 0.6965 | 0.8712 | 0.6761 | 9.604612e-09 | 2840 |
| 0.7747 | 0.7035 | 0.8712 | 0.6761 | 9.6043395e-09 | 2841 |
| 0.7735 | 0.7153 | 0.8711 | 0.6761 | 9.604067e-09 | 2842 |
| 0.7723 | 0.7059 | 0.8711 | 0.6761 | 9.603794e-09 | 2843 |
| 0.7730 | 0.7059 | 0.8711 | 0.6761 | 9.603521e-09 | 2844 |
| 0.7826 | 0.7106 | 0.8711 | 0.6761 | 9.603247e-09 | 2845 |
| 0.7694 | 0.7153 | 0.8710 | 0.6761 | 9.6029735e-09 | 2846 |
| 0.7789 | 0.7082 | 0.8710 | 0.6761 | 9.6027e-09 | 2847 |
| 0.7723 | 0.7035 | 0.8710 | 0.6761 | 9.602426e-09 | 2848 |
| 0.7734 | 0.7082 | 0.8710 | 0.6761 | 9.602153e-09 | 2849 |
| 0.7771 | 0.7012 | 0.8709 | 0.6761 | 9.601879e-09 | 2850 |
| 0.7757 | 0.7059 | 0.8709 | 0.6761 | 9.601606e-09 | 2851 |
| 0.7639 | 0.7106 | 0.8709 | 0.6761 | 9.601332e-09 | 2852 |
| 0.7830 | 0.7106 | 0.8709 | 0.6761 | 9.601059e-09 | 2853 |
| 0.7762 | 0.7059 | 0.8709 | 0.6761 | 9.600784e-09 | 2854 |
| 0.7751 | 0.7129 | 0.8708 | 0.6761 | 9.60051e-09 | 2855 |
| 0.7729 | 0.7200 | 0.8708 | 0.6761 | 9.600235e-09 | 2856 |
| 0.7640 | 0.7200 | 0.8708 | 0.6761 | 9.599961e-09 | 2857 |
| 0.7802 | 0.6918 | 0.8708 | 0.6761 | 9.599686e-09 | 2858 |
| 0.7761 | 0.7129 | 0.8707 | 0.6761 | 9.599412e-09 | 2859 |
| 0.7851 | 0.7035 | 0.8707 | 0.6761 | 9.5991375e-09 | 2860 |
| 0.7873 | 0.7059 | 0.8707 | 0.6761 | 9.598863e-09 | 2861 |
| 0.7683 | 0.7012 | 0.8706 | 0.6761 | 9.598589e-09 | 2862 |
| 0.7732 | 0.7176 | 0.8706 | 0.6761 | 9.598314e-09 | 2863 |
| 0.7721 | 0.7035 | 0.8706 | 0.6761 | 9.59804e-09 | 2864 |
| 0.7796 | 0.7059 | 0.8706 | 0.6761 | 9.597764e-09 | 2865 |
| 0.7717 | 0.7106 | 0.8706 | 0.6761 | 9.597489e-09 | 2866 |
| 0.7875 | 0.6965 | 0.8706 | 0.6761 | 9.597214e-09 | 2867 |
| 0.7782 | 0.7129 | 0.8706 | 0.6761 | 9.596938e-09 | 2868 |
| 0.7721 | 0.7106 | 0.8705 | 0.6761 | 9.596663e-09 | 2869 |
| 0.7718 | 0.7106 | 0.8705 | 0.6761 | 9.596388e-09 | 2870 |
| 0.7771 | 0.7012 | 0.8705 | 0.6761 | 9.596112e-09 | 2871 |
| 0.7625 | 0.7059 | 0.8704 | 0.6761 | 9.595837e-09 | 2872 |
| 0.7689 | 0.7247 | 0.8704 | 0.6761 | 9.595562e-09 | 2873 |
| 0.7728 | 0.7153 | 0.8704 | 0.6761 | 9.595286e-09 | 2874 |
| 0.7729 | 0.7035 | 0.8703 | 0.6761 | 9.59501e-09 | 2875 |
| 0.7831 | 0.7035 | 0.8703 | 0.6761 | 9.594734e-09 | 2876 |
| 0.7659 | 0.7035 | 0.8703 | 0.6761 | 9.594458e-09 | 2877 |
| 0.7601 | 0.7200 | 0.8702 | 0.6761 | 9.5941814e-09 | 2878 |
| 0.7685 | 0.7035 | 0.8702 | 0.6761 | 9.593905e-09 | 2879 |
| 0.7714 | 0.7153 | 0.8702 | 0.6761 | 9.593629e-09 | 2880 |
| 0.7698 | 0.7129 | 0.8702 | 0.6761 | 9.593353e-09 | 2881 |
| 0.7700 | 0.7035 | 0.8702 | 0.6761 | 9.5930766e-09 | 2882 |
| 0.7771 | 0.7035 | 0.8701 | 0.6761 | 9.5928e-09 | 2883 |
| 0.7706 | 0.7082 | 0.8701 | 0.6761 | 9.592524e-09 | 2884 |
| 0.7790 | 0.7059 | 0.8701 | 0.6761 | 9.592247e-09 | 2885 |
| 0.7861 | 0.7059 | 0.8700 | 0.6761 | 9.59197e-09 | 2886 |
| 0.7716 | 0.7035 | 0.8699 | 0.6761 | 9.591693e-09 | 2887 |
| 0.7836 | 0.7012 | 0.8699 | 0.6761 | 9.591416e-09 | 2888 |
| 0.7718 | 0.7176 | 0.8698 | 0.6761 | 9.5911386e-09 | 2889 |
| 0.7713 | 0.7153 | 0.8698 | 0.6761 | 9.590861e-09 | 2890 |
| 0.7755 | 0.7059 | 0.8698 | 0.6761 | 9.590584e-09 | 2891 |
| 0.7630 | 0.7176 | 0.8698 | 0.6761 | 9.590307e-09 | 2892 |
| 0.7732 | 0.7012 | 0.8698 | 0.6761 | 9.59003e-09 | 2893 |
| 0.7909 | 0.7082 | 0.8697 | 0.6761 | 9.589753e-09 | 2894 |
| 0.7846 | 0.6918 | 0.8697 | 0.6761 | 9.589475e-09 | 2895 |
| 0.7828 | 0.6965 | 0.8697 | 0.6761 | 9.589197e-09 | 2896 |
| 0.7745 | 0.7012 | 0.8697 | 0.6761 | 9.588919e-09 | 2897 |
| 0.7721 | 0.7106 | 0.8696 | 0.6761 | 9.588641e-09 | 2898 |
| 0.7690 | 0.7059 | 0.8696 | 0.6761 | 9.588363e-09 | 2899 |
| 0.7670 | 0.7224 | 0.8696 | 0.6761 | 9.588085e-09 | 2900 |
| 0.7805 | 0.7176 | 0.8696 | 0.6761 | 9.587807e-09 | 2901 |
| 0.7662 | 0.7176 | 0.8696 | 0.6761 | 9.587529e-09 | 2902 |
| 0.7671 | 0.7176 | 0.8695 | 0.6761 | 9.587251e-09 | 2903 |
| 0.7701 | 0.7082 | 0.8695 | 0.6761 | 9.586973e-09 | 2904 |
| 0.7733 | 0.7035 | 0.8695 | 0.6761 | 9.586694e-09 | 2905 |
| 0.7769 | 0.7082 | 0.8695 | 0.6761 | 9.586415e-09 | 2906 |
| 0.7826 | 0.7082 | 0.8695 | 0.6761 | 9.586136e-09 | 2907 |
| 0.7660 | 0.7224 | 0.8695 | 0.6761 | 9.585857e-09 | 2908 |
| 0.7686 | 0.7247 | 0.8695 | 0.6761 | 9.5855786e-09 | 2909 |
| 0.7746 | 0.7224 | 0.8695 | 0.6761 | 9.5853e-09 | 2910 |
| 0.7726 | 0.7153 | 0.8694 | 0.6761 | 9.585021e-09 | 2911 |
| 0.7746 | 0.7059 | 0.8694 | 0.6761 | 9.584742e-09 | 2912 |
| 0.7722 | 0.7082 | 0.8694 | 0.6761 | 9.584463e-09 | 2913 |
| 0.7748 | 0.7200 | 0.8693 | 0.6761 | 9.584184e-09 | 2914 |
| 0.7731 | 0.7106 | 0.8693 | 0.6761 | 9.583904e-09 | 2915 |
| 0.7807 | 0.7059 | 0.8692 | 0.6761 | 9.583625e-09 | 2916 |
| 0.7707 | 0.7082 | 0.8692 | 0.6761 | 9.583345e-09 | 2917 |
| 0.7713 | 0.7059 | 0.8692 | 0.6761 | 9.583065e-09 | 2918 |
| 0.7700 | 0.7059 | 0.8692 | 0.6831 | 9.582785e-09 | 2919 |
| 0.7787 | 0.7153 | 0.8692 | 0.6831 | 9.5825055e-09 | 2920 |
| 0.7699 | 0.7082 | 0.8691 | 0.6831 | 9.582226e-09 | 2921 |
| 0.7718 | 0.7059 | 0.8691 | 0.6831 | 9.581946e-09 | 2922 |
| 0.7739 | 0.6988 | 0.8691 | 0.6831 | 9.581666e-09 | 2923 |
| 0.7796 | 0.7035 | 0.8691 | 0.6831 | 9.581386e-09 | 2924 |
| 0.7630 | 0.7153 | 0.8691 | 0.6831 | 9.581106e-09 | 2925 |
| 0.7712 | 0.6941 | 0.8691 | 0.6831 | 9.580825e-09 | 2926 |
| 0.7619 | 0.7035 | 0.8691 | 0.6831 | 9.580544e-09 | 2927 |
| 0.7705 | 0.7035 | 0.8690 | 0.6831 | 9.580264e-09 | 2928 |
| 0.7800 | 0.7129 | 0.8690 | 0.6831 | 9.579983e-09 | 2929 |
| 0.7786 | 0.7106 | 0.8690 | 0.6831 | 9.579702e-09 | 2930 |
| 0.7884 | 0.7059 | 0.8689 | 0.6831 | 9.579422e-09 | 2931 |
| 0.7709 | 0.7059 | 0.8689 | 0.6831 | 9.579141e-09 | 2932 |
| 0.7773 | 0.7106 | 0.8689 | 0.6831 | 9.57886e-09 | 2933 |
| 0.7638 | 0.7176 | 0.8689 | 0.6831 | 9.57858e-09 | 2934 |
| 0.7684 | 0.7129 | 0.8689 | 0.6831 | 9.578298e-09 | 2935 |
| 0.7762 | 0.7059 | 0.8688 | 0.6831 | 9.578017e-09 | 2936 |
| 0.7765 | 0.7012 | 0.8688 | 0.6831 | 9.577735e-09 | 2937 |
| 0.7725 | 0.7294 | 0.8688 | 0.6831 | 9.5774535e-09 | 2938 |
| 0.7702 | 0.7153 | 0.8688 | 0.6831 | 9.577172e-09 | 2939 |
| 0.7734 | 0.7059 | 0.8688 | 0.6831 | 9.57689e-09 | 2940 |
| 0.7699 | 0.7176 | 0.8687 | 0.6831 | 9.576609e-09 | 2941 |
| 0.7668 | 0.7035 | 0.8687 | 0.6831 | 9.576327e-09 | 2942 |
| 0.7650 | 0.7247 | 0.8687 | 0.6831 | 9.576046e-09 | 2943 |
| 0.7789 | 0.6918 | 0.8687 | 0.6831 | 9.575764e-09 | 2944 |
| 0.7658 | 0.7153 | 0.8686 | 0.6831 | 9.575482e-09 | 2945 |
| 0.7794 | 0.7106 | 0.8686 | 0.6831 | 9.575199e-09 | 2946 |
| 0.7773 | 0.7200 | 0.8686 | 0.6831 | 9.574917e-09 | 2947 |
| 0.7768 | 0.7082 | 0.8686 | 0.6831 | 9.574634e-09 | 2948 |
| 0.7630 | 0.7082 | 0.8686 | 0.6831 | 9.574352e-09 | 2949 |
| 0.7707 | 0.7200 | 0.8685 | 0.6831 | 9.5740695e-09 | 2950 |
| 0.7598 | 0.7059 | 0.8685 | 0.6831 | 9.573787e-09 | 2951 |
| 0.7579 | 0.7106 | 0.8685 | 0.6831 | 9.573505e-09 | 2952 |
| 0.7712 | 0.6965 | 0.8685 | 0.6831 | 9.573222e-09 | 2953 |
| 0.7694 | 0.7129 | 0.8685 | 0.6831 | 9.57294e-09 | 2954 |
| 0.7819 | 0.6988 | 0.8685 | 0.6831 | 9.572657e-09 | 2955 |
| 0.7718 | 0.7012 | 0.8685 | 0.6831 | 9.572374e-09 | 2956 |
| 0.7682 | 0.7059 | 0.8684 | 0.6831 | 9.572091e-09 | 2957 |
| 0.7655 | 0.7176 | 0.8684 | 0.6831 | 9.571807e-09 | 2958 |
| 0.7694 | 0.7129 | 0.8684 | 0.6831 | 9.571524e-09 | 2959 |
| 0.7685 | 0.7200 | 0.8684 | 0.6831 | 9.571241e-09 | 2960 |
| 0.7753 | 0.7012 | 0.8684 | 0.6831 | 9.570957e-09 | 2961 |
| 0.7791 | 0.7176 | 0.8683 | 0.6831 | 9.570674e-09 | 2962 |
| 0.7869 | 0.7012 | 0.8683 | 0.6831 | 9.570391e-09 | 2963 |
| 0.7678 | 0.7129 | 0.8683 | 0.6831 | 9.570107e-09 | 2964 |
| 0.7786 | 0.7035 | 0.8683 | 0.6831 | 9.569824e-09 | 2965 |
| 0.7591 | 0.7082 | 0.8683 | 0.6831 | 9.56954e-09 | 2966 |
| 0.7756 | 0.7082 | 0.8683 | 0.6831 | 9.569256e-09 | 2967 |
| 0.7699 | 0.6965 | 0.8683 | 0.6831 | 9.568971e-09 | 2968 |
| 0.7703 | 0.7082 | 0.8682 | 0.6831 | 9.568687e-09 | 2969 |
| 0.7761 | 0.7224 | 0.8682 | 0.6831 | 9.568403e-09 | 2970 |
| 0.7562 | 0.7059 | 0.8682 | 0.6831 | 9.568119e-09 | 2971 |
| 0.7686 | 0.7176 | 0.8681 | 0.6831 | 9.5678345e-09 | 2972 |
| 0.7710 | 0.7059 | 0.8681 | 0.6831 | 9.56755e-09 | 2973 |
| 0.7660 | 0.7153 | 0.8681 | 0.6831 | 9.567266e-09 | 2974 |
| 0.7633 | 0.7035 | 0.8681 | 0.6831 | 9.566982e-09 | 2975 |
| 0.7609 | 0.7200 | 0.8681 | 0.6831 | 9.566697e-09 | 2976 |
| 0.7780 | 0.7129 | 0.8681 | 0.6831 | 9.566412e-09 | 2977 |
| 0.7675 | 0.7035 | 0.8681 | 0.6831 | 9.566127e-09 | 2978 |
| 0.7653 | 0.7200 | 0.8680 | 0.6901 | 9.5658415e-09 | 2979 |
| 0.7750 | 0.7059 | 0.8681 | 0.6901 | 9.565556e-09 | 2980 |
| 0.7616 | 0.7153 | 0.8680 | 0.6901 | 9.565271e-09 | 2981 |
| 0.7700 | 0.6988 | 0.8680 | 0.6901 | 9.564986e-09 | 2982 |
| 0.7760 | 0.7200 | 0.8680 | 0.6901 | 9.564701e-09 | 2983 |
| 0.7561 | 0.7200 | 0.8679 | 0.6901 | 9.564416e-09 | 2984 |
| 0.7751 | 0.7059 | 0.8679 | 0.6901 | 9.564131e-09 | 2985 |
| 0.7691 | 0.7059 | 0.8679 | 0.6901 | 9.563845e-09 | 2986 |
| 0.7706 | 0.7106 | 0.8678 | 0.6901 | 9.563559e-09 | 2987 |
| 0.7526 | 0.7271 | 0.8679 | 0.6901 | 9.563273e-09 | 2988 |
| 0.7736 | 0.7059 | 0.8678 | 0.6901 | 9.562987e-09 | 2989 |
| 0.7747 | 0.7200 | 0.8678 | 0.6901 | 9.562701e-09 | 2990 |
| 0.7592 | 0.7200 | 0.8678 | 0.6901 | 9.562415e-09 | 2991 |
| 0.7621 | 0.7106 | 0.8678 | 0.6901 | 9.562129e-09 | 2992 |
| 0.7703 | 0.7106 | 0.8677 | 0.6831 | 9.561843e-09 | 2993 |
| 0.7607 | 0.7153 | 0.8677 | 0.6901 | 9.561557e-09 | 2994 |
| 0.7649 | 0.7129 | 0.8677 | 0.6831 | 9.561271e-09 | 2995 |
| 0.7776 | 0.7012 | 0.8676 | 0.6831 | 9.560984e-09 | 2996 |
| 0.7682 | 0.7153 | 0.8676 | 0.6831 | 9.560697e-09 | 2997 |
| 0.7645 | 0.7106 | 0.8676 | 0.6831 | 9.56041e-09 | 2998 |
| 0.7581 | 0.7294 | 0.8676 | 0.6831 | 9.560123e-09 | 2999 |
| 0.7632 | 0.7082 | 0.8676 | 0.6831 | 9.5598365e-09 | 3000 |
| 0.7649 | 0.7012 | 0.8675 | 0.6831 | 9.55955e-09 | 3001 |
| 0.7686 | 0.7106 | 0.8675 | 0.6831 | 9.559263e-09 | 3002 |
| 0.7665 | 0.7200 | 0.8674 | 0.6831 | 9.558976e-09 | 3003 |
| 0.7633 | 0.7106 | 0.8674 | 0.6831 | 9.558689e-09 | 3004 |
| 0.7575 | 0.7082 | 0.8673 | 0.6831 | 9.558402e-09 | 3005 |
| 0.7673 | 0.7176 | 0.8673 | 0.6831 | 9.558115e-09 | 3006 |
| 0.7681 | 0.7082 | 0.8673 | 0.6831 | 9.557827e-09 | 3007 |
| 0.7600 | 0.7035 | 0.8672 | 0.6831 | 9.55754e-09 | 3008 |
| 0.7650 | 0.7176 | 0.8672 | 0.6831 | 9.557252e-09 | 3009 |
| 0.7783 | 0.6988 | 0.8672 | 0.6831 | 9.556964e-09 | 3010 |
| 0.7520 | 0.7200 | 0.8671 | 0.6831 | 9.556676e-09 | 3011 |
| 0.7645 | 0.7200 | 0.8671 | 0.6831 | 9.556389e-09 | 3012 |
| 0.7750 | 0.7035 | 0.8671 | 0.6831 | 9.556101e-09 | 3013 |
| 0.7678 | 0.7176 | 0.8671 | 0.6831 | 9.555813e-09 | 3014 |
| 0.7693 | 0.7106 | 0.8671 | 0.6831 | 9.555525e-09 | 3015 |
| 0.7667 | 0.7082 | 0.8670 | 0.6831 | 9.5552375e-09 | 3016 |
| 0.7630 | 0.7176 | 0.8670 | 0.6831 | 9.554949e-09 | 3017 |
| 0.7597 | 0.7271 | 0.8670 | 0.6831 | 9.55466e-09 | 3018 |
| 0.7679 | 0.7176 | 0.8669 | 0.6831 | 9.5543715e-09 | 3019 |
| 0.7633 | 0.7176 | 0.8669 | 0.6831 | 9.554083e-09 | 3020 |
| 0.7613 | 0.7082 | 0.8668 | 0.6901 | 9.553794e-09 | 3021 |
| 0.7702 | 0.7153 | 0.8668 | 0.6901 | 9.5535055e-09 | 3022 |
| 0.7571 | 0.7224 | 0.8668 | 0.6901 | 9.553217e-09 | 3023 |
| 0.7713 | 0.7035 | 0.8668 | 0.6901 | 9.552928e-09 | 3024 |
| 0.7669 | 0.7247 | 0.8668 | 0.6901 | 9.55264e-09 | 3025 |
| 0.7632 | 0.7059 | 0.8668 | 0.6901 | 9.552351e-09 | 3026 |
| 0.7631 | 0.7082 | 0.8668 | 0.6901 | 9.552061e-09 | 3027 |
| 0.7606 | 0.7059 | 0.8667 | 0.6901 | 9.551772e-09 | 3028 |
| 0.7552 | 0.7200 | 0.8668 | 0.6901 | 9.551482e-09 | 3029 |
| 0.7716 | 0.7106 | 0.8667 | 0.6901 | 9.551193e-09 | 3030 |
| 0.7643 | 0.7271 | 0.8667 | 0.6901 | 9.550903e-09 | 3031 |
| 0.7545 | 0.7106 | 0.8667 | 0.6901 | 9.550614e-09 | 3032 |
| 0.7733 | 0.7082 | 0.8666 | 0.6901 | 9.550324e-09 | 3033 |
| 0.7645 | 0.7176 | 0.8666 | 0.6901 | 9.5500345e-09 | 3034 |
| 0.7551 | 0.7200 | 0.8666 | 0.6901 | 9.549745e-09 | 3035 |
| 0.7685 | 0.7153 | 0.8666 | 0.6901 | 9.5494554e-09 | 3036 |
| 0.7742 | 0.7176 | 0.8666 | 0.6901 | 9.549165e-09 | 3037 |
| 0.7570 | 0.7129 | 0.8666 | 0.6901 | 9.548875e-09 | 3038 |
| 0.7696 | 0.7106 | 0.8666 | 0.6901 | 9.548584e-09 | 3039 |
| 0.7600 | 0.7129 | 0.8665 | 0.6901 | 9.548294e-09 | 3040 |
| 0.7658 | 0.7129 | 0.8665 | 0.6901 | 9.548003e-09 | 3041 |
| 0.7642 | 0.7176 | 0.8664 | 0.6901 | 9.547713e-09 | 3042 |
| 0.7690 | 0.7200 | 0.8664 | 0.6901 | 9.547422e-09 | 3043 |
| 0.7683 | 0.7247 | 0.8663 | 0.6901 | 9.547132e-09 | 3044 |
| 0.7661 | 0.7153 | 0.8664 | 0.6901 | 9.5468415e-09 | 3045 |
| 0.7662 | 0.7035 | 0.8663 | 0.6901 | 9.546551e-09 | 3046 |
| 0.7590 | 0.7106 | 0.8663 | 0.6901 | 9.546261e-09 | 3047 |
| 0.7622 | 0.7224 | 0.8663 | 0.6901 | 9.545969e-09 | 3048 |
| 0.7513 | 0.7318 | 0.8663 | 0.6901 | 9.545678e-09 | 3049 |
| 0.7664 | 0.7200 | 0.8662 | 0.6901 | 9.545387e-09 | 3050 |
| 0.7672 | 0.7200 | 0.8662 | 0.6901 | 9.545095e-09 | 3051 |
| 0.7648 | 0.7106 | 0.8662 | 0.6901 | 9.544804e-09 | 3052 |
| 0.7620 | 0.7224 | 0.8662 | 0.6901 | 9.544513e-09 | 3053 |
| 0.7520 | 0.7176 | 0.8662 | 0.6901 | 9.544221e-09 | 3054 |
| 0.7598 | 0.7153 | 0.8662 | 0.6901 | 9.54393e-09 | 3055 |
| 0.7709 | 0.7271 | 0.8661 | 0.6901 | 9.543639e-09 | 3056 |
| 0.7684 | 0.7106 | 0.8661 | 0.6901 | 9.5433474e-09 | 3057 |
| 0.7593 | 0.7082 | 0.8661 | 0.6901 | 9.543055e-09 | 3058 |
| 0.7584 | 0.7176 | 0.8661 | 0.6901 | 9.542763e-09 | 3059 |
| 0.7694 | 0.7106 | 0.8661 | 0.6901 | 9.542471e-09 | 3060 |
| 0.7565 | 0.7129 | 0.8661 | 0.6901 | 9.542179e-09 | 3061 |
| 0.7491 | 0.7153 | 0.8660 | 0.6901 | 9.541886e-09 | 3062 |
| 0.7574 | 0.7318 | 0.8660 | 0.6901 | 9.541594e-09 | 3063 |
| 0.7677 | 0.7247 | 0.8660 | 0.6901 | 9.541302e-09 | 3064 |
| 0.7590 | 0.7176 | 0.8660 | 0.6901 | 9.54101e-09 | 3065 |
| 0.7626 | 0.6988 | 0.8660 | 0.6901 | 9.5407175e-09 | 3066 |
| 0.7657 | 0.7129 | 0.8660 | 0.6901 | 9.540425e-09 | 3067 |
| 0.7609 | 0.7082 | 0.8660 | 0.6901 | 9.540132e-09 | 3068 |
| 0.7603 | 0.7129 | 0.8659 | 0.6901 | 9.539839e-09 | 3069 |
| 0.7607 | 0.7082 | 0.8659 | 0.6901 | 9.539546e-09 | 3070 |
| 0.7607 | 0.7176 | 0.8659 | 0.6901 | 9.539253e-09 | 3071 |
| 0.7582 | 0.7200 | 0.8659 | 0.6901 | 9.53896e-09 | 3072 |
| 0.7712 | 0.7176 | 0.8658 | 0.6901 | 9.538667e-09 | 3073 |
| 0.7623 | 0.7129 | 0.8658 | 0.6901 | 9.538374e-09 | 3074 |
| 0.7456 | 0.7271 | 0.8658 | 0.6901 | 9.5380805e-09 | 3075 |
| 0.7578 | 0.7271 | 0.8658 | 0.6901 | 9.5377874e-09 | 3076 |
| 0.7638 | 0.7224 | 0.8658 | 0.6901 | 9.537494e-09 | 3077 |
| 0.7696 | 0.7129 | 0.8658 | 0.6901 | 9.5372e-09 | 3078 |
| 0.7540 | 0.7153 | 0.8657 | 0.6901 | 9.536906e-09 | 3079 |
| 0.7544 | 0.7200 | 0.8657 | 0.6901 | 9.536612e-09 | 3080 |
| 0.7593 | 0.7200 | 0.8657 | 0.6901 | 9.536318e-09 | 3081 |
| 0.7659 | 0.7129 | 0.8657 | 0.6901 | 9.536024e-09 | 3082 |
| 0.7584 | 0.7176 | 0.8656 | 0.6901 | 9.53573e-09 | 3083 |
| 0.7625 | 0.7247 | 0.8656 | 0.6901 | 9.535436e-09 | 3084 |
| 0.7669 | 0.7271 | 0.8656 | 0.6901 | 9.5351425e-09 | 3085 |
| 0.7585 | 0.7271 | 0.8656 | 0.6901 | 9.5348485e-09 | 3086 |
| 0.7575 | 0.6988 | 0.8656 | 0.6901 | 9.5345545e-09 | 3087 |
| 0.7565 | 0.7176 | 0.8656 | 0.6901 | 9.5342605e-09 | 3088 |
| 0.7525 | 0.7035 | 0.8655 | 0.6901 | 9.533966e-09 | 3089 |
| 0.7544 | 0.7294 | 0.8655 | 0.6901 | 9.533671e-09 | 3090 |
| 0.7603 | 0.7200 | 0.8655 | 0.6901 | 9.533376e-09 | 3091 |
| 0.7627 | 0.7200 | 0.8655 | 0.6901 | 9.533081e-09 | 3092 |
| 0.7547 | 0.7129 | 0.8654 | 0.6901 | 9.532786e-09 | 3093 |
| 0.7532 | 0.7200 | 0.8655 | 0.6901 | 9.532491e-09 | 3094 |
| 0.7511 | 0.7247 | 0.8654 | 0.6901 | 9.532196e-09 | 3095 |
| 0.7619 | 0.7059 | 0.8654 | 0.6901 | 9.5319015e-09 | 3096 |
| 0.7728 | 0.6988 | 0.8654 | 0.6901 | 9.531607e-09 | 3097 |
| 0.7659 | 0.7176 | 0.8654 | 0.6901 | 9.531312e-09 | 3098 |
| 0.7556 | 0.7176 | 0.8653 | 0.6901 | 9.531016e-09 | 3099 |
| 0.7621 | 0.7294 | 0.8653 | 0.6901 | 9.53072e-09 | 3100 |
| 0.7578 | 0.7153 | 0.8653 | 0.6901 | 9.5304244e-09 | 3101 |
| 0.7575 | 0.7294 | 0.8653 | 0.6901 | 9.530129e-09 | 3102 |
| 0.7598 | 0.7224 | 0.8653 | 0.6901 | 9.529833e-09 | 3103 |
| 0.7689 | 0.7153 | 0.8653 | 0.6901 | 9.529537e-09 | 3104 |
| 0.7625 | 0.7176 | 0.8652 | 0.6901 | 9.529241e-09 | 3105 |
| 0.7624 | 0.7082 | 0.8652 | 0.6901 | 9.528946e-09 | 3106 |
| 0.7646 | 0.7200 | 0.8652 | 0.6901 | 9.52865e-09 | 3107 |
| 0.7583 | 0.7153 | 0.8652 | 0.6901 | 9.528354e-09 | 3108 |
| 0.7570 | 0.7294 | 0.8652 | 0.6901 | 9.5280575e-09 | 3109 |
| 0.7624 | 0.7059 | 0.8652 | 0.6901 | 9.527761e-09 | 3110 |
| 0.7724 | 0.7153 | 0.8651 | 0.6901 | 9.527464e-09 | 3111 |
| 0.7496 | 0.7153 | 0.8651 | 0.6901 | 9.5271675e-09 | 3112 |
| 0.7584 | 0.7271 | 0.8651 | 0.6901 | 9.526871e-09 | 3113 |
| 0.7625 | 0.7224 | 0.8651 | 0.6901 | 9.526574e-09 | 3114 |
| 0.7594 | 0.7106 | 0.8651 | 0.6901 | 9.5262775e-09 | 3115 |
| 0.7468 | 0.7247 | 0.8650 | 0.6901 | 9.525981e-09 | 3116 |
| 0.7572 | 0.7176 | 0.8650 | 0.6901 | 9.525684e-09 | 3117 |
| 0.7535 | 0.7224 | 0.8650 | 0.6901 | 9.525388e-09 | 3118 |
| 0.7647 | 0.7224 | 0.8650 | 0.6901 | 9.525091e-09 | 3119 |
| 0.7651 | 0.7153 | 0.8650 | 0.6901 | 9.524793e-09 | 3120 |
| 0.7586 | 0.7200 | 0.8650 | 0.6901 | 9.524496e-09 | 3121 |
| 0.7664 | 0.7153 | 0.8649 | 0.6901 | 9.524198e-09 | 3122 |
| 0.7564 | 0.7271 | 0.8649 | 0.6901 | 9.523901e-09 | 3123 |
| 0.7605 | 0.7176 | 0.8649 | 0.6901 | 9.523603e-09 | 3124 |
| 0.7589 | 0.7153 | 0.8648 | 0.6901 | 9.523306e-09 | 3125 |
| 0.7548 | 0.7224 | 0.8648 | 0.6901 | 9.523008e-09 | 3126 |
| 0.7582 | 0.7200 | 0.8648 | 0.6901 | 9.522711e-09 | 3127 |
| 0.7606 | 0.7059 | 0.8647 | 0.6901 | 9.522413e-09 | 3128 |
| 0.7618 | 0.7176 | 0.8647 | 0.6901 | 9.5221155e-09 | 3129 |
| 0.7490 | 0.7129 | 0.8647 | 0.6901 | 9.521817e-09 | 3130 |
| 0.7546 | 0.7153 | 0.8647 | 0.6901 | 9.521519e-09 | 3131 |
| 0.7569 | 0.7153 | 0.8646 | 0.6901 | 9.52122e-09 | 3132 |
| 0.7540 | 0.7176 | 0.8646 | 0.6901 | 9.520922e-09 | 3133 |
| 0.7556 | 0.7294 | 0.8646 | 0.6901 | 9.520623e-09 | 3134 |
| 0.7699 | 0.7176 | 0.8645 | 0.6901 | 9.520325e-09 | 3135 |
| 0.7502 | 0.7129 | 0.8645 | 0.6901 | 9.5200265e-09 | 3136 |
| 0.7598 | 0.7059 | 0.8645 | 0.6901 | 9.519728e-09 | 3137 |
| 0.7561 | 0.7294 | 0.8645 | 0.6901 | 9.51943e-09 | 3138 |
| 0.7566 | 0.7224 | 0.8645 | 0.6901 | 9.519131e-09 | 3139 |
| 0.7527 | 0.7200 | 0.8645 | 0.6901 | 9.518832e-09 | 3140 |
| 0.7573 | 0.7224 | 0.8645 | 0.6901 | 9.518533e-09 | 3141 |
| 0.7517 | 0.7200 | 0.8645 | 0.6901 | 9.518233e-09 | 3142 |
| 0.7593 | 0.7129 | 0.8644 | 0.6901 | 9.517934e-09 | 3143 |
| 0.7490 | 0.7129 | 0.8644 | 0.6901 | 9.517635e-09 | 3144 |
| 0.7632 | 0.7059 | 0.8644 | 0.6901 | 9.517335e-09 | 3145 |
| 0.7581 | 0.7106 | 0.8644 | 0.6901 | 9.517036e-09 | 3146 |
| 0.7671 | 0.7247 | 0.8644 | 0.6901 | 9.516737e-09 | 3147 |
| 0.7558 | 0.7271 | 0.8643 | 0.6901 | 9.516437e-09 | 3148 |
| 0.7501 | 0.7271 | 0.8643 | 0.6901 | 9.516138e-09 | 3149 |
| 0.7495 | 0.7271 | 0.8643 | 0.6901 | 9.515839e-09 | 3150 |
| 0.7630 | 0.7224 | 0.8643 | 0.6901 | 9.515539e-09 | 3151 |
| 0.7656 | 0.7176 | 0.8643 | 0.6901 | 9.515238e-09 | 3152 |
| 0.7567 | 0.7294 | 0.8643 | 0.6901 | 9.514938e-09 | 3153 |
| 0.7590 | 0.7129 | 0.8642 | 0.6901 | 9.514638e-09 | 3154 |
| 0.7580 | 0.7129 | 0.8642 | 0.6901 | 9.514338e-09 | 3155 |
| 0.7716 | 0.7106 | 0.8642 | 0.6901 | 9.514038e-09 | 3156 |
| 0.7533 | 0.7129 | 0.8642 | 0.6901 | 9.513737e-09 | 3157 |
| 0.7673 | 0.7106 | 0.8641 | 0.6901 | 9.513437e-09 | 3158 |
| 0.7719 | 0.7271 | 0.8641 | 0.6901 | 9.513137e-09 | 3159 |
| 0.7575 | 0.7200 | 0.8641 | 0.6901 | 9.512837e-09 | 3160 |
| 0.7494 | 0.7247 | 0.8641 | 0.6901 | 9.512536e-09 | 3161 |
| 0.7381 | 0.7294 | 0.8640 | 0.6901 | 9.5122346e-09 | 3162 |
| 0.7500 | 0.7153 | 0.8640 | 0.6901 | 9.5119335e-09 | 3163 |
| 0.7559 | 0.7153 | 0.8640 | 0.6901 | 9.511632e-09 | 3164 |
| 0.7520 | 0.7176 | 0.8640 | 0.6901 | 9.511331e-09 | 3165 |
| 0.7466 | 0.7318 | 0.8640 | 0.6901 | 9.51103e-09 | 3166 |
| 0.7626 | 0.7129 | 0.8639 | 0.6901 | 9.510729e-09 | 3167 |
| 0.7532 | 0.7271 | 0.8639 | 0.6901 | 9.510428e-09 | 3168 |
| 0.7467 | 0.7247 | 0.8639 | 0.6901 | 9.510127e-09 | 3169 |
| 0.7643 | 0.7247 | 0.8639 | 0.6901 | 9.509826e-09 | 3170 |
| 0.7380 | 0.7200 | 0.8639 | 0.6901 | 9.509524e-09 | 3171 |
| 0.7493 | 0.7153 | 0.8638 | 0.6901 | 9.509222e-09 | 3172 |
| 0.7448 | 0.7365 | 0.8638 | 0.6901 | 9.50892e-09 | 3173 |
| 0.7650 | 0.7153 | 0.8638 | 0.6901 | 9.508618e-09 | 3174 |
| 0.7608 | 0.7294 | 0.8638 | 0.6901 | 9.508316e-09 | 3175 |
| 0.7564 | 0.7200 | 0.8637 | 0.6901 | 9.508014e-09 | 3176 |
| 0.7412 | 0.7200 | 0.8637 | 0.6901 | 9.507712e-09 | 3177 |
| 0.7521 | 0.7129 | 0.8637 | 0.6901 | 9.50741e-09 | 3178 |
| 0.7546 | 0.7224 | 0.8637 | 0.6901 | 9.507108e-09 | 3179 |
| 0.7422 | 0.7341 | 0.8636 | 0.6901 | 9.506806e-09 | 3180 |
| 0.7490 | 0.7247 | 0.8636 | 0.6901 | 9.506504e-09 | 3181 |
| 0.7491 | 0.7247 | 0.8636 | 0.6901 | 9.506201e-09 | 3182 |
| 0.7565 | 0.7200 | 0.8636 | 0.6901 | 9.505898e-09 | 3183 |
| 0.7520 | 0.7176 | 0.8636 | 0.6901 | 9.505595e-09 | 3184 |
| 0.7520 | 0.7271 | 0.8636 | 0.6901 | 9.5052926e-09 | 3185 |
| 0.7563 | 0.7200 | 0.8636 | 0.6901 | 9.50499e-09 | 3186 |
| 0.7541 | 0.7224 | 0.8636 | 0.6901 | 9.504687e-09 | 3187 |
| 0.7514 | 0.7271 | 0.8635 | 0.6901 | 9.504384e-09 | 3188 |
| 0.7668 | 0.7247 | 0.8635 | 0.6901 | 9.504081e-09 | 3189 |
| 0.7644 | 0.7153 | 0.8635 | 0.6901 | 9.503778e-09 | 3190 |
| 0.7516 | 0.7247 | 0.8635 | 0.6901 | 9.503475e-09 | 3191 |
| 0.7437 | 0.7224 | 0.8635 | 0.6901 | 9.503172e-09 | 3192 |
| 0.7466 | 0.7176 | 0.8634 | 0.6901 | 9.502868e-09 | 3193 |
| 0.7449 | 0.7200 | 0.8634 | 0.6901 | 9.502564e-09 | 3194 |
| 0.7651 | 0.7200 | 0.8634 | 0.6901 | 9.50226e-09 | 3195 |
| 0.7534 | 0.7294 | 0.8634 | 0.6901 | 9.5019566e-09 | 3196 |
| 0.7543 | 0.7271 | 0.8633 | 0.6901 | 9.501653e-09 | 3197 |
| 0.7402 | 0.7153 | 0.8633 | 0.6901 | 9.501349e-09 | 3198 |
| 0.7460 | 0.7247 | 0.8633 | 0.6901 | 9.501045e-09 | 3199 |
| 0.7556 | 0.7318 | 0.8633 | 0.6901 | 9.5007415e-09 | 3200 |
| 0.7634 | 0.7176 | 0.8633 | 0.6901 | 9.500438e-09 | 3201 |
| 0.7461 | 0.7129 | 0.8633 | 0.6901 | 9.500134e-09 | 3202 |
| 0.7427 | 0.7153 | 0.8633 | 0.6901 | 9.499829e-09 | 3203 |
| 0.7520 | 0.7200 | 0.8633 | 0.6901 | 9.499525e-09 | 3204 |
| 0.7490 | 0.7153 | 0.8632 | 0.6901 | 9.49922e-09 | 3205 |
| 0.7557 | 0.7153 | 0.8632 | 0.6901 | 9.498915e-09 | 3206 |
| 0.7546 | 0.7200 | 0.8632 | 0.6901 | 9.498611e-09 | 3207 |
| 0.7515 | 0.7271 | 0.8632 | 0.6901 | 9.498306e-09 | 3208 |
| 0.7639 | 0.7059 | 0.8632 | 0.6901 | 9.4980015e-09 | 3209 |
| 0.7392 | 0.7247 | 0.8632 | 0.6901 | 9.497697e-09 | 3210 |
| 0.7572 | 0.7200 | 0.8631 | 0.6901 | 9.497392e-09 | 3211 |
| 0.7545 | 0.7224 | 0.8631 | 0.6901 | 9.497088e-09 | 3212 |
| 0.7532 | 0.7176 | 0.8630 | 0.6901 | 9.496782e-09 | 3213 |
| 0.7649 | 0.7106 | 0.8630 | 0.6901 | 9.4964765e-09 | 3214 |
| 0.7417 | 0.7247 | 0.8630 | 0.6901 | 9.496171e-09 | 3215 |
| 0.7468 | 0.7247 | 0.8629 | 0.6901 | 9.495865e-09 | 3216 |
| 0.7484 | 0.7271 | 0.8629 | 0.6901 | 9.49556e-09 | 3217 |
| 0.7569 | 0.7271 | 0.8629 | 0.6901 | 9.495254e-09 | 3218 |
| 0.7449 | 0.7247 | 0.8628 | 0.6901 | 9.494949e-09 | 3219 |
| 0.7485 | 0.7318 | 0.8628 | 0.6901 | 9.494643e-09 | 3220 |
| 0.7597 | 0.7129 | 0.8628 | 0.6901 | 9.494338e-09 | 3221 |
| 0.7535 | 0.7318 | 0.8627 | 0.6901 | 9.494032e-09 | 3222 |
| 0.7483 | 0.7318 | 0.8627 | 0.6901 | 9.493726e-09 | 3223 |
| 0.7485 | 0.7200 | 0.8627 | 0.6972 | 9.493419e-09 | 3224 |
| 0.7515 | 0.7341 | 0.8627 | 0.6972 | 9.493113e-09 | 3225 |
| 0.7603 | 0.7224 | 0.8627 | 0.6972 | 9.4928065e-09 | 3226 |
| 0.7789 | 0.7153 | 0.8627 | 0.6972 | 9.4925e-09 | 3227 |
| 0.7575 | 0.7106 | 0.8627 | 0.6972 | 9.492194e-09 | 3228 |
| 0.7471 | 0.7271 | 0.8627 | 0.6972 | 9.491887e-09 | 3229 |
| 0.7596 | 0.7271 | 0.8627 | 0.6972 | 9.491581e-09 | 3230 |
| 0.7425 | 0.7247 | 0.8626 | 0.6972 | 9.491274e-09 | 3231 |
| 0.7521 | 0.7271 | 0.8626 | 0.6972 | 9.490968e-09 | 3232 |
| 0.7478 | 0.7318 | 0.8627 | 0.6972 | 9.490662e-09 | 3233 |
| 0.7632 | 0.7200 | 0.8627 | 0.6972 | 9.490354e-09 | 3234 |
| 0.7540 | 0.7271 | 0.8627 | 0.6972 | 9.490047e-09 | 3235 |
| 0.7464 | 0.7271 | 0.8627 | 0.6972 | 9.48974e-09 | 3236 |
| 0.7383 | 0.7271 | 0.8627 | 0.6972 | 9.489432e-09 | 3237 |
| 0.7505 | 0.7224 | 0.8627 | 0.6972 | 9.489125e-09 | 3238 |
| 0.7634 | 0.7176 | 0.8626 | 0.6972 | 9.488818e-09 | 3239 |
| 0.7539 | 0.7271 | 0.8626 | 0.6972 | 9.48851e-09 | 3240 |
| 0.7620 | 0.7153 | 0.8626 | 0.6972 | 9.488203e-09 | 3241 |
| 0.7351 | 0.7318 | 0.8626 | 0.6972 | 9.487896e-09 | 3242 |
| 0.7525 | 0.7247 | 0.8626 | 0.6972 | 9.4875885e-09 | 3243 |
| 0.7522 | 0.7153 | 0.8626 | 0.6972 | 9.48728e-09 | 3244 |
| 0.7550 | 0.7129 | 0.8625 | 0.6972 | 9.486972e-09 | 3245 |
| 0.7571 | 0.7106 | 0.8625 | 0.6972 | 9.486664e-09 | 3246 |
| 0.7512 | 0.7271 | 0.8624 | 0.6972 | 9.486356e-09 | 3247 |
| 0.7510 | 0.7529 | 0.8624 | 0.6972 | 9.4860475e-09 | 3248 |
| 0.7583 | 0.7200 | 0.8624 | 0.6972 | 9.485739e-09 | 3249 |
| 0.7687 | 0.7082 | 0.8623 | 0.6972 | 9.485431e-09 | 3250 |
| 0.7576 | 0.7176 | 0.8623 | 0.6972 | 9.485123e-09 | 3251 |
| 0.7486 | 0.7176 | 0.8623 | 0.6972 | 9.484815e-09 | 3252 |
| 0.7419 | 0.7271 | 0.8623 | 0.6972 | 9.4845065e-09 | 3253 |
| 0.7462 | 0.7318 | 0.8623 | 0.6972 | 9.484198e-09 | 3254 |
| 0.7317 | 0.7247 | 0.8623 | 0.6972 | 9.483889e-09 | 3255 |
| 0.7506 | 0.7247 | 0.8623 | 0.6972 | 9.48358e-09 | 3256 |
| 0.7566 | 0.7224 | 0.8622 | 0.6972 | 9.483271e-09 | 3257 |
| 0.7543 | 0.7341 | 0.8622 | 0.6972 | 9.482962e-09 | 3258 |
| 0.7489 | 0.7271 | 0.8622 | 0.6972 | 9.482653e-09 | 3259 |
| 0.7525 | 0.7176 | 0.8622 | 0.6972 | 9.482344e-09 | 3260 |
| 0.7578 | 0.7176 | 0.8622 | 0.6972 | 9.482035e-09 | 3261 |
| 0.7478 | 0.7271 | 0.8622 | 0.6972 | 9.481726e-09 | 3262 |
| 0.7639 | 0.7129 | 0.8622 | 0.6972 | 9.4814165e-09 | 3263 |
| 0.7505 | 0.7341 | 0.8622 | 0.6972 | 9.4811075e-09 | 3264 |
| 0.7571 | 0.7294 | 0.8621 | 0.6972 | 9.4807975e-09 | 3265 |
| 0.7577 | 0.7294 | 0.8621 | 0.6972 | 9.4804875e-09 | 3266 |
| 0.7393 | 0.7318 | 0.8621 | 0.6972 | 9.4801775e-09 | 3267 |
| 0.7410 | 0.7318 | 0.8620 | 0.6972 | 9.479868e-09 | 3268 |
| 0.7547 | 0.7224 | 0.8620 | 0.6972 | 9.479558e-09 | 3269 |
| 0.7366 | 0.7271 | 0.8620 | 0.6972 | 9.479248e-09 | 3270 |
| 0.7483 | 0.7318 | 0.8620 | 0.6972 | 9.478938e-09 | 3271 |
| 0.7481 | 0.7294 | 0.8619 | 0.6972 | 9.478628e-09 | 3272 |
| 0.7477 | 0.7341 | 0.8619 | 0.6972 | 9.478318e-09 | 3273 |
| 0.7451 | 0.7294 | 0.8619 | 0.6972 | 9.478008e-09 | 3274 |
| 0.7561 | 0.7176 | 0.8619 | 0.6972 | 9.477698e-09 | 3275 |
| 0.7402 | 0.7200 | 0.8619 | 0.6972 | 9.477387e-09 | 3276 |
| 0.7416 | 0.7247 | 0.8619 | 0.6972 | 9.477076e-09 | 3277 |
| 0.7431 | 0.7224 | 0.8618 | 0.6972 | 9.476765e-09 | 3278 |
| 0.7595 | 0.7224 | 0.8618 | 0.6972 | 9.476454e-09 | 3279 |
| 0.7462 | 0.7153 | 0.8618 | 0.6972 | 9.476143e-09 | 3280 |
| 0.7665 | 0.7035 | 0.8618 | 0.6972 | 9.475833e-09 | 3281 |
| 0.7546 | 0.7176 | 0.8618 | 0.6972 | 9.475522e-09 | 3282 |
| 0.7484 | 0.7271 | 0.8618 | 0.6972 | 9.475211e-09 | 3283 |
| 0.7481 | 0.7200 | 0.8617 | 0.6972 | 9.4749e-09 | 3284 |
| 0.7506 | 0.7247 | 0.8617 | 0.6972 | 9.474589e-09 | 3285 |
| 0.7371 | 0.7341 | 0.8617 | 0.6972 | 9.474277e-09 | 3286 |
| 0.7695 | 0.7294 | 0.8617 | 0.6972 | 9.473966e-09 | 3287 |
| 0.7423 | 0.7318 | 0.8616 | 0.6972 | 9.473654e-09 | 3288 |
| 0.7481 | 0.7271 | 0.8616 | 0.6972 | 9.473342e-09 | 3289 |
| 0.7532 | 0.7224 | 0.8616 | 0.6972 | 9.47303e-09 | 3290 |
| 0.7444 | 0.7318 | 0.8616 | 0.6972 | 9.472719e-09 | 3291 |
| 0.7345 | 0.7200 | 0.8616 | 0.6972 | 9.472407e-09 | 3292 |
| 0.7493 | 0.7176 | 0.8616 | 0.6972 | 9.472095e-09 | 3293 |
| 0.7500 | 0.7153 | 0.8616 | 0.6972 | 9.471783e-09 | 3294 |
| 0.7493 | 0.7247 | 0.8616 | 0.6972 | 9.471472e-09 | 3295 |
| 0.7414 | 0.7294 | 0.8616 | 0.6972 | 9.47116e-09 | 3296 |
| 0.7539 | 0.7341 | 0.8615 | 0.6972 | 9.470847e-09 | 3297 |
| 0.7466 | 0.7271 | 0.8615 | 0.6972 | 9.470535e-09 | 3298 |
| 0.7485 | 0.7271 | 0.8615 | 0.6972 | 9.470222e-09 | 3299 |
| 0.7465 | 0.7271 | 0.8615 | 0.6972 | 9.469909e-09 | 3300 |
| 0.7346 | 0.7200 | 0.8615 | 0.6972 | 9.469597e-09 | 3301 |
| 0.7501 | 0.7153 | 0.8615 | 0.6972 | 9.469284e-09 | 3302 |
| 0.7416 | 0.7224 | 0.8614 | 0.6972 | 9.468971e-09 | 3303 |
| 0.7409 | 0.7247 | 0.8614 | 0.6972 | 9.468659e-09 | 3304 |
| 0.7476 | 0.7176 | 0.8614 | 0.6972 | 9.468346e-09 | 3305 |
| 0.7448 | 0.7388 | 0.8613 | 0.6972 | 9.4680335e-09 | 3306 |
| 0.7501 | 0.7224 | 0.8613 | 0.6972 | 9.46772e-09 | 3307 |
| 0.7433 | 0.7271 | 0.8613 | 0.6972 | 9.467406e-09 | 3308 |
| 0.7556 | 0.7176 | 0.8613 | 0.6972 | 9.467093e-09 | 3309 |
| 0.7489 | 0.7247 | 0.8613 | 0.6972 | 9.466779e-09 | 3310 |
| 0.7556 | 0.7247 | 0.8613 | 0.6972 | 9.466466e-09 | 3311 |
| 0.7318 | 0.7224 | 0.8613 | 0.6972 | 9.466152e-09 | 3312 |
| 0.7355 | 0.7200 | 0.8612 | 0.6972 | 9.465839e-09 | 3313 |
| 0.7454 | 0.7153 | 0.8612 | 0.6972 | 9.465525e-09 | 3314 |
| 0.7524 | 0.7318 | 0.8612 | 0.6972 | 9.465212e-09 | 3315 |
| 0.7578 | 0.7247 | 0.8612 | 0.6972 | 9.464898e-09 | 3316 |
| 0.7520 | 0.7247 | 0.8612 | 0.6972 | 9.464585e-09 | 3317 |
| 0.7461 | 0.7341 | 0.8612 | 0.6972 | 9.46427e-09 | 3318 |
| 0.7396 | 0.7271 | 0.8612 | 0.6972 | 9.463956e-09 | 3319 |
| 0.7502 | 0.7153 | 0.8612 | 0.6972 | 9.463641e-09 | 3320 |
| 0.7425 | 0.7153 | 0.8612 | 0.6972 | 9.463327e-09 | 3321 |
| 0.7533 | 0.7247 | 0.8611 | 0.6972 | 9.463013e-09 | 3322 |
| 0.7499 | 0.7224 | 0.8611 | 0.6972 | 9.462698e-09 | 3323 |
| 0.7496 | 0.7176 | 0.8611 | 0.6972 | 9.462384e-09 | 3324 |
| 0.7557 | 0.7200 | 0.8610 | 0.6972 | 9.462069e-09 | 3325 |
| 0.7399 | 0.7318 | 0.8610 | 0.6972 | 9.461755e-09 | 3326 |
| 0.7486 | 0.7271 | 0.8610 | 0.6972 | 9.4614405e-09 | 3327 |
| 0.7412 | 0.7294 | 0.8610 | 0.6972 | 9.461125e-09 | 3328 |
| 0.7515 | 0.7224 | 0.8610 | 0.6972 | 9.46081e-09 | 3329 |
| 0.7393 | 0.7271 | 0.8610 | 0.6972 | 9.460495e-09 | 3330 |
| 0.7358 | 0.7271 | 0.8610 | 0.6972 | 9.460179e-09 | 3331 |
| 0.7504 | 0.7224 | 0.8609 | 0.6972 | 9.459864e-09 | 3332 |
| 0.7591 | 0.7224 | 0.8609 | 0.6972 | 9.459549e-09 | 3333 |
| 0.7581 | 0.7247 | 0.8609 | 0.6972 | 9.459233e-09 | 3334 |
| 0.7493 | 0.7247 | 0.8609 | 0.6972 | 9.458918e-09 | 3335 |
| 0.7425 | 0.7200 | 0.8609 | 0.6972 | 9.458603e-09 | 3336 |
| 0.7395 | 0.7341 | 0.8609 | 0.6972 | 9.4582875e-09 | 3337 |
| 0.7438 | 0.7247 | 0.8609 | 0.6972 | 9.457972e-09 | 3338 |
| 0.7536 | 0.7176 | 0.8608 | 0.6972 | 9.457656e-09 | 3339 |
| 0.7541 | 0.7271 | 0.8608 | 0.6972 | 9.45734e-09 | 3340 |
| 0.7337 | 0.7271 | 0.8608 | 0.6972 | 9.457024e-09 | 3341 |
| 0.7509 | 0.7247 | 0.8608 | 0.6972 | 9.456707e-09 | 3342 |
| 0.7444 | 0.7294 | 0.8608 | 0.6972 | 9.456391e-09 | 3343 |
| 0.7490 | 0.7318 | 0.8607 | 0.6972 | 9.456075e-09 | 3344 |
| 0.7485 | 0.7247 | 0.8607 | 0.6972 | 9.455759e-09 | 3345 |
| 0.7465 | 0.7224 | 0.8607 | 0.6972 | 9.455443e-09 | 3346 |
| 0.7547 | 0.7200 | 0.8607 | 0.6972 | 9.4551265e-09 | 3347 |
| 0.7423 | 0.7224 | 0.8607 | 0.6972 | 9.45481e-09 | 3348 |
| 0.7570 | 0.7247 | 0.8607 | 0.6972 | 9.454494e-09 | 3349 |
| 0.7409 | 0.7294 | 0.8607 | 0.6972 | 9.454177e-09 | 3350 |
| 0.7358 | 0.7271 | 0.8607 | 0.6972 | 9.45386e-09 | 3351 |
| 0.7589 | 0.7176 | 0.8607 | 0.6972 | 9.453543e-09 | 3352 |
| 0.7383 | 0.7412 | 0.8607 | 0.6972 | 9.453226e-09 | 3353 |
| 0.7435 | 0.7318 | 0.8607 | 0.6972 | 9.452909e-09 | 3354 |
| 0.7439 | 0.7176 | 0.8607 | 0.6972 | 9.452592e-09 | 3355 |
| 0.7554 | 0.7294 | 0.8606 | 0.6972 | 9.4522745e-09 | 3356 |
| 0.7432 | 0.7247 | 0.8606 | 0.6972 | 9.451957e-09 | 3357 |
| 0.7455 | 0.7247 | 0.8606 | 0.6972 | 9.45164e-09 | 3358 |
| 0.7529 | 0.7247 | 0.8606 | 0.6972 | 9.451323e-09 | 3359 |
| 0.7320 | 0.7435 | 0.8606 | 0.6972 | 9.451005e-09 | 3360 |
| 0.7502 | 0.7365 | 0.8606 | 0.6972 | 9.450687e-09 | 3361 |
| 0.7351 | 0.7200 | 0.8606 | 0.6972 | 9.450369e-09 | 3362 |
| 0.7364 | 0.7294 | 0.8605 | 0.6972 | 9.450051e-09 | 3363 |
| 0.7435 | 0.7271 | 0.8605 | 0.6972 | 9.449733e-09 | 3364 |
| 0.7476 | 0.7271 | 0.8605 | 0.6972 | 9.4494155e-09 | 3365 |
| 0.7303 | 0.7271 | 0.8605 | 0.6972 | 9.4490975e-09 | 3366 |
| 0.7583 | 0.7176 | 0.8605 | 0.6972 | 9.4487795e-09 | 3367 |
| 0.7478 | 0.7294 | 0.8605 | 0.6972 | 9.448462e-09 | 3368 |
| 0.7432 | 0.7435 | 0.8604 | 0.6972 | 9.448144e-09 | 3369 |
| 0.7497 | 0.7200 | 0.8604 | 0.6972 | 9.447826e-09 | 3370 |
| 0.7267 | 0.7318 | 0.8604 | 0.6901 | 9.447507e-09 | 3371 |
| 0.7474 | 0.7200 | 0.8604 | 0.6901 | 9.447188e-09 | 3372 |
| 0.7312 | 0.7365 | 0.8603 | 0.6972 | 9.446869e-09 | 3373 |
| 0.7554 | 0.7200 | 0.8603 | 0.6972 | 9.44655e-09 | 3374 |
| 0.7435 | 0.7200 | 0.8603 | 0.6901 | 9.446231e-09 | 3375 |
| 0.7386 | 0.7365 | 0.8603 | 0.6972 | 9.4459125e-09 | 3376 |
| 0.7295 | 0.7318 | 0.8603 | 0.6972 | 9.445594e-09 | 3377 |
| 0.7497 | 0.7247 | 0.8602 | 0.6901 | 9.445275e-09 | 3378 |
| 0.7416 | 0.7365 | 0.8602 | 0.6901 | 9.444956e-09 | 3379 |
| 0.7428 | 0.7388 | 0.8601 | 0.6901 | 9.444637e-09 | 3380 |
| 0.7438 | 0.7224 | 0.8601 | 0.6901 | 9.444317e-09 | 3381 |
| 0.7433 | 0.7247 | 0.8601 | 0.6901 | 9.443998e-09 | 3382 |
| 0.7345 | 0.7271 | 0.8601 | 0.6901 | 9.443678e-09 | 3383 |
| 0.7472 | 0.7271 | 0.8601 | 0.6901 | 9.443358e-09 | 3384 |
| 0.7376 | 0.7271 | 0.8600 | 0.6972 | 9.443038e-09 | 3385 |
| 0.7397 | 0.7365 | 0.8600 | 0.6901 | 9.442719e-09 | 3386 |
| 0.7419 | 0.7412 | 0.8600 | 0.6901 | 9.442399e-09 | 3387 |
| 0.7337 | 0.7294 | 0.8599 | 0.6901 | 9.442079e-09 | 3388 |
| 0.7577 | 0.7200 | 0.8599 | 0.6972 | 9.441759e-09 | 3389 |
| 0.7311 | 0.7294 | 0.8599 | 0.6972 | 9.44144e-09 | 3390 |
| 0.7459 | 0.7271 | 0.8599 | 0.6972 | 9.44112e-09 | 3391 |
| 0.7493 | 0.7247 | 0.8599 | 0.6972 | 9.440799e-09 | 3392 |
| 0.7384 | 0.7294 | 0.8599 | 0.6972 | 9.440479e-09 | 3393 |
| 0.7323 | 0.7365 | 0.8599 | 0.6972 | 9.440158e-09 | 3394 |
| 0.7403 | 0.7294 | 0.8599 | 0.6972 | 9.439837e-09 | 3395 |
| 0.7485 | 0.7247 | 0.8599 | 0.6972 | 9.439517e-09 | 3396 |
| 0.7364 | 0.7388 | 0.8598 | 0.6972 | 9.439196e-09 | 3397 |
| 0.7379 | 0.7247 | 0.8598 | 0.6972 | 9.4388755e-09 | 3398 |
| 0.7379 | 0.7271 | 0.8598 | 0.6972 | 9.438555e-09 | 3399 |
| 0.7363 | 0.7271 | 0.8598 | 0.6972 | 9.438234e-09 | 3400 |
| 0.7315 | 0.7341 | 0.8598 | 0.6972 | 9.437914e-09 | 3401 |
| 0.7330 | 0.7224 | 0.8597 | 0.6972 | 9.437593e-09 | 3402 |
| 0.7467 | 0.7200 | 0.8597 | 0.6972 | 9.437271e-09 | 3403 |
| 0.7480 | 0.7318 | 0.8597 | 0.6972 | 9.43695e-09 | 3404 |
| 0.7453 | 0.7200 | 0.8597 | 0.6972 | 9.436628e-09 | 3405 |
| 0.7438 | 0.7271 | 0.8597 | 0.6972 | 9.436307e-09 | 3406 |
| 0.7369 | 0.7247 | 0.8597 | 0.6972 | 9.435985e-09 | 3407 |
| 0.7405 | 0.7341 | 0.8597 | 0.6972 | 9.435664e-09 | 3408 |
| 0.7355 | 0.7318 | 0.8596 | 0.6972 | 9.435342e-09 | 3409 |
| 0.7451 | 0.7224 | 0.8596 | 0.6972 | 9.435021e-09 | 3410 |
| 0.7322 | 0.7294 | 0.8596 | 0.6972 | 9.434699e-09 | 3411 |
| 0.7433 | 0.7247 | 0.8596 | 0.6972 | 9.434378e-09 | 3412 |
| 0.7467 | 0.7294 | 0.8596 | 0.6972 | 9.434055e-09 | 3413 |
| 0.7226 | 0.7435 | 0.8596 | 0.6972 | 9.433733e-09 | 3414 |
| 0.7371 | 0.7341 | 0.8596 | 0.6972 | 9.4334105e-09 | 3415 |
| 0.7319 | 0.7529 | 0.8596 | 0.6972 | 9.433088e-09 | 3416 |
| 0.7335 | 0.7294 | 0.8595 | 0.6972 | 9.432766e-09 | 3417 |
| 0.7495 | 0.7247 | 0.8595 | 0.6972 | 9.432443e-09 | 3418 |
| 0.7498 | 0.7224 | 0.8595 | 0.6972 | 9.432121e-09 | 3419 |
| 0.7305 | 0.7365 | 0.8595 | 0.6972 | 9.4317985e-09 | 3420 |
| 0.7463 | 0.7318 | 0.8595 | 0.6972 | 9.431476e-09 | 3421 |
| 0.7442 | 0.7271 | 0.8595 | 0.6972 | 9.431154e-09 | 3422 |
| 0.7390 | 0.7318 | 0.8595 | 0.6972 | 9.430831e-09 | 3423 |
| 0.7415 | 0.7459 | 0.8595 | 0.6972 | 9.430508e-09 | 3424 |
| 0.7363 | 0.7294 | 0.8595 | 0.6972 | 9.430185e-09 | 3425 |
| 0.7349 | 0.7412 | 0.8595 | 0.6972 | 9.429861e-09 | 3426 |
| 0.7328 | 0.7271 | 0.8595 | 0.6972 | 9.429538e-09 | 3427 |
| 0.7322 | 0.7318 | 0.8595 | 0.6972 | 9.429215e-09 | 3428 |
| 0.7259 | 0.7388 | 0.8595 | 0.6972 | 9.4288914e-09 | 3429 |
| 0.7321 | 0.7318 | 0.8595 | 0.6972 | 9.428568e-09 | 3430 |
| 0.7432 | 0.7318 | 0.8594 | 0.6972 | 9.428245e-09 | 3431 |
| 0.7400 | 0.7294 | 0.8594 | 0.6972 | 9.4279216e-09 | 3432 |
| 0.7244 | 0.7271 | 0.8594 | 0.6972 | 9.427598e-09 | 3433 |
| 0.7390 | 0.7271 | 0.8594 | 0.6972 | 9.427275e-09 | 3434 |
| 0.7405 | 0.7318 | 0.8594 | 0.6972 | 9.426951e-09 | 3435 |
| 0.7554 | 0.7153 | 0.8593 | 0.6972 | 9.426627e-09 | 3436 |
| 0.7428 | 0.7224 | 0.8593 | 0.6972 | 9.426302e-09 | 3437 |
| 0.7363 | 0.7341 | 0.8593 | 0.6972 | 9.425978e-09 | 3438 |
| 0.7286 | 0.7318 | 0.8593 | 0.6972 | 9.425654e-09 | 3439 |
| 0.7453 | 0.7365 | 0.8593 | 0.6972 | 9.42533e-09 | 3440 |
| 0.7266 | 0.7388 | 0.8592 | 0.6972 | 9.425006e-09 | 3441 |
| 0.7353 | 0.7247 | 0.8592 | 0.6972 | 9.4246815e-09 | 3442 |
| 0.7413 | 0.7271 | 0.8592 | 0.6972 | 9.424357e-09 | 3443 |
| 0.7378 | 0.7365 | 0.8591 | 0.6972 | 9.424033e-09 | 3444 |
| 0.7441 | 0.7341 | 0.8591 | 0.6972 | 9.423708e-09 | 3445 |
| 0.7490 | 0.7200 | 0.8591 | 0.6972 | 9.423383e-09 | 3446 |
| 0.7364 | 0.7294 | 0.8591 | 0.6972 | 9.423058e-09 | 3447 |
| 0.7481 | 0.7224 | 0.8591 | 0.6972 | 9.422733e-09 | 3448 |
| 0.7360 | 0.7200 | 0.8591 | 0.6972 | 9.422408e-09 | 3449 |
| 0.7357 | 0.7318 | 0.8591 | 0.6972 | 9.422083e-09 | 3450 |
| 0.7460 | 0.7129 | 0.8590 | 0.6972 | 9.421758e-09 | 3451 |
| 0.7513 | 0.7153 | 0.8590 | 0.6972 | 9.4214325e-09 | 3452 |
| 0.7504 | 0.7388 | 0.8590 | 0.6972 | 9.4211074e-09 | 3453 |
| 0.7534 | 0.7176 | 0.8589 | 0.6972 | 9.420782e-09 | 3454 |
| 0.7389 | 0.7271 | 0.8590 | 0.6972 | 9.420457e-09 | 3455 |
| 0.7312 | 0.7365 | 0.8589 | 0.6972 | 9.420131e-09 | 3456 |
| 0.7387 | 0.7224 | 0.8589 | 0.6972 | 9.419805e-09 | 3457 |
| 0.7414 | 0.7435 | 0.8589 | 0.6972 | 9.419479e-09 | 3458 |
| 0.7433 | 0.7365 | 0.8589 | 0.6972 | 9.4191535e-09 | 3459 |
| 0.7370 | 0.7482 | 0.8589 | 0.6972 | 9.4188275e-09 | 3460 |
| 0.7235 | 0.7388 | 0.8589 | 0.6972 | 9.4185015e-09 | 3461 |
| 0.7342 | 0.7341 | 0.8589 | 0.6972 | 9.418176e-09 | 3462 |
| 0.7451 | 0.7412 | 0.8588 | 0.6972 | 9.41785e-09 | 3463 |
| 0.7400 | 0.7294 | 0.8588 | 0.6972 | 9.417524e-09 | 3464 |
| 0.7439 | 0.7341 | 0.8588 | 0.6972 | 9.417198e-09 | 3465 |
| 0.7309 | 0.7482 | 0.8588 | 0.6972 | 9.416872e-09 | 3466 |
| 0.7416 | 0.7294 | 0.8588 | 0.6972 | 9.416545e-09 | 3467 |
| 0.7470 | 0.7224 | 0.8588 | 0.6972 | 9.416218e-09 | 3468 |
| 0.7338 | 0.7294 | 0.8588 | 0.6972 | 9.415891e-09 | 3469 |
| 0.7379 | 0.7271 | 0.8588 | 0.6972 | 9.415564e-09 | 3470 |
| 0.7436 | 0.7247 | 0.8589 | 0.6972 | 9.4152375e-09 | 3471 |
| 0.7319 | 0.7341 | 0.8589 | 0.6972 | 9.414911e-09 | 3472 |
| 0.7434 | 0.7271 | 0.8589 | 0.6972 | 9.414584e-09 | 3473 |
| 0.7366 | 0.7365 | 0.8589 | 0.6972 | 9.414257e-09 | 3474 |
| 0.7384 | 0.7388 | 0.8588 | 0.6972 | 9.41393e-09 | 3475 |
| 0.7412 | 0.7435 | 0.8588 | 0.6972 | 9.413603e-09 | 3476 |
| 0.7265 | 0.7459 | 0.8588 | 0.6972 | 9.4132755e-09 | 3477 |
| 0.7303 | 0.7318 | 0.8588 | 0.6972 | 9.412948e-09 | 3478 |
| 0.7353 | 0.7294 | 0.8588 | 0.6972 | 9.41262e-09 | 3479 |
| 0.7275 | 0.7318 | 0.8588 | 0.6972 | 9.412292e-09 | 3480 |
| 0.7356 | 0.7365 | 0.8588 | 0.6972 | 9.4119645e-09 | 3481 |
| 0.7319 | 0.7294 | 0.8588 | 0.6972 | 9.411637e-09 | 3482 |
| 0.7394 | 0.7318 | 0.8588 | 0.6972 | 9.411309e-09 | 3483 |
| 0.7351 | 0.7318 | 0.8588 | 0.6972 | 9.410981e-09 | 3484 |
| 0.7322 | 0.7224 | 0.8587 | 0.6972 | 9.410654e-09 | 3485 |
| 0.7333 | 0.7365 | 0.8587 | 0.6972 | 9.410326e-09 | 3486 |
| 0.7345 | 0.7341 | 0.8587 | 0.6972 | 9.409998e-09 | 3487 |
| 0.7346 | 0.7271 | 0.8587 | 0.6972 | 9.4096695e-09 | 3488 |
| 0.7321 | 0.7365 | 0.8586 | 0.6972 | 9.409341e-09 | 3489 |
| 0.7463 | 0.7365 | 0.8586 | 0.6972 | 9.409012e-09 | 3490 |
| 0.7404 | 0.7341 | 0.8586 | 0.6972 | 9.408684e-09 | 3491 |
| 0.7339 | 0.7341 | 0.8587 | 0.6972 | 9.408355e-09 | 3492 |
| 0.7409 | 0.7200 | 0.8586 | 0.6972 | 9.408026e-09 | 3493 |
| 0.7170 | 0.7318 | 0.8586 | 0.6972 | 9.407698e-09 | 3494 |
| 0.7418 | 0.7294 | 0.8586 | 0.7042 | 9.407369e-09 | 3495 |
| 0.7445 | 0.7318 | 0.8586 | 0.6972 | 9.4070405e-09 | 3496 |
| 0.7396 | 0.7294 | 0.8586 | 0.6972 | 9.406712e-09 | 3497 |
| 0.7311 | 0.7435 | 0.8586 | 0.6972 | 9.406383e-09 | 3498 |
| 0.7364 | 0.7294 | 0.8586 | 0.6972 | 9.406054e-09 | 3499 |
| 0.7418 | 0.7247 | 0.8585 | 0.6972 | 9.405724e-09 | 3500 |
| 0.7284 | 0.7341 | 0.8586 | 0.7042 | 9.405395e-09 | 3501 |
| 0.7295 | 0.7388 | 0.8586 | 0.7042 | 9.405065e-09 | 3502 |
| 0.7303 | 0.7318 | 0.8586 | 0.7042 | 9.404736e-09 | 3503 |
| 0.7353 | 0.7224 | 0.8586 | 0.6972 | 9.404406e-09 | 3504 |
| 0.7396 | 0.7435 | 0.8586 | 0.6972 | 9.404077e-09 | 3505 |
| 0.7321 | 0.7412 | 0.8586 | 0.6972 | 9.403747e-09 | 3506 |
| 0.7440 | 0.7176 | 0.8585 | 0.6972 | 9.403418e-09 | 3507 |
| 0.7309 | 0.7388 | 0.8585 | 0.6972 | 9.403088e-09 | 3508 |
| 0.7259 | 0.7365 | 0.8585 | 0.6972 | 9.402759e-09 | 3509 |
| 0.7419 | 0.7318 | 0.8585 | 0.6972 | 9.402428e-09 | 3510 |
| 0.7259 | 0.7318 | 0.8584 | 0.6972 | 9.402098e-09 | 3511 |
| 0.7341 | 0.7318 | 0.8584 | 0.6972 | 9.401767e-09 | 3512 |
| 0.7255 | 0.7412 | 0.8584 | 0.6972 | 9.401437e-09 | 3513 |
| 0.7341 | 0.7341 | 0.8584 | 0.6972 | 9.401107e-09 | 3514 |
| 0.7433 | 0.7341 | 0.8584 | 0.6972 | 9.400776e-09 | 3515 |
| 0.7324 | 0.7388 | 0.8584 | 0.6972 | 9.400446e-09 | 3516 |
| 0.7350 | 0.7529 | 0.8584 | 0.6972 | 9.400115e-09 | 3517 |
| 0.7296 | 0.7459 | 0.8583 | 0.6972 | 9.399785e-09 | 3518 |
| 0.7305 | 0.7271 | 0.8583 | 0.6972 | 9.3994545e-09 | 3519 |
| 0.7358 | 0.7459 | 0.8583 | 0.6972 | 9.399123e-09 | 3520 |
| 0.7515 | 0.7435 | 0.8583 | 0.6972 | 9.398792e-09 | 3521 |
| 0.7384 | 0.7176 | 0.8583 | 0.6972 | 9.398461e-09 | 3522 |
| 0.7371 | 0.7294 | 0.8583 | 0.6972 | 9.398129e-09 | 3523 |
| 0.7389 | 0.7412 | 0.8583 | 0.6972 | 9.397798e-09 | 3524 |
| 0.7512 | 0.7271 | 0.8583 | 0.6972 | 9.397467e-09 | 3525 |
| 0.7340 | 0.7318 | 0.8583 | 0.6972 | 9.3971355e-09 | 3526 |
| 0.7319 | 0.7318 | 0.8582 | 0.6972 | 9.396804e-09 | 3527 |
| 0.7315 | 0.7294 | 0.8582 | 0.6972 | 9.396473e-09 | 3528 |
| 0.7372 | 0.7341 | 0.8582 | 0.6972 | 9.396142e-09 | 3529 |
| 0.7382 | 0.7341 | 0.8582 | 0.6972 | 9.39581e-09 | 3530 |
| 0.7485 | 0.7247 | 0.8582 | 0.6972 | 9.395478e-09 | 3531 |
| 0.7426 | 0.7294 | 0.8582 | 0.6972 | 9.395146e-09 | 3532 |
| 0.7343 | 0.7388 | 0.8581 | 0.6972 | 9.394814e-09 | 3533 |
| 0.7448 | 0.7153 | 0.8582 | 0.6972 | 9.394482e-09 | 3534 |
| 0.7330 | 0.7318 | 0.8581 | 0.6972 | 9.3941495e-09 | 3535 |
| 0.7251 | 0.7341 | 0.8581 | 0.6972 | 9.393817e-09 | 3536 |
| 0.7368 | 0.7435 | 0.8581 | 0.6972 | 9.393485e-09 | 3537 |
| 0.7368 | 0.7294 | 0.8581 | 0.6972 | 9.393153e-09 | 3538 |
| 0.7294 | 0.7412 | 0.8581 | 0.6972 | 9.392821e-09 | 3539 |
| 0.7256 | 0.7365 | 0.8581 | 0.6972 | 9.392489e-09 | 3540 |
| 0.7386 | 0.7341 | 0.8581 | 0.6972 | 9.392156e-09 | 3541 |
| 0.7480 | 0.7271 | 0.8581 | 0.6972 | 9.391823e-09 | 3542 |
| 0.7339 | 0.7294 | 0.8582 | 0.6972 | 9.39149e-09 | 3543 |
| 0.7213 | 0.7294 | 0.8581 | 0.6972 | 9.391157e-09 | 3544 |
| 0.7450 | 0.7341 | 0.8581 | 0.6972 | 9.390824e-09 | 3545 |
| 0.7362 | 0.7247 | 0.8581 | 0.6972 | 9.390491e-09 | 3546 |
| 0.7208 | 0.7318 | 0.8581 | 0.6972 | 9.390158e-09 | 3547 |
| 0.7320 | 0.7247 | 0.8581 | 0.6972 | 9.389825e-09 | 3548 |
| 0.7267 | 0.7412 | 0.8581 | 0.6972 | 9.389492e-09 | 3549 |
| 0.7388 | 0.7294 | 0.8582 | 0.6972 | 9.389159e-09 | 3550 |
| 0.7249 | 0.7294 | 0.8582 | 0.6972 | 9.388826e-09 | 3551 |
| 0.7360 | 0.7412 | 0.8581 | 0.6972 | 9.388493e-09 | 3552 |
| 0.7360 | 0.7412 | 0.8581 | 0.6972 | 9.388159e-09 | 3553 |
| 0.7333 | 0.7318 | 0.8581 | 0.6972 | 9.387825e-09 | 3554 |
| 0.7409 | 0.7365 | 0.8580 | 0.6972 | 9.387491e-09 | 3555 |
| 0.7333 | 0.7365 | 0.8580 | 0.6972 | 9.387157e-09 | 3556 |
| 0.7316 | 0.7341 | 0.8580 | 0.6972 | 9.386823e-09 | 3557 |
| 0.7318 | 0.7341 | 0.8580 | 0.6972 | 9.386489e-09 | 3558 |
| 0.7239 | 0.7176 | 0.8580 | 0.6972 | 9.386155e-09 | 3559 |
| 0.7244 | 0.7365 | 0.8580 | 0.6972 | 9.385821e-09 | 3560 |
| 0.7184 | 0.7412 | 0.8580 | 0.6972 | 9.385487e-09 | 3561 |
| 0.7327 | 0.7365 | 0.8580 | 0.6972 | 9.385153e-09 | 3562 |
| 0.7366 | 0.7365 | 0.8579 | 0.6972 | 9.384819e-09 | 3563 |
| 0.7344 | 0.7365 | 0.8579 | 0.6972 | 9.384484e-09 | 3564 |
| 0.7283 | 0.7341 | 0.8579 | 0.6972 | 9.3841495e-09 | 3565 |
| 0.7317 | 0.7318 | 0.8579 | 0.7042 | 9.383815e-09 | 3566 |
| 0.7350 | 0.7247 | 0.8579 | 0.7042 | 9.38348e-09 | 3567 |
| 0.7318 | 0.7388 | 0.8579 | 0.7042 | 9.383145e-09 | 3568 |
| 0.7360 | 0.7294 | 0.8578 | 0.7042 | 9.38281e-09 | 3569 |
| 0.7180 | 0.7271 | 0.8578 | 0.7042 | 9.382475e-09 | 3570 |
| 0.7421 | 0.7294 | 0.8578 | 0.7042 | 9.38214e-09 | 3571 |
| 0.7422 | 0.7388 | 0.8578 | 0.7042 | 9.3818056e-09 | 3572 |
| 0.7350 | 0.7294 | 0.8577 | 0.7042 | 9.381471e-09 | 3573 |
| 0.7471 | 0.7365 | 0.8577 | 0.7042 | 9.381136e-09 | 3574 |
| 0.7350 | 0.7271 | 0.8577 | 0.7042 | 9.3808e-09 | 3575 |
| 0.7280 | 0.7412 | 0.8577 | 0.7042 | 9.380464e-09 | 3576 |
| 0.7356 | 0.7247 | 0.8577 | 0.7042 | 9.380129e-09 | 3577 |
| 0.7316 | 0.7388 | 0.8577 | 0.7042 | 9.379793e-09 | 3578 |
| 0.7246 | 0.7271 | 0.8577 | 0.7042 | 9.379457e-09 | 3579 |
| 0.7165 | 0.7482 | 0.8577 | 0.7042 | 9.3791215e-09 | 3580 |
| 0.7399 | 0.7341 | 0.8576 | 0.7042 | 9.378786e-09 | 3581 |
| 0.7312 | 0.7341 | 0.8576 | 0.7042 | 9.37845e-09 | 3582 |
| 0.7197 | 0.7388 | 0.8577 | 0.7042 | 9.378114e-09 | 3583 |
| 0.7208 | 0.7459 | 0.8576 | 0.7042 | 9.3777786e-09 | 3584 |
| 0.7285 | 0.7294 | 0.8577 | 0.7042 | 9.377442e-09 | 3585 |
| 0.7164 | 0.7341 | 0.8576 | 0.7042 | 9.377105e-09 | 3586 |
| 0.7238 | 0.7318 | 0.8576 | 0.7042 | 9.376769e-09 | 3587 |
| 0.7280 | 0.7459 | 0.8576 | 0.7042 | 9.376432e-09 | 3588 |
| 0.7304 | 0.7200 | 0.8576 | 0.7042 | 9.3760955e-09 | 3589 |
| 0.7322 | 0.7200 | 0.8576 | 0.7042 | 9.375759e-09 | 3590 |
| 0.7182 | 0.7435 | 0.8576 | 0.7042 | 9.375422e-09 | 3591 |
| 0.7306 | 0.7318 | 0.8576 | 0.7042 | 9.375086e-09 | 3592 |
| 0.7320 | 0.7341 | 0.8576 | 0.7042 | 9.374749e-09 | 3593 |
| 0.7409 | 0.7341 | 0.8575 | 0.7042 | 9.374412e-09 | 3594 |
| 0.7288 | 0.7318 | 0.8575 | 0.7042 | 9.374076e-09 | 3595 |
| 0.7272 | 0.7341 | 0.8575 | 0.7042 | 9.373738e-09 | 3596 |
| 0.7314 | 0.7271 | 0.8575 | 0.7042 | 9.373401e-09 | 3597 |
| 0.7302 | 0.7271 | 0.8575 | 0.7042 | 9.373063e-09 | 3598 |
| 0.7226 | 0.7459 | 0.8575 | 0.7042 | 9.372726e-09 | 3599 |
| 0.7212 | 0.7388 | 0.8574 | 0.7042 | 9.372388e-09 | 3600 |
| 0.7358 | 0.7271 | 0.8574 | 0.7042 | 9.372051e-09 | 3601 |
| 0.7356 | 0.7435 | 0.8574 | 0.7042 | 9.371713e-09 | 3602 |
| 0.7288 | 0.7271 | 0.8573 | 0.7042 | 9.371376e-09 | 3603 |
| 0.7252 | 0.7341 | 0.8573 | 0.7042 | 9.371038e-09 | 3604 |
| 0.7116 | 0.7624 | 0.8573 | 0.7042 | 9.370701e-09 | 3605 |
| 0.7162 | 0.7294 | 0.8573 | 0.7042 | 9.370363e-09 | 3606 |
| 0.7225 | 0.7388 | 0.8573 | 0.7042 | 9.370025e-09 | 3607 |
| 0.7308 | 0.7365 | 0.8573 | 0.7042 | 9.369686e-09 | 3608 |
| 0.7254 | 0.7482 | 0.8573 | 0.7042 | 9.369348e-09 | 3609 |
| 0.7302 | 0.7271 | 0.8573 | 0.7042 | 9.36901e-09 | 3610 |
| 0.7326 | 0.7365 | 0.8573 | 0.7042 | 9.368671e-09 | 3611 |
| 0.7245 | 0.7412 | 0.8573 | 0.7042 | 9.368333e-09 | 3612 |
| 0.7272 | 0.7388 | 0.8573 | 0.7042 | 9.367994e-09 | 3613 |
| 0.7206 | 0.7388 | 0.8573 | 0.7042 | 9.367656e-09 | 3614 |
| 0.7304 | 0.7294 | 0.8573 | 0.7042 | 9.367318e-09 | 3615 |
| 0.7287 | 0.7365 | 0.8573 | 0.7042 | 9.366979e-09 | 3616 |
| 0.7247 | 0.7341 | 0.8573 | 0.7042 | 9.366641e-09 | 3617 |
| 0.7302 | 0.7271 | 0.8573 | 0.7042 | 9.3663015e-09 | 3618 |
| 0.7248 | 0.7365 | 0.8572 | 0.7042 | 9.365962e-09 | 3619 |
| 0.7288 | 0.7553 | 0.8572 | 0.6972 | 9.365623e-09 | 3620 |
| 0.7215 | 0.7341 | 0.8572 | 0.6972 | 9.365284e-09 | 3621 |
| 0.7217 | 0.7318 | 0.8572 | 0.6972 | 9.364944e-09 | 3622 |
| 0.7384 | 0.7318 | 0.8572 | 0.6972 | 9.364605e-09 | 3623 |
| 0.7231 | 0.7388 | 0.8572 | 0.6972 | 9.364266e-09 | 3624 |
| 0.7348 | 0.7318 | 0.8572 | 0.6972 | 9.3639265e-09 | 3625 |
| 0.7367 | 0.7341 | 0.8571 | 0.6972 | 9.363587e-09 | 3626 |
| 0.7285 | 0.7365 | 0.8571 | 0.6972 | 9.363248e-09 | 3627 |
| 0.7399 | 0.7388 | 0.8571 | 0.6972 | 9.362909e-09 | 3628 |
| 0.7205 | 0.7388 | 0.8571 | 0.6972 | 9.3625685e-09 | 3629 |
| 0.7250 | 0.7271 | 0.8571 | 0.6972 | 9.362228e-09 | 3630 |
| 0.7359 | 0.7412 | 0.8571 | 0.6972 | 9.361888e-09 | 3631 |
| 0.7171 | 0.7412 | 0.8571 | 0.6972 | 9.361548e-09 | 3632 |
| 0.7342 | 0.7271 | 0.8570 | 0.6972 | 9.361208e-09 | 3633 |
| 0.7202 | 0.7435 | 0.8570 | 0.6972 | 9.360868e-09 | 3634 |
| 0.7379 | 0.7318 | 0.8569 | 0.6972 | 9.3605275e-09 | 3635 |
| 0.7248 | 0.7247 | 0.8569 | 0.6972 | 9.360187e-09 | 3636 |
| 0.7271 | 0.7388 | 0.8569 | 0.6972 | 9.359847e-09 | 3637 |
| 0.7317 | 0.7318 | 0.8569 | 0.6972 | 9.359507e-09 | 3638 |
| 0.7095 | 0.7388 | 0.8569 | 0.6972 | 9.359167e-09 | 3639 |
| 0.7248 | 0.7341 | 0.8568 | 0.6972 | 9.358826e-09 | 3640 |
| 0.7225 | 0.7341 | 0.8568 | 0.7042 | 9.358485e-09 | 3641 |
| 0.7285 | 0.7435 | 0.8568 | 0.6972 | 9.358144e-09 | 3642 |
| 0.7248 | 0.7435 | 0.8568 | 0.6972 | 9.3578025e-09 | 3643 |
| 0.7293 | 0.7341 | 0.8568 | 0.6972 | 9.3574615e-09 | 3644 |
| 0.7223 | 0.7412 | 0.8568 | 0.6972 | 9.35712e-09 | 3645 |
| 0.7361 | 0.7435 | 0.8568 | 0.6972 | 9.356779e-09 | 3646 |
| 0.7311 | 0.7365 | 0.8568 | 0.6972 | 9.356438e-09 | 3647 |
| 0.7329 | 0.7271 | 0.8568 | 0.6972 | 9.356097e-09 | 3648 |
| 0.7360 | 0.7412 | 0.8568 | 0.6972 | 9.355756e-09 | 3649 |
| 0.7362 | 0.7176 | 0.8568 | 0.6972 | 9.355415e-09 | 3650 |
| 0.7163 | 0.7412 | 0.8567 | 0.6972 | 9.355073e-09 | 3651 |
| 0.7239 | 0.7412 | 0.8567 | 0.6972 | 9.354731e-09 | 3652 |
| 0.7254 | 0.7365 | 0.8567 | 0.6972 | 9.354389e-09 | 3653 |
| 0.7249 | 0.7247 | 0.8567 | 0.6972 | 9.354047e-09 | 3654 |
| 0.7277 | 0.7435 | 0.8567 | 0.6972 | 9.353705e-09 | 3655 |
| 0.7247 | 0.7412 | 0.8567 | 0.6972 | 9.353363e-09 | 3656 |
| 0.7201 | 0.7271 | 0.8567 | 0.6972 | 9.3530215e-09 | 3657 |
| 0.7220 | 0.7341 | 0.8567 | 0.6972 | 9.3526795e-09 | 3658 |
| 0.7238 | 0.7365 | 0.8567 | 0.6972 | 9.352338e-09 | 3659 |
| 0.7343 | 0.7365 | 0.8566 | 0.6972 | 9.351996e-09 | 3660 |
| 0.7373 | 0.7294 | 0.8566 | 0.6972 | 9.351654e-09 | 3661 |
| 0.7302 | 0.7176 | 0.8566 | 0.6972 | 9.351311e-09 | 3662 |
| 0.7315 | 0.7271 | 0.8566 | 0.6972 | 9.350968e-09 | 3663 |
| 0.7345 | 0.7412 | 0.8566 | 0.6972 | 9.350625e-09 | 3664 |
| 0.7296 | 0.7318 | 0.8566 | 0.6972 | 9.350282e-09 | 3665 |
| 0.7228 | 0.7318 | 0.8566 | 0.6972 | 9.3499395e-09 | 3666 |
| 0.7211 | 0.7294 | 0.8566 | 0.6972 | 9.349597e-09 | 3667 |
| 0.7216 | 0.7365 | 0.8566 | 0.6972 | 9.349254e-09 | 3668 |
| 0.7255 | 0.7294 | 0.8566 | 0.6972 | 9.348911e-09 | 3669 |
| 0.7247 | 0.7365 | 0.8566 | 0.6972 | 9.348568e-09 | 3670 |
| 0.7152 | 0.7506 | 0.8566 | 0.6972 | 9.348225e-09 | 3671 |
| 0.7206 | 0.7318 | 0.8566 | 0.6972 | 9.3478825e-09 | 3672 |
| 0.7209 | 0.7341 | 0.8566 | 0.6972 | 9.347539e-09 | 3673 |
| 0.7255 | 0.7318 | 0.8566 | 0.6972 | 9.347195e-09 | 3674 |
| 0.7222 | 0.7412 | 0.8566 | 0.6972 | 9.346851e-09 | 3675 |
| 0.7272 | 0.7412 | 0.8566 | 0.6972 | 9.346508e-09 | 3676 |
| 0.7252 | 0.7341 | 0.8566 | 0.6972 | 9.346164e-09 | 3677 |
| 0.7192 | 0.7365 | 0.8566 | 0.6972 | 9.34582e-09 | 3678 |
| 0.7225 | 0.7435 | 0.8566 | 0.6972 | 9.345476e-09 | 3679 |
| 0.7303 | 0.7365 | 0.8566 | 0.6972 | 9.345133e-09 | 3680 |
| 0.7224 | 0.7318 | 0.8566 | 0.6972 | 9.344789e-09 | 3681 |
| 0.7323 | 0.7271 | 0.8566 | 0.6972 | 9.344445e-09 | 3682 |
| 0.7244 | 0.7294 | 0.8566 | 0.6972 | 9.3441015e-09 | 3683 |
| 0.7225 | 0.7412 | 0.8566 | 0.6972 | 9.343757e-09 | 3684 |
| 0.7150 | 0.7365 | 0.8566 | 0.6972 | 9.343412e-09 | 3685 |
| 0.7224 | 0.7435 | 0.8566 | 0.6972 | 9.343068e-09 | 3686 |
| 0.7356 | 0.7271 | 0.8566 | 0.6972 | 9.342723e-09 | 3687 |
| 0.7299 | 0.7365 | 0.8566 | 0.6972 | 9.342378e-09 | 3688 |
| 0.7330 | 0.7365 | 0.8566 | 0.6972 | 9.342034e-09 | 3689 |
| 0.7355 | 0.7247 | 0.8565 | 0.6972 | 9.341689e-09 | 3690 |
| 0.7273 | 0.7388 | 0.8565 | 0.6972 | 9.341345e-09 | 3691 |
| 0.7314 | 0.7341 | 0.8565 | 0.6972 | 9.341e-09 | 3692 |
| 0.7221 | 0.7435 | 0.8564 | 0.6972 | 9.340655e-09 | 3693 |
| 0.7242 | 0.7224 | 0.8564 | 0.6972 | 9.340311e-09 | 3694 |
| 0.7269 | 0.7247 | 0.8564 | 0.6972 | 9.339965e-09 | 3695 |
| 0.7257 | 0.7341 | 0.8564 | 0.6972 | 9.33962e-09 | 3696 |
| 0.7234 | 0.7529 | 0.8564 | 0.6972 | 9.339274e-09 | 3697 |
| 0.7083 | 0.7506 | 0.8563 | 0.6972 | 9.338929e-09 | 3698 |
| 0.7297 | 0.7294 | 0.8563 | 0.6972 | 9.338583e-09 | 3699 |
| 0.7233 | 0.7388 | 0.8563 | 0.6972 | 9.338238e-09 | 3700 |
| 0.7269 | 0.7482 | 0.8563 | 0.6972 | 9.337892e-09 | 3701 |
| 0.7357 | 0.7153 | 0.8563 | 0.6972 | 9.337547e-09 | 3702 |
| 0.7314 | 0.7365 | 0.8563 | 0.6972 | 9.337201e-09 | 3703 |
| 0.7287 | 0.7365 | 0.8563 | 0.6972 | 9.336856e-09 | 3704 |
| 0.7235 | 0.7318 | 0.8563 | 0.6972 | 9.33651e-09 | 3705 |
| 0.7298 | 0.7271 | 0.8563 | 0.6972 | 9.336164e-09 | 3706 |
| 0.7181 | 0.7459 | 0.8563 | 0.6972 | 9.3358175e-09 | 3707 |
| 0.7262 | 0.7271 | 0.8563 | 0.6972 | 9.335471e-09 | 3708 |
| 0.7326 | 0.7388 | 0.8563 | 0.6972 | 9.335125e-09 | 3709 |
| 0.7222 | 0.7459 | 0.8563 | 0.6972 | 9.334778e-09 | 3710 |
| 0.7249 | 0.7435 | 0.8562 | 0.6972 | 9.334432e-09 | 3711 |
| 0.7159 | 0.7341 | 0.8562 | 0.6972 | 9.3340855e-09 | 3712 |
| 0.7319 | 0.7294 | 0.8562 | 0.6972 | 9.333739e-09 | 3713 |
| 0.7311 | 0.7388 | 0.8562 | 0.6972 | 9.333393e-09 | 3714 |
| 0.7164 | 0.7271 | 0.8562 | 0.7042 | 9.333046e-09 | 3715 |
| 0.7201 | 0.7388 | 0.8561 | 0.7042 | 9.3327e-09 | 3716 |
| 0.7232 | 0.7435 | 0.8561 | 0.7042 | 9.332353e-09 | 3717 |
| 0.7169 | 0.7412 | 0.8561 | 0.7042 | 9.332005e-09 | 3718 |
| 0.7329 | 0.7341 | 0.8560 | 0.7042 | 9.331658e-09 | 3719 |
| 0.7180 | 0.7365 | 0.8560 | 0.7042 | 9.331311e-09 | 3720 |
| 0.7168 | 0.7482 | 0.8560 | 0.7042 | 9.330964e-09 | 3721 |
| 0.7171 | 0.7294 | 0.8561 | 0.7042 | 9.330616e-09 | 3722 |
| 0.7243 | 0.7482 | 0.8561 | 0.7042 | 9.330269e-09 | 3723 |
| 0.7153 | 0.7553 | 0.8560 | 0.7042 | 9.329922e-09 | 3724 |
| 0.7209 | 0.7435 | 0.8561 | 0.7042 | 9.3295744e-09 | 3725 |
| 0.7291 | 0.7412 | 0.8560 | 0.7042 | 9.329227e-09 | 3726 |
| 0.7081 | 0.7365 | 0.8560 | 0.7042 | 9.32888e-09 | 3727 |
| 0.7290 | 0.7435 | 0.8560 | 0.7042 | 9.328532e-09 | 3728 |
| 0.7269 | 0.7412 | 0.8561 | 0.7042 | 9.328184e-09 | 3729 |
| 0.7221 | 0.7365 | 0.8560 | 0.7042 | 9.327835e-09 | 3730 |
| 0.7241 | 0.7224 | 0.8560 | 0.7042 | 9.327487e-09 | 3731 |
| 0.7205 | 0.7318 | 0.8560 | 0.7042 | 9.327139e-09 | 3732 |
| 0.7200 | 0.7435 | 0.8560 | 0.7042 | 9.326791e-09 | 3733 |
| 0.7179 | 0.7459 | 0.8560 | 0.7042 | 9.326443e-09 | 3734 |
| 0.7168 | 0.7435 | 0.8559 | 0.7042 | 9.326095e-09 | 3735 |
| 0.7113 | 0.7412 | 0.8559 | 0.7042 | 9.325746e-09 | 3736 |
| 0.7167 | 0.7482 | 0.8559 | 0.7042 | 9.325398e-09 | 3737 |
| 0.7330 | 0.7365 | 0.8559 | 0.7042 | 9.32505e-09 | 3738 |
| 0.7099 | 0.7482 | 0.8559 | 0.7042 | 9.324701e-09 | 3739 |
| 0.7118 | 0.7506 | 0.8559 | 0.7042 | 9.324352e-09 | 3740 |
| 0.7161 | 0.7341 | 0.8559 | 0.7042 | 9.324003e-09 | 3741 |
| 0.7226 | 0.7341 | 0.8559 | 0.7042 | 9.323654e-09 | 3742 |
| 0.7226 | 0.7294 | 0.8559 | 0.7042 | 9.323305e-09 | 3743 |
| 0.7285 | 0.7341 | 0.8559 | 0.7042 | 9.322956e-09 | 3744 |
| 0.7297 | 0.7459 | 0.8559 | 0.7042 | 9.322607e-09 | 3745 |
| 0.7202 | 0.7459 | 0.8559 | 0.7042 | 9.322258e-09 | 3746 |
| 0.7300 | 0.7365 | 0.8559 | 0.7042 | 9.321909e-09 | 3747 |
| 0.7358 | 0.7365 | 0.8559 | 0.7042 | 9.3215595e-09 | 3748 |
| 0.7288 | 0.7529 | 0.8558 | 0.7042 | 9.3212105e-09 | 3749 |
| 0.7323 | 0.7365 | 0.8558 | 0.7042 | 9.3208605e-09 | 3750 |
| 0.7133 | 0.7153 | 0.8558 | 0.7042 | 9.320511e-09 | 3751 |
| 0.7160 | 0.7482 | 0.8558 | 0.7042 | 9.320161e-09 | 3752 |
| 0.7273 | 0.7435 | 0.8558 | 0.7042 | 9.319811e-09 | 3753 |
| 0.7129 | 0.7435 | 0.8558 | 0.7042 | 9.319461e-09 | 3754 |
| 0.7208 | 0.7529 | 0.8558 | 0.7113 | 9.319111e-09 | 3755 |
| 0.7311 | 0.7506 | 0.8558 | 0.7042 | 9.318761e-09 | 3756 |
| 0.7203 | 0.7341 | 0.8558 | 0.7042 | 9.318411e-09 | 3757 |
| 0.7241 | 0.7341 | 0.8558 | 0.7042 | 9.318061e-09 | 3758 |
| 0.7231 | 0.7435 | 0.8558 | 0.7042 | 9.317711e-09 | 3759 |
| 0.7176 | 0.7412 | 0.8558 | 0.7042 | 9.317361e-09 | 3760 |
| 0.7148 | 0.7412 | 0.8558 | 0.7113 | 9.31701e-09 | 3761 |
| 0.7243 | 0.7412 | 0.8558 | 0.7113 | 9.316659e-09 | 3762 |
| 0.7314 | 0.7365 | 0.8557 | 0.7113 | 9.316309e-09 | 3763 |
| 0.7275 | 0.7482 | 0.8557 | 0.7042 | 9.315958e-09 | 3764 |
| 0.7079 | 0.7318 | 0.8557 | 0.7042 | 9.315607e-09 | 3765 |
| 0.7145 | 0.7388 | 0.8557 | 0.7183 | 9.315256e-09 | 3766 |
| 0.7255 | 0.7341 | 0.8557 | 0.7183 | 9.314905e-09 | 3767 |
| 0.7183 | 0.7576 | 0.8557 | 0.7183 | 9.3145545e-09 | 3768 |
| 0.7188 | 0.7271 | 0.8557 | 0.7183 | 9.314204e-09 | 3769 |
| 0.7225 | 0.7435 | 0.8557 | 0.7183 | 9.313853e-09 | 3770 |
| 0.7150 | 0.7459 | 0.8557 | 0.7113 | 9.313502e-09 | 3771 |
| 0.7205 | 0.7318 | 0.8557 | 0.7183 | 9.31315e-09 | 3772 |
| 0.7286 | 0.7482 | 0.8557 | 0.7183 | 9.3127985e-09 | 3773 |
| 0.7231 | 0.7459 | 0.8557 | 0.7183 | 9.312447e-09 | 3774 |
| 0.7205 | 0.7341 | 0.8557 | 0.7113 | 9.312095e-09 | 3775 |
| 0.7239 | 0.7459 | 0.8556 | 0.7183 | 9.311743e-09 | 3776 |
| 0.7218 | 0.7388 | 0.8556 | 0.7183 | 9.311392e-09 | 3777 |
| 0.7274 | 0.7506 | 0.8556 | 0.7183 | 9.31104e-09 | 3778 |
| 0.7170 | 0.7365 | 0.8556 | 0.7183 | 9.310688e-09 | 3779 |
| 0.7220 | 0.7412 | 0.8556 | 0.7183 | 9.3103365e-09 | 3780 |
| 0.7129 | 0.7482 | 0.8556 | 0.7183 | 9.309985e-09 | 3781 |
| 0.7211 | 0.7318 | 0.8555 | 0.7183 | 9.309633e-09 | 3782 |
| 0.7135 | 0.7435 | 0.8555 | 0.7183 | 9.3092805e-09 | 3783 |
| 0.7165 | 0.7294 | 0.8555 | 0.7183 | 9.308928e-09 | 3784 |
| 0.7219 | 0.7459 | 0.8555 | 0.7183 | 9.308575e-09 | 3785 |
| 0.7271 | 0.7318 | 0.8555 | 0.7183 | 9.308223e-09 | 3786 |
| 0.7142 | 0.7506 | 0.8554 | 0.7183 | 9.30787e-09 | 3787 |
| 0.7174 | 0.7506 | 0.8554 | 0.7183 | 9.307517e-09 | 3788 |
| 0.7199 | 0.7412 | 0.8553 | 0.7183 | 9.307165e-09 | 3789 |
| 0.7167 | 0.7435 | 0.8553 | 0.7183 | 9.306812e-09 | 3790 |
| 0.7293 | 0.7459 | 0.8552 | 0.7183 | 9.30646e-09 | 3791 |
| 0.7137 | 0.7482 | 0.8552 | 0.7183 | 9.306107e-09 | 3792 |
| 0.7233 | 0.7412 | 0.8552 | 0.7183 | 9.305754e-09 | 3793 |
| 0.7149 | 0.7482 | 0.8551 | 0.7183 | 9.305402e-09 | 3794 |
| 0.7266 | 0.7341 | 0.8551 | 0.7183 | 9.305048e-09 | 3795 |
| 0.7288 | 0.7529 | 0.8551 | 0.7183 | 9.304695e-09 | 3796 |
| 0.7045 | 0.7435 | 0.8551 | 0.7183 | 9.304341e-09 | 3797 |
| 0.7106 | 0.7529 | 0.8551 | 0.7183 | 9.303988e-09 | 3798 |
| 0.7157 | 0.7388 | 0.8550 | 0.7183 | 9.303634e-09 | 3799 |
| 0.7148 | 0.7435 | 0.8550 | 0.7183 | 9.303281e-09 | 3800 |
| 0.7280 | 0.7435 | 0.8550 | 0.7183 | 9.302927e-09 | 3801 |
| 0.7200 | 0.7412 | 0.8551 | 0.7183 | 9.302574e-09 | 3802 |
| 0.7246 | 0.7318 | 0.8551 | 0.7183 | 9.30222e-09 | 3803 |
| 0.7049 | 0.7459 | 0.8551 | 0.7183 | 9.301867e-09 | 3804 |
| 0.7297 | 0.7365 | 0.8550 | 0.7183 | 9.301513e-09 | 3805 |
| 0.7209 | 0.7435 | 0.8550 | 0.7183 | 9.301159e-09 | 3806 |
| 0.7156 | 0.7482 | 0.8550 | 0.7183 | 9.300805e-09 | 3807 |
| 0.7076 | 0.7388 | 0.8550 | 0.7183 | 9.30045e-09 | 3808 |
| 0.7216 | 0.7247 | 0.8550 | 0.7183 | 9.300096e-09 | 3809 |
| 0.7132 | 0.7435 | 0.8550 | 0.7183 | 9.299741e-09 | 3810 |
| 0.7149 | 0.7318 | 0.8550 | 0.7183 | 9.299387e-09 | 3811 |
| 0.7240 | 0.7388 | 0.8550 | 0.7183 | 9.299033e-09 | 3812 |
| 0.7122 | 0.7600 | 0.8549 | 0.7183 | 9.298678e-09 | 3813 |
| 0.7181 | 0.7506 | 0.8549 | 0.7183 | 9.298324e-09 | 3814 |
| 0.7134 | 0.7506 | 0.8549 | 0.7183 | 9.2979695e-09 | 3815 |
| 0.7181 | 0.7412 | 0.8549 | 0.7183 | 9.297615e-09 | 3816 |
| 0.7120 | 0.7482 | 0.8549 | 0.7183 | 9.29726e-09 | 3817 |
| 0.7077 | 0.7435 | 0.8549 | 0.7183 | 9.296905e-09 | 3818 |
| 0.7216 | 0.7388 | 0.8549 | 0.7183 | 9.296549e-09 | 3819 |
| 0.7148 | 0.7435 | 0.8549 | 0.7183 | 9.296194e-09 | 3820 |
| 0.7168 | 0.7506 | 0.8548 | 0.7183 | 9.295839e-09 | 3821 |
| 0.7357 | 0.7506 | 0.8548 | 0.7183 | 9.2954835e-09 | 3822 |
| 0.7214 | 0.7482 | 0.8548 | 0.7183 | 9.295128e-09 | 3823 |
| 0.7169 | 0.7412 | 0.8548 | 0.7183 | 9.294773e-09 | 3824 |
| 0.7192 | 0.7482 | 0.8548 | 0.7183 | 9.294418e-09 | 3825 |
| 0.7262 | 0.7435 | 0.8548 | 0.7183 | 9.294062e-09 | 3826 |
| 0.7371 | 0.7365 | 0.8548 | 0.7183 | 9.293707e-09 | 3827 |
| 0.7113 | 0.7459 | 0.8548 | 0.7254 | 9.293351e-09 | 3828 |
| 0.7174 | 0.7459 | 0.8548 | 0.7254 | 9.292995e-09 | 3829 |
| 0.7133 | 0.7365 | 0.8548 | 0.7254 | 9.292639e-09 | 3830 |
| 0.7094 | 0.7506 | 0.8547 | 0.7254 | 9.2922825e-09 | 3831 |
| 0.7227 | 0.7412 | 0.8548 | 0.7254 | 9.291926e-09 | 3832 |
| 0.7111 | 0.7529 | 0.8548 | 0.7254 | 9.29157e-09 | 3833 |
| 0.7110 | 0.7412 | 0.8547 | 0.7254 | 9.291214e-09 | 3834 |
| 0.7163 | 0.7412 | 0.8548 | 0.7254 | 9.290858e-09 | 3835 |
| 0.7139 | 0.7553 | 0.8547 | 0.7254 | 9.290502e-09 | 3836 |
| 0.7273 | 0.7365 | 0.8547 | 0.7183 | 9.2901455e-09 | 3837 |
| 0.7096 | 0.7506 | 0.8547 | 0.7183 | 9.289789e-09 | 3838 |
| 0.7098 | 0.7459 | 0.8547 | 0.7183 | 9.289432e-09 | 3839 |
| 0.7142 | 0.7529 | 0.8547 | 0.7183 | 9.289075e-09 | 3840 |
| 0.7199 | 0.7435 | 0.8546 | 0.7254 | 9.288718e-09 | 3841 |
| 0.7113 | 0.7529 | 0.8546 | 0.7254 | 9.288361e-09 | 3842 |
| 0.7054 | 0.7412 | 0.8546 | 0.7254 | 9.288004e-09 | 3843 |
| 0.7125 | 0.7482 | 0.8546 | 0.7254 | 9.287647e-09 | 3844 |
| 0.7153 | 0.7482 | 0.8546 | 0.7254 | 9.28729e-09 | 3845 |
| 0.7090 | 0.7482 | 0.8546 | 0.7254 | 9.286933e-09 | 3846 |
| 0.7181 | 0.7388 | 0.8546 | 0.7324 | 9.286576e-09 | 3847 |
| 0.7099 | 0.7365 | 0.8545 | 0.7324 | 9.286219e-09 | 3848 |
| 0.7122 | 0.7435 | 0.8545 | 0.7324 | 9.285862e-09 | 3849 |
| 0.7284 | 0.7318 | 0.8546 | 0.7324 | 9.285504e-09 | 3850 |
| 0.7160 | 0.7388 | 0.8546 | 0.7324 | 9.285146e-09 | 3851 |
| 0.7230 | 0.7318 | 0.8545 | 0.7324 | 9.284788e-09 | 3852 |
| 0.7237 | 0.7529 | 0.8545 | 0.7254 | 9.28443e-09 | 3853 |
| 0.7186 | 0.7459 | 0.8545 | 0.7324 | 9.284072e-09 | 3854 |
| 0.7124 | 0.7294 | 0.8545 | 0.7324 | 9.283714e-09 | 3855 |
| 0.7166 | 0.7412 | 0.8545 | 0.7324 | 9.283356e-09 | 3856 |
| 0.7130 | 0.7459 | 0.8545 | 0.7324 | 9.282998e-09 | 3857 |
| 0.7267 | 0.7459 | 0.8545 | 0.7324 | 9.28264e-09 | 3858 |
| 0.7099 | 0.7435 | 0.8545 | 0.7324 | 9.2822825e-09 | 3859 |
| 0.7270 | 0.7318 | 0.8544 | 0.7324 | 9.281925e-09 | 3860 |
| 0.7113 | 0.7506 | 0.8545 | 0.7324 | 9.281567e-09 | 3861 |
| 0.7230 | 0.7365 | 0.8544 | 0.7324 | 9.281208e-09 | 3862 |
| 0.7039 | 0.7647 | 0.8544 | 0.7324 | 9.280849e-09 | 3863 |
| 0.7126 | 0.7388 | 0.8544 | 0.7324 | 9.28049e-09 | 3864 |
| 0.7022 | 0.7482 | 0.8545 | 0.7324 | 9.280131e-09 | 3865 |
| 0.7092 | 0.7388 | 0.8545 | 0.7324 | 9.2797725e-09 | 3866 |
| 0.7100 | 0.7553 | 0.8545 | 0.7324 | 9.279414e-09 | 3867 |
| 0.7193 | 0.7365 | 0.8545 | 0.7324 | 9.279055e-09 | 3868 |
| 0.7092 | 0.7388 | 0.8545 | 0.7324 | 9.278696e-09 | 3869 |
| 0.7224 | 0.7341 | 0.8545 | 0.7324 | 9.278337e-09 | 3870 |
| 0.7203 | 0.7365 | 0.8545 | 0.7324 | 9.277978e-09 | 3871 |
| 0.7111 | 0.7412 | 0.8545 | 0.7324 | 9.27762e-09 | 3872 |
| 0.7034 | 0.7459 | 0.8545 | 0.7324 | 9.27726e-09 | 3873 |
| 0.7313 | 0.7294 | 0.8545 | 0.7324 | 9.2769e-09 | 3874 |
| 0.7121 | 0.7412 | 0.8545 | 0.7324 | 9.27654e-09 | 3875 |
| 0.7122 | 0.7412 | 0.8545 | 0.7324 | 9.276181e-09 | 3876 |
| 0.7087 | 0.7365 | 0.8545 | 0.7324 | 9.275821e-09 | 3877 |
| 0.7265 | 0.7341 | 0.8545 | 0.7324 | 9.275461e-09 | 3878 |
| 0.7160 | 0.7435 | 0.8545 | 0.7324 | 9.275102e-09 | 3879 |
| 0.7074 | 0.7529 | 0.8545 | 0.7324 | 9.274742e-09 | 3880 |
| 0.7192 | 0.7482 | 0.8544 | 0.7324 | 9.274382e-09 | 3881 |
| 0.7156 | 0.7388 | 0.8545 | 0.7324 | 9.274022e-09 | 3882 |
| 0.7159 | 0.7482 | 0.8544 | 0.7324 | 9.273663e-09 | 3883 |
| 0.7063 | 0.7388 | 0.8545 | 0.7324 | 9.273302e-09 | 3884 |
| 0.7070 | 0.7341 | 0.8544 | 0.7324 | 9.2729415e-09 | 3885 |
| 0.7105 | 0.7576 | 0.8544 | 0.7324 | 9.272581e-09 | 3886 |
| 0.7272 | 0.7459 | 0.8544 | 0.7324 | 9.27222e-09 | 3887 |
| 0.7200 | 0.7482 | 0.8544 | 0.7324 | 9.27186e-09 | 3888 |
| 0.7157 | 0.7388 | 0.8544 | 0.7324 | 9.271499e-09 | 3889 |
| 0.7018 | 0.7482 | 0.8544 | 0.7324 | 9.2711385e-09 | 3890 |
| 0.7113 | 0.7412 | 0.8543 | 0.7324 | 9.270778e-09 | 3891 |
| 0.7151 | 0.7388 | 0.8543 | 0.7324 | 9.270417e-09 | 3892 |
| 0.7192 | 0.7365 | 0.8543 | 0.7324 | 9.270057e-09 | 3893 |
| 0.7075 | 0.7482 | 0.8543 | 0.7324 | 9.269696e-09 | 3894 |
| 0.7185 | 0.7388 | 0.8542 | 0.7324 | 9.2693355e-09 | 3895 |
| 0.7082 | 0.7365 | 0.8543 | 0.7324 | 9.268974e-09 | 3896 |
| 0.7128 | 0.7435 | 0.8543 | 0.7324 | 9.2686125e-09 | 3897 |
| 0.7077 | 0.7506 | 0.8543 | 0.7324 | 9.268251e-09 | 3898 |
| 0.7098 | 0.7482 | 0.8542 | 0.7324 | 9.26789e-09 | 3899 |
| 0.7156 | 0.7365 | 0.8542 | 0.7324 | 9.267528e-09 | 3900 |
| 0.7095 | 0.7435 | 0.8542 | 0.7324 | 9.267167e-09 | 3901 |
| 0.7084 | 0.7600 | 0.8542 | 0.7324 | 9.266805e-09 | 3902 |
| 0.7176 | 0.7482 | 0.8542 | 0.7324 | 9.266444e-09 | 3903 |
| 0.7161 | 0.7412 | 0.8542 | 0.7324 | 9.266082e-09 | 3904 |
| 0.7183 | 0.7459 | 0.8541 | 0.7324 | 9.265721e-09 | 3905 |
| 0.7042 | 0.7506 | 0.8541 | 0.7324 | 9.265359e-09 | 3906 |
| 0.7014 | 0.7435 | 0.8541 | 0.7324 | 9.264997e-09 | 3907 |
| 0.7138 | 0.7435 | 0.8541 | 0.7324 | 9.264634e-09 | 3908 |
| 0.7162 | 0.7529 | 0.8541 | 0.7324 | 9.264272e-09 | 3909 |
| 0.7162 | 0.7459 | 0.8541 | 0.7324 | 9.26391e-09 | 3910 |
| 0.7252 | 0.7482 | 0.8541 | 0.7324 | 9.263547e-09 | 3911 |
| 0.7019 | 0.7529 | 0.8541 | 0.7324 | 9.263185e-09 | 3912 |
| 0.7060 | 0.7388 | 0.8541 | 0.7324 | 9.2628225e-09 | 3913 |
| 0.7146 | 0.7506 | 0.8541 | 0.7324 | 9.26246e-09 | 3914 |
| 0.7037 | 0.7459 | 0.8541 | 0.7324 | 9.262098e-09 | 3915 |
| 0.7113 | 0.7459 | 0.8541 | 0.7324 | 9.261735e-09 | 3916 |
| 0.7092 | 0.7506 | 0.8541 | 0.7324 | 9.261373e-09 | 3917 |
| 0.7026 | 0.7459 | 0.8541 | 0.7324 | 9.26101e-09 | 3918 |
| 0.7201 | 0.7529 | 0.8541 | 0.7324 | 9.2606465e-09 | 3919 |
| 0.7017 | 0.7459 | 0.8541 | 0.7324 | 9.260283e-09 | 3920 |
| 0.7148 | 0.7506 | 0.8541 | 0.7324 | 9.25992e-09 | 3921 |
| 0.7217 | 0.7412 | 0.8541 | 0.7324 | 9.259557e-09 | 3922 |
| 0.7135 | 0.7435 | 0.8541 | 0.7324 | 9.259193e-09 | 3923 |
| 0.7138 | 0.7388 | 0.8541 | 0.7324 | 9.25883e-09 | 3924 |
| 0.7214 | 0.7435 | 0.8541 | 0.7324 | 9.258467e-09 | 3925 |
| 0.7012 | 0.7459 | 0.8540 | 0.7324 | 9.258104e-09 | 3926 |
| 0.7122 | 0.7529 | 0.8540 | 0.7324 | 9.25774e-09 | 3927 |
| 0.7197 | 0.7459 | 0.8540 | 0.7324 | 9.257377e-09 | 3928 |
| 0.7076 | 0.7600 | 0.8540 | 0.7324 | 9.257014e-09 | 3929 |
| 0.7052 | 0.7482 | 0.8540 | 0.7324 | 9.25665e-09 | 3930 |
| 0.7194 | 0.7435 | 0.8540 | 0.7324 | 9.2562855e-09 | 3931 |
| 0.7029 | 0.7506 | 0.8539 | 0.7324 | 9.255921e-09 | 3932 |
| 0.7056 | 0.7576 | 0.8539 | 0.7324 | 9.255557e-09 | 3933 |
| 0.7020 | 0.7341 | 0.8540 | 0.7324 | 9.255193e-09 | 3934 |
| 0.7144 | 0.7506 | 0.8540 | 0.7324 | 9.254829e-09 | 3935 |
| 0.7106 | 0.7412 | 0.8540 | 0.7324 | 9.254465e-09 | 3936 |
| 0.7176 | 0.7435 | 0.8540 | 0.7324 | 9.254101e-09 | 3937 |
| 0.7220 | 0.7482 | 0.8540 | 0.7324 | 9.253736e-09 | 3938 |
| 0.7059 | 0.7506 | 0.8540 | 0.7324 | 9.253372e-09 | 3939 |
| 0.7117 | 0.7388 | 0.8540 | 0.7324 | 9.253008e-09 | 3940 |
| 0.7092 | 0.7482 | 0.8540 | 0.7324 | 9.252643e-09 | 3941 |
| 0.6979 | 0.7600 | 0.8539 | 0.7324 | 9.252278e-09 | 3942 |
| 0.7003 | 0.7624 | 0.8539 | 0.7324 | 9.251913e-09 | 3943 |
| 0.7118 | 0.7412 | 0.8539 | 0.7324 | 9.251548e-09 | 3944 |
| 0.6967 | 0.7506 | 0.8539 | 0.7324 | 9.251183e-09 | 3945 |
| 0.7154 | 0.7435 | 0.8539 | 0.7324 | 9.250818e-09 | 3946 |
| 0.7274 | 0.7506 | 0.8539 | 0.7324 | 9.250453e-09 | 3947 |
| 0.7188 | 0.7506 | 0.8539 | 0.7324 | 9.250088e-09 | 3948 |
| 0.7144 | 0.7365 | 0.8539 | 0.7324 | 9.249723e-09 | 3949 |
| 0.7147 | 0.7506 | 0.8539 | 0.7324 | 9.249358e-09 | 3950 |
| 0.7118 | 0.7318 | 0.8539 | 0.7324 | 9.248993e-09 | 3951 |
| 0.6955 | 0.7529 | 0.8539 | 0.7324 | 9.248627e-09 | 3952 |
| 0.7150 | 0.7412 | 0.8539 | 0.7324 | 9.248261e-09 | 3953 |
| 0.7189 | 0.7388 | 0.8538 | 0.7324 | 9.247895e-09 | 3954 |
| 0.7089 | 0.7459 | 0.8538 | 0.7324 | 9.247529e-09 | 3955 |
| 0.7026 | 0.7576 | 0.8538 | 0.7324 | 9.247163e-09 | 3956 |
| 0.7145 | 0.7294 | 0.8538 | 0.7324 | 9.246797e-09 | 3957 |
| 0.7159 | 0.7482 | 0.8538 | 0.7324 | 9.246431e-09 | 3958 |
| 0.7182 | 0.7459 | 0.8537 | 0.7324 | 9.246065e-09 | 3959 |
| 0.7092 | 0.7341 | 0.8537 | 0.7324 | 9.245699e-09 | 3960 |
| 0.7059 | 0.7435 | 0.8537 | 0.7324 | 9.245333e-09 | 3961 |
| 0.7063 | 0.7529 | 0.8537 | 0.7324 | 9.2449675e-09 | 3962 |
| 0.7113 | 0.7459 | 0.8537 | 0.7324 | 9.2446015e-09 | 3963 |
| 0.7176 | 0.7318 | 0.8537 | 0.7324 | 9.244235e-09 | 3964 |
| 0.7230 | 0.7388 | 0.8537 | 0.7324 | 9.243868e-09 | 3965 |
| 0.7063 | 0.7294 | 0.8537 | 0.7324 | 9.243501e-09 | 3966 |
| 0.7223 | 0.7318 | 0.8537 | 0.7324 | 9.243134e-09 | 3967 |
| 0.7155 | 0.7341 | 0.8537 | 0.7324 | 9.242767e-09 | 3968 |
| 0.7188 | 0.7341 | 0.8536 | 0.7324 | 9.242401e-09 | 3969 |
| 0.7155 | 0.7482 | 0.8536 | 0.7324 | 9.242034e-09 | 3970 |
| 0.7222 | 0.7412 | 0.8536 | 0.7324 | 9.241667e-09 | 3971 |
| 0.7104 | 0.7412 | 0.8536 | 0.7324 | 9.2413e-09 | 3972 |
| 0.7107 | 0.7482 | 0.8537 | 0.7324 | 9.240933e-09 | 3973 |
| 0.7131 | 0.7435 | 0.8537 | 0.7324 | 9.2405665e-09 | 3974 |
| 0.7063 | 0.7388 | 0.8537 | 0.7324 | 9.240199e-09 | 3975 |
| 0.7072 | 0.7553 | 0.8537 | 0.7324 | 9.239831e-09 | 3976 |
| 0.7079 | 0.7388 | 0.8537 | 0.7324 | 9.239463e-09 | 3977 |
| 0.7084 | 0.7412 | 0.8537 | 0.7324 | 9.239096e-09 | 3978 |
| 0.7126 | 0.7388 | 0.8537 | 0.7324 | 9.238728e-09 | 3979 |
| 0.7033 | 0.7482 | 0.8536 | 0.7324 | 9.23836e-09 | 3980 |
| 0.7035 | 0.7482 | 0.8537 | 0.7324 | 9.237993e-09 | 3981 |
| 0.7087 | 0.7553 | 0.8537 | 0.7324 | 9.237625e-09 | 3982 |
| 0.7029 | 0.7412 | 0.8537 | 0.7324 | 9.237257e-09 | 3983 |
| 0.7127 | 0.7412 | 0.8537 | 0.7324 | 9.2368895e-09 | 3984 |
| 0.7112 | 0.7294 | 0.8537 | 0.7324 | 9.236522e-09 | 3985 |
| 0.7030 | 0.7600 | 0.8537 | 0.7324 | 9.236153e-09 | 3986 |
| 0.7078 | 0.7506 | 0.8537 | 0.7324 | 9.235785e-09 | 3987 |
| 0.7270 | 0.7459 | 0.8537 | 0.7324 | 9.235416e-09 | 3988 |
| 0.7012 | 0.7341 | 0.8537 | 0.7324 | 9.235047e-09 | 3989 |
| 0.7110 | 0.7459 | 0.8537 | 0.7324 | 9.234679e-09 | 3990 |
| 0.7204 | 0.7459 | 0.8537 | 0.7324 | 9.23431e-09 | 3991 |
| 0.7002 | 0.7506 | 0.8537 | 0.7324 | 9.233942e-09 | 3992 |
| 0.7061 | 0.7412 | 0.8537 | 0.7324 | 9.233573e-09 | 3993 |
| 0.7043 | 0.7388 | 0.8537 | 0.7324 | 9.233204e-09 | 3994 |
| 0.7127 | 0.7435 | 0.8537 | 0.7324 | 9.232836e-09 | 3995 |
| 0.7096 | 0.7576 | 0.8537 | 0.7324 | 9.232467e-09 | 3996 |
| 0.7124 | 0.7529 | 0.8537 | 0.7324 | 9.232099e-09 | 3997 |
| 0.6943 | 0.7576 | 0.8537 | 0.7324 | 9.231729e-09 | 3998 |
| 0.6953 | 0.7459 | 0.8537 | 0.7324 | 9.23136e-09 | 3999 |
### Framework versions
- Transformers 4.29.0.dev0
- TensorFlow 2.9.1
- Datasets 2.8.0
- Tokenizers 0.13.2
| 385,394 | [
[
-0.05255126953125,
-0.033416748046875,
0.0264892578125,
0.0074462890625,
-0.0014696121215820312,
0.00015211105346679688,
0.0005655288696289062,
-0.01073455810546875,
0.05322265625,
0.0212554931640625,
-0.040771484375,
-0.043548583984375,
-0.040802001953125,
... |
Sleoruiz/roberta-bne-fine-tuned-text-classification-SL-1200samples | 2023-05-08T01:10:02.000Z | [
"transformers",
"pytorch",
"roberta",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | Sleoruiz | null | null | Sleoruiz/roberta-bne-fine-tuned-text-classification-SL-1200samples | 0 | 2 | transformers | 2023-05-07T23:23:42 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- f1
- recall
- accuracy
- precision
model-index:
- name: roberta-bne-fine-tuned-text-classification-SL-1200samples
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-bne-fine-tuned-text-classification-SL-1200samples
This model is a fine-tuned version of [PlanTL-GOB-ES/roberta-base-bne](https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5536
- F1: 0.4587
- Recall: 0.4697
- Accuracy: 0.4697
- Precision: 0.4773
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 | Recall | Accuracy | Precision |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:--------:|:---------:|
| 2.3608 | 1.0 | 1503 | 2.2771 | 0.3955 | 0.4385 | 0.4385 | 0.4415 |
| 1.9673 | 2.0 | 3006 | 2.0774 | 0.4439 | 0.4769 | 0.4769 | 0.4716 |
| 1.5479 | 3.0 | 4509 | 2.1167 | 0.4567 | 0.4767 | 0.4767 | 0.4719 |
| 1.0917 | 4.0 | 6012 | 2.4366 | 0.4512 | 0.4451 | 0.4451 | 0.4902 |
| 0.8063 | 5.0 | 7515 | 2.5536 | 0.4587 | 0.4697 | 0.4697 | 0.4773 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,000 | [
[
-0.032440185546875,
-0.0400390625,
0.0093841552734375,
-0.0010919570922851562,
-0.0173492431640625,
-0.026336669921875,
-0.0115509033203125,
-0.02154541015625,
0.016876220703125,
0.0264892578125,
-0.046295166015625,
-0.056182861328125,
-0.05535888671875,
-0.... |
Falguni/dqn-SpaceInvadersNoFrameskip-v4 | 2023-05-08T00:53:32.000Z | [
"stable-baselines3",
"SpaceInvadersNoFrameskip-v4",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | Falguni | null | null | Falguni/dqn-SpaceInvadersNoFrameskip-v4 | 0 | 2 | stable-baselines3 | 2023-05-08T00:53:00 | ---
library_name: stable-baselines3
tags:
- SpaceInvadersNoFrameskip-v4
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: DQN
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: SpaceInvadersNoFrameskip-v4
type: SpaceInvadersNoFrameskip-v4
metrics:
- type: mean_reward
value: 480.00 +/- 132.78
name: mean_reward
verified: false
---
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga Falguni -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga Falguni -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga Falguni
```
## Hyperparameters
```python
OrderedDict([('batch_size', 32),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.01),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.0001),
('learning_starts', 100000),
('n_timesteps', 1000000.0),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 4),
('normalize', False)])
```
| 2,688 | [
[
-0.0416259765625,
-0.0369873046875,
0.0207977294921875,
0.025115966796875,
-0.0102386474609375,
-0.0182342529296875,
0.01222991943359375,
-0.01314544677734375,
0.01337432861328125,
0.0245208740234375,
-0.0697021484375,
-0.0355224609375,
-0.027099609375,
-0.0... |
thuyentruong/dqn-SpaceInvadersNoFrameskip-v4 | 2023-05-08T03:48:50.000Z | [
"stable-baselines3",
"SpaceInvadersNoFrameskip-v4",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | thuyentruong | null | null | thuyentruong/dqn-SpaceInvadersNoFrameskip-v4 | 0 | 2 | stable-baselines3 | 2023-05-08T01:04:35 | ---
library_name: stable-baselines3
tags:
- SpaceInvadersNoFrameskip-v4
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: DQN
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: SpaceInvadersNoFrameskip-v4
type: SpaceInvadersNoFrameskip-v4
metrics:
- type: mean_reward
value: 282.50 +/- 73.22
name: mean_reward
verified: false
---
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga thuyentruong -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga thuyentruong -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga thuyentruong
```
## Hyperparameters
```python
OrderedDict([('batch_size', 32),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.01),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.0001),
('learning_starts', 100000),
('n_timesteps', 1000000.0),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 256),
('normalize', False)])
```
| 2,704 | [
[
-0.041290283203125,
-0.036407470703125,
0.0211334228515625,
0.0247344970703125,
-0.00970458984375,
-0.017059326171875,
0.012725830078125,
-0.01457977294921875,
0.013519287109375,
0.02459716796875,
-0.07037353515625,
-0.034088134765625,
-0.026947021484375,
-0... |
AntoineBlanot/roberta-large-seq-classif | 2023-05-08T02:37:06.000Z | [
"transformers",
"pytorch",
"roberta",
"text-classification",
"endpoints_compatible",
"region:us"
] | text-classification | AntoineBlanot | null | null | AntoineBlanot/roberta-large-seq-classif | 0 | 2 | transformers | 2023-05-08T02:29:08 | ---
{}
---
# roberta-large-3way
This is the checkpoint for [roberta-large](https://huggingface.co/roberta-large) after being trained on a various of tasks inlcuding various of datasets.
The used datasets have been transformed in a binary setting: **non-entailment** and **entailment**
It can be directly used as a NLI inference model or a zero-shot classifier. | 363 | [
[
-0.0111846923828125,
-0.061767578125,
0.06011962890625,
0.00872039794921875,
-0.0162506103515625,
-0.0190887451171875,
0.00968170166015625,
-0.0243072509765625,
0.023223876953125,
0.057373046875,
-0.0712890625,
-0.037628173828125,
-0.05029296875,
0.013732910... |
chaninder/trashtacks-model-v1 | 2023-05-08T04:43:00.000Z | [
"keras",
"region:us"
] | null | chaninder | null | null | chaninder/trashtacks-model-v1 | 0 | 2 | keras | 2023-05-08T04:42:30 | ---
library_name: keras
---
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
| Hyperparameters | Value |
| :-- | :-- |
| name | Adam |
| learning_rate | 0.0010000000474974513 |
| decay | 0.0 |
| beta_1 | 0.8999999761581421 |
| beta_2 | 0.9990000128746033 |
| epsilon | 1e-07 |
| amsgrad | False |
| training_precision | float32 |
## Model Plot
<details>
<summary>View Model Plot</summary>

</details> | 658 | [
[
-0.034637451171875,
-0.0401611328125,
0.0255584716796875,
0.00649261474609375,
-0.041046142578125,
-0.0197601318359375,
0.01187896728515625,
-0.0110015869140625,
0.0156707763671875,
0.033538818359375,
-0.035552978515625,
-0.053741455078125,
-0.0428466796875,
... |
AskingAlex/exist-2023-task3 | 2023-05-08T09:07:25.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-classification | AskingAlex | null | null | AskingAlex/exist-2023-task3 | 0 | 2 | transformers | 2023-05-08T07:50:59 | ---
license: mit
tags:
- generated_from_trainer
model-index:
- name: exist-2023-task3
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# exist-2023-task3
This model is a fine-tuned version of [microsoft/Multilingual-MiniLM-L12-H384](https://huggingface.co/microsoft/Multilingual-MiniLM-L12-H384) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3814
- Acc: 47.9045
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Acc |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| No log | 1.0 | 97 | 0.4243 | 28.9107 |
| No log | 2.0 | 194 | 0.4149 | 31.6012 |
| No log | 3.0 | 291 | 0.4112 | 32.5061 |
| No log | 4.0 | 388 | 0.4111 | 32.6858 |
| No log | 5.0 | 485 | 0.4049 | 34.8687 |
| 0.416 | 6.0 | 582 | 0.4023 | 35.9895 |
| 0.416 | 7.0 | 679 | 0.4005 | 36.6499 |
| 0.416 | 8.0 | 776 | 0.3978 | 38.6035 |
| 0.416 | 9.0 | 873 | 0.3964 | 38.5017 |
| 0.416 | 10.0 | 970 | 0.3931 | 40.5065 |
| 0.4029 | 11.0 | 1067 | 0.3912 | 42.2190 |
| 0.4029 | 12.0 | 1164 | 0.3891 | 43.2468 |
| 0.4029 | 13.0 | 1261 | 0.3888 | 42.6855 |
| 0.4029 | 14.0 | 1358 | 0.3861 | 44.5341 |
| 0.4029 | 15.0 | 1455 | 0.3851 | 44.8797 |
| 0.3932 | 16.0 | 1552 | 0.3841 | 46.3287 |
| 0.3932 | 17.0 | 1649 | 0.3832 | 46.5887 |
| 0.3932 | 18.0 | 1746 | 0.3820 | 47.4830 |
| 0.3932 | 19.0 | 1843 | 0.3817 | 47.8015 |
| 0.3932 | 20.0 | 1940 | 0.3814 | 47.9045 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,492 | [
[
-0.03753662109375,
-0.034912109375,
0.00995635986328125,
0.0014429092407226562,
-0.008636474609375,
-0.01593017578125,
-0.001010894775390625,
-0.01473236083984375,
0.0238037109375,
0.0197296142578125,
-0.059112548828125,
-0.045257568359375,
-0.04290771484375,
... |
P3ps/test-trainer-glue-mrpc | 2023-05-08T10:39:45.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | P3ps | null | null | P3ps/test-trainer-glue-mrpc | 0 | 2 | transformers | 2023-05-08T10:32:39 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
- f1
model-index:
- name: test-trainer-glue-mrpc
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: mrpc
split: validation
args: mrpc
metrics:
- name: Accuracy
type: accuracy
value:
accuracy: 0.8627450980392157
- name: F1
type: f1
value: 0.902439024390244
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# test-trainer-glue-mrpc
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6850
- Accuracy: {'accuracy': 0.8627450980392157}
- F1: 0.9024
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------------------------------:|:------:|
| No log | 1.0 | 459 | 0.3762 | {'accuracy': 0.8455882352941176} | 0.8873 |
| 0.4903 | 2.0 | 918 | 0.5500 | {'accuracy': 0.8431372549019608} | 0.8923 |
| 0.2654 | 3.0 | 1377 | 0.6850 | {'accuracy': 0.8627450980392157} | 0.9024 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,069 | [
[
-0.035614013671875,
-0.046539306640625,
0.004772186279296875,
0.0117950439453125,
-0.0225982666015625,
-0.029327392578125,
-0.01444244384765625,
-0.018310546875,
0.0201873779296875,
0.015869140625,
-0.054595947265625,
-0.0391845703125,
-0.05267333984375,
-0.... |
IRI2070/dal-bert-finetuned-address-v1 | 2023-05-08T13:45:12.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"fill-mask",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | IRI2070 | null | null | IRI2070/dal-bert-finetuned-address-v1 | 0 | 2 | transformers | 2023-05-08T10:37:43 | ---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: dal-bert-finetuned-address-v1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dal-bert-finetuned-address-v1
This model is a fine-tuned version of [sharif-dal/dal-bert](https://huggingface.co/sharif-dal/dal-bert) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1825
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 1.6281 | 1.0 | 5455 | 1.3490 |
| 1.32 | 2.0 | 10910 | 1.2199 |
| 1.2409 | 3.0 | 16365 | 1.1815 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,432 | [
[
-0.042999267578125,
-0.048919677734375,
0.013916015625,
0.01486968994140625,
-0.0234527587890625,
-0.0300445556640625,
-0.00763702392578125,
-0.02166748046875,
0.00321197509765625,
0.0305328369140625,
-0.06292724609375,
-0.043670654296875,
-0.04669189453125,
... |
marco-c88/gpt2-large-finetuned-mstatmem_1ep_gpt2_no_valid_austen | 2023-05-08T12:27:14.000Z | [
"transformers",
"pytorch",
"tensorboard",
"gpt2",
"text-generation",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | marco-c88 | null | null | marco-c88/gpt2-large-finetuned-mstatmem_1ep_gpt2_no_valid_austen | 0 | 2 | transformers | 2023-05-08T12:08:10 | ---
license: mit
tags:
- generated_from_trainer
model-index:
- name: gpt2-large-finetuned-mstatmem_1ep_gpt2_no_valid_austen
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# gpt2-large-finetuned-mstatmem_1ep_gpt2_no_valid_austen
This model is a fine-tuned version of [gpt2-large](https://huggingface.co/gpt2-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.9654
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 3.2869 | 1.0 | 939 | 2.9654 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,305 | [
[
-0.027435302734375,
-0.046112060546875,
0.0259857177734375,
0.006870269775390625,
-0.03302001953125,
-0.042999267578125,
-0.01593017578125,
-0.019775390625,
-0.01018524169921875,
0.0187530517578125,
-0.04632568359375,
-0.0323486328125,
-0.05389404296875,
-0.... |
wTao1215/autotrain-it-case-classify-56514130987 | 2023-05-08T14:12:55.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"autotrain",
"unk",
"dataset:wTao1215/autotrain-data-it-case-classify",
"co2_eq_emissions",
"endpoints_compatible",
"region:us"
] | text-classification | wTao1215 | null | null | wTao1215/autotrain-it-case-classify-56514130987 | 0 | 2 | transformers | 2023-05-08T14:12:12 | ---
tags:
- autotrain
- text-classification
language:
- unk
widget:
- text: "I love AutoTrain 🤗"
datasets:
- wTao1215/autotrain-data-it-case-classify
co2_eq_emissions:
emissions: 0.0206199757216604
---
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 56514130987
- CO2 Emissions (in grams): 0.0206
## Validation Metrics
- Loss: 2.740
- Accuracy: 0.303
- Macro F1: 0.141
- Micro F1: 0.303
- Weighted F1: 0.210
- Macro Precision: 0.135
- Micro Precision: 0.303
- Weighted Precision: 0.188
- Macro Recall: 0.167
- Micro Recall: 0.303
- Weighted Recall: 0.303
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoTrain"}' https://api-inference.huggingface.co/models/wTao1215/autotrain-it-case-classify-56514130987
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("wTao1215/autotrain-it-case-classify-56514130987", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("wTao1215/autotrain-it-case-classify-56514130987", use_auth_token=True)
inputs = tokenizer("I love AutoTrain", return_tensors="pt")
outputs = model(**inputs)
``` | 1,313 | [
[
-0.03167724609375,
-0.022613525390625,
0.01171112060546875,
0.010772705078125,
-0.002819061279296875,
0.003597259521484375,
0.0031280517578125,
-0.00983428955078125,
-0.0031223297119140625,
0.0071868896484375,
-0.04864501953125,
-0.0328369140625,
-0.056091308593... |
AbrahamSanders/opt-2.7b-realtime-chat-v2 | 2023-05-21T19:09:02.000Z | [
"transformers",
"pytorch",
"tensorboard",
"opt",
"text-generation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | AbrahamSanders | null | null | AbrahamSanders/opt-2.7b-realtime-chat-v2 | 0 | 2 | transformers | 2023-05-08T17:12:04 | ---
license: other
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: opt-2.7b-realtime-chat-v2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# opt-2.7b-realtime-chat-v2
This model is a fine-tuned version of [facebook/opt-2.7b](https://huggingface.co/facebook/opt-2.7b) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.0888
- Accuracy: 0.6870
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 128
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.0974 | 0.5 | 51 | 2.1267 | 0.6826 |
| 2.0842 | 1.0 | 102 | 2.0968 | 0.6859 |
| 1.9624 | 1.49 | 153 | 2.0936 | 0.6863 |
| 1.9476 | 1.99 | 204 | 2.0888 | 0.6870 |
| 1.888 | 2.49 | 255 | 2.0993 | 0.6864 |
| 1.8687 | 2.99 | 306 | 2.0994 | 0.6865 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.1+cu118
- Datasets 2.7.1
- Tokenizers 0.12.1
| 1,746 | [
[
-0.025390625,
-0.058868408203125,
0.00716400146484375,
0.0187835693359375,
-0.02056884765625,
-0.00992584228515625,
-0.007366180419921875,
-0.033355712890625,
0.016265869140625,
0.0212860107421875,
-0.06903076171875,
-0.035675048828125,
-0.040069580078125,
-... |
guoluo/Bert_class_1e-07 | 2023-05-08T18:11:18.000Z | [
"transformers",
"tf",
"bert",
"text-classification",
"generated_from_keras_callback",
"endpoints_compatible",
"region:us"
] | text-classification | guoluo | null | null | guoluo/Bert_class_1e-07 | 0 | 2 | transformers | 2023-05-08T18:10:28 | ---
tags:
- generated_from_keras_callback
model-index:
- name: Bert_class_1e-07
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Bert_class_1e-07
This model is a fine-tuned version of [guoluo/Bert_1.5e_07](https://huggingface.co/guoluo/Bert_1.5e_07) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0102
- Train Accuracy: 1.0
- Validation Loss: 1.7238
- Validation Accuracy: 0.6972
- Train Lr: 4.4946695e-08
- Epoch: 3999
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 4.4946695e-08, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Train Lr | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-------------:|:-----:|
| 1.4508 | 0.1647 | 1.4468 | 0.1408 | 1e-07 | 0 |
| 1.4039 | 0.1953 | 1.3961 | 0.1901 | 9.9999994e-08 | 1 |
| 1.3625 | 0.2612 | 1.3495 | 0.2817 | 9.999997e-08 | 2 |
| 1.3186 | 0.3788 | 1.3070 | 0.4930 | 9.9999944e-08 | 3 |
| 1.2774 | 0.5129 | 1.2667 | 0.6197 | 9.99999e-08 | 4 |
| 1.2385 | 0.5976 | 1.2318 | 0.6549 | 9.999985e-08 | 5 |
| 1.2131 | 0.6329 | 1.2005 | 0.6761 | 9.9999795e-08 | 6 |
| 1.1789 | 0.6565 | 1.1730 | 0.6761 | 9.9999724e-08 | 7 |
| 1.1624 | 0.6753 | 1.1462 | 0.6761 | 9.9999646e-08 | 8 |
| 1.1323 | 0.6753 | 1.1232 | 0.6761 | 9.999955e-08 | 9 |
| 1.1121 | 0.6776 | 1.1041 | 0.6761 | 9.9999454e-08 | 10 |
| 1.0925 | 0.6776 | 1.0864 | 0.6761 | 9.999935e-08 | 11 |
| 1.0686 | 0.6776 | 1.0705 | 0.6761 | 9.999923e-08 | 12 |
| 1.0624 | 0.6776 | 1.0586 | 0.6761 | 9.99991e-08 | 13 |
| 1.0507 | 0.6776 | 1.0460 | 0.6761 | 9.999896e-08 | 14 |
| 1.0419 | 0.6776 | 1.0358 | 0.6761 | 9.999881e-08 | 15 |
| 1.0323 | 0.6776 | 1.0266 | 0.6761 | 9.9998644e-08 | 16 |
| 1.0233 | 0.6776 | 1.0185 | 0.6761 | 9.999847e-08 | 17 |
| 1.0176 | 0.6776 | 1.0113 | 0.6761 | 9.9998296e-08 | 18 |
| 1.0026 | 0.6776 | 1.0049 | 0.6761 | 9.9998104e-08 | 19 |
| 1.0017 | 0.6776 | 0.9997 | 0.6761 | 9.9997905e-08 | 20 |
| 0.9869 | 0.6776 | 0.9946 | 0.6761 | 9.999769e-08 | 21 |
| 0.9874 | 0.6776 | 0.9902 | 0.6761 | 9.999747e-08 | 22 |
| 0.9813 | 0.6776 | 0.9862 | 0.6761 | 9.9997244e-08 | 23 |
| 0.9751 | 0.6776 | 0.9827 | 0.6761 | 9.9997e-08 | 24 |
| 0.9752 | 0.6776 | 0.9799 | 0.6761 | 9.9996754e-08 | 25 |
| 0.9753 | 0.6776 | 0.9771 | 0.6761 | 9.999649e-08 | 26 |
| 0.9704 | 0.6776 | 0.9752 | 0.6761 | 9.999622e-08 | 27 |
| 0.9629 | 0.6776 | 0.9731 | 0.6761 | 9.9995944e-08 | 28 |
| 0.9688 | 0.6776 | 0.9716 | 0.6761 | 9.999565e-08 | 29 |
| 0.9558 | 0.6776 | 0.9698 | 0.6761 | 9.9995354e-08 | 30 |
| 0.9666 | 0.6776 | 0.9681 | 0.6761 | 9.999504e-08 | 31 |
| 0.9599 | 0.6776 | 0.9667 | 0.6761 | 9.999472e-08 | 32 |
| 0.9532 | 0.6776 | 0.9653 | 0.6761 | 9.9994395e-08 | 33 |
| 0.9484 | 0.6776 | 0.9640 | 0.6761 | 9.9994054e-08 | 34 |
| 0.9447 | 0.6776 | 0.9629 | 0.6761 | 9.9993706e-08 | 35 |
| 0.9481 | 0.6776 | 0.9619 | 0.6761 | 9.999334e-08 | 36 |
| 0.9440 | 0.6776 | 0.9609 | 0.6761 | 9.9992974e-08 | 37 |
| 0.9474 | 0.6776 | 0.9599 | 0.6761 | 9.99926e-08 | 38 |
| 0.9468 | 0.6776 | 0.9591 | 0.6761 | 9.999221e-08 | 39 |
| 0.9512 | 0.6776 | 0.9582 | 0.6761 | 9.999181e-08 | 40 |
| 0.9437 | 0.6776 | 0.9574 | 0.6761 | 9.99914e-08 | 41 |
| 0.9430 | 0.6776 | 0.9566 | 0.6761 | 9.999098e-08 | 42 |
| 0.9372 | 0.6776 | 0.9560 | 0.6761 | 9.9990544e-08 | 43 |
| 0.9351 | 0.6776 | 0.9552 | 0.6761 | 9.99901e-08 | 44 |
| 0.9323 | 0.6776 | 0.9545 | 0.6761 | 9.9989656e-08 | 45 |
| 0.9300 | 0.6776 | 0.9538 | 0.6761 | 9.9989194e-08 | 46 |
| 0.9310 | 0.6776 | 0.9532 | 0.6761 | 9.9988725e-08 | 47 |
| 0.9332 | 0.6776 | 0.9527 | 0.6761 | 9.998824e-08 | 48 |
| 0.9280 | 0.6776 | 0.9521 | 0.6761 | 9.998775e-08 | 49 |
| 0.9335 | 0.6776 | 0.9515 | 0.6761 | 9.9987254e-08 | 50 |
| 0.9278 | 0.6776 | 0.9509 | 0.6761 | 9.998674e-08 | 51 |
| 0.9259 | 0.6776 | 0.9503 | 0.6761 | 9.9986224e-08 | 52 |
| 0.9329 | 0.6776 | 0.9496 | 0.6761 | 9.998569e-08 | 53 |
| 0.9235 | 0.6776 | 0.9491 | 0.6761 | 9.998515e-08 | 54 |
| 0.9306 | 0.6776 | 0.9485 | 0.6761 | 9.9984604e-08 | 55 |
| 0.9229 | 0.6776 | 0.9480 | 0.6761 | 9.998404e-08 | 56 |
| 0.9215 | 0.6776 | 0.9475 | 0.6761 | 9.9983474e-08 | 57 |
| 0.9220 | 0.6776 | 0.9469 | 0.6761 | 9.998289e-08 | 58 |
| 0.9236 | 0.6776 | 0.9464 | 0.6761 | 9.99823e-08 | 59 |
| 0.9212 | 0.6776 | 0.9460 | 0.6761 | 9.9981705e-08 | 60 |
| 0.9134 | 0.6776 | 0.9454 | 0.6761 | 9.9981094e-08 | 61 |
| 0.9215 | 0.6776 | 0.9448 | 0.6761 | 9.9980475e-08 | 62 |
| 0.9192 | 0.6776 | 0.9442 | 0.6761 | 9.997984e-08 | 63 |
| 0.9167 | 0.6776 | 0.9439 | 0.6761 | 9.9979204e-08 | 64 |
| 0.9194 | 0.6776 | 0.9433 | 0.6761 | 9.997856e-08 | 65 |
| 0.9142 | 0.6776 | 0.9428 | 0.6761 | 9.9977896e-08 | 66 |
| 0.9135 | 0.6776 | 0.9423 | 0.6761 | 9.997723e-08 | 67 |
| 0.9058 | 0.6776 | 0.9419 | 0.6761 | 9.9976546e-08 | 68 |
| 0.9134 | 0.6776 | 0.9415 | 0.6761 | 9.997586e-08 | 69 |
| 0.9129 | 0.6776 | 0.9411 | 0.6761 | 9.997516e-08 | 70 |
| 0.9128 | 0.6776 | 0.9407 | 0.6761 | 9.997445e-08 | 71 |
| 0.9099 | 0.6776 | 0.9404 | 0.6761 | 9.997373e-08 | 72 |
| 0.9110 | 0.6776 | 0.9400 | 0.6761 | 9.9973e-08 | 73 |
| 0.8994 | 0.6776 | 0.9394 | 0.6761 | 9.997226e-08 | 74 |
| 0.9065 | 0.6776 | 0.9388 | 0.6761 | 9.997151e-08 | 75 |
| 0.9038 | 0.6776 | 0.9382 | 0.6761 | 9.997075e-08 | 76 |
| 0.9062 | 0.6776 | 0.9376 | 0.6761 | 9.996998e-08 | 77 |
| 0.9011 | 0.6776 | 0.9370 | 0.6761 | 9.99692e-08 | 78 |
| 0.9015 | 0.6776 | 0.9366 | 0.6761 | 9.996841e-08 | 79 |
| 0.8978 | 0.6776 | 0.9361 | 0.6761 | 9.996761e-08 | 80 |
| 0.9003 | 0.6776 | 0.9355 | 0.6761 | 9.99668e-08 | 81 |
| 0.9023 | 0.6776 | 0.9349 | 0.6761 | 9.996598e-08 | 82 |
| 0.9083 | 0.6776 | 0.9345 | 0.6761 | 9.996515e-08 | 83 |
| 0.8979 | 0.6776 | 0.9341 | 0.6761 | 9.996431e-08 | 84 |
| 0.8943 | 0.6776 | 0.9334 | 0.6761 | 9.996346e-08 | 85 |
| 0.8877 | 0.6776 | 0.9328 | 0.6761 | 9.99626e-08 | 86 |
| 0.8946 | 0.6776 | 0.9322 | 0.6761 | 9.996173e-08 | 87 |
| 0.8964 | 0.6776 | 0.9318 | 0.6761 | 9.996085e-08 | 88 |
| 0.8905 | 0.6776 | 0.9313 | 0.6761 | 9.995996e-08 | 89 |
| 0.8941 | 0.6776 | 0.9307 | 0.6761 | 9.995906e-08 | 90 |
| 0.8883 | 0.6776 | 0.9302 | 0.6761 | 9.995815e-08 | 91 |
| 0.8906 | 0.6776 | 0.9297 | 0.6761 | 9.9957234e-08 | 92 |
| 0.8901 | 0.6776 | 0.9291 | 0.6761 | 9.99563e-08 | 93 |
| 0.8811 | 0.6776 | 0.9287 | 0.6761 | 9.9955365e-08 | 94 |
| 0.8866 | 0.6800 | 0.9283 | 0.6761 | 9.995441e-08 | 95 |
| 0.8830 | 0.6800 | 0.9278 | 0.6761 | 9.995345e-08 | 96 |
| 0.8810 | 0.6800 | 0.9272 | 0.6761 | 9.995249e-08 | 97 |
| 0.8823 | 0.6776 | 0.9266 | 0.6761 | 9.995151e-08 | 98 |
| 0.8852 | 0.6776 | 0.9259 | 0.6761 | 9.995052e-08 | 99 |
| 0.8770 | 0.6776 | 0.9253 | 0.6761 | 9.994952e-08 | 100 |
| 0.8847 | 0.6800 | 0.9246 | 0.6761 | 9.994851e-08 | 101 |
| 0.8823 | 0.6776 | 0.9241 | 0.6761 | 9.994749e-08 | 102 |
| 0.8843 | 0.6776 | 0.9237 | 0.6761 | 9.994646e-08 | 103 |
| 0.8753 | 0.6800 | 0.9229 | 0.6761 | 9.9945424e-08 | 104 |
| 0.8781 | 0.6824 | 0.9224 | 0.6761 | 9.994437e-08 | 105 |
| 0.8729 | 0.6800 | 0.9221 | 0.6761 | 9.9943314e-08 | 106 |
| 0.8797 | 0.6776 | 0.9217 | 0.6761 | 9.994224e-08 | 107 |
| 0.8728 | 0.6776 | 0.9211 | 0.6761 | 9.994116e-08 | 108 |
| 0.8768 | 0.6776 | 0.9207 | 0.6761 | 9.9940074e-08 | 109 |
| 0.8686 | 0.6776 | 0.9204 | 0.6761 | 9.993897e-08 | 110 |
| 0.8737 | 0.6824 | 0.9197 | 0.6761 | 9.9937864e-08 | 111 |
| 0.8722 | 0.6776 | 0.9190 | 0.6761 | 9.993674e-08 | 112 |
| 0.8702 | 0.6800 | 0.9185 | 0.6761 | 9.993561e-08 | 113 |
| 0.8663 | 0.6776 | 0.9179 | 0.6761 | 9.9934475e-08 | 114 |
| 0.8674 | 0.6800 | 0.9175 | 0.6761 | 9.9933324e-08 | 115 |
| 0.8639 | 0.6800 | 0.9171 | 0.6761 | 9.9932166e-08 | 116 |
| 0.8687 | 0.6800 | 0.9165 | 0.6761 | 9.993099e-08 | 117 |
| 0.8636 | 0.6800 | 0.9159 | 0.6761 | 9.9929814e-08 | 118 |
| 0.8623 | 0.6824 | 0.9156 | 0.6761 | 9.992863e-08 | 119 |
| 0.8685 | 0.6800 | 0.9154 | 0.6761 | 9.9927426e-08 | 120 |
| 0.8619 | 0.6800 | 0.9148 | 0.6761 | 9.992622e-08 | 121 |
| 0.8645 | 0.6800 | 0.9143 | 0.6761 | 9.9924996e-08 | 122 |
| 0.8535 | 0.6800 | 0.9135 | 0.6761 | 9.992377e-08 | 123 |
| 0.8547 | 0.6824 | 0.9131 | 0.6761 | 9.992253e-08 | 124 |
| 0.8631 | 0.6824 | 0.9126 | 0.6761 | 9.992128e-08 | 125 |
| 0.8538 | 0.6824 | 0.9118 | 0.6761 | 9.992002e-08 | 126 |
| 0.8532 | 0.6800 | 0.9112 | 0.6761 | 9.991875e-08 | 127 |
| 0.8595 | 0.6847 | 0.9107 | 0.6761 | 9.991747e-08 | 128 |
| 0.8527 | 0.6800 | 0.9100 | 0.6761 | 9.9916186e-08 | 129 |
| 0.8518 | 0.6776 | 0.9095 | 0.6761 | 9.9914885e-08 | 130 |
| 0.8459 | 0.6800 | 0.9088 | 0.6761 | 9.991358e-08 | 131 |
| 0.8501 | 0.6847 | 0.9082 | 0.6761 | 9.9912256e-08 | 132 |
| 0.8385 | 0.6824 | 0.9077 | 0.6761 | 9.991093e-08 | 133 |
| 0.8455 | 0.6776 | 0.9072 | 0.6761 | 9.990959e-08 | 134 |
| 0.8504 | 0.6824 | 0.9064 | 0.6761 | 9.990824e-08 | 135 |
| 0.8367 | 0.6824 | 0.9057 | 0.6761 | 9.9906885e-08 | 136 |
| 0.8402 | 0.6871 | 0.9054 | 0.6761 | 9.990551e-08 | 137 |
| 0.8430 | 0.6824 | 0.9047 | 0.6761 | 9.9904135e-08 | 138 |
| 0.8416 | 0.6847 | 0.9042 | 0.6761 | 9.990275e-08 | 139 |
| 0.8371 | 0.6824 | 0.9035 | 0.6761 | 9.990135e-08 | 140 |
| 0.8411 | 0.6871 | 0.9029 | 0.6761 | 9.989994e-08 | 141 |
| 0.8430 | 0.6824 | 0.9023 | 0.6761 | 9.989852e-08 | 142 |
| 0.8304 | 0.6847 | 0.9016 | 0.6761 | 9.989709e-08 | 143 |
| 0.8276 | 0.6847 | 0.9010 | 0.6761 | 9.989566e-08 | 144 |
| 0.8342 | 0.6847 | 0.9005 | 0.6761 | 9.989421e-08 | 145 |
| 0.8314 | 0.6824 | 0.9000 | 0.6761 | 9.989275e-08 | 146 |
| 0.8338 | 0.6847 | 0.8994 | 0.6761 | 9.989128e-08 | 147 |
| 0.8327 | 0.6847 | 0.8990 | 0.6761 | 9.98898e-08 | 148 |
| 0.8327 | 0.6847 | 0.8984 | 0.6761 | 9.988832e-08 | 149 |
| 0.8322 | 0.6847 | 0.8978 | 0.6761 | 9.988682e-08 | 150 |
| 0.8231 | 0.6894 | 0.8971 | 0.6761 | 9.988531e-08 | 151 |
| 0.8240 | 0.6871 | 0.8967 | 0.6761 | 9.988379e-08 | 152 |
| 0.8270 | 0.6847 | 0.8962 | 0.6761 | 9.9882264e-08 | 153 |
| 0.8216 | 0.6894 | 0.8958 | 0.6761 | 9.988073e-08 | 154 |
| 0.8283 | 0.6847 | 0.8953 | 0.6761 | 9.987918e-08 | 155 |
| 0.8211 | 0.6871 | 0.8944 | 0.6761 | 9.9877624e-08 | 156 |
| 0.8297 | 0.6918 | 0.8942 | 0.6761 | 9.9876054e-08 | 157 |
| 0.8211 | 0.6894 | 0.8936 | 0.6761 | 9.987448e-08 | 158 |
| 0.8155 | 0.6871 | 0.8929 | 0.6761 | 9.987289e-08 | 159 |
| 0.8119 | 0.6918 | 0.8927 | 0.6761 | 9.987129e-08 | 160 |
| 0.8152 | 0.6918 | 0.8919 | 0.6761 | 9.986969e-08 | 161 |
| 0.8116 | 0.6941 | 0.8913 | 0.6761 | 9.986807e-08 | 162 |
| 0.8142 | 0.6847 | 0.8906 | 0.6761 | 9.986644e-08 | 163 |
| 0.8187 | 0.6918 | 0.8901 | 0.6761 | 9.9864806e-08 | 164 |
| 0.8054 | 0.6918 | 0.8894 | 0.6761 | 9.986316e-08 | 165 |
| 0.8195 | 0.6894 | 0.8890 | 0.6761 | 9.98615e-08 | 166 |
| 0.8124 | 0.6894 | 0.8884 | 0.6761 | 9.985983e-08 | 167 |
| 0.8099 | 0.6847 | 0.8878 | 0.6761 | 9.9858156e-08 | 168 |
| 0.8060 | 0.6847 | 0.8872 | 0.6761 | 9.9856464e-08 | 169 |
| 0.8052 | 0.6918 | 0.8867 | 0.6761 | 9.9854766e-08 | 170 |
| 0.8073 | 0.6894 | 0.8864 | 0.6761 | 9.985306e-08 | 171 |
| 0.8077 | 0.6894 | 0.8858 | 0.6761 | 9.985134e-08 | 172 |
| 0.8022 | 0.6918 | 0.8853 | 0.6761 | 9.9849615e-08 | 173 |
| 0.8017 | 0.6894 | 0.8850 | 0.6761 | 9.9847874e-08 | 174 |
| 0.8025 | 0.6871 | 0.8846 | 0.6761 | 9.9846126e-08 | 175 |
| 0.7963 | 0.6965 | 0.8841 | 0.6761 | 9.984437e-08 | 176 |
| 0.8057 | 0.6941 | 0.8834 | 0.6690 | 9.98426e-08 | 177 |
| 0.7980 | 0.6871 | 0.8830 | 0.6690 | 9.9840825e-08 | 178 |
| 0.7916 | 0.6965 | 0.8823 | 0.6690 | 9.9839035e-08 | 179 |
| 0.7986 | 0.6988 | 0.8819 | 0.6690 | 9.983724e-08 | 180 |
| 0.7940 | 0.6941 | 0.8814 | 0.6690 | 9.983543e-08 | 181 |
| 0.7916 | 0.7035 | 0.8809 | 0.6690 | 9.983361e-08 | 182 |
| 0.7955 | 0.6941 | 0.8804 | 0.6690 | 9.983179e-08 | 183 |
| 0.7826 | 0.6871 | 0.8800 | 0.6690 | 9.982995e-08 | 184 |
| 0.7890 | 0.6965 | 0.8796 | 0.6690 | 9.98281e-08 | 185 |
| 0.7806 | 0.6894 | 0.8790 | 0.6690 | 9.9826245e-08 | 186 |
| 0.7863 | 0.6988 | 0.8787 | 0.6690 | 9.9824376e-08 | 187 |
| 0.7858 | 0.6941 | 0.8782 | 0.6690 | 9.98225e-08 | 188 |
| 0.7882 | 0.6988 | 0.8778 | 0.6690 | 9.982061e-08 | 189 |
| 0.7893 | 0.7012 | 0.8773 | 0.6690 | 9.981871e-08 | 190 |
| 0.7867 | 0.7012 | 0.8769 | 0.6690 | 9.981681e-08 | 191 |
| 0.7854 | 0.6941 | 0.8763 | 0.6690 | 9.981489e-08 | 192 |
| 0.7790 | 0.6894 | 0.8757 | 0.6761 | 9.9812965e-08 | 193 |
| 0.7874 | 0.7129 | 0.8752 | 0.6761 | 9.9811025e-08 | 194 |
| 0.7837 | 0.7012 | 0.8748 | 0.6761 | 9.980908e-08 | 195 |
| 0.7807 | 0.7035 | 0.8742 | 0.6761 | 9.9807124e-08 | 196 |
| 0.7797 | 0.7012 | 0.8738 | 0.6761 | 9.9805156e-08 | 197 |
| 0.7833 | 0.7106 | 0.8735 | 0.6761 | 9.980318e-08 | 198 |
| 0.7762 | 0.6988 | 0.8729 | 0.6761 | 9.980119e-08 | 199 |
| 0.7678 | 0.6988 | 0.8725 | 0.6761 | 9.9799195e-08 | 200 |
| 0.7771 | 0.7012 | 0.8722 | 0.6761 | 9.979719e-08 | 201 |
| 0.7729 | 0.7059 | 0.8717 | 0.6761 | 9.979517e-08 | 202 |
| 0.7729 | 0.7035 | 0.8714 | 0.6761 | 9.979315e-08 | 203 |
| 0.7722 | 0.7012 | 0.8710 | 0.6761 | 9.979111e-08 | 204 |
| 0.7705 | 0.7035 | 0.8706 | 0.6761 | 9.978906e-08 | 205 |
| 0.7588 | 0.7082 | 0.8704 | 0.6761 | 9.978701e-08 | 206 |
| 0.7616 | 0.7153 | 0.8699 | 0.6761 | 9.978494e-08 | 207 |
| 0.7722 | 0.7059 | 0.8695 | 0.6761 | 9.9782866e-08 | 208 |
| 0.7729 | 0.6988 | 0.8692 | 0.6761 | 9.9780785e-08 | 209 |
| 0.7601 | 0.6988 | 0.8687 | 0.6761 | 9.977869e-08 | 210 |
| 0.7627 | 0.7153 | 0.8684 | 0.6901 | 9.9776585e-08 | 211 |
| 0.7708 | 0.7059 | 0.8680 | 0.6901 | 9.977447e-08 | 212 |
| 0.7554 | 0.7153 | 0.8677 | 0.6901 | 9.977234e-08 | 213 |
| 0.7584 | 0.7059 | 0.8673 | 0.6901 | 9.977021e-08 | 214 |
| 0.7575 | 0.7176 | 0.8669 | 0.6901 | 9.9768066e-08 | 215 |
| 0.7501 | 0.7153 | 0.8665 | 0.6901 | 9.976591e-08 | 216 |
| 0.7515 | 0.7129 | 0.8661 | 0.6901 | 9.9763746e-08 | 217 |
| 0.7647 | 0.7176 | 0.8658 | 0.6831 | 9.976157e-08 | 218 |
| 0.7605 | 0.7318 | 0.8654 | 0.6831 | 9.975939e-08 | 219 |
| 0.7572 | 0.7129 | 0.8651 | 0.6831 | 9.9757195e-08 | 220 |
| 0.7531 | 0.7153 | 0.8647 | 0.6831 | 9.975499e-08 | 221 |
| 0.7501 | 0.7200 | 0.8644 | 0.6831 | 9.9752775e-08 | 222 |
| 0.7514 | 0.7129 | 0.8640 | 0.6831 | 9.975055e-08 | 223 |
| 0.7427 | 0.7318 | 0.8637 | 0.6831 | 9.974832e-08 | 224 |
| 0.7493 | 0.7106 | 0.8633 | 0.6831 | 9.9746075e-08 | 225 |
| 0.7533 | 0.7129 | 0.8628 | 0.6831 | 9.974382e-08 | 226 |
| 0.7429 | 0.7153 | 0.8625 | 0.6831 | 9.9741555e-08 | 227 |
| 0.7452 | 0.7294 | 0.8620 | 0.6831 | 9.973928e-08 | 228 |
| 0.7398 | 0.7200 | 0.8618 | 0.6901 | 9.9737e-08 | 229 |
| 0.7365 | 0.7271 | 0.8618 | 0.6972 | 9.9734706e-08 | 230 |
| 0.7439 | 0.7176 | 0.8614 | 0.6972 | 9.9732404e-08 | 231 |
| 0.7409 | 0.7271 | 0.8609 | 0.6972 | 9.973009e-08 | 232 |
| 0.7357 | 0.7271 | 0.8606 | 0.6901 | 9.9727764e-08 | 233 |
| 0.7455 | 0.7247 | 0.8602 | 0.6972 | 9.972543e-08 | 234 |
| 0.7384 | 0.7318 | 0.8598 | 0.6972 | 9.972309e-08 | 235 |
| 0.7438 | 0.7224 | 0.8595 | 0.6972 | 9.972074e-08 | 236 |
| 0.7346 | 0.7271 | 0.8592 | 0.6972 | 9.971837e-08 | 237 |
| 0.7324 | 0.7294 | 0.8588 | 0.6972 | 9.9716e-08 | 238 |
| 0.7358 | 0.7271 | 0.8585 | 0.6901 | 9.971362e-08 | 239 |
| 0.7464 | 0.7200 | 0.8583 | 0.6901 | 9.971122e-08 | 240 |
| 0.7282 | 0.7365 | 0.8580 | 0.6901 | 9.970882e-08 | 241 |
| 0.7292 | 0.7224 | 0.8577 | 0.6901 | 9.9706405e-08 | 242 |
| 0.7377 | 0.7294 | 0.8574 | 0.6901 | 9.970398e-08 | 243 |
| 0.7248 | 0.7412 | 0.8569 | 0.6901 | 9.970155e-08 | 244 |
| 0.7262 | 0.7365 | 0.8565 | 0.7042 | 9.969911e-08 | 245 |
| 0.7229 | 0.7200 | 0.8560 | 0.6972 | 9.9696656e-08 | 246 |
| 0.7181 | 0.7341 | 0.8557 | 0.6972 | 9.969419e-08 | 247 |
| 0.7273 | 0.7341 | 0.8554 | 0.7113 | 9.969172e-08 | 248 |
| 0.7272 | 0.7412 | 0.8550 | 0.7113 | 9.968924e-08 | 249 |
| 0.7245 | 0.7388 | 0.8547 | 0.7042 | 9.9686744e-08 | 250 |
| 0.7307 | 0.7271 | 0.8543 | 0.7113 | 9.968424e-08 | 251 |
| 0.7147 | 0.7388 | 0.8541 | 0.7113 | 9.968173e-08 | 252 |
| 0.7275 | 0.7435 | 0.8539 | 0.7183 | 9.9679205e-08 | 253 |
| 0.7246 | 0.7341 | 0.8538 | 0.7183 | 9.9676676e-08 | 254 |
| 0.7178 | 0.7412 | 0.8532 | 0.7183 | 9.967413e-08 | 255 |
| 0.7236 | 0.7365 | 0.8528 | 0.7183 | 9.967158e-08 | 256 |
| 0.7230 | 0.7365 | 0.8524 | 0.7183 | 9.966902e-08 | 257 |
| 0.7262 | 0.7294 | 0.8518 | 0.7183 | 9.966645e-08 | 258 |
| 0.7197 | 0.7365 | 0.8516 | 0.7183 | 9.966387e-08 | 259 |
| 0.7114 | 0.7388 | 0.8516 | 0.7183 | 9.966128e-08 | 260 |
| 0.7203 | 0.7294 | 0.8513 | 0.7183 | 9.965868e-08 | 261 |
| 0.7127 | 0.7506 | 0.8509 | 0.7183 | 9.965607e-08 | 262 |
| 0.7184 | 0.7294 | 0.8507 | 0.7183 | 9.965345e-08 | 263 |
| 0.7090 | 0.7529 | 0.8505 | 0.7183 | 9.965082e-08 | 264 |
| 0.7010 | 0.7388 | 0.8501 | 0.7183 | 9.9648176e-08 | 265 |
| 0.7103 | 0.7506 | 0.8497 | 0.7183 | 9.9645526e-08 | 266 |
| 0.7133 | 0.7435 | 0.8495 | 0.7183 | 9.964287e-08 | 267 |
| 0.7045 | 0.7576 | 0.8490 | 0.7183 | 9.96402e-08 | 268 |
| 0.7045 | 0.7318 | 0.8487 | 0.7183 | 9.963752e-08 | 269 |
| 0.7072 | 0.7271 | 0.8485 | 0.7183 | 9.9634825e-08 | 270 |
| 0.7033 | 0.7459 | 0.8483 | 0.7183 | 9.9632125e-08 | 271 |
| 0.7050 | 0.7553 | 0.8480 | 0.7183 | 9.962942e-08 | 272 |
| 0.7084 | 0.7388 | 0.8476 | 0.7183 | 9.9626696e-08 | 273 |
| 0.7123 | 0.7435 | 0.8476 | 0.7183 | 9.962397e-08 | 274 |
| 0.7054 | 0.7576 | 0.8480 | 0.7183 | 9.9621225e-08 | 275 |
| 0.6990 | 0.7459 | 0.8474 | 0.7254 | 9.9618475e-08 | 276 |
| 0.6995 | 0.7435 | 0.8472 | 0.7254 | 9.961572e-08 | 277 |
| 0.6885 | 0.7553 | 0.8471 | 0.7254 | 9.961295e-08 | 278 |
| 0.6993 | 0.7506 | 0.8469 | 0.7183 | 9.961017e-08 | 279 |
| 0.7039 | 0.7600 | 0.8465 | 0.7183 | 9.960738e-08 | 280 |
| 0.6966 | 0.7506 | 0.8457 | 0.7183 | 9.960458e-08 | 281 |
| 0.6908 | 0.7671 | 0.8453 | 0.7183 | 9.960177e-08 | 282 |
| 0.7020 | 0.7459 | 0.8453 | 0.7183 | 9.959895e-08 | 283 |
| 0.7047 | 0.7224 | 0.8449 | 0.7183 | 9.959612e-08 | 284 |
| 0.6943 | 0.7388 | 0.8449 | 0.7183 | 9.959329e-08 | 285 |
| 0.6984 | 0.7553 | 0.8448 | 0.7183 | 9.959044e-08 | 286 |
| 0.6862 | 0.7553 | 0.8445 | 0.7183 | 9.958758e-08 | 287 |
| 0.6907 | 0.7506 | 0.8444 | 0.7183 | 9.958471e-08 | 288 |
| 0.7013 | 0.7365 | 0.8441 | 0.7183 | 9.958183e-08 | 289 |
| 0.6907 | 0.7459 | 0.8440 | 0.7113 | 9.957895e-08 | 290 |
| 0.6824 | 0.7647 | 0.8438 | 0.7113 | 9.957605e-08 | 291 |
| 0.6784 | 0.7506 | 0.8433 | 0.7183 | 9.957314e-08 | 292 |
| 0.6933 | 0.7553 | 0.8429 | 0.7183 | 9.957022e-08 | 293 |
| 0.6799 | 0.7506 | 0.8428 | 0.7183 | 9.9567295e-08 | 294 |
| 0.6886 | 0.7600 | 0.8430 | 0.7113 | 9.956436e-08 | 295 |
| 0.6766 | 0.7600 | 0.8428 | 0.7113 | 9.956141e-08 | 296 |
| 0.6825 | 0.7482 | 0.8427 | 0.7113 | 9.9558456e-08 | 297 |
| 0.6797 | 0.7529 | 0.8428 | 0.7113 | 9.9555486e-08 | 298 |
| 0.6800 | 0.7576 | 0.8431 | 0.7183 | 9.955251e-08 | 299 |
| 0.6791 | 0.7553 | 0.8424 | 0.7183 | 9.9549524e-08 | 300 |
| 0.6857 | 0.7482 | 0.8419 | 0.7113 | 9.9546526e-08 | 301 |
| 0.6802 | 0.7482 | 0.8420 | 0.7183 | 9.954352e-08 | 302 |
| 0.6684 | 0.7482 | 0.8418 | 0.7183 | 9.954051e-08 | 303 |
| 0.6822 | 0.7482 | 0.8413 | 0.7113 | 9.953748e-08 | 304 |
| 0.6771 | 0.7600 | 0.8411 | 0.7113 | 9.953445e-08 | 305 |
| 0.6775 | 0.7553 | 0.8408 | 0.7113 | 9.95314e-08 | 306 |
| 0.6808 | 0.7600 | 0.8406 | 0.7113 | 9.952834e-08 | 307 |
| 0.6794 | 0.7529 | 0.8406 | 0.7113 | 9.952528e-08 | 308 |
| 0.6684 | 0.7718 | 0.8407 | 0.7183 | 9.9522204e-08 | 309 |
| 0.6757 | 0.7671 | 0.8408 | 0.7183 | 9.951912e-08 | 310 |
| 0.6698 | 0.7529 | 0.8407 | 0.7183 | 9.951602e-08 | 311 |
| 0.6625 | 0.7600 | 0.8403 | 0.7183 | 9.951292e-08 | 312 |
| 0.6626 | 0.7624 | 0.8398 | 0.7183 | 9.9509805e-08 | 313 |
| 0.6691 | 0.7529 | 0.8401 | 0.7183 | 9.950668e-08 | 314 |
| 0.6706 | 0.7718 | 0.8403 | 0.7113 | 9.9503545e-08 | 315 |
| 0.6716 | 0.7624 | 0.8401 | 0.7113 | 9.95004e-08 | 316 |
| 0.6713 | 0.7576 | 0.8399 | 0.7113 | 9.949724e-08 | 317 |
| 0.6576 | 0.7506 | 0.8398 | 0.7113 | 9.949408e-08 | 318 |
| 0.6596 | 0.7576 | 0.8392 | 0.7113 | 9.9490904e-08 | 319 |
| 0.6537 | 0.7788 | 0.8391 | 0.7113 | 9.948772e-08 | 320 |
| 0.6604 | 0.7624 | 0.8392 | 0.7113 | 9.948453e-08 | 321 |
| 0.6736 | 0.7600 | 0.8390 | 0.7113 | 9.9481326e-08 | 322 |
| 0.6524 | 0.7765 | 0.8386 | 0.7113 | 9.9478115e-08 | 323 |
| 0.6555 | 0.7741 | 0.8388 | 0.7042 | 9.947489e-08 | 324 |
| 0.6543 | 0.7741 | 0.8394 | 0.7042 | 9.9471656e-08 | 325 |
| 0.6643 | 0.7600 | 0.8384 | 0.7042 | 9.9468416e-08 | 326 |
| 0.6537 | 0.7671 | 0.8383 | 0.7042 | 9.946516e-08 | 327 |
| 0.6601 | 0.7718 | 0.8380 | 0.7042 | 9.94619e-08 | 328 |
| 0.6618 | 0.7647 | 0.8378 | 0.7042 | 9.9458624e-08 | 329 |
| 0.6571 | 0.7553 | 0.8377 | 0.7042 | 9.945534e-08 | 330 |
| 0.6575 | 0.7624 | 0.8379 | 0.7042 | 9.945205e-08 | 331 |
| 0.6616 | 0.7741 | 0.8373 | 0.7042 | 9.944875e-08 | 332 |
| 0.6515 | 0.7576 | 0.8372 | 0.7042 | 9.944544e-08 | 333 |
| 0.6510 | 0.7859 | 0.8369 | 0.7042 | 9.944212e-08 | 334 |
| 0.6486 | 0.7624 | 0.8364 | 0.7042 | 9.9438786e-08 | 335 |
| 0.6542 | 0.7624 | 0.8361 | 0.7042 | 9.943545e-08 | 336 |
| 0.6462 | 0.7694 | 0.8360 | 0.7042 | 9.943209e-08 | 337 |
| 0.6562 | 0.7576 | 0.8366 | 0.7042 | 9.942873e-08 | 338 |
| 0.6482 | 0.7741 | 0.8366 | 0.7042 | 9.9425364e-08 | 339 |
| 0.6529 | 0.7741 | 0.8363 | 0.7042 | 9.942198e-08 | 340 |
| 0.6430 | 0.7647 | 0.8354 | 0.7042 | 9.941859e-08 | 341 |
| 0.6554 | 0.7671 | 0.8354 | 0.7042 | 9.941519e-08 | 342 |
| 0.6419 | 0.7694 | 0.8356 | 0.7042 | 9.941178e-08 | 343 |
| 0.6402 | 0.7647 | 0.8355 | 0.7042 | 9.940836e-08 | 344 |
| 0.6568 | 0.7647 | 0.8355 | 0.7042 | 9.940493e-08 | 345 |
| 0.6463 | 0.7671 | 0.8364 | 0.6972 | 9.940149e-08 | 346 |
| 0.6481 | 0.7647 | 0.8360 | 0.7042 | 9.9398044e-08 | 347 |
| 0.6414 | 0.7694 | 0.8363 | 0.6972 | 9.939458e-08 | 348 |
| 0.6439 | 0.7647 | 0.8362 | 0.6972 | 9.9391116e-08 | 349 |
| 0.6385 | 0.7835 | 0.8360 | 0.6972 | 9.9387634e-08 | 350 |
| 0.6433 | 0.7671 | 0.8363 | 0.6972 | 9.9384145e-08 | 351 |
| 0.6433 | 0.7718 | 0.8370 | 0.6972 | 9.938065e-08 | 352 |
| 0.6339 | 0.7812 | 0.8365 | 0.6972 | 9.937714e-08 | 353 |
| 0.6388 | 0.7718 | 0.8362 | 0.6972 | 9.937362e-08 | 354 |
| 0.6290 | 0.7882 | 0.8354 | 0.6972 | 9.93701e-08 | 355 |
| 0.6343 | 0.7718 | 0.8354 | 0.6972 | 9.936656e-08 | 356 |
| 0.6247 | 0.7741 | 0.8355 | 0.6972 | 9.9363014e-08 | 357 |
| 0.6323 | 0.7741 | 0.8350 | 0.7113 | 9.9359454e-08 | 358 |
| 0.6401 | 0.7718 | 0.8351 | 0.6972 | 9.935589e-08 | 359 |
| 0.6339 | 0.7741 | 0.8348 | 0.6972 | 9.935231e-08 | 360 |
| 0.6250 | 0.7741 | 0.8352 | 0.6972 | 9.9348725e-08 | 361 |
| 0.6288 | 0.7788 | 0.8352 | 0.6972 | 9.934513e-08 | 362 |
| 0.6255 | 0.7765 | 0.8346 | 0.6972 | 9.934152e-08 | 363 |
| 0.6246 | 0.7788 | 0.8343 | 0.6972 | 9.93379e-08 | 364 |
| 0.6267 | 0.7765 | 0.8349 | 0.6972 | 9.933428e-08 | 365 |
| 0.6260 | 0.7859 | 0.8359 | 0.6972 | 9.933064e-08 | 366 |
| 0.6259 | 0.7788 | 0.8350 | 0.6972 | 9.9326996e-08 | 367 |
| 0.6224 | 0.7835 | 0.8343 | 0.6972 | 9.9323344e-08 | 368 |
| 0.6251 | 0.7882 | 0.8342 | 0.6972 | 9.931968e-08 | 369 |
| 0.6258 | 0.7906 | 0.8348 | 0.6972 | 9.9316004e-08 | 370 |
| 0.6202 | 0.7812 | 0.8356 | 0.6972 | 9.931232e-08 | 371 |
| 0.6260 | 0.7765 | 0.8349 | 0.6972 | 9.930862e-08 | 372 |
| 0.6243 | 0.7765 | 0.8344 | 0.6972 | 9.930492e-08 | 373 |
| 0.6274 | 0.7788 | 0.8339 | 0.6972 | 9.9301204e-08 | 374 |
| 0.6138 | 0.7788 | 0.8340 | 0.6972 | 9.929748e-08 | 375 |
| 0.6146 | 0.7788 | 0.8340 | 0.6972 | 9.929375e-08 | 376 |
| 0.6163 | 0.7741 | 0.8338 | 0.6972 | 9.9290006e-08 | 377 |
| 0.6137 | 0.7788 | 0.8341 | 0.6901 | 9.9286254e-08 | 378 |
| 0.6191 | 0.7765 | 0.8346 | 0.6972 | 9.928249e-08 | 379 |
| 0.6184 | 0.7835 | 0.8342 | 0.6901 | 9.9278715e-08 | 380 |
| 0.6177 | 0.8024 | 0.8337 | 0.6901 | 9.9274935e-08 | 381 |
| 0.6233 | 0.7741 | 0.8333 | 0.6901 | 9.927114e-08 | 382 |
| 0.6168 | 0.7953 | 0.8332 | 0.6901 | 9.926734e-08 | 383 |
| 0.6084 | 0.7953 | 0.8331 | 0.6901 | 9.926353e-08 | 384 |
| 0.6162 | 0.7812 | 0.8328 | 0.6901 | 9.925971e-08 | 385 |
| 0.6226 | 0.7906 | 0.8327 | 0.7042 | 9.925588e-08 | 386 |
| 0.6151 | 0.7835 | 0.8321 | 0.6901 | 9.9252034e-08 | 387 |
| 0.6160 | 0.7765 | 0.8316 | 0.6901 | 9.924818e-08 | 388 |
| 0.6201 | 0.7859 | 0.8317 | 0.6901 | 9.9244325e-08 | 389 |
| 0.6161 | 0.7812 | 0.8318 | 0.6972 | 9.924045e-08 | 390 |
| 0.6107 | 0.7765 | 0.8315 | 0.6972 | 9.923657e-08 | 391 |
| 0.6141 | 0.7765 | 0.8316 | 0.7042 | 9.9232686e-08 | 392 |
| 0.6166 | 0.7835 | 0.8322 | 0.7113 | 9.9228785e-08 | 393 |
| 0.6043 | 0.7882 | 0.8314 | 0.7113 | 9.922488e-08 | 394 |
| 0.6064 | 0.7788 | 0.8325 | 0.7183 | 9.9220955e-08 | 395 |
| 0.6040 | 0.7835 | 0.8323 | 0.7183 | 9.9217026e-08 | 396 |
| 0.6046 | 0.7812 | 0.8325 | 0.7183 | 9.921309e-08 | 397 |
| 0.6007 | 0.8071 | 0.8324 | 0.7183 | 9.920914e-08 | 398 |
| 0.6078 | 0.7835 | 0.8309 | 0.7113 | 9.920518e-08 | 399 |
| 0.6051 | 0.7929 | 0.8306 | 0.7042 | 9.9201216e-08 | 400 |
| 0.5952 | 0.7812 | 0.8306 | 0.7183 | 9.919724e-08 | 401 |
| 0.5973 | 0.7929 | 0.8310 | 0.7183 | 9.919325e-08 | 402 |
| 0.6055 | 0.7929 | 0.8311 | 0.7183 | 9.918925e-08 | 403 |
| 0.5996 | 0.7906 | 0.8302 | 0.7042 | 9.918524e-08 | 404 |
| 0.5921 | 0.7953 | 0.8299 | 0.7042 | 9.918123e-08 | 405 |
| 0.6025 | 0.7953 | 0.8311 | 0.7254 | 9.91772e-08 | 406 |
| 0.6109 | 0.7835 | 0.8311 | 0.7254 | 9.9173164e-08 | 407 |
| 0.6025 | 0.7906 | 0.8311 | 0.7254 | 9.916912e-08 | 408 |
| 0.5965 | 0.7882 | 0.8311 | 0.7254 | 9.9165064e-08 | 409 |
| 0.5990 | 0.7835 | 0.8306 | 0.7254 | 9.9161e-08 | 410 |
| 0.5870 | 0.7906 | 0.8307 | 0.7254 | 9.915692e-08 | 411 |
| 0.5908 | 0.7906 | 0.8302 | 0.7254 | 9.9152835e-08 | 412 |
| 0.5990 | 0.7929 | 0.8307 | 0.7324 | 9.914874e-08 | 413 |
| 0.5885 | 0.7976 | 0.8303 | 0.7324 | 9.9144636e-08 | 414 |
| 0.5916 | 0.7976 | 0.8300 | 0.7254 | 9.914052e-08 | 415 |
| 0.5923 | 0.7882 | 0.8302 | 0.7324 | 9.91364e-08 | 416 |
| 0.6001 | 0.7788 | 0.8302 | 0.7324 | 9.9132265e-08 | 417 |
| 0.5871 | 0.7859 | 0.8300 | 0.7324 | 9.912812e-08 | 418 |
| 0.5939 | 0.7929 | 0.8303 | 0.7324 | 9.9123966e-08 | 419 |
| 0.5956 | 0.7976 | 0.8298 | 0.7183 | 9.91198e-08 | 420 |
| 0.5913 | 0.7835 | 0.8295 | 0.7183 | 9.911563e-08 | 421 |
| 0.5963 | 0.7859 | 0.8299 | 0.7254 | 9.9111446e-08 | 422 |
| 0.5967 | 0.7765 | 0.8295 | 0.7254 | 9.9107254e-08 | 423 |
| 0.5910 | 0.7741 | 0.8297 | 0.7254 | 9.9103055e-08 | 424 |
| 0.5875 | 0.7835 | 0.8295 | 0.7254 | 9.909884e-08 | 425 |
| 0.5872 | 0.7906 | 0.8299 | 0.7254 | 9.909462e-08 | 426 |
| 0.5876 | 0.7882 | 0.8296 | 0.7254 | 9.909039e-08 | 427 |
| 0.5791 | 0.7906 | 0.8297 | 0.7254 | 9.908615e-08 | 428 |
| 0.6050 | 0.7788 | 0.8287 | 0.7254 | 9.90819e-08 | 429 |
| 0.5830 | 0.7906 | 0.8287 | 0.7254 | 9.907764e-08 | 430 |
| 0.5901 | 0.7906 | 0.8287 | 0.7254 | 9.907337e-08 | 431 |
| 0.5885 | 0.8000 | 0.8294 | 0.7254 | 9.906909e-08 | 432 |
| 0.5826 | 0.7859 | 0.8297 | 0.7254 | 9.90648e-08 | 433 |
| 0.5680 | 0.7906 | 0.8307 | 0.7254 | 9.90605e-08 | 434 |
| 0.5878 | 0.7906 | 0.8298 | 0.7324 | 9.9056194e-08 | 435 |
| 0.5839 | 0.7976 | 0.8295 | 0.7254 | 9.9051874e-08 | 436 |
| 0.5836 | 0.7835 | 0.8291 | 0.7324 | 9.904755e-08 | 437 |
| 0.5877 | 0.7976 | 0.8291 | 0.7324 | 9.9043206e-08 | 438 |
| 0.5726 | 0.7953 | 0.8280 | 0.7324 | 9.903886e-08 | 439 |
| 0.5726 | 0.8000 | 0.8285 | 0.7254 | 9.90345e-08 | 440 |
| 0.5738 | 0.7929 | 0.8288 | 0.7254 | 9.903013e-08 | 441 |
| 0.5836 | 0.7929 | 0.8294 | 0.7254 | 9.9025755e-08 | 442 |
| 0.5769 | 0.7953 | 0.8292 | 0.7254 | 9.902137e-08 | 443 |
| 0.5747 | 0.7953 | 0.8288 | 0.7254 | 9.901697e-08 | 444 |
| 0.5700 | 0.7976 | 0.8290 | 0.7254 | 9.901257e-08 | 445 |
| 0.5756 | 0.8094 | 0.8289 | 0.7254 | 9.9008155e-08 | 446 |
| 0.5776 | 0.7976 | 0.8281 | 0.7324 | 9.900373e-08 | 447 |
| 0.5757 | 0.7835 | 0.8287 | 0.7324 | 9.8999294e-08 | 448 |
| 0.5735 | 0.8000 | 0.8288 | 0.7324 | 9.8994846e-08 | 449 |
| 0.5719 | 0.7929 | 0.8287 | 0.7324 | 9.899039e-08 | 450 |
| 0.5804 | 0.8000 | 0.8283 | 0.7324 | 9.898593e-08 | 451 |
| 0.5756 | 0.8000 | 0.8280 | 0.7324 | 9.898145e-08 | 452 |
| 0.5651 | 0.8024 | 0.8280 | 0.7324 | 9.897697e-08 | 453 |
| 0.5587 | 0.8000 | 0.8290 | 0.7324 | 9.897248e-08 | 454 |
| 0.5730 | 0.7976 | 0.8309 | 0.7254 | 9.896797e-08 | 455 |
| 0.5596 | 0.8094 | 0.8304 | 0.7254 | 9.896346e-08 | 456 |
| 0.5719 | 0.8094 | 0.8297 | 0.7254 | 9.895894e-08 | 457 |
| 0.5621 | 0.8000 | 0.8299 | 0.7254 | 9.895441e-08 | 458 |
| 0.5619 | 0.8000 | 0.8298 | 0.7254 | 9.894987e-08 | 459 |
| 0.5708 | 0.7882 | 0.8289 | 0.7254 | 9.8945314e-08 | 460 |
| 0.5629 | 0.7859 | 0.8281 | 0.7254 | 9.894075e-08 | 461 |
| 0.5627 | 0.8094 | 0.8292 | 0.7254 | 9.8936184e-08 | 462 |
| 0.5616 | 0.8071 | 0.8297 | 0.7254 | 9.89316e-08 | 463 |
| 0.5652 | 0.8024 | 0.8302 | 0.7254 | 9.892701e-08 | 464 |
| 0.5720 | 0.8000 | 0.8305 | 0.7254 | 9.892241e-08 | 465 |
| 0.5713 | 0.7906 | 0.8297 | 0.7254 | 9.89178e-08 | 466 |
| 0.5643 | 0.8024 | 0.8294 | 0.7254 | 9.891318e-08 | 467 |
| 0.5478 | 0.8141 | 0.8288 | 0.7254 | 9.890856e-08 | 468 |
| 0.5510 | 0.8071 | 0.8287 | 0.7254 | 9.890392e-08 | 469 |
| 0.5560 | 0.8071 | 0.8290 | 0.7254 | 9.889927e-08 | 470 |
| 0.5532 | 0.8141 | 0.8279 | 0.7254 | 9.889461e-08 | 471 |
| 0.5564 | 0.8094 | 0.8294 | 0.7254 | 9.888994e-08 | 472 |
| 0.5629 | 0.7953 | 0.8301 | 0.7254 | 9.8885266e-08 | 473 |
| 0.5590 | 0.7976 | 0.8301 | 0.7254 | 9.888058e-08 | 474 |
| 0.5504 | 0.8071 | 0.8288 | 0.7254 | 9.887588e-08 | 475 |
| 0.5650 | 0.8047 | 0.8283 | 0.7254 | 9.8871176e-08 | 476 |
| 0.5545 | 0.8024 | 0.8280 | 0.7254 | 9.886646e-08 | 477 |
| 0.5631 | 0.7929 | 0.8282 | 0.7254 | 9.886173e-08 | 478 |
| 0.5557 | 0.8024 | 0.8272 | 0.7254 | 9.8857e-08 | 479 |
| 0.5582 | 0.8071 | 0.8282 | 0.7254 | 9.8852254e-08 | 480 |
| 0.5461 | 0.8094 | 0.8285 | 0.7254 | 9.88475e-08 | 481 |
| 0.5453 | 0.8071 | 0.8291 | 0.7254 | 9.884273e-08 | 482 |
| 0.5453 | 0.8071 | 0.8296 | 0.7254 | 9.883796e-08 | 483 |
| 0.5530 | 0.7976 | 0.8297 | 0.7254 | 9.8833176e-08 | 484 |
| 0.5531 | 0.8165 | 0.8307 | 0.7254 | 9.882838e-08 | 485 |
| 0.5662 | 0.8094 | 0.8309 | 0.7254 | 9.882358e-08 | 486 |
| 0.5379 | 0.8071 | 0.8291 | 0.7254 | 9.881877e-08 | 487 |
| 0.5464 | 0.8000 | 0.8280 | 0.7254 | 9.881394e-08 | 488 |
| 0.5493 | 0.7976 | 0.8294 | 0.7254 | 9.880911e-08 | 489 |
| 0.5465 | 0.7976 | 0.8303 | 0.7254 | 9.880427e-08 | 490 |
| 0.5508 | 0.8118 | 0.8305 | 0.7254 | 9.879942e-08 | 491 |
| 0.5359 | 0.8165 | 0.8303 | 0.7254 | 9.879456e-08 | 492 |
| 0.5356 | 0.8141 | 0.8314 | 0.7254 | 9.878969e-08 | 493 |
| 0.5428 | 0.8071 | 0.8310 | 0.7254 | 9.878481e-08 | 494 |
| 0.5380 | 0.8188 | 0.8304 | 0.7254 | 9.877992e-08 | 495 |
| 0.5548 | 0.7953 | 0.8293 | 0.7254 | 9.877502e-08 | 496 |
| 0.5428 | 0.8000 | 0.8290 | 0.7254 | 9.877011e-08 | 497 |
| 0.5586 | 0.7906 | 0.8293 | 0.7254 | 9.876519e-08 | 498 |
| 0.5342 | 0.8024 | 0.8290 | 0.7254 | 9.876026e-08 | 499 |
| 0.5394 | 0.8141 | 0.8294 | 0.7254 | 9.875532e-08 | 500 |
| 0.5517 | 0.8000 | 0.8293 | 0.7254 | 9.875038e-08 | 501 |
| 0.5428 | 0.8024 | 0.8288 | 0.7254 | 9.874542e-08 | 502 |
| 0.5427 | 0.8094 | 0.8302 | 0.7254 | 9.874045e-08 | 503 |
| 0.5443 | 0.8000 | 0.8297 | 0.7254 | 9.873548e-08 | 504 |
| 0.5440 | 0.8000 | 0.8300 | 0.7254 | 9.873049e-08 | 505 |
| 0.5308 | 0.8165 | 0.8299 | 0.7254 | 9.8725494e-08 | 506 |
| 0.5451 | 0.8024 | 0.8286 | 0.7254 | 9.872049e-08 | 507 |
| 0.5446 | 0.8141 | 0.8287 | 0.7254 | 9.8715475e-08 | 508 |
| 0.5460 | 0.8118 | 0.8290 | 0.7254 | 9.871045e-08 | 509 |
| 0.5279 | 0.8165 | 0.8292 | 0.7254 | 9.870542e-08 | 510 |
| 0.5259 | 0.8094 | 0.8294 | 0.7254 | 9.8700376e-08 | 511 |
| 0.5224 | 0.8165 | 0.8297 | 0.7254 | 9.8695324e-08 | 512 |
| 0.5349 | 0.8000 | 0.8295 | 0.7254 | 9.869026e-08 | 513 |
| 0.5475 | 0.8094 | 0.8290 | 0.7254 | 9.8685184e-08 | 514 |
| 0.5435 | 0.7906 | 0.8293 | 0.7254 | 9.8680104e-08 | 515 |
| 0.5251 | 0.8306 | 0.8287 | 0.7254 | 9.867501e-08 | 516 |
| 0.5340 | 0.8141 | 0.8290 | 0.7254 | 9.866991e-08 | 517 |
| 0.5263 | 0.8000 | 0.8287 | 0.7254 | 9.86648e-08 | 518 |
| 0.5279 | 0.8235 | 0.8291 | 0.7254 | 9.8659676e-08 | 519 |
| 0.5363 | 0.8118 | 0.8292 | 0.7254 | 9.8654546e-08 | 520 |
| 0.5272 | 0.8071 | 0.8291 | 0.7254 | 9.864941e-08 | 521 |
| 0.5168 | 0.8141 | 0.8288 | 0.7254 | 9.864426e-08 | 522 |
| 0.5306 | 0.8118 | 0.8292 | 0.7254 | 9.86391e-08 | 523 |
| 0.5360 | 0.8071 | 0.8304 | 0.7254 | 9.863393e-08 | 524 |
| 0.5358 | 0.8141 | 0.8295 | 0.7254 | 9.862875e-08 | 525 |
| 0.5307 | 0.8118 | 0.8285 | 0.7254 | 9.8623566e-08 | 526 |
| 0.5272 | 0.8047 | 0.8289 | 0.7254 | 9.861837e-08 | 527 |
| 0.5349 | 0.8212 | 0.8293 | 0.7254 | 9.8613164e-08 | 528 |
| 0.5281 | 0.8118 | 0.8302 | 0.7254 | 9.860795e-08 | 529 |
| 0.5248 | 0.8024 | 0.8297 | 0.7254 | 9.8602726e-08 | 530 |
| 0.5296 | 0.8047 | 0.8303 | 0.7254 | 9.859749e-08 | 531 |
| 0.5337 | 0.8141 | 0.8307 | 0.7183 | 9.8592245e-08 | 532 |
| 0.5235 | 0.8212 | 0.8310 | 0.7183 | 9.858699e-08 | 533 |
| 0.5081 | 0.8165 | 0.8299 | 0.7254 | 9.858172e-08 | 534 |
| 0.5359 | 0.8024 | 0.8291 | 0.7254 | 9.857645e-08 | 535 |
| 0.5138 | 0.8118 | 0.8292 | 0.7254 | 9.8571164e-08 | 536 |
| 0.5239 | 0.8071 | 0.8292 | 0.7254 | 9.856587e-08 | 537 |
| 0.5142 | 0.8047 | 0.8299 | 0.7254 | 9.856057e-08 | 538 |
| 0.5290 | 0.8094 | 0.8294 | 0.7254 | 9.8555255e-08 | 539 |
| 0.5135 | 0.8141 | 0.8292 | 0.7254 | 9.854993e-08 | 540 |
| 0.5158 | 0.8141 | 0.8304 | 0.7254 | 9.8544604e-08 | 541 |
| 0.5086 | 0.8141 | 0.8302 | 0.7254 | 9.853926e-08 | 542 |
| 0.5305 | 0.8094 | 0.8309 | 0.7254 | 9.853391e-08 | 543 |
| 0.5179 | 0.8047 | 0.8310 | 0.7254 | 9.852855e-08 | 544 |
| 0.5171 | 0.8141 | 0.8314 | 0.7183 | 9.852318e-08 | 545 |
| 0.5053 | 0.8212 | 0.8313 | 0.7183 | 9.85178e-08 | 546 |
| 0.5223 | 0.8212 | 0.8314 | 0.7183 | 9.8512416e-08 | 547 |
| 0.5084 | 0.8141 | 0.8308 | 0.7254 | 9.8507016e-08 | 548 |
| 0.5072 | 0.8212 | 0.8313 | 0.7254 | 9.850161e-08 | 549 |
| 0.5174 | 0.8071 | 0.8301 | 0.7254 | 9.8496194e-08 | 550 |
| 0.5128 | 0.8188 | 0.8295 | 0.7254 | 9.8490766e-08 | 551 |
| 0.5044 | 0.8071 | 0.8313 | 0.7183 | 9.848533e-08 | 552 |
| 0.4974 | 0.8259 | 0.8311 | 0.7254 | 9.847989e-08 | 553 |
| 0.5189 | 0.8165 | 0.8314 | 0.7183 | 9.847443e-08 | 554 |
| 0.5161 | 0.8141 | 0.8314 | 0.7183 | 9.8468966e-08 | 555 |
| 0.4974 | 0.8141 | 0.8316 | 0.7183 | 9.8463495e-08 | 556 |
| 0.5077 | 0.8282 | 0.8315 | 0.7183 | 9.845801e-08 | 557 |
| 0.5084 | 0.8094 | 0.8331 | 0.7113 | 9.845252e-08 | 558 |
| 0.4988 | 0.8259 | 0.8331 | 0.7113 | 9.844701e-08 | 559 |
| 0.5178 | 0.8188 | 0.8330 | 0.7113 | 9.84415e-08 | 560 |
| 0.5063 | 0.8259 | 0.8318 | 0.7183 | 9.8435976e-08 | 561 |
| 0.5036 | 0.8165 | 0.8322 | 0.7183 | 9.843044e-08 | 562 |
| 0.5046 | 0.8259 | 0.8317 | 0.7183 | 9.84249e-08 | 563 |
| 0.5053 | 0.8165 | 0.8301 | 0.7254 | 9.841935e-08 | 564 |
| 0.4978 | 0.8118 | 0.8310 | 0.7254 | 9.8413786e-08 | 565 |
| 0.4986 | 0.8165 | 0.8316 | 0.7183 | 9.8408215e-08 | 566 |
| 0.4996 | 0.8259 | 0.8318 | 0.7183 | 9.840264e-08 | 567 |
| 0.5046 | 0.8212 | 0.8323 | 0.7042 | 9.8397045e-08 | 568 |
| 0.5058 | 0.8188 | 0.8321 | 0.7113 | 9.8391446e-08 | 569 |
| 0.4927 | 0.8188 | 0.8327 | 0.7042 | 9.838584e-08 | 570 |
| 0.4856 | 0.8306 | 0.8335 | 0.7113 | 9.838022e-08 | 571 |
| 0.4980 | 0.8306 | 0.8328 | 0.7042 | 9.837459e-08 | 572 |
| 0.4948 | 0.8235 | 0.8324 | 0.7042 | 9.836896e-08 | 573 |
| 0.4987 | 0.8188 | 0.8322 | 0.7113 | 9.836331e-08 | 574 |
| 0.4920 | 0.8306 | 0.8326 | 0.7113 | 9.835765e-08 | 575 |
| 0.5005 | 0.8235 | 0.8327 | 0.7113 | 9.835199e-08 | 576 |
| 0.4951 | 0.8235 | 0.8321 | 0.7113 | 9.834631e-08 | 577 |
| 0.5081 | 0.8235 | 0.8315 | 0.7113 | 9.834063e-08 | 578 |
| 0.4888 | 0.8235 | 0.8314 | 0.7113 | 9.833494e-08 | 579 |
| 0.4969 | 0.8165 | 0.8310 | 0.7113 | 9.832923e-08 | 580 |
| 0.5023 | 0.8165 | 0.8315 | 0.7113 | 9.832352e-08 | 581 |
| 0.4897 | 0.8306 | 0.8317 | 0.7113 | 9.83178e-08 | 582 |
| 0.4984 | 0.8188 | 0.8325 | 0.7183 | 9.8312064e-08 | 583 |
| 0.5020 | 0.8259 | 0.8326 | 0.7183 | 9.830632e-08 | 584 |
| 0.4950 | 0.8188 | 0.8337 | 0.7113 | 9.8300575e-08 | 585 |
| 0.5045 | 0.8188 | 0.8350 | 0.7042 | 9.829481e-08 | 586 |
| 0.4893 | 0.8212 | 0.8347 | 0.7042 | 9.828904e-08 | 587 |
| 0.4852 | 0.8165 | 0.8331 | 0.7183 | 9.8283266e-08 | 588 |
| 0.4781 | 0.8306 | 0.8328 | 0.7183 | 9.8277475e-08 | 589 |
| 0.4934 | 0.8165 | 0.8332 | 0.7113 | 9.827168e-08 | 590 |
| 0.4840 | 0.8094 | 0.8330 | 0.7183 | 9.826587e-08 | 591 |
| 0.4915 | 0.8306 | 0.8322 | 0.7183 | 9.826005e-08 | 592 |
| 0.4846 | 0.8329 | 0.8341 | 0.7042 | 9.8254226e-08 | 593 |
| 0.4825 | 0.8235 | 0.8343 | 0.7042 | 9.824839e-08 | 594 |
| 0.4826 | 0.8353 | 0.8352 | 0.7042 | 9.8242545e-08 | 595 |
| 0.4741 | 0.8376 | 0.8354 | 0.7042 | 9.823669e-08 | 596 |
| 0.4946 | 0.8212 | 0.8346 | 0.7042 | 9.823083e-08 | 597 |
| 0.4850 | 0.8282 | 0.8333 | 0.7113 | 9.822495e-08 | 598 |
| 0.4932 | 0.8235 | 0.8341 | 0.7042 | 9.821907e-08 | 599 |
| 0.4809 | 0.8259 | 0.8336 | 0.7113 | 9.821318e-08 | 600 |
| 0.4901 | 0.8235 | 0.8349 | 0.7042 | 9.820727e-08 | 601 |
| 0.4806 | 0.8259 | 0.8333 | 0.7113 | 9.820136e-08 | 602 |
| 0.4831 | 0.8282 | 0.8328 | 0.7113 | 9.819544e-08 | 603 |
| 0.4845 | 0.8235 | 0.8319 | 0.7042 | 9.818951e-08 | 604 |
| 0.4851 | 0.8235 | 0.8330 | 0.7113 | 9.818357e-08 | 605 |
| 0.4920 | 0.8188 | 0.8330 | 0.7113 | 9.817762e-08 | 606 |
| 0.4853 | 0.8376 | 0.8341 | 0.7113 | 9.817166e-08 | 607 |
| 0.4862 | 0.8212 | 0.8345 | 0.7113 | 9.816569e-08 | 608 |
| 0.4754 | 0.8400 | 0.8349 | 0.7113 | 9.815972e-08 | 609 |
| 0.4828 | 0.8188 | 0.8360 | 0.7042 | 9.815373e-08 | 610 |
| 0.4769 | 0.8329 | 0.8363 | 0.7042 | 9.814773e-08 | 611 |
| 0.4778 | 0.8329 | 0.8368 | 0.7042 | 9.8141726e-08 | 612 |
| 0.4709 | 0.8353 | 0.8366 | 0.7042 | 9.813571e-08 | 613 |
| 0.4735 | 0.8306 | 0.8378 | 0.7042 | 9.812968e-08 | 614 |
| 0.4682 | 0.8353 | 0.8379 | 0.7042 | 9.812365e-08 | 615 |
| 0.4767 | 0.8329 | 0.8365 | 0.7042 | 9.81176e-08 | 616 |
| 0.4774 | 0.8259 | 0.8363 | 0.7042 | 9.811155e-08 | 617 |
| 0.4668 | 0.8353 | 0.8363 | 0.7042 | 9.810549e-08 | 618 |
| 0.4607 | 0.8329 | 0.8365 | 0.7042 | 9.809941e-08 | 619 |
| 0.4601 | 0.8447 | 0.8370 | 0.7042 | 9.809333e-08 | 620 |
| 0.4801 | 0.8282 | 0.8362 | 0.7113 | 9.808724e-08 | 621 |
| 0.4694 | 0.8376 | 0.8349 | 0.7042 | 9.808114e-08 | 622 |
| 0.4862 | 0.8400 | 0.8352 | 0.7113 | 9.807503e-08 | 623 |
| 0.4802 | 0.8259 | 0.8349 | 0.7042 | 9.806891e-08 | 624 |
| 0.4902 | 0.8141 | 0.8355 | 0.7042 | 9.806278e-08 | 625 |
| 0.4697 | 0.8447 | 0.8378 | 0.7042 | 9.805664e-08 | 626 |
| 0.4583 | 0.8494 | 0.8382 | 0.7042 | 9.805049e-08 | 627 |
| 0.4711 | 0.8376 | 0.8371 | 0.7042 | 9.804433e-08 | 628 |
| 0.4596 | 0.8376 | 0.8368 | 0.7042 | 9.8038164e-08 | 629 |
| 0.4716 | 0.8306 | 0.8360 | 0.7113 | 9.803199e-08 | 630 |
| 0.4625 | 0.8400 | 0.8371 | 0.7042 | 9.80258e-08 | 631 |
| 0.4625 | 0.8259 | 0.8373 | 0.7042 | 9.8019605e-08 | 632 |
| 0.4678 | 0.8353 | 0.8372 | 0.7042 | 9.80134e-08 | 633 |
| 0.4554 | 0.8424 | 0.8375 | 0.7042 | 9.800719e-08 | 634 |
| 0.4602 | 0.8424 | 0.8368 | 0.7113 | 9.800097e-08 | 635 |
| 0.4754 | 0.8141 | 0.8362 | 0.7042 | 9.7994736e-08 | 636 |
| 0.4659 | 0.8282 | 0.8364 | 0.7113 | 9.79885e-08 | 637 |
| 0.4613 | 0.8259 | 0.8383 | 0.7042 | 9.7982245e-08 | 638 |
| 0.4642 | 0.8400 | 0.8379 | 0.7042 | 9.7975985e-08 | 639 |
| 0.4566 | 0.8306 | 0.8401 | 0.7042 | 9.796972e-08 | 640 |
| 0.4574 | 0.8282 | 0.8396 | 0.7042 | 9.796344e-08 | 641 |
| 0.4641 | 0.8353 | 0.8401 | 0.7042 | 9.795715e-08 | 642 |
| 0.4656 | 0.8235 | 0.8390 | 0.7042 | 9.795085e-08 | 643 |
| 0.4536 | 0.8282 | 0.8398 | 0.7042 | 9.794454e-08 | 644 |
| 0.4539 | 0.8400 | 0.8398 | 0.7042 | 9.7938226e-08 | 645 |
| 0.4553 | 0.8353 | 0.8402 | 0.7042 | 9.79319e-08 | 646 |
| 0.4639 | 0.8424 | 0.8405 | 0.7042 | 9.7925565e-08 | 647 |
| 0.4593 | 0.8424 | 0.8397 | 0.7042 | 9.791922e-08 | 648 |
| 0.4550 | 0.8471 | 0.8398 | 0.7042 | 9.791287e-08 | 649 |
| 0.4437 | 0.8471 | 0.8378 | 0.7042 | 9.79065e-08 | 650 |
| 0.4563 | 0.8494 | 0.8388 | 0.7042 | 9.790013e-08 | 651 |
| 0.4554 | 0.8376 | 0.8378 | 0.7042 | 9.7893746e-08 | 652 |
| 0.4592 | 0.8353 | 0.8392 | 0.7042 | 9.788735e-08 | 653 |
| 0.4589 | 0.8306 | 0.8395 | 0.7042 | 9.788095e-08 | 654 |
| 0.4574 | 0.8376 | 0.8395 | 0.7042 | 9.787454e-08 | 655 |
| 0.4632 | 0.8282 | 0.8404 | 0.6972 | 9.786812e-08 | 656 |
| 0.4576 | 0.8376 | 0.8405 | 0.6972 | 9.786169e-08 | 657 |
| 0.4461 | 0.8306 | 0.8403 | 0.7042 | 9.785525e-08 | 658 |
| 0.4552 | 0.8376 | 0.8402 | 0.7042 | 9.78488e-08 | 659 |
| 0.4497 | 0.8447 | 0.8408 | 0.7042 | 9.784234e-08 | 660 |
| 0.4513 | 0.8447 | 0.8404 | 0.7042 | 9.783587e-08 | 661 |
| 0.4519 | 0.8447 | 0.8403 | 0.7042 | 9.78294e-08 | 662 |
| 0.4727 | 0.8329 | 0.8405 | 0.7042 | 9.782291e-08 | 663 |
| 0.4550 | 0.8353 | 0.8428 | 0.7042 | 9.781642e-08 | 664 |
| 0.4558 | 0.8353 | 0.8429 | 0.7042 | 9.780992e-08 | 665 |
| 0.4412 | 0.8376 | 0.8443 | 0.7113 | 9.78034e-08 | 666 |
| 0.4488 | 0.8376 | 0.8418 | 0.6972 | 9.779688e-08 | 667 |
| 0.4579 | 0.8376 | 0.8421 | 0.7042 | 9.779035e-08 | 668 |
| 0.4394 | 0.8306 | 0.8425 | 0.6972 | 9.7783804e-08 | 669 |
| 0.4387 | 0.8494 | 0.8414 | 0.7042 | 9.777725e-08 | 670 |
| 0.4549 | 0.8329 | 0.8417 | 0.7042 | 9.7770695e-08 | 671 |
| 0.4465 | 0.8424 | 0.8423 | 0.6972 | 9.776412e-08 | 672 |
| 0.4462 | 0.8447 | 0.8415 | 0.7042 | 9.775754e-08 | 673 |
| 0.4538 | 0.8353 | 0.8410 | 0.7042 | 9.7750956e-08 | 674 |
| 0.4575 | 0.8376 | 0.8427 | 0.6972 | 9.7744355e-08 | 675 |
| 0.4509 | 0.8353 | 0.8430 | 0.6972 | 9.773775e-08 | 676 |
| 0.4323 | 0.8424 | 0.8422 | 0.7042 | 9.773113e-08 | 677 |
| 0.4323 | 0.8518 | 0.8406 | 0.7042 | 9.772451e-08 | 678 |
| 0.4442 | 0.8212 | 0.8417 | 0.7042 | 9.771787e-08 | 679 |
| 0.4421 | 0.8471 | 0.8429 | 0.7042 | 9.771123e-08 | 680 |
| 0.4448 | 0.8376 | 0.8438 | 0.7042 | 9.770458e-08 | 681 |
| 0.4349 | 0.8400 | 0.8440 | 0.7042 | 9.7697914e-08 | 682 |
| 0.4410 | 0.8424 | 0.8448 | 0.6972 | 9.769124e-08 | 683 |
| 0.4390 | 0.8282 | 0.8459 | 0.6972 | 9.768456e-08 | 684 |
| 0.4446 | 0.8565 | 0.8463 | 0.6972 | 9.767787e-08 | 685 |
| 0.4330 | 0.8518 | 0.8436 | 0.7042 | 9.767117e-08 | 686 |
| 0.4463 | 0.8400 | 0.8427 | 0.7042 | 9.766446e-08 | 687 |
| 0.4541 | 0.8424 | 0.8433 | 0.7042 | 9.765774e-08 | 688 |
| 0.4355 | 0.8400 | 0.8419 | 0.7042 | 9.765101e-08 | 689 |
| 0.4466 | 0.8329 | 0.8427 | 0.7042 | 9.7644275e-08 | 690 |
| 0.4253 | 0.8400 | 0.8434 | 0.7042 | 9.7637525e-08 | 691 |
| 0.4356 | 0.8400 | 0.8444 | 0.7042 | 9.763077e-08 | 692 |
| 0.4318 | 0.8518 | 0.8448 | 0.7042 | 9.7624e-08 | 693 |
| 0.4417 | 0.8447 | 0.8442 | 0.7042 | 9.761723e-08 | 694 |
| 0.4277 | 0.8518 | 0.8456 | 0.7042 | 9.7610446e-08 | 695 |
| 0.4415 | 0.8400 | 0.8452 | 0.7042 | 9.760365e-08 | 696 |
| 0.4317 | 0.8471 | 0.8451 | 0.7042 | 9.759685e-08 | 697 |
| 0.4297 | 0.8400 | 0.8449 | 0.7042 | 9.759004e-08 | 698 |
| 0.4178 | 0.8494 | 0.8463 | 0.7042 | 9.758322e-08 | 699 |
| 0.4357 | 0.8400 | 0.8465 | 0.7042 | 9.757639e-08 | 700 |
| 0.4407 | 0.8376 | 0.8471 | 0.7042 | 9.756955e-08 | 701 |
| 0.4238 | 0.8565 | 0.8475 | 0.7113 | 9.75627e-08 | 702 |
| 0.4273 | 0.8518 | 0.8490 | 0.7042 | 9.755584e-08 | 703 |
| 0.4220 | 0.8447 | 0.8484 | 0.7113 | 9.754897e-08 | 704 |
| 0.4213 | 0.8588 | 0.8462 | 0.7042 | 9.754209e-08 | 705 |
| 0.4352 | 0.8494 | 0.8466 | 0.7042 | 9.753521e-08 | 706 |
| 0.4237 | 0.8447 | 0.8479 | 0.7113 | 9.7528314e-08 | 707 |
| 0.4331 | 0.8447 | 0.8463 | 0.7042 | 9.752141e-08 | 708 |
| 0.4306 | 0.8447 | 0.8460 | 0.7042 | 9.7514494e-08 | 709 |
| 0.4230 | 0.8494 | 0.8452 | 0.7042 | 9.7507574e-08 | 710 |
| 0.4268 | 0.8541 | 0.8454 | 0.7042 | 9.750064e-08 | 711 |
| 0.4261 | 0.8612 | 0.8454 | 0.7042 | 9.74937e-08 | 712 |
| 0.4398 | 0.8376 | 0.8463 | 0.7042 | 9.748675e-08 | 713 |
| 0.4180 | 0.8424 | 0.8475 | 0.7042 | 9.7479784e-08 | 714 |
| 0.4239 | 0.8471 | 0.8470 | 0.7042 | 9.7472814e-08 | 715 |
| 0.4353 | 0.8424 | 0.8480 | 0.7113 | 9.7465836e-08 | 716 |
| 0.4131 | 0.8447 | 0.8491 | 0.7113 | 9.745885e-08 | 717 |
| 0.4324 | 0.8424 | 0.8525 | 0.7113 | 9.745185e-08 | 718 |
| 0.4242 | 0.8518 | 0.8513 | 0.7183 | 9.744485e-08 | 719 |
| 0.4216 | 0.8400 | 0.8493 | 0.7113 | 9.7437834e-08 | 720 |
| 0.4212 | 0.8400 | 0.8482 | 0.7113 | 9.743081e-08 | 721 |
| 0.4161 | 0.8518 | 0.8482 | 0.7113 | 9.742377e-08 | 722 |
| 0.4133 | 0.8494 | 0.8489 | 0.7113 | 9.741673e-08 | 723 |
| 0.4118 | 0.8518 | 0.8508 | 0.7113 | 9.7409675e-08 | 724 |
| 0.4073 | 0.8659 | 0.8509 | 0.7113 | 9.740261e-08 | 725 |
| 0.4153 | 0.8494 | 0.8502 | 0.7113 | 9.739554e-08 | 726 |
| 0.4097 | 0.8541 | 0.8500 | 0.7113 | 9.7388465e-08 | 727 |
| 0.4221 | 0.8400 | 0.8493 | 0.7113 | 9.7381374e-08 | 728 |
| 0.4040 | 0.8635 | 0.8506 | 0.7113 | 9.7374276e-08 | 729 |
| 0.4070 | 0.8612 | 0.8508 | 0.7113 | 9.736717e-08 | 730 |
| 0.4144 | 0.8565 | 0.8493 | 0.7113 | 9.736005e-08 | 731 |
| 0.4260 | 0.8494 | 0.8496 | 0.7113 | 9.7352924e-08 | 732 |
| 0.4081 | 0.8612 | 0.8497 | 0.7113 | 9.734579e-08 | 733 |
| 0.4242 | 0.8494 | 0.8500 | 0.7113 | 9.733864e-08 | 734 |
| 0.4070 | 0.8565 | 0.8501 | 0.7113 | 9.733149e-08 | 735 |
| 0.4194 | 0.8518 | 0.8512 | 0.7113 | 9.7324325e-08 | 736 |
| 0.4279 | 0.8518 | 0.8519 | 0.7113 | 9.7317155e-08 | 737 |
| 0.4119 | 0.8588 | 0.8517 | 0.7113 | 9.730997e-08 | 738 |
| 0.4126 | 0.8471 | 0.8529 | 0.7113 | 9.730278e-08 | 739 |
| 0.4193 | 0.8400 | 0.8523 | 0.7113 | 9.729558e-08 | 740 |
| 0.4114 | 0.8447 | 0.8529 | 0.7113 | 9.728837e-08 | 741 |
| 0.4142 | 0.8447 | 0.8543 | 0.7183 | 9.728115e-08 | 742 |
| 0.4097 | 0.8612 | 0.8547 | 0.7183 | 9.7273926e-08 | 743 |
| 0.4014 | 0.8635 | 0.8531 | 0.7113 | 9.726669e-08 | 744 |
| 0.3902 | 0.8635 | 0.8525 | 0.7113 | 9.7259445e-08 | 745 |
| 0.4114 | 0.8494 | 0.8539 | 0.7113 | 9.725219e-08 | 746 |
| 0.4179 | 0.8565 | 0.8542 | 0.7183 | 9.724493e-08 | 747 |
| 0.3993 | 0.8753 | 0.8546 | 0.7183 | 9.723765e-08 | 748 |
| 0.4003 | 0.8541 | 0.8559 | 0.7113 | 9.723037e-08 | 749 |
| 0.4246 | 0.8400 | 0.8561 | 0.7113 | 9.722308e-08 | 750 |
| 0.3973 | 0.8612 | 0.8551 | 0.7183 | 9.7215775e-08 | 751 |
| 0.4115 | 0.8494 | 0.8544 | 0.7113 | 9.720846e-08 | 752 |
| 0.4088 | 0.8424 | 0.8545 | 0.7113 | 9.7201145e-08 | 753 |
| 0.4154 | 0.8400 | 0.8543 | 0.7113 | 9.719382e-08 | 754 |
| 0.4215 | 0.8518 | 0.8549 | 0.7113 | 9.718648e-08 | 755 |
| 0.4047 | 0.8565 | 0.8547 | 0.7113 | 9.717913e-08 | 756 |
| 0.4058 | 0.8424 | 0.8560 | 0.7183 | 9.717178e-08 | 757 |
| 0.4080 | 0.8376 | 0.8558 | 0.7183 | 9.716441e-08 | 758 |
| 0.4080 | 0.8541 | 0.8562 | 0.7113 | 9.7157034e-08 | 759 |
| 0.3968 | 0.8635 | 0.8570 | 0.7113 | 9.714965e-08 | 760 |
| 0.3936 | 0.8612 | 0.8557 | 0.7183 | 9.714226e-08 | 761 |
| 0.4100 | 0.8565 | 0.8570 | 0.7183 | 9.713486e-08 | 762 |
| 0.3994 | 0.8588 | 0.8564 | 0.7113 | 9.712745e-08 | 763 |
| 0.4114 | 0.8400 | 0.8548 | 0.7183 | 9.712003e-08 | 764 |
| 0.4050 | 0.8518 | 0.8562 | 0.7113 | 9.71126e-08 | 765 |
| 0.3991 | 0.8588 | 0.8579 | 0.7113 | 9.710516e-08 | 766 |
| 0.3984 | 0.8659 | 0.8582 | 0.7113 | 9.709771e-08 | 767 |
| 0.3865 | 0.8659 | 0.8597 | 0.7113 | 9.709026e-08 | 768 |
| 0.4004 | 0.8541 | 0.8581 | 0.7183 | 9.708279e-08 | 769 |
| 0.4130 | 0.8471 | 0.8582 | 0.7254 | 9.7075315e-08 | 770 |
| 0.4086 | 0.8565 | 0.8576 | 0.7254 | 9.706783e-08 | 771 |
| 0.3977 | 0.8612 | 0.8579 | 0.7254 | 9.706034e-08 | 772 |
| 0.3905 | 0.8471 | 0.8592 | 0.7113 | 9.705283e-08 | 773 |
| 0.3977 | 0.8682 | 0.8596 | 0.7183 | 9.704532e-08 | 774 |
| 0.3773 | 0.8682 | 0.8586 | 0.7254 | 9.7037805e-08 | 775 |
| 0.3895 | 0.8612 | 0.8593 | 0.7183 | 9.7030274e-08 | 776 |
| 0.3903 | 0.8635 | 0.8601 | 0.7183 | 9.7022735e-08 | 777 |
| 0.3972 | 0.8494 | 0.8599 | 0.7183 | 9.701519e-08 | 778 |
| 0.3899 | 0.8588 | 0.8598 | 0.7254 | 9.700763e-08 | 779 |
| 0.3972 | 0.8635 | 0.8599 | 0.7254 | 9.700006e-08 | 780 |
| 0.3873 | 0.8612 | 0.8599 | 0.7254 | 9.699249e-08 | 781 |
| 0.3941 | 0.8541 | 0.8604 | 0.7183 | 9.6984905e-08 | 782 |
| 0.3858 | 0.8682 | 0.8599 | 0.7254 | 9.697731e-08 | 783 |
| 0.3691 | 0.8635 | 0.8602 | 0.7183 | 9.696971e-08 | 784 |
| 0.3879 | 0.8682 | 0.8609 | 0.7183 | 9.69621e-08 | 785 |
| 0.3892 | 0.8565 | 0.8612 | 0.7183 | 9.695447e-08 | 786 |
| 0.3818 | 0.8753 | 0.8620 | 0.7113 | 9.694684e-08 | 787 |
| 0.3798 | 0.8706 | 0.8625 | 0.7113 | 9.69392e-08 | 788 |
| 0.3828 | 0.8612 | 0.8627 | 0.7183 | 9.693156e-08 | 789 |
| 0.4055 | 0.8447 | 0.8618 | 0.7183 | 9.69239e-08 | 790 |
| 0.4016 | 0.8635 | 0.8625 | 0.7183 | 9.691623e-08 | 791 |
| 0.3952 | 0.8659 | 0.8629 | 0.7183 | 9.690856e-08 | 792 |
| 0.3878 | 0.8753 | 0.8649 | 0.7042 | 9.690088e-08 | 793 |
| 0.3724 | 0.8871 | 0.8650 | 0.7042 | 9.689318e-08 | 794 |
| 0.3746 | 0.8682 | 0.8640 | 0.7183 | 9.688548e-08 | 795 |
| 0.3752 | 0.8682 | 0.8635 | 0.7183 | 9.687777e-08 | 796 |
| 0.3817 | 0.8682 | 0.8638 | 0.7183 | 9.6870046e-08 | 797 |
| 0.3891 | 0.8729 | 0.8636 | 0.7183 | 9.6862316e-08 | 798 |
| 0.3775 | 0.8635 | 0.8626 | 0.7183 | 9.685458e-08 | 799 |
| 0.3968 | 0.8447 | 0.8634 | 0.7183 | 9.684683e-08 | 800 |
| 0.3826 | 0.8635 | 0.8633 | 0.7183 | 9.6839074e-08 | 801 |
| 0.3809 | 0.8471 | 0.8632 | 0.7183 | 9.683131e-08 | 802 |
| 0.3811 | 0.8659 | 0.8636 | 0.7183 | 9.6823534e-08 | 803 |
| 0.3647 | 0.8682 | 0.8636 | 0.7183 | 9.6815754e-08 | 804 |
| 0.3752 | 0.8800 | 0.8632 | 0.7254 | 9.680796e-08 | 805 |
| 0.3823 | 0.8753 | 0.8636 | 0.7183 | 9.680016e-08 | 806 |
| 0.4058 | 0.8424 | 0.8643 | 0.7183 | 9.679235e-08 | 807 |
| 0.3703 | 0.8871 | 0.8650 | 0.7183 | 9.6784525e-08 | 808 |
| 0.3668 | 0.8871 | 0.8660 | 0.7183 | 9.6776695e-08 | 809 |
| 0.3709 | 0.8729 | 0.8677 | 0.7183 | 9.676886e-08 | 810 |
| 0.3715 | 0.8776 | 0.8698 | 0.7042 | 9.676101e-08 | 811 |
| 0.3838 | 0.8729 | 0.8687 | 0.7183 | 9.6753155e-08 | 812 |
| 0.3827 | 0.8706 | 0.8676 | 0.7183 | 9.674529e-08 | 813 |
| 0.3873 | 0.8682 | 0.8661 | 0.7183 | 9.6737416e-08 | 814 |
| 0.3668 | 0.8659 | 0.8672 | 0.7183 | 9.6729536e-08 | 815 |
| 0.3785 | 0.8776 | 0.8667 | 0.7183 | 9.672164e-08 | 816 |
| 0.3693 | 0.8729 | 0.8669 | 0.7183 | 9.671374e-08 | 817 |
| 0.3739 | 0.8729 | 0.8673 | 0.7183 | 9.670583e-08 | 818 |
| 0.3728 | 0.8800 | 0.8679 | 0.7183 | 9.669791e-08 | 819 |
| 0.3747 | 0.8706 | 0.8673 | 0.7183 | 9.668998e-08 | 820 |
| 0.3659 | 0.8635 | 0.8676 | 0.7183 | 9.6682044e-08 | 821 |
| 0.3742 | 0.8612 | 0.8686 | 0.7183 | 9.66741e-08 | 822 |
| 0.3672 | 0.8753 | 0.8702 | 0.7113 | 9.666614e-08 | 823 |
| 0.3876 | 0.8635 | 0.8702 | 0.7113 | 9.665818e-08 | 824 |
| 0.3816 | 0.8706 | 0.8700 | 0.7183 | 9.6650204e-08 | 825 |
| 0.3764 | 0.8682 | 0.8706 | 0.7183 | 9.6642225e-08 | 826 |
| 0.3863 | 0.8682 | 0.8716 | 0.7183 | 9.663423e-08 | 827 |
| 0.3608 | 0.8682 | 0.8719 | 0.7113 | 9.662623e-08 | 828 |
| 0.3592 | 0.8729 | 0.8713 | 0.7113 | 9.661822e-08 | 829 |
| 0.3594 | 0.8565 | 0.8719 | 0.7113 | 9.66102e-08 | 830 |
| 0.3772 | 0.8659 | 0.8714 | 0.7183 | 9.660217e-08 | 831 |
| 0.3771 | 0.8541 | 0.8726 | 0.7113 | 9.6594135e-08 | 832 |
| 0.3803 | 0.8565 | 0.8735 | 0.7113 | 9.658609e-08 | 833 |
| 0.3558 | 0.8871 | 0.8728 | 0.7183 | 9.6578034e-08 | 834 |
| 0.3758 | 0.8659 | 0.8718 | 0.7183 | 9.656997e-08 | 835 |
| 0.3712 | 0.8706 | 0.8722 | 0.7183 | 9.65619e-08 | 836 |
| 0.3721 | 0.8565 | 0.8731 | 0.7113 | 9.655382e-08 | 837 |
| 0.3659 | 0.8871 | 0.8736 | 0.7113 | 9.6545726e-08 | 838 |
| 0.3747 | 0.8659 | 0.8717 | 0.7183 | 9.6537626e-08 | 839 |
| 0.3522 | 0.8871 | 0.8715 | 0.7183 | 9.652952e-08 | 840 |
| 0.3715 | 0.8659 | 0.8717 | 0.7183 | 9.6521404e-08 | 841 |
| 0.3718 | 0.8706 | 0.8724 | 0.7183 | 9.6513276e-08 | 842 |
| 0.3643 | 0.8682 | 0.8729 | 0.7183 | 9.650514e-08 | 843 |
| 0.3596 | 0.8729 | 0.8750 | 0.7113 | 9.6497e-08 | 844 |
| 0.3653 | 0.8776 | 0.8752 | 0.7113 | 9.648885e-08 | 845 |
| 0.3606 | 0.8776 | 0.8741 | 0.7183 | 9.648068e-08 | 846 |
| 0.3604 | 0.8659 | 0.8737 | 0.7113 | 9.647251e-08 | 847 |
| 0.3661 | 0.8776 | 0.8746 | 0.7113 | 9.646433e-08 | 848 |
| 0.3663 | 0.8659 | 0.8740 | 0.7183 | 9.645614e-08 | 849 |
| 0.3568 | 0.8847 | 0.8745 | 0.7113 | 9.644794e-08 | 850 |
| 0.3718 | 0.8565 | 0.8758 | 0.7113 | 9.6439734e-08 | 851 |
| 0.3603 | 0.8659 | 0.8750 | 0.7183 | 9.643152e-08 | 852 |
| 0.3610 | 0.8918 | 0.8767 | 0.7113 | 9.642329e-08 | 853 |
| 0.3629 | 0.8706 | 0.8752 | 0.7183 | 9.641506e-08 | 854 |
| 0.3577 | 0.8800 | 0.8744 | 0.7183 | 9.6406815e-08 | 855 |
| 0.3556 | 0.8659 | 0.8745 | 0.7254 | 9.6398566e-08 | 856 |
| 0.3613 | 0.8776 | 0.8748 | 0.7183 | 9.63903e-08 | 857 |
| 0.3626 | 0.8659 | 0.8749 | 0.7254 | 9.638203e-08 | 858 |
| 0.3538 | 0.8729 | 0.8748 | 0.7254 | 9.637375e-08 | 859 |
| 0.3545 | 0.8706 | 0.8746 | 0.7254 | 9.636547e-08 | 860 |
| 0.3545 | 0.8824 | 0.8749 | 0.7254 | 9.635717e-08 | 861 |
| 0.3431 | 0.8776 | 0.8754 | 0.7254 | 9.634886e-08 | 862 |
| 0.3612 | 0.8706 | 0.8766 | 0.7183 | 9.634055e-08 | 863 |
| 0.3533 | 0.8729 | 0.8782 | 0.7113 | 9.633223e-08 | 864 |
| 0.3695 | 0.8659 | 0.8779 | 0.7183 | 9.6323895e-08 | 865 |
| 0.3466 | 0.8847 | 0.8776 | 0.7183 | 9.631555e-08 | 866 |
| 0.3493 | 0.8753 | 0.8790 | 0.7042 | 9.6307204e-08 | 867 |
| 0.3409 | 0.8847 | 0.8785 | 0.7042 | 9.629885e-08 | 868 |
| 0.3423 | 0.8894 | 0.8800 | 0.7042 | 9.629048e-08 | 869 |
| 0.3529 | 0.8753 | 0.8810 | 0.6972 | 9.62821e-08 | 870 |
| 0.3539 | 0.8682 | 0.8800 | 0.6972 | 9.6273716e-08 | 871 |
| 0.3528 | 0.8706 | 0.8793 | 0.7183 | 9.6265325e-08 | 872 |
| 0.3525 | 0.8729 | 0.8784 | 0.7254 | 9.625692e-08 | 873 |
| 0.3503 | 0.8824 | 0.8777 | 0.7254 | 9.6248506e-08 | 874 |
| 0.3529 | 0.8824 | 0.8783 | 0.7254 | 9.6240086e-08 | 875 |
| 0.3444 | 0.8918 | 0.8797 | 0.7183 | 9.623166e-08 | 876 |
| 0.3491 | 0.8800 | 0.8791 | 0.7254 | 9.622322e-08 | 877 |
| 0.3457 | 0.8871 | 0.8797 | 0.7183 | 9.621477e-08 | 878 |
| 0.3449 | 0.8824 | 0.8792 | 0.7254 | 9.6206314e-08 | 879 |
| 0.3548 | 0.8847 | 0.8803 | 0.7183 | 9.619785e-08 | 880 |
| 0.3499 | 0.8776 | 0.8810 | 0.7183 | 9.6189375e-08 | 881 |
| 0.3426 | 0.9012 | 0.8843 | 0.6972 | 9.618089e-08 | 882 |
| 0.3376 | 0.8894 | 0.8836 | 0.7042 | 9.61724e-08 | 883 |
| 0.3337 | 0.8800 | 0.8828 | 0.7113 | 9.61639e-08 | 884 |
| 0.3528 | 0.8729 | 0.8842 | 0.7113 | 9.615539e-08 | 885 |
| 0.3576 | 0.8682 | 0.8831 | 0.7183 | 9.614687e-08 | 886 |
| 0.3467 | 0.8894 | 0.8841 | 0.7183 | 9.613834e-08 | 887 |
| 0.3433 | 0.8824 | 0.8834 | 0.7183 | 9.612981e-08 | 888 |
| 0.3427 | 0.8871 | 0.8835 | 0.7254 | 9.612126e-08 | 889 |
| 0.3516 | 0.8753 | 0.8836 | 0.7183 | 9.611271e-08 | 890 |
| 0.3336 | 0.8824 | 0.8837 | 0.7254 | 9.6104145e-08 | 891 |
| 0.3516 | 0.8753 | 0.8836 | 0.7254 | 9.6095576e-08 | 892 |
| 0.3448 | 0.8824 | 0.8838 | 0.7254 | 9.608699e-08 | 893 |
| 0.3412 | 0.8847 | 0.8838 | 0.7254 | 9.60784e-08 | 894 |
| 0.3568 | 0.8776 | 0.8845 | 0.7254 | 9.6069805e-08 | 895 |
| 0.3175 | 0.8941 | 0.8856 | 0.7183 | 9.60612e-08 | 896 |
| 0.3414 | 0.8871 | 0.8857 | 0.7113 | 9.605258e-08 | 897 |
| 0.3430 | 0.8847 | 0.8865 | 0.7113 | 9.6043955e-08 | 898 |
| 0.3461 | 0.8776 | 0.8877 | 0.7042 | 9.603532e-08 | 899 |
| 0.3415 | 0.8894 | 0.8856 | 0.7254 | 9.602668e-08 | 900 |
| 0.3332 | 0.8847 | 0.8854 | 0.7254 | 9.601803e-08 | 901 |
| 0.3473 | 0.8776 | 0.8856 | 0.7254 | 9.6009366e-08 | 902 |
| 0.3374 | 0.8941 | 0.8870 | 0.7254 | 9.60007e-08 | 903 |
| 0.3351 | 0.8729 | 0.8881 | 0.7113 | 9.599202e-08 | 904 |
| 0.3468 | 0.8706 | 0.8887 | 0.7113 | 9.598333e-08 | 905 |
| 0.3393 | 0.8941 | 0.8882 | 0.7254 | 9.5974634e-08 | 906 |
| 0.3379 | 0.8800 | 0.8872 | 0.7254 | 9.596593e-08 | 907 |
| 0.3416 | 0.8894 | 0.8872 | 0.7254 | 9.595722e-08 | 908 |
| 0.3199 | 0.8965 | 0.8881 | 0.7254 | 9.59485e-08 | 909 |
| 0.3392 | 0.8776 | 0.8877 | 0.7254 | 9.593977e-08 | 910 |
| 0.3356 | 0.8871 | 0.8896 | 0.7113 | 9.593103e-08 | 911 |
| 0.3379 | 0.8729 | 0.8892 | 0.7113 | 9.592228e-08 | 912 |
| 0.3472 | 0.8918 | 0.8906 | 0.7113 | 9.591353e-08 | 913 |
| 0.3394 | 0.8776 | 0.8927 | 0.6972 | 9.590476e-08 | 914 |
| 0.3438 | 0.8729 | 0.8928 | 0.6972 | 9.5895984e-08 | 915 |
| 0.3303 | 0.8800 | 0.8912 | 0.7183 | 9.58872e-08 | 916 |
| 0.3288 | 0.8894 | 0.8921 | 0.6972 | 9.587841e-08 | 917 |
| 0.3187 | 0.8988 | 0.8910 | 0.7183 | 9.586961e-08 | 918 |
| 0.3390 | 0.8800 | 0.8907 | 0.7183 | 9.58608e-08 | 919 |
| 0.3385 | 0.8776 | 0.8911 | 0.7183 | 9.585198e-08 | 920 |
| 0.3257 | 0.8871 | 0.8903 | 0.7183 | 9.5843156e-08 | 921 |
| 0.3233 | 0.8847 | 0.8908 | 0.7183 | 9.583432e-08 | 922 |
| 0.3289 | 0.8847 | 0.8899 | 0.7254 | 9.582547e-08 | 923 |
| 0.3232 | 0.8894 | 0.8916 | 0.7183 | 9.581662e-08 | 924 |
| 0.3434 | 0.8659 | 0.8942 | 0.7113 | 9.5807756e-08 | 925 |
| 0.3175 | 0.8965 | 0.8936 | 0.7183 | 9.579889e-08 | 926 |
| 0.3317 | 0.8941 | 0.8947 | 0.7042 | 9.579001e-08 | 927 |
| 0.3095 | 0.9059 | 0.8930 | 0.7183 | 9.578112e-08 | 928 |
| 0.3422 | 0.8753 | 0.8912 | 0.7254 | 9.577222e-08 | 929 |
| 0.3369 | 0.8918 | 0.8919 | 0.7183 | 9.576332e-08 | 930 |
| 0.3316 | 0.8753 | 0.8933 | 0.7183 | 9.57544e-08 | 931 |
| 0.3050 | 0.9106 | 0.8939 | 0.7183 | 9.574548e-08 | 932 |
| 0.3229 | 0.8894 | 0.8941 | 0.7183 | 9.5736546e-08 | 933 |
| 0.3361 | 0.8941 | 0.8931 | 0.7183 | 9.572761e-08 | 934 |
| 0.3267 | 0.8941 | 0.8952 | 0.7183 | 9.5718654e-08 | 935 |
| 0.3158 | 0.8965 | 0.8962 | 0.7042 | 9.5709694e-08 | 936 |
| 0.3282 | 0.8847 | 0.8957 | 0.7113 | 9.570073e-08 | 937 |
| 0.3287 | 0.8800 | 0.8958 | 0.7113 | 9.569175e-08 | 938 |
| 0.3242 | 0.8988 | 0.8963 | 0.7042 | 9.568277e-08 | 939 |
| 0.3318 | 0.8753 | 0.8957 | 0.7183 | 9.567378e-08 | 940 |
| 0.3343 | 0.8800 | 0.8965 | 0.7183 | 9.5664774e-08 | 941 |
| 0.3278 | 0.8871 | 0.8958 | 0.7183 | 9.5655764e-08 | 942 |
| 0.3299 | 0.8824 | 0.8955 | 0.7183 | 9.564675e-08 | 943 |
| 0.3231 | 0.8918 | 0.8963 | 0.7183 | 9.5637716e-08 | 944 |
| 0.3265 | 0.8941 | 0.8969 | 0.7042 | 9.562868e-08 | 945 |
| 0.3301 | 0.8847 | 0.8957 | 0.7113 | 9.561963e-08 | 946 |
| 0.3099 | 0.9035 | 0.8963 | 0.7183 | 9.561058e-08 | 947 |
| 0.3200 | 0.9012 | 0.8969 | 0.7183 | 9.5601514e-08 | 948 |
| 0.3235 | 0.8847 | 0.8963 | 0.7113 | 9.559244e-08 | 949 |
| 0.3194 | 0.8753 | 0.8963 | 0.7113 | 9.558336e-08 | 950 |
| 0.3224 | 0.8800 | 0.8968 | 0.7113 | 9.557427e-08 | 951 |
| 0.3229 | 0.8871 | 0.8976 | 0.7183 | 9.556518e-08 | 952 |
| 0.3283 | 0.8800 | 0.9004 | 0.7042 | 9.555607e-08 | 953 |
| 0.3196 | 0.8824 | 0.9018 | 0.6972 | 9.554695e-08 | 954 |
| 0.3207 | 0.8894 | 0.9019 | 0.6901 | 9.553783e-08 | 955 |
| 0.3244 | 0.8824 | 0.9030 | 0.6901 | 9.55287e-08 | 956 |
| 0.3301 | 0.8988 | 0.8994 | 0.7183 | 9.551955e-08 | 957 |
| 0.3086 | 0.9012 | 0.8994 | 0.7183 | 9.55104e-08 | 958 |
| 0.3111 | 0.9059 | 0.8996 | 0.7183 | 9.550124e-08 | 959 |
| 0.3198 | 0.8800 | 0.8997 | 0.7113 | 9.549208e-08 | 960 |
| 0.3367 | 0.8824 | 0.9017 | 0.7042 | 9.54829e-08 | 961 |
| 0.3287 | 0.8871 | 0.9016 | 0.7042 | 9.5473716e-08 | 962 |
| 0.3195 | 0.8941 | 0.9029 | 0.6972 | 9.546452e-08 | 963 |
| 0.3192 | 0.8941 | 0.9037 | 0.6831 | 9.545532e-08 | 964 |
| 0.3191 | 0.8988 | 0.9035 | 0.6831 | 9.544611e-08 | 965 |
| 0.3378 | 0.8824 | 0.9007 | 0.7113 | 9.5436896e-08 | 966 |
| 0.3276 | 0.8871 | 0.9021 | 0.7042 | 9.5427666e-08 | 967 |
| 0.3155 | 0.8871 | 0.9007 | 0.7113 | 9.541843e-08 | 968 |
| 0.3221 | 0.8776 | 0.9006 | 0.7113 | 9.5409185e-08 | 969 |
| 0.3085 | 0.9035 | 0.9023 | 0.7042 | 9.539993e-08 | 970 |
| 0.3081 | 0.9035 | 0.9031 | 0.7042 | 9.539067e-08 | 971 |
| 0.3084 | 0.9012 | 0.9023 | 0.7113 | 9.5381395e-08 | 972 |
| 0.3048 | 0.8918 | 0.9026 | 0.6972 | 9.5372116e-08 | 973 |
| 0.3216 | 0.8847 | 0.9040 | 0.6901 | 9.536283e-08 | 974 |
| 0.3060 | 0.8965 | 0.9033 | 0.6972 | 9.5353535e-08 | 975 |
| 0.3197 | 0.8706 | 0.9025 | 0.7113 | 9.534423e-08 | 976 |
| 0.3110 | 0.8894 | 0.9038 | 0.6972 | 9.533491e-08 | 977 |
| 0.3092 | 0.8965 | 0.9055 | 0.6831 | 9.532559e-08 | 978 |
| 0.3142 | 0.8871 | 0.9067 | 0.6901 | 9.531626e-08 | 979 |
| 0.3116 | 0.8988 | 0.9044 | 0.6831 | 9.530692e-08 | 980 |
| 0.3130 | 0.8965 | 0.9052 | 0.6831 | 9.529757e-08 | 981 |
| 0.3138 | 0.8988 | 0.9049 | 0.7042 | 9.5288215e-08 | 982 |
| 0.2931 | 0.8965 | 0.9047 | 0.7042 | 9.527885e-08 | 983 |
| 0.3097 | 0.8941 | 0.9052 | 0.7042 | 9.526948e-08 | 984 |
| 0.3083 | 0.8941 | 0.9047 | 0.7042 | 9.526009e-08 | 985 |
| 0.2876 | 0.9106 | 0.9053 | 0.7042 | 9.52507e-08 | 986 |
| 0.2991 | 0.8965 | 0.9055 | 0.7042 | 9.52413e-08 | 987 |
| 0.3027 | 0.9035 | 0.9063 | 0.7113 | 9.523189e-08 | 988 |
| 0.3063 | 0.8894 | 0.9077 | 0.7042 | 9.5222475e-08 | 989 |
| 0.3036 | 0.8941 | 0.9075 | 0.6972 | 9.5213046e-08 | 990 |
| 0.3033 | 0.9082 | 0.9088 | 0.6901 | 9.520361e-08 | 991 |
| 0.3197 | 0.8753 | 0.9079 | 0.7042 | 9.519417e-08 | 992 |
| 0.3021 | 0.9035 | 0.9092 | 0.6972 | 9.518472e-08 | 993 |
| 0.3144 | 0.8847 | 0.9107 | 0.6972 | 9.517526e-08 | 994 |
| 0.3085 | 0.8918 | 0.9085 | 0.6972 | 9.516579e-08 | 995 |
| 0.2938 | 0.9012 | 0.9079 | 0.7042 | 9.515631e-08 | 996 |
| 0.3006 | 0.9059 | 0.9085 | 0.7042 | 9.5146824e-08 | 997 |
| 0.3031 | 0.8965 | 0.9091 | 0.6972 | 9.513733e-08 | 998 |
| 0.3031 | 0.9035 | 0.9112 | 0.6831 | 9.512783e-08 | 999 |
| 0.2973 | 0.9012 | 0.9105 | 0.6831 | 9.511832e-08 | 1000 |
| 0.2860 | 0.9012 | 0.9103 | 0.6901 | 9.5108796e-08 | 1001 |
| 0.2966 | 0.9106 | 0.9122 | 0.6831 | 9.509927e-08 | 1002 |
| 0.2915 | 0.9012 | 0.9114 | 0.6901 | 9.508973e-08 | 1003 |
| 0.2913 | 0.9059 | 0.9105 | 0.7042 | 9.508019e-08 | 1004 |
| 0.3020 | 0.9082 | 0.9118 | 0.6901 | 9.507063e-08 | 1005 |
| 0.2910 | 0.9082 | 0.9124 | 0.6831 | 9.506107e-08 | 1006 |
| 0.3047 | 0.8965 | 0.9112 | 0.6972 | 9.50515e-08 | 1007 |
| 0.2942 | 0.8894 | 0.9103 | 0.7042 | 9.504192e-08 | 1008 |
| 0.2864 | 0.9200 | 0.9124 | 0.6901 | 9.5032334e-08 | 1009 |
| 0.2805 | 0.9224 | 0.9128 | 0.6901 | 9.5022735e-08 | 1010 |
| 0.2943 | 0.8918 | 0.9116 | 0.7042 | 9.501313e-08 | 1011 |
| 0.3138 | 0.8824 | 0.9122 | 0.7042 | 9.5003514e-08 | 1012 |
| 0.2957 | 0.8965 | 0.9130 | 0.7042 | 9.4993894e-08 | 1013 |
| 0.2907 | 0.9012 | 0.9166 | 0.6901 | 9.4984266e-08 | 1014 |
| 0.2776 | 0.9106 | 0.9167 | 0.6831 | 9.4974624e-08 | 1015 |
| 0.3045 | 0.9012 | 0.9147 | 0.6972 | 9.4964975e-08 | 1016 |
| 0.2965 | 0.9059 | 0.9151 | 0.6901 | 9.495532e-08 | 1017 |
| 0.2927 | 0.9082 | 0.9160 | 0.6901 | 9.4945655e-08 | 1018 |
| 0.3016 | 0.8988 | 0.9162 | 0.6901 | 9.4935984e-08 | 1019 |
| 0.2937 | 0.9012 | 0.9166 | 0.6901 | 9.49263e-08 | 1020 |
| 0.2989 | 0.9035 | 0.9173 | 0.6831 | 9.491661e-08 | 1021 |
| 0.2873 | 0.9035 | 0.9181 | 0.6901 | 9.490691e-08 | 1022 |
| 0.3089 | 0.8941 | 0.9200 | 0.6901 | 9.48972e-08 | 1023 |
| 0.2910 | 0.9035 | 0.9191 | 0.6972 | 9.488749e-08 | 1024 |
| 0.2783 | 0.9106 | 0.9193 | 0.6972 | 9.487776e-08 | 1025 |
| 0.2792 | 0.9035 | 0.9183 | 0.6901 | 9.486803e-08 | 1026 |
| 0.2868 | 0.9082 | 0.9171 | 0.6972 | 9.485829e-08 | 1027 |
| 0.2870 | 0.9129 | 0.9168 | 0.6972 | 9.484854e-08 | 1028 |
| 0.2867 | 0.9106 | 0.9161 | 0.6972 | 9.483878e-08 | 1029 |
| 0.2814 | 0.8988 | 0.9159 | 0.6972 | 9.482901e-08 | 1030 |
| 0.2835 | 0.9106 | 0.9154 | 0.7042 | 9.4819235e-08 | 1031 |
| 0.2868 | 0.9059 | 0.9163 | 0.7042 | 9.480945e-08 | 1032 |
| 0.2995 | 0.8941 | 0.9172 | 0.6972 | 9.479966e-08 | 1033 |
| 0.2943 | 0.9012 | 0.9186 | 0.6972 | 9.478986e-08 | 1034 |
| 0.2939 | 0.9012 | 0.9232 | 0.6972 | 9.478005e-08 | 1035 |
| 0.2913 | 0.9012 | 0.9204 | 0.7113 | 9.477023e-08 | 1036 |
| 0.2953 | 0.9082 | 0.9197 | 0.7042 | 9.47604e-08 | 1037 |
| 0.2967 | 0.8918 | 0.9193 | 0.7042 | 9.475057e-08 | 1038 |
| 0.2780 | 0.9012 | 0.9210 | 0.7113 | 9.474073e-08 | 1039 |
| 0.2915 | 0.9059 | 0.9217 | 0.7113 | 9.473087e-08 | 1040 |
| 0.3084 | 0.8894 | 0.9219 | 0.7113 | 9.472101e-08 | 1041 |
| 0.2769 | 0.9106 | 0.9219 | 0.7113 | 9.471114e-08 | 1042 |
| 0.2918 | 0.9035 | 0.9219 | 0.7113 | 9.4701264e-08 | 1043 |
| 0.2802 | 0.9106 | 0.9230 | 0.7113 | 9.469138e-08 | 1044 |
| 0.2767 | 0.9200 | 0.9225 | 0.7113 | 9.468149e-08 | 1045 |
| 0.2888 | 0.8918 | 0.9215 | 0.7113 | 9.4671584e-08 | 1046 |
| 0.2719 | 0.9082 | 0.9215 | 0.7042 | 9.466167e-08 | 1047 |
| 0.2806 | 0.9153 | 0.9223 | 0.7113 | 9.465175e-08 | 1048 |
| 0.2766 | 0.9129 | 0.9241 | 0.7042 | 9.464183e-08 | 1049 |
| 0.2850 | 0.9106 | 0.9232 | 0.7113 | 9.463189e-08 | 1050 |
| 0.2749 | 0.9106 | 0.9229 | 0.7113 | 9.4621946e-08 | 1051 |
| 0.2945 | 0.8918 | 0.9200 | 0.6972 | 9.461199e-08 | 1052 |
| 0.2927 | 0.8988 | 0.9216 | 0.6972 | 9.460203e-08 | 1053 |
| 0.2851 | 0.9012 | 0.9221 | 0.7042 | 9.459206e-08 | 1054 |
| 0.2741 | 0.9035 | 0.9221 | 0.7042 | 9.4582084e-08 | 1055 |
| 0.2769 | 0.9082 | 0.9254 | 0.7113 | 9.4572094e-08 | 1056 |
| 0.2841 | 0.9059 | 0.9251 | 0.7113 | 9.45621e-08 | 1057 |
| 0.2817 | 0.9012 | 0.9262 | 0.7113 | 9.455209e-08 | 1058 |
| 0.2920 | 0.8988 | 0.9266 | 0.7042 | 9.454208e-08 | 1059 |
| 0.2618 | 0.9129 | 0.9264 | 0.7113 | 9.453206e-08 | 1060 |
| 0.2861 | 0.9012 | 0.9252 | 0.7113 | 9.4522036e-08 | 1061 |
| 0.2805 | 0.9153 | 0.9279 | 0.7113 | 9.4512e-08 | 1062 |
| 0.2810 | 0.9200 | 0.9284 | 0.7113 | 9.450195e-08 | 1063 |
| 0.2737 | 0.9106 | 0.9277 | 0.7113 | 9.4491895e-08 | 1064 |
| 0.2802 | 0.9059 | 0.9270 | 0.7113 | 9.4481834e-08 | 1065 |
| 0.2756 | 0.9082 | 0.9259 | 0.7113 | 9.4471766e-08 | 1066 |
| 0.2669 | 0.9200 | 0.9262 | 0.7113 | 9.446168e-08 | 1067 |
| 0.2906 | 0.9106 | 0.9263 | 0.7113 | 9.445159e-08 | 1068 |
| 0.2823 | 0.9035 | 0.9258 | 0.7042 | 9.4441496e-08 | 1069 |
| 0.2815 | 0.9129 | 0.9277 | 0.7113 | 9.443139e-08 | 1070 |
| 0.2768 | 0.9082 | 0.9287 | 0.7113 | 9.442128e-08 | 1071 |
| 0.2663 | 0.9129 | 0.9294 | 0.7113 | 9.441116e-08 | 1072 |
| 0.2664 | 0.9200 | 0.9296 | 0.7113 | 9.440103e-08 | 1073 |
| 0.2668 | 0.9153 | 0.9294 | 0.7113 | 9.439089e-08 | 1074 |
| 0.2728 | 0.9129 | 0.9297 | 0.7113 | 9.4380745e-08 | 1075 |
| 0.2684 | 0.9106 | 0.9313 | 0.7113 | 9.437059e-08 | 1076 |
| 0.2757 | 0.9224 | 0.9321 | 0.7113 | 9.436043e-08 | 1077 |
| 0.2775 | 0.9082 | 0.9306 | 0.7113 | 9.435026e-08 | 1078 |
| 0.2593 | 0.9224 | 0.9317 | 0.7113 | 9.434008e-08 | 1079 |
| 0.2745 | 0.8988 | 0.9317 | 0.7113 | 9.432989e-08 | 1080 |
| 0.2679 | 0.9224 | 0.9320 | 0.7113 | 9.4319695e-08 | 1081 |
| 0.2713 | 0.9059 | 0.9311 | 0.7113 | 9.430949e-08 | 1082 |
| 0.2679 | 0.8918 | 0.9352 | 0.7113 | 9.429928e-08 | 1083 |
| 0.2847 | 0.9224 | 0.9355 | 0.7113 | 9.4289064e-08 | 1084 |
| 0.2707 | 0.9059 | 0.9338 | 0.7113 | 9.427883e-08 | 1085 |
| 0.2781 | 0.9082 | 0.9337 | 0.7113 | 9.426859e-08 | 1086 |
| 0.2635 | 0.9129 | 0.9347 | 0.7113 | 9.425835e-08 | 1087 |
| 0.2748 | 0.9082 | 0.9348 | 0.7113 | 9.4248094e-08 | 1088 |
| 0.2536 | 0.9365 | 0.9344 | 0.7113 | 9.423783e-08 | 1089 |
| 0.2537 | 0.9153 | 0.9361 | 0.7113 | 9.4227566e-08 | 1090 |
| 0.2717 | 0.9082 | 0.9372 | 0.7113 | 9.4217285e-08 | 1091 |
| 0.2643 | 0.9224 | 0.9385 | 0.7113 | 9.4206996e-08 | 1092 |
| 0.2681 | 0.9082 | 0.9365 | 0.7113 | 9.41967e-08 | 1093 |
| 0.2651 | 0.9153 | 0.9363 | 0.7113 | 9.41864e-08 | 1094 |
| 0.2702 | 0.9247 | 0.9352 | 0.7113 | 9.417609e-08 | 1095 |
| 0.2628 | 0.9176 | 0.9373 | 0.7113 | 9.416577e-08 | 1096 |
| 0.2636 | 0.9200 | 0.9363 | 0.7113 | 9.415544e-08 | 1097 |
| 0.2675 | 0.9082 | 0.9374 | 0.7113 | 9.41451e-08 | 1098 |
| 0.2577 | 0.9271 | 0.9392 | 0.7113 | 9.4134755e-08 | 1099 |
| 0.2600 | 0.9247 | 0.9403 | 0.7042 | 9.41244e-08 | 1100 |
| 0.2653 | 0.9153 | 0.9413 | 0.7042 | 9.411404e-08 | 1101 |
| 0.2505 | 0.9247 | 0.9396 | 0.7113 | 9.4103676e-08 | 1102 |
| 0.2722 | 0.9035 | 0.9419 | 0.6972 | 9.4093295e-08 | 1103 |
| 0.2658 | 0.9129 | 0.9390 | 0.7113 | 9.408291e-08 | 1104 |
| 0.2596 | 0.9271 | 0.9416 | 0.7042 | 9.407251e-08 | 1105 |
| 0.2642 | 0.9224 | 0.9413 | 0.7113 | 9.406211e-08 | 1106 |
| 0.2773 | 0.9059 | 0.9435 | 0.6972 | 9.40517e-08 | 1107 |
| 0.2484 | 0.9224 | 0.9425 | 0.7113 | 9.404128e-08 | 1108 |
| 0.2715 | 0.9106 | 0.9410 | 0.7113 | 9.403085e-08 | 1109 |
| 0.2612 | 0.9176 | 0.9406 | 0.7113 | 9.4020415e-08 | 1110 |
| 0.2572 | 0.9035 | 0.9406 | 0.7113 | 9.400997e-08 | 1111 |
| 0.2633 | 0.9153 | 0.9406 | 0.7113 | 9.399952e-08 | 1112 |
| 0.2381 | 0.9294 | 0.9427 | 0.7113 | 9.398906e-08 | 1113 |
| 0.2642 | 0.9035 | 0.9419 | 0.7113 | 9.397859e-08 | 1114 |
| 0.2674 | 0.8988 | 0.9416 | 0.7113 | 9.396811e-08 | 1115 |
| 0.2556 | 0.9035 | 0.9432 | 0.7113 | 9.3957624e-08 | 1116 |
| 0.2655 | 0.9200 | 0.9442 | 0.7113 | 9.394713e-08 | 1117 |
| 0.2529 | 0.9271 | 0.9428 | 0.7113 | 9.393663e-08 | 1118 |
| 0.2625 | 0.9106 | 0.9428 | 0.7113 | 9.392612e-08 | 1119 |
| 0.2498 | 0.9106 | 0.9429 | 0.7113 | 9.39156e-08 | 1120 |
| 0.2595 | 0.9129 | 0.9438 | 0.7113 | 9.390507e-08 | 1121 |
| 0.2535 | 0.9176 | 0.9449 | 0.7113 | 9.3894535e-08 | 1122 |
| 0.2571 | 0.9176 | 0.9443 | 0.7113 | 9.388399e-08 | 1123 |
| 0.2678 | 0.9129 | 0.9439 | 0.7113 | 9.387344e-08 | 1124 |
| 0.2471 | 0.9176 | 0.9451 | 0.7324 | 9.386288e-08 | 1125 |
| 0.2562 | 0.9153 | 0.9471 | 0.7113 | 9.3852314e-08 | 1126 |
| 0.2471 | 0.9200 | 0.9470 | 0.7113 | 9.384174e-08 | 1127 |
| 0.2644 | 0.9200 | 0.9479 | 0.7113 | 9.3831154e-08 | 1128 |
| 0.2619 | 0.9012 | 0.9461 | 0.7113 | 9.382056e-08 | 1129 |
| 0.2551 | 0.9271 | 0.9464 | 0.7113 | 9.380996e-08 | 1130 |
| 0.2423 | 0.9388 | 0.9464 | 0.7113 | 9.379935e-08 | 1131 |
| 0.2455 | 0.9176 | 0.9468 | 0.7113 | 9.3788735e-08 | 1132 |
| 0.2505 | 0.9153 | 0.9474 | 0.7113 | 9.377811e-08 | 1133 |
| 0.2494 | 0.9200 | 0.9478 | 0.7113 | 9.376748e-08 | 1134 |
| 0.2559 | 0.9153 | 0.9494 | 0.7113 | 9.375684e-08 | 1135 |
| 0.2606 | 0.9082 | 0.9528 | 0.6972 | 9.374619e-08 | 1136 |
| 0.2511 | 0.9200 | 0.9529 | 0.6972 | 9.373553e-08 | 1137 |
| 0.2521 | 0.9176 | 0.9516 | 0.7042 | 9.3724864e-08 | 1138 |
| 0.2458 | 0.9082 | 0.9527 | 0.7042 | 9.371419e-08 | 1139 |
| 0.2501 | 0.9153 | 0.9510 | 0.7113 | 9.370351e-08 | 1140 |
| 0.2432 | 0.9200 | 0.9507 | 0.7113 | 9.369282e-08 | 1141 |
| 0.2555 | 0.9059 | 0.9501 | 0.7183 | 9.368212e-08 | 1142 |
| 0.2393 | 0.9271 | 0.9499 | 0.7113 | 9.367141e-08 | 1143 |
| 0.2549 | 0.9200 | 0.9496 | 0.7183 | 9.3660695e-08 | 1144 |
| 0.2536 | 0.9153 | 0.9511 | 0.7113 | 9.364997e-08 | 1145 |
| 0.2327 | 0.9271 | 0.9532 | 0.7113 | 9.3639244e-08 | 1146 |
| 0.2494 | 0.9247 | 0.9572 | 0.7042 | 9.362851e-08 | 1147 |
| 0.2580 | 0.9153 | 0.9569 | 0.7042 | 9.361776e-08 | 1148 |
| 0.2483 | 0.9153 | 0.9547 | 0.7113 | 9.3607e-08 | 1149 |
| 0.2426 | 0.9318 | 0.9547 | 0.7113 | 9.3596235e-08 | 1150 |
| 0.2398 | 0.9271 | 0.9513 | 0.7254 | 9.358546e-08 | 1151 |
| 0.2547 | 0.9059 | 0.9517 | 0.7183 | 9.3574684e-08 | 1152 |
| 0.2446 | 0.9200 | 0.9543 | 0.7113 | 9.35639e-08 | 1153 |
| 0.2435 | 0.9224 | 0.9539 | 0.7113 | 9.3553105e-08 | 1154 |
| 0.2454 | 0.9129 | 0.9544 | 0.7113 | 9.35423e-08 | 1155 |
| 0.2479 | 0.9153 | 0.9540 | 0.7113 | 9.353148e-08 | 1156 |
| 0.2547 | 0.9129 | 0.9547 | 0.7113 | 9.352066e-08 | 1157 |
| 0.2590 | 0.9035 | 0.9549 | 0.7113 | 9.350983e-08 | 1158 |
| 0.2516 | 0.9200 | 0.9567 | 0.7113 | 9.3499e-08 | 1159 |
| 0.2468 | 0.9082 | 0.9582 | 0.7113 | 9.3488154e-08 | 1160 |
| 0.2355 | 0.9388 | 0.9594 | 0.7113 | 9.3477304e-08 | 1161 |
| 0.2323 | 0.9388 | 0.9574 | 0.7183 | 9.346644e-08 | 1162 |
| 0.2483 | 0.9059 | 0.9581 | 0.7113 | 9.345557e-08 | 1163 |
| 0.2390 | 0.9224 | 0.9585 | 0.7113 | 9.344469e-08 | 1164 |
| 0.2611 | 0.9129 | 0.9594 | 0.7113 | 9.3433805e-08 | 1165 |
| 0.2302 | 0.9200 | 0.9591 | 0.7113 | 9.342291e-08 | 1166 |
| 0.2513 | 0.9129 | 0.9588 | 0.7113 | 9.341201e-08 | 1167 |
| 0.2431 | 0.9271 | 0.9593 | 0.7113 | 9.3401106e-08 | 1168 |
| 0.2486 | 0.9082 | 0.9609 | 0.7113 | 9.339019e-08 | 1169 |
| 0.2446 | 0.9176 | 0.9599 | 0.7113 | 9.337926e-08 | 1170 |
| 0.2397 | 0.9176 | 0.9605 | 0.7113 | 9.336833e-08 | 1171 |
| 0.2423 | 0.9224 | 0.9629 | 0.7042 | 9.3357386e-08 | 1172 |
| 0.2190 | 0.9553 | 0.9634 | 0.6972 | 9.3346436e-08 | 1173 |
| 0.2391 | 0.9294 | 0.9605 | 0.7113 | 9.333548e-08 | 1174 |
| 0.2438 | 0.9200 | 0.9617 | 0.7113 | 9.3324516e-08 | 1175 |
| 0.2436 | 0.9176 | 0.9644 | 0.7042 | 9.3313545e-08 | 1176 |
| 0.2474 | 0.9153 | 0.9624 | 0.7113 | 9.330256e-08 | 1177 |
| 0.2578 | 0.9153 | 0.9625 | 0.7113 | 9.329157e-08 | 1178 |
| 0.2458 | 0.9200 | 0.9613 | 0.7113 | 9.328057e-08 | 1179 |
| 0.2436 | 0.9318 | 0.9637 | 0.7113 | 9.326956e-08 | 1180 |
| 0.2387 | 0.9247 | 0.9627 | 0.7113 | 9.325855e-08 | 1181 |
| 0.2460 | 0.9224 | 0.9629 | 0.7113 | 9.324753e-08 | 1182 |
| 0.2386 | 0.9224 | 0.9627 | 0.7254 | 9.32365e-08 | 1183 |
| 0.2290 | 0.9247 | 0.9640 | 0.7183 | 9.322547e-08 | 1184 |
| 0.2250 | 0.9294 | 0.9636 | 0.7254 | 9.321442e-08 | 1185 |
| 0.2285 | 0.9412 | 0.9653 | 0.7113 | 9.320336e-08 | 1186 |
| 0.2429 | 0.9247 | 0.9657 | 0.7183 | 9.31923e-08 | 1187 |
| 0.2284 | 0.9294 | 0.9655 | 0.7254 | 9.318123e-08 | 1188 |
| 0.2303 | 0.9365 | 0.9651 | 0.7254 | 9.317015e-08 | 1189 |
| 0.2245 | 0.9247 | 0.9655 | 0.7254 | 9.3159066e-08 | 1190 |
| 0.2342 | 0.9365 | 0.9677 | 0.7113 | 9.3147975e-08 | 1191 |
| 0.2419 | 0.9247 | 0.9683 | 0.7113 | 9.3136876e-08 | 1192 |
| 0.2358 | 0.9271 | 0.9665 | 0.7254 | 9.312576e-08 | 1193 |
| 0.2376 | 0.9200 | 0.9678 | 0.7254 | 9.311464e-08 | 1194 |
| 0.2253 | 0.9365 | 0.9688 | 0.7183 | 9.3103516e-08 | 1195 |
| 0.2237 | 0.9365 | 0.9689 | 0.7113 | 9.309238e-08 | 1196 |
| 0.2383 | 0.9200 | 0.9685 | 0.7183 | 9.308124e-08 | 1197 |
| 0.2505 | 0.9012 | 0.9701 | 0.7113 | 9.307009e-08 | 1198 |
| 0.2348 | 0.9365 | 0.9707 | 0.7113 | 9.305894e-08 | 1199 |
| 0.2364 | 0.9082 | 0.9715 | 0.7113 | 9.3047774e-08 | 1200 |
| 0.2289 | 0.9412 | 0.9727 | 0.7113 | 9.30366e-08 | 1201 |
| 0.2374 | 0.9318 | 0.9732 | 0.7113 | 9.302541e-08 | 1202 |
| 0.2459 | 0.9294 | 0.9730 | 0.7113 | 9.301422e-08 | 1203 |
| 0.2354 | 0.9271 | 0.9720 | 0.7113 | 9.3003024e-08 | 1204 |
| 0.2285 | 0.9341 | 0.9721 | 0.7113 | 9.299182e-08 | 1205 |
| 0.2364 | 0.9318 | 0.9718 | 0.7113 | 9.2980606e-08 | 1206 |
| 0.2338 | 0.9318 | 0.9739 | 0.7113 | 9.296939e-08 | 1207 |
| 0.2227 | 0.9388 | 0.9731 | 0.7113 | 9.295816e-08 | 1208 |
| 0.2391 | 0.9012 | 0.9723 | 0.7113 | 9.294692e-08 | 1209 |
| 0.2329 | 0.9153 | 0.9725 | 0.7113 | 9.293567e-08 | 1210 |
| 0.2191 | 0.9459 | 0.9739 | 0.7113 | 9.292442e-08 | 1211 |
| 0.2319 | 0.9271 | 0.9733 | 0.7113 | 9.2913155e-08 | 1212 |
| 0.2258 | 0.9271 | 0.9725 | 0.7113 | 9.2901885e-08 | 1213 |
| 0.2352 | 0.9318 | 0.9718 | 0.7183 | 9.289061e-08 | 1214 |
| 0.2363 | 0.9153 | 0.9740 | 0.7113 | 9.2879326e-08 | 1215 |
| 0.2253 | 0.9200 | 0.9765 | 0.7113 | 9.2868035e-08 | 1216 |
| 0.2248 | 0.9224 | 0.9735 | 0.7113 | 9.285674e-08 | 1217 |
| 0.2306 | 0.9224 | 0.9745 | 0.7113 | 9.2845426e-08 | 1218 |
| 0.2360 | 0.9200 | 0.9761 | 0.7113 | 9.283411e-08 | 1219 |
| 0.2379 | 0.9153 | 0.9748 | 0.7113 | 9.282278e-08 | 1220 |
| 0.2225 | 0.9247 | 0.9765 | 0.7113 | 9.281145e-08 | 1221 |
| 0.2213 | 0.9459 | 0.9778 | 0.7113 | 9.280011e-08 | 1222 |
| 0.2238 | 0.9341 | 0.9751 | 0.7254 | 9.278876e-08 | 1223 |
| 0.2351 | 0.9153 | 0.9754 | 0.7254 | 9.2777405e-08 | 1224 |
| 0.2278 | 0.9200 | 0.9763 | 0.7113 | 9.2766044e-08 | 1225 |
| 0.2249 | 0.9271 | 0.9776 | 0.7113 | 9.2754675e-08 | 1226 |
| 0.2130 | 0.9271 | 0.9767 | 0.7113 | 9.274329e-08 | 1227 |
| 0.2119 | 0.9341 | 0.9769 | 0.7113 | 9.27319e-08 | 1228 |
| 0.2259 | 0.9318 | 0.9777 | 0.7113 | 9.2720505e-08 | 1229 |
| 0.2307 | 0.9318 | 0.9775 | 0.7113 | 9.27091e-08 | 1230 |
| 0.2153 | 0.9224 | 0.9777 | 0.7113 | 9.269769e-08 | 1231 |
| 0.2193 | 0.9388 | 0.9772 | 0.7113 | 9.268627e-08 | 1232 |
| 0.2136 | 0.9247 | 0.9779 | 0.7113 | 9.2674846e-08 | 1233 |
| 0.2272 | 0.9153 | 0.9805 | 0.7113 | 9.266341e-08 | 1234 |
| 0.2243 | 0.9318 | 0.9814 | 0.7113 | 9.265197e-08 | 1235 |
| 0.2124 | 0.9365 | 0.9803 | 0.7113 | 9.2640526e-08 | 1236 |
| 0.2327 | 0.9271 | 0.9790 | 0.7183 | 9.2629065e-08 | 1237 |
| 0.2261 | 0.9365 | 0.9806 | 0.7113 | 9.26176e-08 | 1238 |
| 0.2088 | 0.9365 | 0.9827 | 0.7113 | 9.260612e-08 | 1239 |
| 0.2325 | 0.9224 | 0.9826 | 0.7113 | 9.259464e-08 | 1240 |
| 0.2165 | 0.9412 | 0.9795 | 0.7254 | 9.258315e-08 | 1241 |
| 0.2066 | 0.9412 | 0.9809 | 0.7254 | 9.257165e-08 | 1242 |
| 0.1951 | 0.9482 | 0.9822 | 0.7254 | 9.256015e-08 | 1243 |
| 0.2166 | 0.9365 | 0.9821 | 0.7254 | 9.254864e-08 | 1244 |
| 0.2245 | 0.9247 | 0.9822 | 0.7254 | 9.253712e-08 | 1245 |
| 0.2042 | 0.9435 | 0.9830 | 0.7254 | 9.252559e-08 | 1246 |
| 0.2177 | 0.9365 | 0.9855 | 0.7113 | 9.251405e-08 | 1247 |
| 0.2168 | 0.9341 | 0.9850 | 0.7113 | 9.25025e-08 | 1248 |
| 0.2245 | 0.9294 | 0.9852 | 0.7254 | 9.249095e-08 | 1249 |
| 0.2080 | 0.9365 | 0.9843 | 0.7254 | 9.247939e-08 | 1250 |
| 0.2174 | 0.9365 | 0.9839 | 0.7254 | 9.246782e-08 | 1251 |
| 0.2246 | 0.9247 | 0.9867 | 0.7113 | 9.245625e-08 | 1252 |
| 0.2139 | 0.9365 | 0.9870 | 0.7113 | 9.2444665e-08 | 1253 |
| 0.2153 | 0.9388 | 0.9846 | 0.7254 | 9.2433076e-08 | 1254 |
| 0.2191 | 0.9365 | 0.9842 | 0.7254 | 9.242148e-08 | 1255 |
| 0.2219 | 0.9247 | 0.9858 | 0.7254 | 9.240988e-08 | 1256 |
| 0.2072 | 0.9412 | 0.9888 | 0.7113 | 9.239826e-08 | 1257 |
| 0.2312 | 0.9200 | 0.9862 | 0.7254 | 9.2386635e-08 | 1258 |
| 0.2133 | 0.9294 | 0.9870 | 0.7254 | 9.2375004e-08 | 1259 |
| 0.2126 | 0.9388 | 0.9889 | 0.7113 | 9.2363365e-08 | 1260 |
| 0.2068 | 0.9271 | 0.9927 | 0.7113 | 9.235172e-08 | 1261 |
| 0.1979 | 0.9482 | 0.9914 | 0.7042 | 9.2340066e-08 | 1262 |
| 0.1986 | 0.9341 | 0.9886 | 0.7113 | 9.2328406e-08 | 1263 |
| 0.2181 | 0.9341 | 0.9892 | 0.7113 | 9.231674e-08 | 1264 |
| 0.2152 | 0.9294 | 0.9888 | 0.7113 | 9.2305065e-08 | 1265 |
| 0.2085 | 0.9247 | 0.9884 | 0.7254 | 9.2293384e-08 | 1266 |
| 0.2147 | 0.9294 | 0.9894 | 0.7183 | 9.228169e-08 | 1267 |
| 0.2213 | 0.9318 | 0.9927 | 0.7042 | 9.2269985e-08 | 1268 |
| 0.2132 | 0.9365 | 0.9934 | 0.7042 | 9.2258276e-08 | 1269 |
| 0.2294 | 0.9341 | 0.9925 | 0.7113 | 9.224656e-08 | 1270 |
| 0.2104 | 0.9318 | 0.9930 | 0.7042 | 9.2234835e-08 | 1271 |
| 0.1949 | 0.9459 | 0.9918 | 0.7113 | 9.2223104e-08 | 1272 |
| 0.2225 | 0.9294 | 0.9916 | 0.7113 | 9.2211366e-08 | 1273 |
| 0.2177 | 0.9294 | 0.9896 | 0.7254 | 9.219962e-08 | 1274 |
| 0.1972 | 0.9482 | 0.9891 | 0.7254 | 9.218787e-08 | 1275 |
| 0.2041 | 0.9412 | 0.9913 | 0.7254 | 9.217611e-08 | 1276 |
| 0.2056 | 0.9341 | 0.9935 | 0.7254 | 9.216434e-08 | 1277 |
| 0.1910 | 0.9553 | 0.9922 | 0.7254 | 9.215257e-08 | 1278 |
| 0.2137 | 0.9247 | 0.9917 | 0.7254 | 9.214078e-08 | 1279 |
| 0.2177 | 0.9247 | 0.9928 | 0.7254 | 9.2128985e-08 | 1280 |
| 0.2114 | 0.9388 | 0.9939 | 0.7254 | 9.211718e-08 | 1281 |
| 0.2036 | 0.9388 | 0.9956 | 0.7113 | 9.2105374e-08 | 1282 |
| 0.2217 | 0.9412 | 0.9960 | 0.7113 | 9.209356e-08 | 1283 |
| 0.1949 | 0.9435 | 0.9953 | 0.7113 | 9.2081734e-08 | 1284 |
| 0.1983 | 0.9365 | 0.9955 | 0.7254 | 9.2069904e-08 | 1285 |
| 0.2023 | 0.9482 | 0.9951 | 0.7254 | 9.2058066e-08 | 1286 |
| 0.2109 | 0.9247 | 0.9956 | 0.7254 | 9.204622e-08 | 1287 |
| 0.2113 | 0.9224 | 0.9979 | 0.7113 | 9.203437e-08 | 1288 |
| 0.2112 | 0.9365 | 0.9979 | 0.7254 | 9.202251e-08 | 1289 |
| 0.2085 | 0.9294 | 0.9973 | 0.7254 | 9.2010644e-08 | 1290 |
| 0.1924 | 0.9529 | 0.9955 | 0.7254 | 9.1998764e-08 | 1291 |
| 0.1916 | 0.9388 | 0.9967 | 0.7254 | 9.198688e-08 | 1292 |
| 0.2088 | 0.9412 | 0.9973 | 0.7254 | 9.197498e-08 | 1293 |
| 0.2008 | 0.9529 | 0.9973 | 0.7254 | 9.196308e-08 | 1294 |
| 0.2044 | 0.9341 | 0.9979 | 0.7254 | 9.195117e-08 | 1295 |
| 0.2097 | 0.9388 | 0.9997 | 0.7254 | 9.1939256e-08 | 1296 |
| 0.1950 | 0.9412 | 1.0000 | 0.7254 | 9.192733e-08 | 1297 |
| 0.2109 | 0.9365 | 0.9989 | 0.7254 | 9.19154e-08 | 1298 |
| 0.2064 | 0.9365 | 0.9989 | 0.7254 | 9.1903466e-08 | 1299 |
| 0.2026 | 0.9412 | 0.9991 | 0.7254 | 9.189152e-08 | 1300 |
| 0.2060 | 0.9341 | 1.0000 | 0.7254 | 9.187957e-08 | 1301 |
| 0.1943 | 0.9435 | 1.0036 | 0.7183 | 9.186761e-08 | 1302 |
| 0.2008 | 0.9388 | 1.0042 | 0.7183 | 9.185565e-08 | 1303 |
| 0.2004 | 0.9435 | 1.0036 | 0.7254 | 9.184367e-08 | 1304 |
| 0.2002 | 0.9365 | 1.0023 | 0.7254 | 9.183168e-08 | 1305 |
| 0.1976 | 0.9435 | 1.0007 | 0.7254 | 9.1819686e-08 | 1306 |
| 0.1907 | 0.9412 | 1.0020 | 0.7254 | 9.1807685e-08 | 1307 |
| 0.1964 | 0.9435 | 1.0034 | 0.7254 | 9.179568e-08 | 1308 |
| 0.1935 | 0.9388 | 1.0040 | 0.7254 | 9.178366e-08 | 1309 |
| 0.2107 | 0.9271 | 1.0063 | 0.7254 | 9.177164e-08 | 1310 |
| 0.1962 | 0.9388 | 1.0065 | 0.7254 | 9.175961e-08 | 1311 |
| 0.2016 | 0.9506 | 1.0056 | 0.7254 | 9.174757e-08 | 1312 |
| 0.2024 | 0.9294 | 1.0051 | 0.7254 | 9.173553e-08 | 1313 |
| 0.1935 | 0.9341 | 1.0057 | 0.7254 | 9.172348e-08 | 1314 |
| 0.1939 | 0.9412 | 1.0076 | 0.7183 | 9.171142e-08 | 1315 |
| 0.1883 | 0.9435 | 1.0083 | 0.7183 | 9.1699356e-08 | 1316 |
| 0.2000 | 0.9247 | 1.0071 | 0.7254 | 9.168728e-08 | 1317 |
| 0.2031 | 0.9224 | 1.0069 | 0.7254 | 9.16752e-08 | 1318 |
| 0.1831 | 0.9553 | 1.0081 | 0.7254 | 9.1663104e-08 | 1319 |
| 0.1891 | 0.9459 | 1.0100 | 0.7254 | 9.1651e-08 | 1320 |
| 0.1932 | 0.9412 | 1.0093 | 0.7254 | 9.1638896e-08 | 1321 |
| 0.1950 | 0.9247 | 1.0084 | 0.7254 | 9.162678e-08 | 1322 |
| 0.1996 | 0.9271 | 1.0092 | 0.7254 | 9.161466e-08 | 1323 |
| 0.1958 | 0.9365 | 1.0095 | 0.7254 | 9.160253e-08 | 1324 |
| 0.1900 | 0.9412 | 1.0106 | 0.7254 | 9.1590394e-08 | 1325 |
| 0.1812 | 0.9529 | 1.0127 | 0.7324 | 9.157825e-08 | 1326 |
| 0.1889 | 0.9388 | 1.0112 | 0.7254 | 9.15661e-08 | 1327 |
| 0.1918 | 0.9412 | 1.0123 | 0.7254 | 9.155394e-08 | 1328 |
| 0.2004 | 0.9388 | 1.0136 | 0.7254 | 9.154178e-08 | 1329 |
| 0.2025 | 0.9341 | 1.0151 | 0.7183 | 9.152961e-08 | 1330 |
| 0.1811 | 0.9459 | 1.0149 | 0.7254 | 9.151743e-08 | 1331 |
| 0.1892 | 0.9388 | 1.0145 | 0.7324 | 9.150524e-08 | 1332 |
| 0.1909 | 0.9365 | 1.0140 | 0.7254 | 9.149305e-08 | 1333 |
| 0.1840 | 0.9553 | 1.0139 | 0.7324 | 9.148085e-08 | 1334 |
| 0.1746 | 0.9553 | 1.0149 | 0.7324 | 9.1468635e-08 | 1335 |
| 0.1936 | 0.9412 | 1.0162 | 0.7324 | 9.1456414e-08 | 1336 |
| 0.1862 | 0.9506 | 1.0184 | 0.7042 | 9.1444186e-08 | 1337 |
| 0.1906 | 0.9365 | 1.0184 | 0.7183 | 9.143195e-08 | 1338 |
| 0.1874 | 0.9553 | 1.0147 | 0.7254 | 9.141971e-08 | 1339 |
| 0.1932 | 0.9435 | 1.0158 | 0.7254 | 9.140746e-08 | 1340 |
| 0.1944 | 0.9412 | 1.0173 | 0.7254 | 9.13952e-08 | 1341 |
| 0.1976 | 0.9294 | 1.0169 | 0.7254 | 9.138294e-08 | 1342 |
| 0.1951 | 0.9388 | 1.0180 | 0.7324 | 9.1370666e-08 | 1343 |
| 0.1801 | 0.9412 | 1.0165 | 0.7254 | 9.135839e-08 | 1344 |
| 0.2004 | 0.9412 | 1.0172 | 0.7254 | 9.13461e-08 | 1345 |
| 0.1866 | 0.9435 | 1.0198 | 0.7324 | 9.133381e-08 | 1346 |
| 0.1853 | 0.9412 | 1.0211 | 0.7254 | 9.132151e-08 | 1347 |
| 0.1965 | 0.9435 | 1.0243 | 0.7042 | 9.1309204e-08 | 1348 |
| 0.1969 | 0.9365 | 1.0242 | 0.7113 | 9.129689e-08 | 1349 |
| 0.1845 | 0.9506 | 1.0226 | 0.7183 | 9.128457e-08 | 1350 |
| 0.1907 | 0.9459 | 1.0214 | 0.7324 | 9.127224e-08 | 1351 |
| 0.1808 | 0.9459 | 1.0203 | 0.7254 | 9.1259906e-08 | 1352 |
| 0.1736 | 0.9553 | 1.0219 | 0.7324 | 9.124756e-08 | 1353 |
| 0.1864 | 0.9435 | 1.0236 | 0.7254 | 9.12352e-08 | 1354 |
| 0.1728 | 0.9459 | 1.0229 | 0.7324 | 9.122284e-08 | 1355 |
| 0.1958 | 0.9365 | 1.0232 | 0.7254 | 9.121047e-08 | 1356 |
| 0.1869 | 0.9412 | 1.0203 | 0.7254 | 9.119809e-08 | 1357 |
| 0.1802 | 0.9482 | 1.0218 | 0.7254 | 9.1185704e-08 | 1358 |
| 0.1880 | 0.9388 | 1.0218 | 0.7254 | 9.117331e-08 | 1359 |
| 0.1771 | 0.9459 | 1.0234 | 0.7324 | 9.116091e-08 | 1360 |
| 0.1952 | 0.9506 | 1.0243 | 0.7324 | 9.114851e-08 | 1361 |
| 0.1929 | 0.9506 | 1.0240 | 0.7324 | 9.1136094e-08 | 1362 |
| 0.1711 | 0.9624 | 1.0228 | 0.7254 | 9.1123674e-08 | 1363 |
| 0.1873 | 0.9435 | 1.0248 | 0.7324 | 9.111125e-08 | 1364 |
| 0.1767 | 0.9459 | 1.0286 | 0.7254 | 9.109881e-08 | 1365 |
| 0.1765 | 0.9529 | 1.0275 | 0.7254 | 9.108637e-08 | 1366 |
| 0.1737 | 0.9529 | 1.0265 | 0.7254 | 9.107392e-08 | 1367 |
| 0.1832 | 0.9412 | 1.0277 | 0.7254 | 9.1061466e-08 | 1368 |
| 0.1941 | 0.9388 | 1.0270 | 0.7324 | 9.1049e-08 | 1369 |
| 0.1786 | 0.9506 | 1.0287 | 0.7254 | 9.103653e-08 | 1370 |
| 0.1782 | 0.9506 | 1.0302 | 0.7254 | 9.1024056e-08 | 1371 |
| 0.1734 | 0.9529 | 1.0296 | 0.7254 | 9.101157e-08 | 1372 |
| 0.1692 | 0.9553 | 1.0286 | 0.7324 | 9.099908e-08 | 1373 |
| 0.1765 | 0.9459 | 1.0303 | 0.7254 | 9.098658e-08 | 1374 |
| 0.1754 | 0.9412 | 1.0304 | 0.7254 | 9.0974076e-08 | 1375 |
| 0.1664 | 0.9553 | 1.0325 | 0.7254 | 9.096156e-08 | 1376 |
| 0.1919 | 0.9412 | 1.0308 | 0.7183 | 9.094903e-08 | 1377 |
| 0.1773 | 0.9529 | 1.0319 | 0.7254 | 9.0936496e-08 | 1378 |
| 0.1794 | 0.9412 | 1.0310 | 0.7324 | 9.0923955e-08 | 1379 |
| 0.1799 | 0.9482 | 1.0301 | 0.7254 | 9.0911406e-08 | 1380 |
| 0.1820 | 0.9412 | 1.0300 | 0.7254 | 9.089885e-08 | 1381 |
| 0.1707 | 0.9459 | 1.0346 | 0.7254 | 9.088629e-08 | 1382 |
| 0.1738 | 0.9529 | 1.0366 | 0.7183 | 9.087372e-08 | 1383 |
| 0.1762 | 0.9459 | 1.0378 | 0.7042 | 9.086114e-08 | 1384 |
| 0.1683 | 0.9435 | 1.0380 | 0.6972 | 9.084856e-08 | 1385 |
| 0.1785 | 0.9506 | 1.0364 | 0.7183 | 9.083597e-08 | 1386 |
| 0.1845 | 0.9459 | 1.0360 | 0.7254 | 9.082337e-08 | 1387 |
| 0.1769 | 0.9459 | 1.0362 | 0.7254 | 9.0810765e-08 | 1388 |
| 0.1754 | 0.9459 | 1.0375 | 0.7183 | 9.079815e-08 | 1389 |
| 0.1753 | 0.9459 | 1.0390 | 0.7183 | 9.0785534e-08 | 1390 |
| 0.1765 | 0.9482 | 1.0408 | 0.7113 | 9.077291e-08 | 1391 |
| 0.1650 | 0.9506 | 1.0416 | 0.7113 | 9.0760274e-08 | 1392 |
| 0.1967 | 0.9435 | 1.0399 | 0.7254 | 9.074763e-08 | 1393 |
| 0.1748 | 0.9506 | 1.0352 | 0.7254 | 9.0734986e-08 | 1394 |
| 0.1779 | 0.9506 | 1.0348 | 0.7254 | 9.072233e-08 | 1395 |
| 0.1720 | 0.9459 | 1.0367 | 0.7254 | 9.070967e-08 | 1396 |
| 0.1583 | 0.9624 | 1.0407 | 0.7254 | 9.0697e-08 | 1397 |
| 0.1808 | 0.9459 | 1.0443 | 0.7113 | 9.0684324e-08 | 1398 |
| 0.1708 | 0.9529 | 1.0441 | 0.7254 | 9.067164e-08 | 1399 |
| 0.1833 | 0.9553 | 1.0443 | 0.7183 | 9.065895e-08 | 1400 |
| 0.1805 | 0.9435 | 1.0441 | 0.7183 | 9.064625e-08 | 1401 |
| 0.1692 | 0.9482 | 1.0414 | 0.7324 | 9.063355e-08 | 1402 |
| 0.1686 | 0.9553 | 1.0412 | 0.7324 | 9.062084e-08 | 1403 |
| 0.1690 | 0.9482 | 1.0416 | 0.7254 | 9.060812e-08 | 1404 |
| 0.1886 | 0.9388 | 1.0438 | 0.7183 | 9.059539e-08 | 1405 |
| 0.1642 | 0.9506 | 1.0460 | 0.7113 | 9.058266e-08 | 1406 |
| 0.1801 | 0.9529 | 1.0468 | 0.7113 | 9.056992e-08 | 1407 |
| 0.1819 | 0.9529 | 1.0474 | 0.7113 | 9.055717e-08 | 1408 |
| 0.1622 | 0.9600 | 1.0458 | 0.7113 | 9.054442e-08 | 1409 |
| 0.1557 | 0.9647 | 1.0429 | 0.7254 | 9.053165e-08 | 1410 |
| 0.1789 | 0.9388 | 1.0432 | 0.7324 | 9.0518874e-08 | 1411 |
| 0.1712 | 0.9435 | 1.0430 | 0.7324 | 9.050609e-08 | 1412 |
| 0.1741 | 0.9435 | 1.0438 | 0.7324 | 9.04933e-08 | 1413 |
| 0.1649 | 0.9553 | 1.0453 | 0.7324 | 9.0480505e-08 | 1414 |
| 0.1648 | 0.9529 | 1.0475 | 0.7254 | 9.04677e-08 | 1415 |
| 0.1668 | 0.9459 | 1.0482 | 0.7254 | 9.045489e-08 | 1416 |
| 0.1659 | 0.9576 | 1.0463 | 0.7324 | 9.044207e-08 | 1417 |
| 0.1602 | 0.9600 | 1.0448 | 0.7324 | 9.0429246e-08 | 1418 |
| 0.1707 | 0.9412 | 1.0457 | 0.7324 | 9.0416414e-08 | 1419 |
| 0.1730 | 0.9459 | 1.0466 | 0.7324 | 9.0403574e-08 | 1420 |
| 0.1536 | 0.9647 | 1.0476 | 0.7254 | 9.039073e-08 | 1421 |
| 0.1781 | 0.9388 | 1.0515 | 0.7183 | 9.0377874e-08 | 1422 |
| 0.1720 | 0.9388 | 1.0485 | 0.7324 | 9.036501e-08 | 1423 |
| 0.1746 | 0.9482 | 1.0511 | 0.7183 | 9.0352145e-08 | 1424 |
| 0.1659 | 0.9435 | 1.0528 | 0.7113 | 9.033927e-08 | 1425 |
| 0.1643 | 0.9647 | 1.0544 | 0.7042 | 9.032639e-08 | 1426 |
| 0.1786 | 0.9459 | 1.0533 | 0.7183 | 9.03135e-08 | 1427 |
| 0.1646 | 0.9482 | 1.0516 | 0.7183 | 9.03006e-08 | 1428 |
| 0.1749 | 0.9388 | 1.0539 | 0.7183 | 9.02877e-08 | 1429 |
| 0.1636 | 0.9529 | 1.0529 | 0.7183 | 9.027479e-08 | 1430 |
| 0.1692 | 0.9506 | 1.0542 | 0.7183 | 9.026187e-08 | 1431 |
| 0.1616 | 0.9529 | 1.0531 | 0.7183 | 9.0248946e-08 | 1432 |
| 0.1764 | 0.9459 | 1.0513 | 0.7254 | 9.0236014e-08 | 1433 |
| 0.1660 | 0.9529 | 1.0528 | 0.7183 | 9.0223075e-08 | 1434 |
| 0.1613 | 0.9506 | 1.0531 | 0.7183 | 9.021013e-08 | 1435 |
| 0.1502 | 0.9600 | 1.0546 | 0.7183 | 9.0197176e-08 | 1436 |
| 0.1513 | 0.9671 | 1.0550 | 0.7183 | 9.0184216e-08 | 1437 |
| 0.1745 | 0.9482 | 1.0541 | 0.7254 | 9.017125e-08 | 1438 |
| 0.1661 | 0.9482 | 1.0567 | 0.7183 | 9.0158274e-08 | 1439 |
| 0.1683 | 0.9553 | 1.0572 | 0.7183 | 9.014529e-08 | 1440 |
| 0.1560 | 0.9671 | 1.0564 | 0.7254 | 9.01323e-08 | 1441 |
| 0.1726 | 0.9459 | 1.0539 | 0.7324 | 9.011931e-08 | 1442 |
| 0.1599 | 0.9553 | 1.0587 | 0.7113 | 9.0106305e-08 | 1443 |
| 0.1592 | 0.9576 | 1.0603 | 0.7113 | 9.0093295e-08 | 1444 |
| 0.1693 | 0.9506 | 1.0643 | 0.7042 | 9.008028e-08 | 1445 |
| 0.1633 | 0.9600 | 1.0648 | 0.7113 | 9.006725e-08 | 1446 |
| 0.1589 | 0.9624 | 1.0624 | 0.7113 | 9.005422e-08 | 1447 |
| 0.1641 | 0.9576 | 1.0601 | 0.7254 | 9.004118e-08 | 1448 |
| 0.1573 | 0.9529 | 1.0570 | 0.7254 | 9.002814e-08 | 1449 |
| 0.1656 | 0.9412 | 1.0562 | 0.7324 | 9.0015085e-08 | 1450 |
| 0.1560 | 0.9600 | 1.0579 | 0.7324 | 9.0002025e-08 | 1451 |
| 0.1703 | 0.9482 | 1.0593 | 0.7324 | 8.998896e-08 | 1452 |
| 0.1633 | 0.9482 | 1.0581 | 0.7324 | 8.9975885e-08 | 1453 |
| 0.1763 | 0.9435 | 1.0597 | 0.7324 | 8.99628e-08 | 1454 |
| 0.1617 | 0.9482 | 1.0603 | 0.7254 | 8.9949715e-08 | 1455 |
| 0.1767 | 0.9482 | 1.0615 | 0.7254 | 8.993662e-08 | 1456 |
| 0.1545 | 0.9694 | 1.0614 | 0.7254 | 8.992352e-08 | 1457 |
| 0.1516 | 0.9600 | 1.0628 | 0.7183 | 8.991041e-08 | 1458 |
| 0.1547 | 0.9529 | 1.0636 | 0.7183 | 8.989729e-08 | 1459 |
| 0.1487 | 0.9718 | 1.0634 | 0.7183 | 8.988417e-08 | 1460 |
| 0.1627 | 0.9529 | 1.0644 | 0.7183 | 8.987104e-08 | 1461 |
| 0.1572 | 0.9529 | 1.0635 | 0.7254 | 8.98579e-08 | 1462 |
| 0.1525 | 0.9553 | 1.0649 | 0.7183 | 8.9844754e-08 | 1463 |
| 0.1567 | 0.9576 | 1.0652 | 0.7183 | 8.98316e-08 | 1464 |
| 0.1742 | 0.9412 | 1.0648 | 0.7254 | 8.981844e-08 | 1465 |
| 0.1678 | 0.9506 | 1.0660 | 0.7183 | 8.9805276e-08 | 1466 |
| 0.1418 | 0.9671 | 1.0667 | 0.7183 | 8.97921e-08 | 1467 |
| 0.1671 | 0.9365 | 1.0673 | 0.7183 | 8.977892e-08 | 1468 |
| 0.1572 | 0.9459 | 1.0664 | 0.7324 | 8.9765734e-08 | 1469 |
| 0.1621 | 0.9529 | 1.0665 | 0.7324 | 8.975254e-08 | 1470 |
| 0.1604 | 0.9624 | 1.0671 | 0.7254 | 8.973934e-08 | 1471 |
| 0.1701 | 0.9435 | 1.0681 | 0.7254 | 8.972613e-08 | 1472 |
| 0.1569 | 0.9529 | 1.0696 | 0.7183 | 8.971291e-08 | 1473 |
| 0.1551 | 0.9624 | 1.0700 | 0.7183 | 8.969969e-08 | 1474 |
| 0.1599 | 0.9482 | 1.0732 | 0.7113 | 8.968646e-08 | 1475 |
| 0.1634 | 0.9529 | 1.0745 | 0.7183 | 8.967322e-08 | 1476 |
| 0.1454 | 0.9671 | 1.0722 | 0.7183 | 8.965998e-08 | 1477 |
| 0.1454 | 0.9553 | 1.0715 | 0.7183 | 8.9646726e-08 | 1478 |
| 0.1540 | 0.9576 | 1.0700 | 0.7254 | 8.963347e-08 | 1479 |
| 0.1474 | 0.9647 | 1.0707 | 0.7254 | 8.96202e-08 | 1480 |
| 0.1478 | 0.9553 | 1.0728 | 0.7183 | 8.960693e-08 | 1481 |
| 0.1599 | 0.9506 | 1.0724 | 0.7183 | 8.959365e-08 | 1482 |
| 0.1524 | 0.9600 | 1.0742 | 0.7183 | 8.958036e-08 | 1483 |
| 0.1530 | 0.9506 | 1.0745 | 0.7183 | 8.956707e-08 | 1484 |
| 0.1543 | 0.9506 | 1.0729 | 0.7254 | 8.9553765e-08 | 1485 |
| 0.1465 | 0.9600 | 1.0729 | 0.7254 | 8.954046e-08 | 1486 |
| 0.1555 | 0.9553 | 1.0745 | 0.7183 | 8.952714e-08 | 1487 |
| 0.1644 | 0.9553 | 1.0752 | 0.7183 | 8.951382e-08 | 1488 |
| 0.1644 | 0.9435 | 1.0752 | 0.7183 | 8.950049e-08 | 1489 |
| 0.1445 | 0.9647 | 1.0755 | 0.7183 | 8.948715e-08 | 1490 |
| 0.1544 | 0.9600 | 1.0757 | 0.7183 | 8.947381e-08 | 1491 |
| 0.1517 | 0.9624 | 1.0758 | 0.7183 | 8.9460464e-08 | 1492 |
| 0.1486 | 0.9718 | 1.0755 | 0.7254 | 8.944711e-08 | 1493 |
| 0.1765 | 0.9388 | 1.0777 | 0.7183 | 8.9433755e-08 | 1494 |
| 0.1448 | 0.9576 | 1.0780 | 0.7183 | 8.942039e-08 | 1495 |
| 0.1549 | 0.9506 | 1.0777 | 0.7183 | 8.940702e-08 | 1496 |
| 0.1570 | 0.9576 | 1.0770 | 0.7254 | 8.939364e-08 | 1497 |
| 0.1568 | 0.9576 | 1.0757 | 0.7254 | 8.938025e-08 | 1498 |
| 0.1500 | 0.9482 | 1.0762 | 0.7254 | 8.936686e-08 | 1499 |
| 0.1397 | 0.9647 | 1.0781 | 0.7183 | 8.9353456e-08 | 1500 |
| 0.1537 | 0.9506 | 1.0780 | 0.7254 | 8.934005e-08 | 1501 |
| 0.1521 | 0.9624 | 1.0799 | 0.7183 | 8.932663e-08 | 1502 |
| 0.1587 | 0.9482 | 1.0813 | 0.7183 | 8.931321e-08 | 1503 |
| 0.1529 | 0.9600 | 1.0790 | 0.7254 | 8.929978e-08 | 1504 |
| 0.1551 | 0.9482 | 1.0797 | 0.7254 | 8.9286345e-08 | 1505 |
| 0.1576 | 0.9459 | 1.0813 | 0.7183 | 8.92729e-08 | 1506 |
| 0.1568 | 0.9576 | 1.0845 | 0.7254 | 8.925945e-08 | 1507 |
| 0.1631 | 0.9459 | 1.0865 | 0.7254 | 8.9245994e-08 | 1508 |
| 0.1432 | 0.9671 | 1.0861 | 0.7254 | 8.923253e-08 | 1509 |
| 0.1363 | 0.9647 | 1.0856 | 0.7254 | 8.921906e-08 | 1510 |
| 0.1366 | 0.9624 | 1.0863 | 0.7254 | 8.920558e-08 | 1511 |
| 0.1444 | 0.9647 | 1.0839 | 0.7254 | 8.919209e-08 | 1512 |
| 0.1530 | 0.9576 | 1.0846 | 0.7183 | 8.91786e-08 | 1513 |
| 0.1471 | 0.9529 | 1.0859 | 0.7183 | 8.91651e-08 | 1514 |
| 0.1505 | 0.9694 | 1.0888 | 0.7254 | 8.915159e-08 | 1515 |
| 0.1629 | 0.9529 | 1.0886 | 0.7254 | 8.913808e-08 | 1516 |
| 0.1630 | 0.9529 | 1.0866 | 0.7254 | 8.9124555e-08 | 1517 |
| 0.1591 | 0.9506 | 1.0862 | 0.7254 | 8.9111026e-08 | 1518 |
| 0.1472 | 0.9553 | 1.0850 | 0.7254 | 8.909749e-08 | 1519 |
| 0.1482 | 0.9624 | 1.0862 | 0.7254 | 8.908395e-08 | 1520 |
| 0.1501 | 0.9553 | 1.0870 | 0.7183 | 8.90704e-08 | 1521 |
| 0.1469 | 0.9529 | 1.0870 | 0.7254 | 8.905684e-08 | 1522 |
| 0.1413 | 0.9576 | 1.0865 | 0.7254 | 8.9043276e-08 | 1523 |
| 0.1402 | 0.9647 | 1.0860 | 0.7183 | 8.9029704e-08 | 1524 |
| 0.1320 | 0.9624 | 1.0878 | 0.7254 | 8.9016126e-08 | 1525 |
| 0.1528 | 0.9553 | 1.0905 | 0.7254 | 8.900255e-08 | 1526 |
| 0.1335 | 0.9694 | 1.0899 | 0.7183 | 8.898896e-08 | 1527 |
| 0.1478 | 0.9600 | 1.0919 | 0.7254 | 8.897537e-08 | 1528 |
| 0.1374 | 0.9671 | 1.0929 | 0.7254 | 8.896177e-08 | 1529 |
| 0.1417 | 0.9600 | 1.0931 | 0.7254 | 8.894816e-08 | 1530 |
| 0.1387 | 0.9647 | 1.0934 | 0.7254 | 8.893455e-08 | 1531 |
| 0.1373 | 0.9671 | 1.0955 | 0.7254 | 8.892093e-08 | 1532 |
| 0.1383 | 0.9576 | 1.0947 | 0.7254 | 8.89073e-08 | 1533 |
| 0.1452 | 0.9482 | 1.0946 | 0.7254 | 8.8893664e-08 | 1534 |
| 0.1411 | 0.9506 | 1.0939 | 0.7254 | 8.888002e-08 | 1535 |
| 0.1574 | 0.9482 | 1.0936 | 0.7254 | 8.886637e-08 | 1536 |
| 0.1365 | 0.9671 | 1.0917 | 0.7254 | 8.8852715e-08 | 1537 |
| 0.1452 | 0.9624 | 1.0925 | 0.7254 | 8.883905e-08 | 1538 |
| 0.1477 | 0.9482 | 1.0937 | 0.7254 | 8.882538e-08 | 1539 |
| 0.1412 | 0.9671 | 1.0956 | 0.7394 | 8.88117e-08 | 1540 |
| 0.1447 | 0.9624 | 1.0952 | 0.7324 | 8.879802e-08 | 1541 |
| 0.1358 | 0.9647 | 1.0966 | 0.7254 | 8.8784326e-08 | 1542 |
| 0.1489 | 0.9506 | 1.0997 | 0.7254 | 8.8770626e-08 | 1543 |
| 0.1573 | 0.9506 | 1.0987 | 0.7254 | 8.875692e-08 | 1544 |
| 0.1374 | 0.9624 | 1.0982 | 0.7254 | 8.874321e-08 | 1545 |
| 0.1322 | 0.9718 | 1.0994 | 0.7254 | 8.8729486e-08 | 1546 |
| 0.1292 | 0.9718 | 1.0992 | 0.7254 | 8.871576e-08 | 1547 |
| 0.1480 | 0.9576 | 1.1002 | 0.7254 | 8.870203e-08 | 1548 |
| 0.1340 | 0.9718 | 1.1005 | 0.7254 | 8.8688296e-08 | 1549 |
| 0.1332 | 0.9671 | 1.0997 | 0.7254 | 8.8674554e-08 | 1550 |
| 0.1416 | 0.9624 | 1.0983 | 0.7254 | 8.8660805e-08 | 1551 |
| 0.1288 | 0.9624 | 1.1002 | 0.7324 | 8.864705e-08 | 1552 |
| 0.1382 | 0.9671 | 1.0999 | 0.7254 | 8.8633286e-08 | 1553 |
| 0.1328 | 0.9576 | 1.1012 | 0.7254 | 8.8619515e-08 | 1554 |
| 0.1306 | 0.9694 | 1.1011 | 0.7183 | 8.860574e-08 | 1555 |
| 0.1248 | 0.9694 | 1.1021 | 0.7254 | 8.859195e-08 | 1556 |
| 0.1341 | 0.9600 | 1.1020 | 0.7254 | 8.857816e-08 | 1557 |
| 0.1343 | 0.9600 | 1.1034 | 0.7324 | 8.856436e-08 | 1558 |
| 0.1347 | 0.9647 | 1.1069 | 0.7254 | 8.855056e-08 | 1559 |
| 0.1447 | 0.9529 | 1.1065 | 0.7254 | 8.8536744e-08 | 1560 |
| 0.1443 | 0.9553 | 1.1063 | 0.7254 | 8.8522924e-08 | 1561 |
| 0.1355 | 0.9788 | 1.1063 | 0.7254 | 8.85091e-08 | 1562 |
| 0.1538 | 0.9506 | 1.1061 | 0.7254 | 8.849526e-08 | 1563 |
| 0.1308 | 0.9694 | 1.1082 | 0.7254 | 8.848142e-08 | 1564 |
| 0.1412 | 0.9600 | 1.1090 | 0.7254 | 8.846757e-08 | 1565 |
| 0.1550 | 0.9459 | 1.1087 | 0.7254 | 8.8453724e-08 | 1566 |
| 0.1511 | 0.9506 | 1.1094 | 0.7254 | 8.843987e-08 | 1567 |
| 0.1532 | 0.9506 | 1.1089 | 0.7254 | 8.8426006e-08 | 1568 |
| 0.1265 | 0.9671 | 1.1068 | 0.7324 | 8.8412136e-08 | 1569 |
| 0.1408 | 0.9600 | 1.1067 | 0.7324 | 8.839826e-08 | 1570 |
| 0.1349 | 0.9671 | 1.1071 | 0.7324 | 8.8384375e-08 | 1571 |
| 0.1224 | 0.9624 | 1.1064 | 0.7394 | 8.8370484e-08 | 1572 |
| 0.1375 | 0.9553 | 1.1103 | 0.7254 | 8.8356586e-08 | 1573 |
| 0.1281 | 0.9671 | 1.1114 | 0.7254 | 8.834268e-08 | 1574 |
| 0.1262 | 0.9671 | 1.1130 | 0.7254 | 8.832877e-08 | 1575 |
| 0.1472 | 0.9624 | 1.1121 | 0.7254 | 8.831485e-08 | 1576 |
| 0.1381 | 0.9600 | 1.1114 | 0.7254 | 8.830092e-08 | 1577 |
| 0.1331 | 0.9694 | 1.1113 | 0.7254 | 8.828699e-08 | 1578 |
| 0.1401 | 0.9506 | 1.1104 | 0.7324 | 8.827305e-08 | 1579 |
| 0.1446 | 0.9600 | 1.1117 | 0.7254 | 8.82591e-08 | 1580 |
| 0.1349 | 0.9647 | 1.1115 | 0.7254 | 8.8245145e-08 | 1581 |
| 0.1345 | 0.9576 | 1.1125 | 0.7183 | 8.823119e-08 | 1582 |
| 0.1328 | 0.9694 | 1.1152 | 0.7254 | 8.821723e-08 | 1583 |
| 0.1387 | 0.9576 | 1.1151 | 0.7254 | 8.820326e-08 | 1584 |
| 0.1325 | 0.9671 | 1.1147 | 0.7254 | 8.818928e-08 | 1585 |
| 0.1310 | 0.9624 | 1.1132 | 0.7324 | 8.81753e-08 | 1586 |
| 0.1347 | 0.9718 | 1.1140 | 0.7254 | 8.816131e-08 | 1587 |
| 0.1217 | 0.9741 | 1.1141 | 0.7254 | 8.814731e-08 | 1588 |
| 0.1282 | 0.9694 | 1.1152 | 0.7254 | 8.8133305e-08 | 1589 |
| 0.1285 | 0.9647 | 1.1169 | 0.7254 | 8.811929e-08 | 1590 |
| 0.1195 | 0.9671 | 1.1163 | 0.7254 | 8.8105274e-08 | 1591 |
| 0.1294 | 0.9694 | 1.1152 | 0.7324 | 8.809125e-08 | 1592 |
| 0.1335 | 0.9624 | 1.1145 | 0.7254 | 8.8077215e-08 | 1593 |
| 0.1324 | 0.9647 | 1.1148 | 0.7254 | 8.8063175e-08 | 1594 |
| 0.1263 | 0.9671 | 1.1165 | 0.7254 | 8.8049134e-08 | 1595 |
| 0.1281 | 0.9671 | 1.1191 | 0.7254 | 8.803509e-08 | 1596 |
| 0.1297 | 0.9671 | 1.1209 | 0.7254 | 8.802103e-08 | 1597 |
| 0.1220 | 0.9765 | 1.1206 | 0.7254 | 8.800697e-08 | 1598 |
| 0.1384 | 0.9647 | 1.1212 | 0.7254 | 8.79929e-08 | 1599 |
| 0.1315 | 0.9600 | 1.1241 | 0.7254 | 8.7978826e-08 | 1600 |
| 0.1456 | 0.9624 | 1.1247 | 0.7254 | 8.796474e-08 | 1601 |
| 0.1328 | 0.9576 | 1.1258 | 0.7254 | 8.795065e-08 | 1602 |
| 0.1232 | 0.9671 | 1.1241 | 0.7254 | 8.7936556e-08 | 1603 |
| 0.1323 | 0.9624 | 1.1219 | 0.7254 | 8.792245e-08 | 1604 |
| 0.1262 | 0.9671 | 1.1219 | 0.7254 | 8.790834e-08 | 1605 |
| 0.1256 | 0.9624 | 1.1227 | 0.7254 | 8.789422e-08 | 1606 |
| 0.1276 | 0.9576 | 1.1235 | 0.7254 | 8.7880096e-08 | 1607 |
| 0.1399 | 0.9624 | 1.1283 | 0.7183 | 8.786597e-08 | 1608 |
| 0.1276 | 0.9671 | 1.1302 | 0.7183 | 8.785184e-08 | 1609 |
| 0.1258 | 0.9718 | 1.1299 | 0.7183 | 8.78377e-08 | 1610 |
| 0.1364 | 0.9624 | 1.1261 | 0.7254 | 8.782355e-08 | 1611 |
| 0.1127 | 0.9765 | 1.1252 | 0.7254 | 8.78094e-08 | 1612 |
| 0.1248 | 0.9647 | 1.1253 | 0.7254 | 8.7795236e-08 | 1613 |
| 0.1292 | 0.9694 | 1.1265 | 0.7254 | 8.778107e-08 | 1614 |
| 0.1249 | 0.9529 | 1.1285 | 0.7183 | 8.776689e-08 | 1615 |
| 0.1284 | 0.9647 | 1.1278 | 0.7254 | 8.775271e-08 | 1616 |
| 0.1259 | 0.9624 | 1.1269 | 0.7254 | 8.773852e-08 | 1617 |
| 0.1256 | 0.9718 | 1.1267 | 0.7254 | 8.7724324e-08 | 1618 |
| 0.1254 | 0.9765 | 1.1273 | 0.7254 | 8.771012e-08 | 1619 |
| 0.1293 | 0.9624 | 1.1324 | 0.7183 | 8.7695916e-08 | 1620 |
| 0.1189 | 0.9647 | 1.1301 | 0.7254 | 8.7681705e-08 | 1621 |
| 0.1284 | 0.9600 | 1.1281 | 0.7254 | 8.766749e-08 | 1622 |
| 0.1182 | 0.9741 | 1.1276 | 0.7254 | 8.765326e-08 | 1623 |
| 0.1270 | 0.9624 | 1.1270 | 0.7254 | 8.763903e-08 | 1624 |
| 0.1270 | 0.9624 | 1.1285 | 0.7254 | 8.762479e-08 | 1625 |
| 0.1169 | 0.9741 | 1.1295 | 0.7254 | 8.7610545e-08 | 1626 |
| 0.1223 | 0.9694 | 1.1292 | 0.7254 | 8.759629e-08 | 1627 |
| 0.1205 | 0.9671 | 1.1298 | 0.7254 | 8.758203e-08 | 1628 |
| 0.1441 | 0.9600 | 1.1322 | 0.7254 | 8.756776e-08 | 1629 |
| 0.1316 | 0.9647 | 1.1325 | 0.7254 | 8.7553495e-08 | 1630 |
| 0.1219 | 0.9694 | 1.1322 | 0.7254 | 8.753922e-08 | 1631 |
| 0.1128 | 0.9765 | 1.1316 | 0.7254 | 8.752494e-08 | 1632 |
| 0.1249 | 0.9765 | 1.1334 | 0.7254 | 8.751065e-08 | 1633 |
| 0.1221 | 0.9624 | 1.1344 | 0.7254 | 8.749635e-08 | 1634 |
| 0.1132 | 0.9741 | 1.1352 | 0.7254 | 8.748205e-08 | 1635 |
| 0.1342 | 0.9647 | 1.1360 | 0.7183 | 8.746774e-08 | 1636 |
| 0.1208 | 0.9718 | 1.1358 | 0.7254 | 8.745342e-08 | 1637 |
| 0.1263 | 0.9718 | 1.1344 | 0.7324 | 8.74391e-08 | 1638 |
| 0.1176 | 0.9671 | 1.1344 | 0.7254 | 8.7424766e-08 | 1639 |
| 0.1344 | 0.9647 | 1.1350 | 0.7254 | 8.741043e-08 | 1640 |
| 0.1163 | 0.9694 | 1.1371 | 0.7254 | 8.739609e-08 | 1641 |
| 0.1142 | 0.9718 | 1.1379 | 0.7254 | 8.738174e-08 | 1642 |
| 0.1274 | 0.9624 | 1.1398 | 0.7183 | 8.736739e-08 | 1643 |
| 0.1384 | 0.9624 | 1.1408 | 0.7183 | 8.735303e-08 | 1644 |
| 0.1294 | 0.9600 | 1.1395 | 0.7183 | 8.733866e-08 | 1645 |
| 0.1344 | 0.9600 | 1.1396 | 0.7183 | 8.732429e-08 | 1646 |
| 0.1055 | 0.9741 | 1.1396 | 0.7183 | 8.730991e-08 | 1647 |
| 0.1294 | 0.9647 | 1.1404 | 0.7183 | 8.729552e-08 | 1648 |
| 0.1117 | 0.9741 | 1.1413 | 0.7254 | 8.728112e-08 | 1649 |
| 0.1131 | 0.9671 | 1.1411 | 0.7183 | 8.726673e-08 | 1650 |
| 0.1155 | 0.9741 | 1.1447 | 0.7254 | 8.7252324e-08 | 1651 |
| 0.1164 | 0.9671 | 1.1462 | 0.7183 | 8.7237915e-08 | 1652 |
| 0.1061 | 0.9694 | 1.1447 | 0.7254 | 8.72235e-08 | 1653 |
| 0.1167 | 0.9741 | 1.1431 | 0.7183 | 8.7209074e-08 | 1654 |
| 0.1205 | 0.9671 | 1.1433 | 0.7183 | 8.719464e-08 | 1655 |
| 0.1234 | 0.9647 | 1.1452 | 0.7183 | 8.7180204e-08 | 1656 |
| 0.1212 | 0.9647 | 1.1477 | 0.7183 | 8.716576e-08 | 1657 |
| 0.1243 | 0.9718 | 1.1460 | 0.7183 | 8.715131e-08 | 1658 |
| 0.1169 | 0.9694 | 1.1454 | 0.7183 | 8.713685e-08 | 1659 |
| 0.1128 | 0.9718 | 1.1461 | 0.7183 | 8.712239e-08 | 1660 |
| 0.1165 | 0.9718 | 1.1470 | 0.7183 | 8.710792e-08 | 1661 |
| 0.1372 | 0.9576 | 1.1459 | 0.7183 | 8.709345e-08 | 1662 |
| 0.1095 | 0.9765 | 1.1452 | 0.7254 | 8.7078966e-08 | 1663 |
| 0.1182 | 0.9694 | 1.1475 | 0.7254 | 8.706448e-08 | 1664 |
| 0.1093 | 0.9788 | 1.1476 | 0.7254 | 8.704998e-08 | 1665 |
| 0.1180 | 0.9765 | 1.1477 | 0.7254 | 8.703548e-08 | 1666 |
| 0.1383 | 0.9553 | 1.1497 | 0.7254 | 8.702097e-08 | 1667 |
| 0.1147 | 0.9694 | 1.1503 | 0.7254 | 8.700646e-08 | 1668 |
| 0.1254 | 0.9647 | 1.1498 | 0.7183 | 8.6991946e-08 | 1669 |
| 0.1217 | 0.9624 | 1.1503 | 0.7183 | 8.697742e-08 | 1670 |
| 0.1093 | 0.9694 | 1.1515 | 0.7183 | 8.696289e-08 | 1671 |
| 0.1196 | 0.9671 | 1.1515 | 0.7183 | 8.6948354e-08 | 1672 |
| 0.1185 | 0.9718 | 1.1535 | 0.7183 | 8.693381e-08 | 1673 |
| 0.1162 | 0.9647 | 1.1548 | 0.7183 | 8.691926e-08 | 1674 |
| 0.1096 | 0.9788 | 1.1548 | 0.7183 | 8.69047e-08 | 1675 |
| 0.1241 | 0.9624 | 1.1546 | 0.7183 | 8.689013e-08 | 1676 |
| 0.1371 | 0.9506 | 1.1569 | 0.7183 | 8.6875566e-08 | 1677 |
| 0.1200 | 0.9741 | 1.1535 | 0.7254 | 8.686099e-08 | 1678 |
| 0.1197 | 0.9671 | 1.1534 | 0.7254 | 8.684641e-08 | 1679 |
| 0.1072 | 0.9671 | 1.1534 | 0.7183 | 8.6831825e-08 | 1680 |
| 0.1119 | 0.9694 | 1.1550 | 0.7183 | 8.681723e-08 | 1681 |
| 0.1153 | 0.9671 | 1.1550 | 0.7183 | 8.680263e-08 | 1682 |
| 0.1147 | 0.9671 | 1.1544 | 0.7183 | 8.678802e-08 | 1683 |
| 0.1067 | 0.9741 | 1.1551 | 0.7183 | 8.6773404e-08 | 1684 |
| 0.1204 | 0.9671 | 1.1575 | 0.7183 | 8.675879e-08 | 1685 |
| 0.1113 | 0.9694 | 1.1581 | 0.7183 | 8.6744166e-08 | 1686 |
| 0.1184 | 0.9671 | 1.1563 | 0.7183 | 8.6729536e-08 | 1687 |
| 0.1134 | 0.9718 | 1.1573 | 0.7183 | 8.67149e-08 | 1688 |
| 0.1157 | 0.9765 | 1.1575 | 0.7183 | 8.6700254e-08 | 1689 |
| 0.1277 | 0.9600 | 1.1586 | 0.7183 | 8.66856e-08 | 1690 |
| 0.1144 | 0.9741 | 1.1589 | 0.7183 | 8.6670944e-08 | 1691 |
| 0.1180 | 0.9718 | 1.1618 | 0.7183 | 8.665628e-08 | 1692 |
| 0.1184 | 0.9671 | 1.1631 | 0.7183 | 8.664161e-08 | 1693 |
| 0.1012 | 0.9718 | 1.1629 | 0.7183 | 8.662694e-08 | 1694 |
| 0.1065 | 0.9694 | 1.1624 | 0.7183 | 8.661226e-08 | 1695 |
| 0.0955 | 0.9812 | 1.1622 | 0.7183 | 8.6597574e-08 | 1696 |
| 0.1075 | 0.9718 | 1.1630 | 0.7183 | 8.658288e-08 | 1697 |
| 0.1079 | 0.9765 | 1.1652 | 0.7183 | 8.656818e-08 | 1698 |
| 0.1002 | 0.9788 | 1.1654 | 0.7183 | 8.655347e-08 | 1699 |
| 0.1092 | 0.9718 | 1.1663 | 0.7183 | 8.653876e-08 | 1700 |
| 0.1168 | 0.9624 | 1.1648 | 0.7183 | 8.652405e-08 | 1701 |
| 0.0993 | 0.9765 | 1.1609 | 0.7183 | 8.6509324e-08 | 1702 |
| 0.1193 | 0.9647 | 1.1626 | 0.7254 | 8.6494595e-08 | 1703 |
| 0.1105 | 0.9718 | 1.1644 | 0.7254 | 8.647986e-08 | 1704 |
| 0.1191 | 0.9671 | 1.1664 | 0.7183 | 8.6465114e-08 | 1705 |
| 0.1205 | 0.9671 | 1.1678 | 0.7183 | 8.645036e-08 | 1706 |
| 0.1081 | 0.9718 | 1.1692 | 0.7113 | 8.6435605e-08 | 1707 |
| 0.1091 | 0.9718 | 1.1682 | 0.7183 | 8.642085e-08 | 1708 |
| 0.0995 | 0.9906 | 1.1648 | 0.7254 | 8.640608e-08 | 1709 |
| 0.1073 | 0.9788 | 1.1651 | 0.7254 | 8.639131e-08 | 1710 |
| 0.1133 | 0.9741 | 1.1668 | 0.7183 | 8.637653e-08 | 1711 |
| 0.1127 | 0.9671 | 1.1681 | 0.7183 | 8.6361744e-08 | 1712 |
| 0.1104 | 0.9718 | 1.1657 | 0.7254 | 8.634695e-08 | 1713 |
| 0.1188 | 0.9694 | 1.1656 | 0.7254 | 8.633215e-08 | 1714 |
| 0.1248 | 0.9624 | 1.1665 | 0.7254 | 8.631735e-08 | 1715 |
| 0.1108 | 0.9647 | 1.1716 | 0.7254 | 8.630254e-08 | 1716 |
| 0.1136 | 0.9718 | 1.1730 | 0.7254 | 8.628773e-08 | 1717 |
| 0.1114 | 0.9741 | 1.1722 | 0.7254 | 8.6272905e-08 | 1718 |
| 0.1103 | 0.9694 | 1.1723 | 0.7254 | 8.6258076e-08 | 1719 |
| 0.1132 | 0.9718 | 1.1724 | 0.7254 | 8.624324e-08 | 1720 |
| 0.1183 | 0.9694 | 1.1750 | 0.7254 | 8.62284e-08 | 1721 |
| 0.1138 | 0.9718 | 1.1744 | 0.7254 | 8.6213554e-08 | 1722 |
| 0.1091 | 0.9788 | 1.1716 | 0.7254 | 8.61987e-08 | 1723 |
| 0.1051 | 0.9765 | 1.1718 | 0.7254 | 8.6183846e-08 | 1724 |
| 0.1128 | 0.9671 | 1.1709 | 0.7183 | 8.616898e-08 | 1725 |
| 0.1221 | 0.9624 | 1.1717 | 0.7183 | 8.615411e-08 | 1726 |
| 0.0965 | 0.9812 | 1.1758 | 0.7254 | 8.613923e-08 | 1727 |
| 0.1055 | 0.9788 | 1.1758 | 0.7183 | 8.612435e-08 | 1728 |
| 0.1183 | 0.9671 | 1.1750 | 0.7183 | 8.6109466e-08 | 1729 |
| 0.0998 | 0.9741 | 1.1719 | 0.7254 | 8.609457e-08 | 1730 |
| 0.1215 | 0.9624 | 1.1728 | 0.7183 | 8.607967e-08 | 1731 |
| 0.1011 | 0.9741 | 1.1742 | 0.7254 | 8.6064766e-08 | 1732 |
| 0.1023 | 0.9741 | 1.1732 | 0.7183 | 8.604985e-08 | 1733 |
| 0.1019 | 0.9718 | 1.1748 | 0.7183 | 8.603493e-08 | 1734 |
| 0.0984 | 0.9859 | 1.1740 | 0.7183 | 8.602001e-08 | 1735 |
| 0.1067 | 0.9718 | 1.1731 | 0.7254 | 8.600508e-08 | 1736 |
| 0.1113 | 0.9671 | 1.1741 | 0.7254 | 8.5990145e-08 | 1737 |
| 0.0981 | 0.9812 | 1.1755 | 0.7183 | 8.59752e-08 | 1738 |
| 0.1106 | 0.9694 | 1.1766 | 0.7183 | 8.596025e-08 | 1739 |
| 0.1000 | 0.9859 | 1.1774 | 0.7183 | 8.5945295e-08 | 1740 |
| 0.1190 | 0.9671 | 1.1794 | 0.7183 | 8.593034e-08 | 1741 |
| 0.1181 | 0.9671 | 1.1783 | 0.7183 | 8.5915374e-08 | 1742 |
| 0.1085 | 0.9812 | 1.1777 | 0.7183 | 8.59004e-08 | 1743 |
| 0.0958 | 0.9812 | 1.1776 | 0.7183 | 8.5885425e-08 | 1744 |
| 0.1121 | 0.9624 | 1.1790 | 0.7183 | 8.587044e-08 | 1745 |
| 0.1087 | 0.9671 | 1.1797 | 0.7254 | 8.585545e-08 | 1746 |
| 0.1130 | 0.9647 | 1.1797 | 0.7254 | 8.584045e-08 | 1747 |
| 0.0981 | 0.9765 | 1.1813 | 0.7254 | 8.582545e-08 | 1748 |
| 0.1090 | 0.9741 | 1.1826 | 0.7254 | 8.581044e-08 | 1749 |
| 0.1047 | 0.9718 | 1.1836 | 0.7183 | 8.579543e-08 | 1750 |
| 0.0960 | 0.9812 | 1.1824 | 0.7183 | 8.578041e-08 | 1751 |
| 0.1100 | 0.9694 | 1.1837 | 0.7183 | 8.576538e-08 | 1752 |
| 0.1124 | 0.9694 | 1.1875 | 0.7113 | 8.5750344e-08 | 1753 |
| 0.0986 | 0.9741 | 1.1892 | 0.7113 | 8.573531e-08 | 1754 |
| 0.0981 | 0.9812 | 1.1873 | 0.7113 | 8.5720266e-08 | 1755 |
| 0.0941 | 0.9835 | 1.1854 | 0.7183 | 8.570522e-08 | 1756 |
| 0.1150 | 0.9671 | 1.1839 | 0.7183 | 8.569016e-08 | 1757 |
| 0.1111 | 0.9671 | 1.1851 | 0.7183 | 8.56751e-08 | 1758 |
| 0.1151 | 0.9647 | 1.1849 | 0.7183 | 8.566003e-08 | 1759 |
| 0.0966 | 0.9718 | 1.1892 | 0.7183 | 8.5644956e-08 | 1760 |
| 0.1063 | 0.9741 | 1.1869 | 0.7183 | 8.562988e-08 | 1761 |
| 0.1054 | 0.9765 | 1.1854 | 0.7183 | 8.561479e-08 | 1762 |
| 0.1007 | 0.9718 | 1.1866 | 0.7183 | 8.55997e-08 | 1763 |
| 0.1112 | 0.9741 | 1.1861 | 0.7183 | 8.55846e-08 | 1764 |
| 0.1025 | 0.9694 | 1.1846 | 0.7254 | 8.5569496e-08 | 1765 |
| 0.1048 | 0.9718 | 1.1858 | 0.7183 | 8.555439e-08 | 1766 |
| 0.0897 | 0.9835 | 1.1882 | 0.7183 | 8.553928e-08 | 1767 |
| 0.1030 | 0.9765 | 1.1886 | 0.7254 | 8.552416e-08 | 1768 |
| 0.0918 | 0.9812 | 1.1914 | 0.7254 | 8.550903e-08 | 1769 |
| 0.1144 | 0.9671 | 1.1914 | 0.7254 | 8.5493895e-08 | 1770 |
| 0.1045 | 0.9741 | 1.1873 | 0.7254 | 8.547875e-08 | 1771 |
| 0.1035 | 0.9812 | 1.1865 | 0.7254 | 8.546361e-08 | 1772 |
| 0.1219 | 0.9694 | 1.1878 | 0.7183 | 8.544846e-08 | 1773 |
| 0.1037 | 0.9718 | 1.1900 | 0.7254 | 8.543331e-08 | 1774 |
| 0.0928 | 0.9788 | 1.1913 | 0.7254 | 8.5418144e-08 | 1775 |
| 0.1003 | 0.9788 | 1.1905 | 0.7183 | 8.5402974e-08 | 1776 |
| 0.1115 | 0.9694 | 1.1938 | 0.7183 | 8.53878e-08 | 1777 |
| 0.1067 | 0.9718 | 1.1975 | 0.7183 | 8.5372626e-08 | 1778 |
| 0.0940 | 0.9788 | 1.1979 | 0.7113 | 8.535744e-08 | 1779 |
| 0.1098 | 0.9694 | 1.1959 | 0.7183 | 8.534225e-08 | 1780 |
| 0.1068 | 0.9671 | 1.1955 | 0.7183 | 8.532705e-08 | 1781 |
| 0.1053 | 0.9671 | 1.1960 | 0.7183 | 8.531185e-08 | 1782 |
| 0.0973 | 0.9788 | 1.1968 | 0.7183 | 8.529664e-08 | 1783 |
| 0.1030 | 0.9741 | 1.1955 | 0.7183 | 8.528143e-08 | 1784 |
| 0.1202 | 0.9553 | 1.1940 | 0.7183 | 8.526621e-08 | 1785 |
| 0.0957 | 0.9788 | 1.1942 | 0.7183 | 8.525098e-08 | 1786 |
| 0.1077 | 0.9694 | 1.1944 | 0.7183 | 8.523575e-08 | 1787 |
| 0.0904 | 0.9835 | 1.1951 | 0.7183 | 8.522051e-08 | 1788 |
| 0.0935 | 0.9835 | 1.1948 | 0.7183 | 8.520527e-08 | 1789 |
| 0.0964 | 0.9812 | 1.1955 | 0.7183 | 8.5190024e-08 | 1790 |
| 0.1150 | 0.9647 | 1.1950 | 0.7183 | 8.517477e-08 | 1791 |
| 0.0885 | 0.9812 | 1.1955 | 0.7183 | 8.5159506e-08 | 1792 |
| 0.1001 | 0.9741 | 1.1946 | 0.7183 | 8.514424e-08 | 1793 |
| 0.0932 | 0.9741 | 1.1954 | 0.7254 | 8.512897e-08 | 1794 |
| 0.1023 | 0.9765 | 1.1982 | 0.7254 | 8.511369e-08 | 1795 |
| 0.1076 | 0.9718 | 1.1984 | 0.7254 | 8.509841e-08 | 1796 |
| 0.1005 | 0.9741 | 1.1996 | 0.7254 | 8.5083116e-08 | 1797 |
| 0.1028 | 0.9788 | 1.1999 | 0.7254 | 8.506782e-08 | 1798 |
| 0.1075 | 0.9647 | 1.1995 | 0.7254 | 8.505252e-08 | 1799 |
| 0.1058 | 0.9718 | 1.2006 | 0.7183 | 8.5037215e-08 | 1800 |
| 0.0910 | 0.9741 | 1.2030 | 0.7254 | 8.50219e-08 | 1801 |
| 0.0918 | 0.9882 | 1.2045 | 0.7183 | 8.500658e-08 | 1802 |
| 0.1041 | 0.9671 | 1.2036 | 0.7254 | 8.499126e-08 | 1803 |
| 0.0912 | 0.9812 | 1.2029 | 0.7254 | 8.497593e-08 | 1804 |
| 0.0925 | 0.9835 | 1.2017 | 0.7183 | 8.49606e-08 | 1805 |
| 0.0930 | 0.9788 | 1.2012 | 0.7183 | 8.4945256e-08 | 1806 |
| 0.1033 | 0.9694 | 1.2011 | 0.7183 | 8.492991e-08 | 1807 |
| 0.0992 | 0.9765 | 1.2032 | 0.7183 | 8.4914554e-08 | 1808 |
| 0.0961 | 0.9765 | 1.2036 | 0.7183 | 8.48992e-08 | 1809 |
| 0.0942 | 0.9788 | 1.2033 | 0.7254 | 8.488384e-08 | 1810 |
| 0.1041 | 0.9671 | 1.2038 | 0.7183 | 8.486847e-08 | 1811 |
| 0.1002 | 0.9718 | 1.2040 | 0.7183 | 8.485309e-08 | 1812 |
| 0.0921 | 0.9835 | 1.2031 | 0.7183 | 8.483771e-08 | 1813 |
| 0.1028 | 0.9812 | 1.2046 | 0.7254 | 8.4822325e-08 | 1814 |
| 0.0939 | 0.9741 | 1.2086 | 0.7254 | 8.4806935e-08 | 1815 |
| 0.0991 | 0.9788 | 1.2083 | 0.7254 | 8.479154e-08 | 1816 |
| 0.0981 | 0.9718 | 1.2079 | 0.7254 | 8.477613e-08 | 1817 |
| 0.0953 | 0.9835 | 1.2078 | 0.7183 | 8.476072e-08 | 1818 |
| 0.0890 | 0.9835 | 1.2085 | 0.7183 | 8.474531e-08 | 1819 |
| 0.0923 | 0.9788 | 1.2094 | 0.7183 | 8.472989e-08 | 1820 |
| 0.0927 | 0.9765 | 1.2110 | 0.7254 | 8.4714465e-08 | 1821 |
| 0.0839 | 0.9835 | 1.2129 | 0.7254 | 8.469903e-08 | 1822 |
| 0.0831 | 0.9906 | 1.2110 | 0.7254 | 8.468359e-08 | 1823 |
| 0.0926 | 0.9788 | 1.2087 | 0.7183 | 8.466815e-08 | 1824 |
| 0.0997 | 0.9765 | 1.2077 | 0.7254 | 8.4652704e-08 | 1825 |
| 0.0971 | 0.9741 | 1.2079 | 0.7254 | 8.463725e-08 | 1826 |
| 0.1017 | 0.9765 | 1.2100 | 0.7183 | 8.462179e-08 | 1827 |
| 0.0886 | 0.9882 | 1.2122 | 0.7254 | 8.460632e-08 | 1828 |
| 0.0899 | 0.9859 | 1.2121 | 0.7183 | 8.459085e-08 | 1829 |
| 0.0827 | 0.9812 | 1.2126 | 0.7183 | 8.4575376e-08 | 1830 |
| 0.0977 | 0.9694 | 1.2131 | 0.7183 | 8.455989e-08 | 1831 |
| 0.0988 | 0.9671 | 1.2135 | 0.7254 | 8.45444e-08 | 1832 |
| 0.0905 | 0.9765 | 1.2140 | 0.7254 | 8.452891e-08 | 1833 |
| 0.0929 | 0.9835 | 1.2167 | 0.7254 | 8.451341e-08 | 1834 |
| 0.0998 | 0.9671 | 1.2179 | 0.7254 | 8.4497906e-08 | 1835 |
| 0.0968 | 0.9812 | 1.2156 | 0.7254 | 8.4482394e-08 | 1836 |
| 0.0953 | 0.9812 | 1.2147 | 0.7254 | 8.4466876e-08 | 1837 |
| 0.0848 | 0.9859 | 1.2145 | 0.7183 | 8.445135e-08 | 1838 |
| 0.1127 | 0.9647 | 1.2152 | 0.7183 | 8.4435825e-08 | 1839 |
| 0.0901 | 0.9765 | 1.2183 | 0.7254 | 8.442029e-08 | 1840 |
| 0.0891 | 0.9812 | 1.2220 | 0.7183 | 8.440475e-08 | 1841 |
| 0.0981 | 0.9812 | 1.2195 | 0.7254 | 8.438921e-08 | 1842 |
| 0.0860 | 0.9835 | 1.2187 | 0.7254 | 8.437366e-08 | 1843 |
| 0.0817 | 0.9953 | 1.2200 | 0.7254 | 8.4358106e-08 | 1844 |
| 0.0979 | 0.9812 | 1.2199 | 0.7254 | 8.4342545e-08 | 1845 |
| 0.0927 | 0.9694 | 1.2205 | 0.7183 | 8.432698e-08 | 1846 |
| 0.0883 | 0.9788 | 1.2203 | 0.7183 | 8.43114e-08 | 1847 |
| 0.0852 | 0.9859 | 1.2216 | 0.7183 | 8.429583e-08 | 1848 |
| 0.1044 | 0.9671 | 1.2238 | 0.7254 | 8.4280245e-08 | 1849 |
| 0.0927 | 0.9741 | 1.2242 | 0.7254 | 8.4264656e-08 | 1850 |
| 0.0919 | 0.9859 | 1.2267 | 0.7183 | 8.424906e-08 | 1851 |
| 0.0812 | 0.9859 | 1.2271 | 0.7254 | 8.4233456e-08 | 1852 |
| 0.0993 | 0.9718 | 1.2266 | 0.7254 | 8.421785e-08 | 1853 |
| 0.0876 | 0.9812 | 1.2244 | 0.7254 | 8.420224e-08 | 1854 |
| 0.0826 | 0.9882 | 1.2230 | 0.7183 | 8.4186624e-08 | 1855 |
| 0.0960 | 0.9671 | 1.2238 | 0.7183 | 8.4171e-08 | 1856 |
| 0.0936 | 0.9718 | 1.2229 | 0.7183 | 8.4155374e-08 | 1857 |
| 0.0957 | 0.9741 | 1.2228 | 0.7183 | 8.413974e-08 | 1858 |
| 0.0848 | 0.9835 | 1.2247 | 0.7254 | 8.41241e-08 | 1859 |
| 0.1037 | 0.9671 | 1.2267 | 0.7183 | 8.410846e-08 | 1860 |
| 0.0859 | 0.9859 | 1.2276 | 0.7183 | 8.409281e-08 | 1861 |
| 0.0933 | 0.9765 | 1.2270 | 0.7254 | 8.407716e-08 | 1862 |
| 0.0779 | 0.9906 | 1.2265 | 0.7183 | 8.40615e-08 | 1863 |
| 0.0819 | 0.9835 | 1.2279 | 0.7183 | 8.404583e-08 | 1864 |
| 0.0806 | 0.9859 | 1.2278 | 0.7183 | 8.4030155e-08 | 1865 |
| 0.1020 | 0.9765 | 1.2291 | 0.7183 | 8.401448e-08 | 1866 |
| 0.0780 | 0.9906 | 1.2308 | 0.7183 | 8.39988e-08 | 1867 |
| 0.0890 | 0.9788 | 1.2303 | 0.7183 | 8.398311e-08 | 1868 |
| 0.0889 | 0.9812 | 1.2288 | 0.7183 | 8.3967414e-08 | 1869 |
| 0.0976 | 0.9812 | 1.2302 | 0.7183 | 8.395172e-08 | 1870 |
| 0.0848 | 0.9788 | 1.2323 | 0.7183 | 8.3936015e-08 | 1871 |
| 0.0785 | 0.9906 | 1.2332 | 0.7183 | 8.3920305e-08 | 1872 |
| 0.0878 | 0.9835 | 1.2305 | 0.7183 | 8.390459e-08 | 1873 |
| 0.0847 | 0.9788 | 1.2298 | 0.7183 | 8.3888864e-08 | 1874 |
| 0.0854 | 0.9835 | 1.2308 | 0.7183 | 8.387314e-08 | 1875 |
| 0.0861 | 0.9835 | 1.2319 | 0.7183 | 8.385741e-08 | 1876 |
| 0.0829 | 0.9788 | 1.2333 | 0.7183 | 8.384167e-08 | 1877 |
| 0.0953 | 0.9741 | 1.2326 | 0.7183 | 8.3825924e-08 | 1878 |
| 0.0973 | 0.9788 | 1.2319 | 0.7183 | 8.381018e-08 | 1879 |
| 0.0877 | 0.9835 | 1.2335 | 0.7183 | 8.3794426e-08 | 1880 |
| 0.0945 | 0.9788 | 1.2325 | 0.7183 | 8.3778666e-08 | 1881 |
| 0.0817 | 0.9812 | 1.2318 | 0.7183 | 8.37629e-08 | 1882 |
| 0.0900 | 0.9741 | 1.2334 | 0.7183 | 8.374713e-08 | 1883 |
| 0.0810 | 0.9835 | 1.2341 | 0.7183 | 8.373136e-08 | 1884 |
| 0.0914 | 0.9788 | 1.2348 | 0.7183 | 8.371558e-08 | 1885 |
| 0.0856 | 0.9788 | 1.2351 | 0.7183 | 8.369979e-08 | 1886 |
| 0.0715 | 0.9906 | 1.2362 | 0.7183 | 8.3684e-08 | 1887 |
| 0.0904 | 0.9835 | 1.2371 | 0.7183 | 8.3668205e-08 | 1888 |
| 0.0836 | 0.9835 | 1.2371 | 0.7183 | 8.36524e-08 | 1889 |
| 0.0984 | 0.9671 | 1.2374 | 0.7183 | 8.363659e-08 | 1890 |
| 0.0823 | 0.9859 | 1.2383 | 0.7183 | 8.362078e-08 | 1891 |
| 0.0843 | 0.9882 | 1.2395 | 0.7183 | 8.360497e-08 | 1892 |
| 0.0832 | 0.9859 | 1.2398 | 0.7183 | 8.358914e-08 | 1893 |
| 0.0929 | 0.9694 | 1.2391 | 0.7183 | 8.357331e-08 | 1894 |
| 0.0854 | 0.9788 | 1.2447 | 0.7183 | 8.355748e-08 | 1895 |
| 0.0819 | 0.9906 | 1.2443 | 0.7254 | 8.354164e-08 | 1896 |
| 0.0875 | 0.9788 | 1.2423 | 0.7183 | 8.35258e-08 | 1897 |
| 0.0835 | 0.9882 | 1.2406 | 0.7183 | 8.3509946e-08 | 1898 |
| 0.0815 | 0.9835 | 1.2399 | 0.7183 | 8.349409e-08 | 1899 |
| 0.0791 | 0.9835 | 1.2404 | 0.7183 | 8.3478234e-08 | 1900 |
| 0.0846 | 0.9765 | 1.2402 | 0.7183 | 8.346237e-08 | 1901 |
| 0.0810 | 0.9882 | 1.2416 | 0.7183 | 8.3446494e-08 | 1902 |
| 0.0846 | 0.9812 | 1.2424 | 0.7183 | 8.343062e-08 | 1903 |
| 0.0887 | 0.9671 | 1.2420 | 0.7183 | 8.341474e-08 | 1904 |
| 0.0898 | 0.9741 | 1.2435 | 0.7183 | 8.339885e-08 | 1905 |
| 0.0778 | 0.9859 | 1.2449 | 0.7254 | 8.338296e-08 | 1906 |
| 0.0772 | 0.9812 | 1.2441 | 0.7183 | 8.336706e-08 | 1907 |
| 0.0885 | 0.9788 | 1.2436 | 0.7183 | 8.335116e-08 | 1908 |
| 0.0807 | 0.9835 | 1.2467 | 0.7183 | 8.333525e-08 | 1909 |
| 0.0850 | 0.9788 | 1.2471 | 0.7113 | 8.3319335e-08 | 1910 |
| 0.0760 | 0.9859 | 1.2456 | 0.7183 | 8.330342e-08 | 1911 |
| 0.0865 | 0.9741 | 1.2483 | 0.7183 | 8.3287496e-08 | 1912 |
| 0.0805 | 0.9835 | 1.2490 | 0.7183 | 8.3271566e-08 | 1913 |
| 0.0904 | 0.9788 | 1.2473 | 0.7183 | 8.325563e-08 | 1914 |
| 0.0812 | 0.9812 | 1.2474 | 0.7183 | 8.323969e-08 | 1915 |
| 0.0674 | 0.9882 | 1.2488 | 0.7254 | 8.3223746e-08 | 1916 |
| 0.0879 | 0.9812 | 1.2514 | 0.7183 | 8.3207794e-08 | 1917 |
| 0.0770 | 0.9788 | 1.2515 | 0.7183 | 8.3191836e-08 | 1918 |
| 0.0675 | 0.9906 | 1.2508 | 0.7254 | 8.317588e-08 | 1919 |
| 0.0881 | 0.9718 | 1.2498 | 0.7183 | 8.315991e-08 | 1920 |
| 0.0787 | 0.9906 | 1.2505 | 0.7183 | 8.314394e-08 | 1921 |
| 0.0801 | 0.9859 | 1.2533 | 0.7183 | 8.312796e-08 | 1922 |
| 0.0973 | 0.9671 | 1.2523 | 0.7183 | 8.311198e-08 | 1923 |
| 0.0864 | 0.9812 | 1.2511 | 0.7183 | 8.309599e-08 | 1924 |
| 0.0942 | 0.9765 | 1.2510 | 0.7183 | 8.3079996e-08 | 1925 |
| 0.0778 | 0.9835 | 1.2508 | 0.7183 | 8.3063995e-08 | 1926 |
| 0.0801 | 0.9835 | 1.2497 | 0.7183 | 8.304799e-08 | 1927 |
| 0.0804 | 0.9812 | 1.2504 | 0.7183 | 8.3031985e-08 | 1928 |
| 0.0725 | 0.9929 | 1.2520 | 0.7183 | 8.301597e-08 | 1929 |
| 0.0746 | 0.9835 | 1.2530 | 0.7183 | 8.2999954e-08 | 1930 |
| 0.0825 | 0.9835 | 1.2525 | 0.7183 | 8.298393e-08 | 1931 |
| 0.0810 | 0.9835 | 1.2516 | 0.7183 | 8.29679e-08 | 1932 |
| 0.0805 | 0.9812 | 1.2540 | 0.7113 | 8.2951864e-08 | 1933 |
| 0.0879 | 0.9788 | 1.2549 | 0.7183 | 8.293583e-08 | 1934 |
| 0.0761 | 0.9859 | 1.2549 | 0.7254 | 8.291978e-08 | 1935 |
| 0.0768 | 0.9835 | 1.2558 | 0.7113 | 8.290373e-08 | 1936 |
| 0.0790 | 0.9812 | 1.2540 | 0.7183 | 8.2887674e-08 | 1937 |
| 0.0741 | 0.9835 | 1.2558 | 0.7254 | 8.2871615e-08 | 1938 |
| 0.0691 | 0.9882 | 1.2576 | 0.7183 | 8.285555e-08 | 1939 |
| 0.0770 | 0.9859 | 1.2559 | 0.7183 | 8.283948e-08 | 1940 |
| 0.0875 | 0.9812 | 1.2546 | 0.7254 | 8.2823405e-08 | 1941 |
| 0.0768 | 0.9859 | 1.2556 | 0.7183 | 8.2807325e-08 | 1942 |
| 0.0727 | 0.9812 | 1.2570 | 0.7183 | 8.279124e-08 | 1943 |
| 0.0671 | 0.9929 | 1.2603 | 0.7113 | 8.2775145e-08 | 1944 |
| 0.0686 | 0.9812 | 1.2653 | 0.7113 | 8.275905e-08 | 1945 |
| 0.0852 | 0.9835 | 1.2625 | 0.7113 | 8.274295e-08 | 1946 |
| 0.0645 | 0.9906 | 1.2605 | 0.7113 | 8.272684e-08 | 1947 |
| 0.0769 | 0.9882 | 1.2588 | 0.7183 | 8.271073e-08 | 1948 |
| 0.0807 | 0.9812 | 1.2587 | 0.7183 | 8.269461e-08 | 1949 |
| 0.0788 | 0.9835 | 1.2594 | 0.7183 | 8.267849e-08 | 1950 |
| 0.0785 | 0.9741 | 1.2601 | 0.7183 | 8.266236e-08 | 1951 |
| 0.0764 | 0.9765 | 1.2581 | 0.7183 | 8.264623e-08 | 1952 |
| 0.0792 | 0.9859 | 1.2593 | 0.7183 | 8.2630095e-08 | 1953 |
| 0.0792 | 0.9859 | 1.2619 | 0.7113 | 8.261395e-08 | 1954 |
| 0.0757 | 0.9882 | 1.2619 | 0.7113 | 8.25978e-08 | 1955 |
| 0.0787 | 0.9835 | 1.2616 | 0.7113 | 8.258165e-08 | 1956 |
| 0.0961 | 0.9671 | 1.2641 | 0.7113 | 8.256549e-08 | 1957 |
| 0.0743 | 0.9859 | 1.2646 | 0.7113 | 8.254933e-08 | 1958 |
| 0.0814 | 0.9835 | 1.2670 | 0.7113 | 8.253316e-08 | 1959 |
| 0.0819 | 0.9788 | 1.2684 | 0.7113 | 8.251699e-08 | 1960 |
| 0.0925 | 0.9741 | 1.2658 | 0.7042 | 8.250081e-08 | 1961 |
| 0.0850 | 0.9812 | 1.2643 | 0.7113 | 8.2484625e-08 | 1962 |
| 0.0805 | 0.9835 | 1.2657 | 0.7113 | 8.246844e-08 | 1963 |
| 0.0613 | 0.9906 | 1.2652 | 0.7113 | 8.2452246e-08 | 1964 |
| 0.0787 | 0.9882 | 1.2666 | 0.7113 | 8.2436046e-08 | 1965 |
| 0.0803 | 0.9835 | 1.2675 | 0.7113 | 8.2419845e-08 | 1966 |
| 0.0806 | 0.9859 | 1.2683 | 0.7042 | 8.240364e-08 | 1967 |
| 0.0795 | 0.9882 | 1.2685 | 0.7113 | 8.238742e-08 | 1968 |
| 0.0652 | 0.9906 | 1.2693 | 0.7113 | 8.23712e-08 | 1969 |
| 0.0670 | 0.9906 | 1.2710 | 0.7042 | 8.235498e-08 | 1970 |
| 0.0769 | 0.9859 | 1.2701 | 0.7042 | 8.233875e-08 | 1971 |
| 0.0608 | 0.9929 | 1.2701 | 0.7113 | 8.2322515e-08 | 1972 |
| 0.0761 | 0.9859 | 1.2703 | 0.7113 | 8.230628e-08 | 1973 |
| 0.0731 | 0.9882 | 1.2690 | 0.7254 | 8.2290036e-08 | 1974 |
| 0.0838 | 0.9765 | 1.2682 | 0.7254 | 8.2273786e-08 | 1975 |
| 0.0782 | 0.9812 | 1.2705 | 0.7113 | 8.225753e-08 | 1976 |
| 0.0816 | 0.9859 | 1.2728 | 0.7113 | 8.224127e-08 | 1977 |
| 0.0890 | 0.9741 | 1.2715 | 0.7113 | 8.222501e-08 | 1978 |
| 0.0768 | 0.9882 | 1.2706 | 0.7183 | 8.2208736e-08 | 1979 |
| 0.0807 | 0.9835 | 1.2697 | 0.7183 | 8.2192464e-08 | 1980 |
| 0.0710 | 0.9859 | 1.2710 | 0.7183 | 8.2176186e-08 | 1981 |
| 0.0676 | 0.9859 | 1.2704 | 0.7183 | 8.21599e-08 | 1982 |
| 0.0772 | 0.9812 | 1.2725 | 0.7183 | 8.2143615e-08 | 1983 |
| 0.0657 | 0.9859 | 1.2722 | 0.7183 | 8.212732e-08 | 1984 |
| 0.0799 | 0.9835 | 1.2713 | 0.7183 | 8.211102e-08 | 1985 |
| 0.0771 | 0.9765 | 1.2729 | 0.7183 | 8.2094715e-08 | 1986 |
| 0.0823 | 0.9788 | 1.2759 | 0.7113 | 8.207841e-08 | 1987 |
| 0.0583 | 0.9953 | 1.2759 | 0.7113 | 8.2062094e-08 | 1988 |
| 0.0907 | 0.9741 | 1.2761 | 0.7113 | 8.204577e-08 | 1989 |
| 0.0768 | 0.9859 | 1.2784 | 0.7042 | 8.202945e-08 | 1990 |
| 0.0784 | 0.9835 | 1.2766 | 0.7113 | 8.201312e-08 | 1991 |
| 0.0698 | 0.9906 | 1.2775 | 0.7042 | 8.199679e-08 | 1992 |
| 0.0667 | 0.9929 | 1.2795 | 0.7113 | 8.198045e-08 | 1993 |
| 0.0776 | 0.9812 | 1.2771 | 0.7183 | 8.196411e-08 | 1994 |
| 0.0679 | 0.9882 | 1.2786 | 0.7183 | 8.194776e-08 | 1995 |
| 0.0876 | 0.9812 | 1.2775 | 0.7183 | 8.1931404e-08 | 1996 |
| 0.0700 | 0.9929 | 1.2792 | 0.7042 | 8.191505e-08 | 1997 |
| 0.0844 | 0.9882 | 1.2782 | 0.7183 | 8.189868e-08 | 1998 |
| 0.0633 | 0.9929 | 1.2764 | 0.7183 | 8.188231e-08 | 1999 |
| 0.0684 | 0.9859 | 1.2758 | 0.7183 | 8.186594e-08 | 2000 |
| 0.0805 | 0.9788 | 1.2777 | 0.7183 | 8.1849564e-08 | 2001 |
| 0.0798 | 0.9812 | 1.2814 | 0.7113 | 8.183318e-08 | 2002 |
| 0.0764 | 0.9882 | 1.2825 | 0.7113 | 8.181679e-08 | 2003 |
| 0.0751 | 0.9788 | 1.2831 | 0.7042 | 8.18004e-08 | 2004 |
| 0.0769 | 0.9812 | 1.2842 | 0.7113 | 8.1784e-08 | 2005 |
| 0.0677 | 0.9859 | 1.2839 | 0.7113 | 8.17676e-08 | 2006 |
| 0.0704 | 0.9859 | 1.2794 | 0.7183 | 8.1751196e-08 | 2007 |
| 0.0780 | 0.9812 | 1.2786 | 0.7183 | 8.173478e-08 | 2008 |
| 0.0730 | 0.9812 | 1.2796 | 0.7183 | 8.171836e-08 | 2009 |
| 0.0773 | 0.9859 | 1.2811 | 0.7183 | 8.170194e-08 | 2010 |
| 0.0649 | 0.9882 | 1.2815 | 0.7183 | 8.168551e-08 | 2011 |
| 0.0808 | 0.9765 | 1.2819 | 0.7183 | 8.166908e-08 | 2012 |
| 0.0789 | 0.9788 | 1.2814 | 0.7183 | 8.1652644e-08 | 2013 |
| 0.0715 | 0.9906 | 1.2819 | 0.7183 | 8.16362e-08 | 2014 |
| 0.0733 | 0.9835 | 1.2792 | 0.7183 | 8.161975e-08 | 2015 |
| 0.0769 | 0.9859 | 1.2813 | 0.7183 | 8.1603304e-08 | 2016 |
| 0.0681 | 0.9953 | 1.2835 | 0.7183 | 8.158685e-08 | 2017 |
| 0.0734 | 0.9788 | 1.2861 | 0.7113 | 8.1570384e-08 | 2018 |
| 0.0707 | 0.9859 | 1.2861 | 0.7183 | 8.155392e-08 | 2019 |
| 0.0554 | 0.9953 | 1.2854 | 0.7183 | 8.153745e-08 | 2020 |
| 0.0736 | 0.9859 | 1.2844 | 0.7183 | 8.152097e-08 | 2021 |
| 0.0737 | 0.9882 | 1.2856 | 0.7113 | 8.1504496e-08 | 2022 |
| 0.0881 | 0.9788 | 1.2847 | 0.7183 | 8.148801e-08 | 2023 |
| 0.0658 | 0.9882 | 1.2827 | 0.7254 | 8.147152e-08 | 2024 |
| 0.0681 | 0.9882 | 1.2837 | 0.7183 | 8.145503e-08 | 2025 |
| 0.0870 | 0.9647 | 1.2882 | 0.7042 | 8.143853e-08 | 2026 |
| 0.0755 | 0.9906 | 1.2898 | 0.7113 | 8.142202e-08 | 2027 |
| 0.0725 | 0.9835 | 1.2910 | 0.7113 | 8.140552e-08 | 2028 |
| 0.0681 | 0.9882 | 1.2878 | 0.7113 | 8.1389004e-08 | 2029 |
| 0.0624 | 0.9953 | 1.2879 | 0.7113 | 8.1372484e-08 | 2030 |
| 0.0680 | 0.9812 | 1.2883 | 0.7113 | 8.1355964e-08 | 2031 |
| 0.0769 | 0.9812 | 1.2898 | 0.7113 | 8.133944e-08 | 2032 |
| 0.0693 | 0.9859 | 1.2886 | 0.7113 | 8.13229e-08 | 2033 |
| 0.0643 | 0.9929 | 1.2885 | 0.7113 | 8.130637e-08 | 2034 |
| 0.0774 | 0.9812 | 1.2874 | 0.7183 | 8.1289826e-08 | 2035 |
| 0.0694 | 0.9882 | 1.2884 | 0.7183 | 8.127328e-08 | 2036 |
| 0.0764 | 0.9835 | 1.2885 | 0.7254 | 8.125673e-08 | 2037 |
| 0.0589 | 0.9906 | 1.2907 | 0.7113 | 8.1240174e-08 | 2038 |
| 0.0656 | 0.9859 | 1.2915 | 0.7113 | 8.122361e-08 | 2039 |
| 0.0698 | 0.9882 | 1.2918 | 0.7113 | 8.120705e-08 | 2040 |
| 0.0750 | 0.9788 | 1.2938 | 0.7113 | 8.119048e-08 | 2041 |
| 0.0747 | 0.9835 | 1.2937 | 0.7113 | 8.11739e-08 | 2042 |
| 0.0698 | 0.9906 | 1.2928 | 0.7113 | 8.1157324e-08 | 2043 |
| 0.0725 | 0.9812 | 1.2921 | 0.7113 | 8.114074e-08 | 2044 |
| 0.0624 | 0.9929 | 1.2934 | 0.7042 | 8.112415e-08 | 2045 |
| 0.0746 | 0.9859 | 1.2946 | 0.7042 | 8.110756e-08 | 2046 |
| 0.0788 | 0.9835 | 1.2967 | 0.7042 | 8.109096e-08 | 2047 |
| 0.0611 | 0.9859 | 1.2972 | 0.7042 | 8.1074354e-08 | 2048 |
| 0.0642 | 0.9812 | 1.2972 | 0.7042 | 8.105775e-08 | 2049 |
| 0.0681 | 0.9765 | 1.2955 | 0.7183 | 8.104114e-08 | 2050 |
| 0.0692 | 0.9882 | 1.2943 | 0.7183 | 8.102452e-08 | 2051 |
| 0.0643 | 0.9882 | 1.2965 | 0.7113 | 8.10079e-08 | 2052 |
| 0.0754 | 0.9812 | 1.2960 | 0.7113 | 8.099127e-08 | 2053 |
| 0.0682 | 0.9882 | 1.2980 | 0.7113 | 8.097464e-08 | 2054 |
| 0.0663 | 0.9882 | 1.2971 | 0.7183 | 8.0958e-08 | 2055 |
| 0.0572 | 0.9906 | 1.2984 | 0.7113 | 8.094136e-08 | 2056 |
| 0.0672 | 0.9906 | 1.2991 | 0.7113 | 8.0924714e-08 | 2057 |
| 0.0625 | 0.9859 | 1.2997 | 0.7183 | 8.0908066e-08 | 2058 |
| 0.0870 | 0.9741 | 1.3026 | 0.7042 | 8.089141e-08 | 2059 |
| 0.0721 | 0.9835 | 1.3025 | 0.7042 | 8.087475e-08 | 2060 |
| 0.0618 | 0.9906 | 1.3037 | 0.7042 | 8.085809e-08 | 2061 |
| 0.0636 | 0.9929 | 1.3033 | 0.7042 | 8.084142e-08 | 2062 |
| 0.0699 | 0.9859 | 1.3026 | 0.7042 | 8.082474e-08 | 2063 |
| 0.0624 | 0.9906 | 1.3002 | 0.7183 | 8.0808064e-08 | 2064 |
| 0.0711 | 0.9812 | 1.2998 | 0.7183 | 8.079138e-08 | 2065 |
| 0.0677 | 0.9859 | 1.3018 | 0.7042 | 8.077469e-08 | 2066 |
| 0.0697 | 0.9882 | 1.3029 | 0.7042 | 8.0758e-08 | 2067 |
| 0.0633 | 0.9882 | 1.3034 | 0.7042 | 8.07413e-08 | 2068 |
| 0.0754 | 0.9835 | 1.3045 | 0.7042 | 8.07246e-08 | 2069 |
| 0.0662 | 0.9882 | 1.3070 | 0.7042 | 8.070789e-08 | 2070 |
| 0.0679 | 0.9788 | 1.3067 | 0.7042 | 8.069118e-08 | 2071 |
| 0.0577 | 0.9976 | 1.3043 | 0.7042 | 8.067446e-08 | 2072 |
| 0.0568 | 0.9906 | 1.3047 | 0.7042 | 8.065774e-08 | 2073 |
| 0.0652 | 0.9882 | 1.3017 | 0.7183 | 8.0641016e-08 | 2074 |
| 0.0726 | 0.9812 | 1.3021 | 0.7183 | 8.062429e-08 | 2075 |
| 0.0643 | 0.9882 | 1.3056 | 0.7183 | 8.0607556e-08 | 2076 |
| 0.0670 | 0.9906 | 1.3073 | 0.7113 | 8.0590816e-08 | 2077 |
| 0.0646 | 0.9882 | 1.3067 | 0.7183 | 8.0574075e-08 | 2078 |
| 0.0639 | 0.9859 | 1.3094 | 0.7113 | 8.055733e-08 | 2079 |
| 0.0625 | 0.9882 | 1.3094 | 0.7113 | 8.054057e-08 | 2080 |
| 0.0595 | 0.9859 | 1.3091 | 0.7113 | 8.052382e-08 | 2081 |
| 0.0671 | 0.9812 | 1.3097 | 0.7113 | 8.050706e-08 | 2082 |
| 0.0712 | 0.9835 | 1.3100 | 0.7113 | 8.049029e-08 | 2083 |
| 0.0724 | 0.9882 | 1.3090 | 0.7113 | 8.047352e-08 | 2084 |
| 0.0790 | 0.9718 | 1.3077 | 0.7042 | 8.045674e-08 | 2085 |
| 0.0605 | 0.9953 | 1.3084 | 0.7042 | 8.043996e-08 | 2086 |
| 0.0706 | 0.9882 | 1.3118 | 0.7113 | 8.042318e-08 | 2087 |
| 0.0582 | 0.9906 | 1.3094 | 0.7042 | 8.040639e-08 | 2088 |
| 0.0719 | 0.9859 | 1.3097 | 0.7113 | 8.03896e-08 | 2089 |
| 0.0569 | 1.0 | 1.3099 | 0.7113 | 8.03728e-08 | 2090 |
| 0.0649 | 0.9859 | 1.3102 | 0.7113 | 8.0355996e-08 | 2091 |
| 0.0643 | 0.9859 | 1.3094 | 0.7183 | 8.033919e-08 | 2092 |
| 0.0588 | 0.9882 | 1.3114 | 0.7113 | 8.032238e-08 | 2093 |
| 0.0601 | 0.9906 | 1.3115 | 0.7183 | 8.030556e-08 | 2094 |
| 0.0656 | 0.9859 | 1.3112 | 0.7183 | 8.028874e-08 | 2095 |
| 0.0703 | 0.9882 | 1.3108 | 0.7113 | 8.027192e-08 | 2096 |
| 0.0527 | 0.9929 | 1.3096 | 0.7183 | 8.0255084e-08 | 2097 |
| 0.0795 | 0.9812 | 1.3113 | 0.7113 | 8.023825e-08 | 2098 |
| 0.0713 | 0.9859 | 1.3125 | 0.7113 | 8.022141e-08 | 2099 |
| 0.0682 | 0.9859 | 1.3134 | 0.7183 | 8.020457e-08 | 2100 |
| 0.0623 | 0.9882 | 1.3136 | 0.7113 | 8.0187725e-08 | 2101 |
| 0.0596 | 0.9906 | 1.3140 | 0.7183 | 8.017087e-08 | 2102 |
| 0.0650 | 0.9859 | 1.3144 | 0.7183 | 8.015402e-08 | 2103 |
| 0.0691 | 0.9882 | 1.3157 | 0.7113 | 8.0137156e-08 | 2104 |
| 0.0619 | 0.9906 | 1.3159 | 0.7113 | 8.012029e-08 | 2105 |
| 0.0561 | 0.9953 | 1.3164 | 0.7113 | 8.010342e-08 | 2106 |
| 0.0566 | 0.9929 | 1.3170 | 0.7113 | 8.0086544e-08 | 2107 |
| 0.0585 | 0.9953 | 1.3171 | 0.7113 | 8.006967e-08 | 2108 |
| 0.0632 | 0.9906 | 1.3188 | 0.7113 | 8.0052786e-08 | 2109 |
| 0.0615 | 0.9859 | 1.3182 | 0.7113 | 8.0035896e-08 | 2110 |
| 0.0640 | 0.9859 | 1.3187 | 0.7042 | 8.001901e-08 | 2111 |
| 0.0715 | 0.9859 | 1.3183 | 0.7042 | 8.000211e-08 | 2112 |
| 0.0628 | 0.9882 | 1.3194 | 0.7113 | 7.9985206e-08 | 2113 |
| 0.0549 | 0.9953 | 1.3206 | 0.7042 | 7.99683e-08 | 2114 |
| 0.0640 | 0.9906 | 1.3187 | 0.7113 | 7.995139e-08 | 2115 |
| 0.0592 | 0.9906 | 1.3203 | 0.7042 | 7.993448e-08 | 2116 |
| 0.0750 | 0.9788 | 1.3215 | 0.7042 | 7.991756e-08 | 2117 |
| 0.0636 | 0.9882 | 1.3202 | 0.7042 | 7.990064e-08 | 2118 |
| 0.0608 | 0.9882 | 1.3218 | 0.7113 | 7.988371e-08 | 2119 |
| 0.0583 | 0.9929 | 1.3231 | 0.7113 | 7.986678e-08 | 2120 |
| 0.0693 | 0.9835 | 1.3221 | 0.7113 | 7.984984e-08 | 2121 |
| 0.0671 | 0.9906 | 1.3234 | 0.7113 | 7.98329e-08 | 2122 |
| 0.0618 | 0.9906 | 1.3280 | 0.7113 | 7.9815955e-08 | 2123 |
| 0.0594 | 0.9929 | 1.3257 | 0.7042 | 7.979901e-08 | 2124 |
| 0.0596 | 0.9929 | 1.3248 | 0.7042 | 7.9782055e-08 | 2125 |
| 0.0587 | 0.9882 | 1.3236 | 0.7113 | 7.9765094e-08 | 2126 |
| 0.0664 | 0.9788 | 1.3235 | 0.7113 | 7.974813e-08 | 2127 |
| 0.0581 | 0.9906 | 1.3232 | 0.7042 | 7.9731166e-08 | 2128 |
| 0.0577 | 0.9929 | 1.3241 | 0.7042 | 7.97142e-08 | 2129 |
| 0.0694 | 0.9882 | 1.3255 | 0.7113 | 7.969722e-08 | 2130 |
| 0.0514 | 0.9929 | 1.3261 | 0.7042 | 7.968024e-08 | 2131 |
| 0.0710 | 0.9812 | 1.3289 | 0.7113 | 7.966326e-08 | 2132 |
| 0.0647 | 0.9882 | 1.3307 | 0.7113 | 7.964627e-08 | 2133 |
| 0.0602 | 0.9882 | 1.3305 | 0.7113 | 7.9629274e-08 | 2134 |
| 0.0686 | 0.9859 | 1.3281 | 0.7042 | 7.961228e-08 | 2135 |
| 0.0629 | 0.9835 | 1.3262 | 0.7042 | 7.9595274e-08 | 2136 |
| 0.0672 | 0.9859 | 1.3295 | 0.7042 | 7.957827e-08 | 2137 |
| 0.0675 | 0.9859 | 1.3329 | 0.7113 | 7.956126e-08 | 2138 |
| 0.0629 | 0.9859 | 1.3337 | 0.7113 | 7.954424e-08 | 2139 |
| 0.0546 | 0.9929 | 1.3347 | 0.7113 | 7.9527226e-08 | 2140 |
| 0.0556 | 0.9953 | 1.3341 | 0.7042 | 7.95102e-08 | 2141 |
| 0.0591 | 0.9906 | 1.3350 | 0.7113 | 7.949318e-08 | 2142 |
| 0.0517 | 0.9882 | 1.3349 | 0.7113 | 7.9476145e-08 | 2143 |
| 0.0573 | 0.9929 | 1.3339 | 0.7042 | 7.9459106e-08 | 2144 |
| 0.0563 | 0.9953 | 1.3348 | 0.7042 | 7.944207e-08 | 2145 |
| 0.0553 | 0.9929 | 1.3339 | 0.7042 | 7.942502e-08 | 2146 |
| 0.0676 | 0.9812 | 1.3345 | 0.7042 | 7.9407975e-08 | 2147 |
| 0.0609 | 0.9835 | 1.3360 | 0.7042 | 7.939092e-08 | 2148 |
| 0.0688 | 0.9812 | 1.3366 | 0.7042 | 7.937386e-08 | 2149 |
| 0.0672 | 0.9835 | 1.3385 | 0.7042 | 7.93568e-08 | 2150 |
| 0.0607 | 0.9882 | 1.3368 | 0.7113 | 7.9339735e-08 | 2151 |
| 0.0538 | 0.9953 | 1.3372 | 0.7113 | 7.932267e-08 | 2152 |
| 0.0641 | 0.9882 | 1.3347 | 0.7042 | 7.930559e-08 | 2153 |
| 0.0638 | 0.9835 | 1.3338 | 0.7183 | 7.928851e-08 | 2154 |
| 0.0579 | 0.9906 | 1.3341 | 0.7183 | 7.927143e-08 | 2155 |
| 0.0595 | 0.9882 | 1.3339 | 0.7183 | 7.925434e-08 | 2156 |
| 0.0714 | 0.9812 | 1.3342 | 0.7183 | 7.923725e-08 | 2157 |
| 0.0512 | 0.9929 | 1.3373 | 0.7113 | 7.922016e-08 | 2158 |
| 0.0562 | 0.9906 | 1.3392 | 0.7113 | 7.9203055e-08 | 2159 |
| 0.0662 | 0.9906 | 1.3368 | 0.7113 | 7.918595e-08 | 2160 |
| 0.0462 | 0.9976 | 1.3371 | 0.7113 | 7.916884e-08 | 2161 |
| 0.0641 | 0.9812 | 1.3370 | 0.7042 | 7.915173e-08 | 2162 |
| 0.0705 | 0.9906 | 1.3381 | 0.7042 | 7.9134615e-08 | 2163 |
| 0.0548 | 0.9929 | 1.3397 | 0.7042 | 7.911749e-08 | 2164 |
| 0.0559 | 0.9835 | 1.3404 | 0.7113 | 7.910037e-08 | 2165 |
| 0.0635 | 0.9835 | 1.3411 | 0.7113 | 7.9083236e-08 | 2166 |
| 0.0510 | 0.9906 | 1.3402 | 0.7113 | 7.9066105e-08 | 2167 |
| 0.0629 | 0.9835 | 1.3397 | 0.7113 | 7.904897e-08 | 2168 |
| 0.0580 | 0.9929 | 1.3420 | 0.7113 | 7.903182e-08 | 2169 |
| 0.0529 | 0.9929 | 1.3432 | 0.7042 | 7.9014676e-08 | 2170 |
| 0.0585 | 0.9906 | 1.3456 | 0.7113 | 7.899752e-08 | 2171 |
| 0.0650 | 0.9835 | 1.3463 | 0.7113 | 7.898037e-08 | 2172 |
| 0.0547 | 0.9906 | 1.3444 | 0.7042 | 7.896321e-08 | 2173 |
| 0.0546 | 0.9906 | 1.3416 | 0.7042 | 7.894605e-08 | 2174 |
| 0.0577 | 0.9929 | 1.3406 | 0.7183 | 7.8928885e-08 | 2175 |
| 0.0550 | 0.9906 | 1.3422 | 0.7113 | 7.891171e-08 | 2176 |
| 0.0559 | 0.9953 | 1.3447 | 0.7042 | 7.889454e-08 | 2177 |
| 0.0670 | 0.9835 | 1.3443 | 0.7042 | 7.8877356e-08 | 2178 |
| 0.0601 | 0.9906 | 1.3424 | 0.7113 | 7.8860175e-08 | 2179 |
| 0.0573 | 0.9835 | 1.3436 | 0.7042 | 7.884299e-08 | 2180 |
| 0.0521 | 0.9906 | 1.3461 | 0.7042 | 7.882579e-08 | 2181 |
| 0.0600 | 0.9835 | 1.3468 | 0.7042 | 7.88086e-08 | 2182 |
| 0.0748 | 0.9788 | 1.3462 | 0.7042 | 7.8791395e-08 | 2183 |
| 0.0523 | 0.9976 | 1.3450 | 0.7113 | 7.877419e-08 | 2184 |
| 0.0522 | 0.9882 | 1.3444 | 0.7042 | 7.875698e-08 | 2185 |
| 0.0578 | 0.9882 | 1.3476 | 0.7042 | 7.8739774e-08 | 2186 |
| 0.0579 | 0.9953 | 1.3475 | 0.7042 | 7.872256e-08 | 2187 |
| 0.0511 | 0.9929 | 1.3468 | 0.7042 | 7.8705334e-08 | 2188 |
| 0.0578 | 0.9953 | 1.3475 | 0.7113 | 7.868811e-08 | 2189 |
| 0.0639 | 0.9859 | 1.3472 | 0.7113 | 7.867088e-08 | 2190 |
| 0.0540 | 0.9882 | 1.3463 | 0.7042 | 7.865365e-08 | 2191 |
| 0.0509 | 0.9882 | 1.3478 | 0.7042 | 7.863641e-08 | 2192 |
| 0.0534 | 0.9906 | 1.3484 | 0.7113 | 7.861917e-08 | 2193 |
| 0.0694 | 0.9835 | 1.3481 | 0.7113 | 7.860193e-08 | 2194 |
| 0.0606 | 0.9882 | 1.3523 | 0.7113 | 7.858468e-08 | 2195 |
| 0.0502 | 0.9953 | 1.3529 | 0.7113 | 7.8567425e-08 | 2196 |
| 0.0549 | 0.9835 | 1.3533 | 0.7113 | 7.8550165e-08 | 2197 |
| 0.0476 | 0.9953 | 1.3537 | 0.7113 | 7.8532906e-08 | 2198 |
| 0.0604 | 0.9882 | 1.3544 | 0.7113 | 7.851564e-08 | 2199 |
| 0.0593 | 0.9882 | 1.3533 | 0.7042 | 7.8498374e-08 | 2200 |
| 0.0522 | 0.9953 | 1.3541 | 0.7042 | 7.84811e-08 | 2201 |
| 0.0559 | 0.9882 | 1.3519 | 0.7042 | 7.846382e-08 | 2202 |
| 0.0570 | 0.9906 | 1.3507 | 0.7042 | 7.844654e-08 | 2203 |
| 0.0473 | 1.0 | 1.3498 | 0.7042 | 7.842925e-08 | 2204 |
| 0.0541 | 0.9929 | 1.3494 | 0.7042 | 7.8411965e-08 | 2205 |
| 0.0543 | 0.9953 | 1.3493 | 0.6972 | 7.839467e-08 | 2206 |
| 0.0603 | 0.9882 | 1.3477 | 0.7042 | 7.8377376e-08 | 2207 |
| 0.0464 | 0.9929 | 1.3478 | 0.7113 | 7.8360074e-08 | 2208 |
| 0.0518 | 0.9859 | 1.3502 | 0.7113 | 7.8342765e-08 | 2209 |
| 0.0526 | 0.9882 | 1.3520 | 0.7113 | 7.8325456e-08 | 2210 |
| 0.0518 | 0.9906 | 1.3545 | 0.7042 | 7.830814e-08 | 2211 |
| 0.0495 | 0.9882 | 1.3552 | 0.7042 | 7.8290824e-08 | 2212 |
| 0.0514 | 0.9929 | 1.3561 | 0.7042 | 7.82735e-08 | 2213 |
| 0.0484 | 0.9953 | 1.3546 | 0.7042 | 7.825618e-08 | 2214 |
| 0.0538 | 0.9929 | 1.3544 | 0.7042 | 7.823885e-08 | 2215 |
| 0.0515 | 0.9906 | 1.3560 | 0.7042 | 7.822151e-08 | 2216 |
| 0.0540 | 0.9882 | 1.3571 | 0.7042 | 7.8204174e-08 | 2217 |
| 0.0488 | 0.9953 | 1.3586 | 0.7042 | 7.818683e-08 | 2218 |
| 0.0573 | 0.9859 | 1.3571 | 0.7042 | 7.8169485e-08 | 2219 |
| 0.0529 | 0.9906 | 1.3556 | 0.7042 | 7.8152134e-08 | 2220 |
| 0.0570 | 0.9906 | 1.3568 | 0.7113 | 7.813478e-08 | 2221 |
| 0.0598 | 0.9882 | 1.3590 | 0.7113 | 7.8117424e-08 | 2222 |
| 0.0422 | 0.9929 | 1.3608 | 0.7113 | 7.8100065e-08 | 2223 |
| 0.0513 | 0.9906 | 1.3605 | 0.7113 | 7.80827e-08 | 2224 |
| 0.0484 | 0.9976 | 1.3572 | 0.7042 | 7.806533e-08 | 2225 |
| 0.0623 | 0.9859 | 1.3574 | 0.7042 | 7.8047954e-08 | 2226 |
| 0.0551 | 0.9882 | 1.3580 | 0.7042 | 7.8030574e-08 | 2227 |
| 0.0503 | 0.9976 | 1.3593 | 0.7042 | 7.8013194e-08 | 2228 |
| 0.0529 | 0.9929 | 1.3611 | 0.7042 | 7.799581e-08 | 2229 |
| 0.0467 | 0.9929 | 1.3630 | 0.7113 | 7.797842e-08 | 2230 |
| 0.0593 | 0.9906 | 1.3625 | 0.7113 | 7.7961026e-08 | 2231 |
| 0.0585 | 0.9812 | 1.3612 | 0.7042 | 7.794363e-08 | 2232 |
| 0.0516 | 0.9882 | 1.3612 | 0.7113 | 7.792623e-08 | 2233 |
| 0.0543 | 0.9953 | 1.3637 | 0.7113 | 7.790882e-08 | 2234 |
| 0.0474 | 0.9953 | 1.3675 | 0.7042 | 7.7891414e-08 | 2235 |
| 0.0555 | 0.9929 | 1.3666 | 0.7042 | 7.7874e-08 | 2236 |
| 0.0514 | 0.9906 | 1.3662 | 0.7042 | 7.785658e-08 | 2237 |
| 0.0546 | 0.9882 | 1.3652 | 0.7042 | 7.783916e-08 | 2238 |
| 0.0584 | 0.9929 | 1.3642 | 0.7113 | 7.782174e-08 | 2239 |
| 0.0469 | 0.9929 | 1.3636 | 0.7113 | 7.780431e-08 | 2240 |
| 0.0508 | 0.9906 | 1.3669 | 0.7113 | 7.778688e-08 | 2241 |
| 0.0519 | 0.9929 | 1.3674 | 0.7113 | 7.776944e-08 | 2242 |
| 0.0503 | 0.9929 | 1.3689 | 0.7113 | 7.7752006e-08 | 2243 |
| 0.0483 | 0.9953 | 1.3715 | 0.7113 | 7.773456e-08 | 2244 |
| 0.0473 | 0.9953 | 1.3722 | 0.7113 | 7.771711e-08 | 2245 |
| 0.0540 | 0.9906 | 1.3708 | 0.7042 | 7.769966e-08 | 2246 |
| 0.0540 | 0.9929 | 1.3685 | 0.7042 | 7.76822e-08 | 2247 |
| 0.0494 | 0.9953 | 1.3672 | 0.7042 | 7.7664744e-08 | 2248 |
| 0.0490 | 0.9929 | 1.3681 | 0.7042 | 7.764728e-08 | 2249 |
| 0.0544 | 0.9882 | 1.3669 | 0.7113 | 7.7629814e-08 | 2250 |
| 0.0507 | 0.9929 | 1.3658 | 0.7113 | 7.761234e-08 | 2251 |
| 0.0596 | 0.9859 | 1.3644 | 0.6972 | 7.759487e-08 | 2252 |
| 0.0498 | 0.9929 | 1.3634 | 0.7042 | 7.757739e-08 | 2253 |
| 0.0471 | 0.9953 | 1.3654 | 0.6972 | 7.755991e-08 | 2254 |
| 0.0539 | 0.9906 | 1.3651 | 0.6972 | 7.7542424e-08 | 2255 |
| 0.0513 | 0.9882 | 1.3645 | 0.7113 | 7.752494e-08 | 2256 |
| 0.0582 | 0.9859 | 1.3662 | 0.7113 | 7.7507444e-08 | 2257 |
| 0.0417 | 0.9953 | 1.3686 | 0.7113 | 7.748994e-08 | 2258 |
| 0.0502 | 0.9882 | 1.3675 | 0.7042 | 7.747244e-08 | 2259 |
| 0.0526 | 0.9859 | 1.3690 | 0.6972 | 7.7454935e-08 | 2260 |
| 0.0583 | 0.9835 | 1.3704 | 0.7042 | 7.743743e-08 | 2261 |
| 0.0581 | 0.9929 | 1.3704 | 0.7042 | 7.741991e-08 | 2262 |
| 0.0458 | 0.9929 | 1.3715 | 0.7042 | 7.74024e-08 | 2263 |
| 0.0523 | 0.9859 | 1.3736 | 0.7042 | 7.7384875e-08 | 2264 |
| 0.0538 | 0.9929 | 1.3741 | 0.7042 | 7.736735e-08 | 2265 |
| 0.0633 | 0.9788 | 1.3705 | 0.6972 | 7.7349824e-08 | 2266 |
| 0.0626 | 0.9859 | 1.3691 | 0.6972 | 7.7332295e-08 | 2267 |
| 0.0521 | 0.9929 | 1.3705 | 0.6972 | 7.731476e-08 | 2268 |
| 0.0519 | 0.9882 | 1.3732 | 0.6972 | 7.729722e-08 | 2269 |
| 0.0485 | 0.9953 | 1.3742 | 0.7042 | 7.727968e-08 | 2270 |
| 0.0472 | 0.9929 | 1.3732 | 0.7042 | 7.7262136e-08 | 2271 |
| 0.0476 | 0.9953 | 1.3754 | 0.7042 | 7.7244586e-08 | 2272 |
| 0.0464 | 0.9906 | 1.3760 | 0.7042 | 7.7227035e-08 | 2273 |
| 0.0531 | 0.9906 | 1.3728 | 0.7042 | 7.720948e-08 | 2274 |
| 0.0520 | 0.9906 | 1.3718 | 0.6972 | 7.719191e-08 | 2275 |
| 0.0410 | 1.0 | 1.3713 | 0.6972 | 7.717435e-08 | 2276 |
| 0.0593 | 0.9859 | 1.3729 | 0.6972 | 7.715678e-08 | 2277 |
| 0.0533 | 0.9882 | 1.3760 | 0.7042 | 7.7139205e-08 | 2278 |
| 0.0572 | 0.9906 | 1.3765 | 0.7042 | 7.7121626e-08 | 2279 |
| 0.0490 | 0.9929 | 1.3762 | 0.7042 | 7.710405e-08 | 2280 |
| 0.0628 | 0.9812 | 1.3796 | 0.7042 | 7.708646e-08 | 2281 |
| 0.0528 | 0.9929 | 1.3807 | 0.7042 | 7.7068876e-08 | 2282 |
| 0.0521 | 0.9906 | 1.3820 | 0.7042 | 7.705128e-08 | 2283 |
| 0.0432 | 0.9953 | 1.3823 | 0.7042 | 7.703369e-08 | 2284 |
| 0.0514 | 0.9906 | 1.3827 | 0.7042 | 7.701609e-08 | 2285 |
| 0.0542 | 0.9929 | 1.3880 | 0.7042 | 7.699849e-08 | 2286 |
| 0.0509 | 0.9906 | 1.3876 | 0.7042 | 7.698088e-08 | 2287 |
| 0.0492 | 0.9929 | 1.3850 | 0.7042 | 7.6963275e-08 | 2288 |
| 0.0427 | 0.9953 | 1.3844 | 0.7042 | 7.694566e-08 | 2289 |
| 0.0496 | 0.9906 | 1.3854 | 0.7042 | 7.6928046e-08 | 2290 |
| 0.0478 | 0.9929 | 1.3868 | 0.7113 | 7.6910425e-08 | 2291 |
| 0.0484 | 0.9953 | 1.3886 | 0.7113 | 7.68928e-08 | 2292 |
| 0.0492 | 0.9976 | 1.3871 | 0.7113 | 7.6875175e-08 | 2293 |
| 0.0430 | 0.9929 | 1.3844 | 0.7042 | 7.6857546e-08 | 2294 |
| 0.0466 | 0.9906 | 1.3831 | 0.6972 | 7.683991e-08 | 2295 |
| 0.0431 | 0.9882 | 1.3832 | 0.6972 | 7.6822275e-08 | 2296 |
| 0.0508 | 0.9906 | 1.3828 | 0.6972 | 7.680463e-08 | 2297 |
| 0.0465 | 0.9953 | 1.3844 | 0.6972 | 7.678699e-08 | 2298 |
| 0.0510 | 0.9906 | 1.3852 | 0.6972 | 7.676934e-08 | 2299 |
| 0.0623 | 0.9859 | 1.3868 | 0.7042 | 7.675169e-08 | 2300 |
| 0.0503 | 0.9882 | 1.3860 | 0.7113 | 7.673403e-08 | 2301 |
| 0.0420 | 0.9976 | 1.3875 | 0.7113 | 7.6716375e-08 | 2302 |
| 0.0478 | 0.9953 | 1.3875 | 0.7113 | 7.669871e-08 | 2303 |
| 0.0427 | 0.9976 | 1.3880 | 0.7113 | 7.668105e-08 | 2304 |
| 0.0555 | 0.9906 | 1.3861 | 0.6972 | 7.6663376e-08 | 2305 |
| 0.0446 | 0.9953 | 1.3860 | 0.6972 | 7.6645705e-08 | 2306 |
| 0.0447 | 0.9906 | 1.3864 | 0.6972 | 7.662803e-08 | 2307 |
| 0.0599 | 0.9859 | 1.3861 | 0.6972 | 7.661035e-08 | 2308 |
| 0.0502 | 0.9906 | 1.3878 | 0.6972 | 7.659266e-08 | 2309 |
| 0.0386 | 0.9976 | 1.3887 | 0.7113 | 7.657498e-08 | 2310 |
| 0.0453 | 0.9929 | 1.3881 | 0.7113 | 7.6557285e-08 | 2311 |
| 0.0514 | 0.9906 | 1.3902 | 0.7113 | 7.653959e-08 | 2312 |
| 0.0543 | 0.9859 | 1.3923 | 0.7113 | 7.652189e-08 | 2313 |
| 0.0428 | 0.9906 | 1.3903 | 0.7113 | 7.650419e-08 | 2314 |
| 0.0569 | 0.9859 | 1.3908 | 0.7113 | 7.648649e-08 | 2315 |
| 0.0451 | 0.9929 | 1.3923 | 0.7113 | 7.646878e-08 | 2316 |
| 0.0440 | 0.9929 | 1.3906 | 0.7113 | 7.6451066e-08 | 2317 |
| 0.0505 | 0.9859 | 1.3903 | 0.7042 | 7.643335e-08 | 2318 |
| 0.0413 | 0.9882 | 1.3912 | 0.7113 | 7.641563e-08 | 2319 |
| 0.0554 | 0.9906 | 1.3932 | 0.7113 | 7.639791e-08 | 2320 |
| 0.0488 | 0.9976 | 1.3925 | 0.7113 | 7.638018e-08 | 2321 |
| 0.0461 | 0.9906 | 1.3901 | 0.7042 | 7.6362454e-08 | 2322 |
| 0.0535 | 0.9835 | 1.3919 | 0.7113 | 7.634472e-08 | 2323 |
| 0.0502 | 0.9882 | 1.3934 | 0.7113 | 7.6326984e-08 | 2324 |
| 0.0542 | 0.9812 | 1.3912 | 0.7113 | 7.630924e-08 | 2325 |
| 0.0454 | 0.9929 | 1.3928 | 0.7113 | 7.62915e-08 | 2326 |
| 0.0471 | 0.9882 | 1.3932 | 0.7113 | 7.627375e-08 | 2327 |
| 0.0441 | 0.9906 | 1.3928 | 0.7042 | 7.6256e-08 | 2328 |
| 0.0479 | 0.9929 | 1.3915 | 0.7042 | 7.6238244e-08 | 2329 |
| 0.0496 | 0.9835 | 1.3916 | 0.7042 | 7.622049e-08 | 2330 |
| 0.0548 | 0.9882 | 1.3945 | 0.7113 | 7.6202724e-08 | 2331 |
| 0.0441 | 0.9906 | 1.3987 | 0.7113 | 7.618496e-08 | 2332 |
| 0.0526 | 0.9835 | 1.3970 | 0.7113 | 7.616719e-08 | 2333 |
| 0.0496 | 0.9906 | 1.3923 | 0.7042 | 7.614942e-08 | 2334 |
| 0.0392 | 0.9953 | 1.3918 | 0.6972 | 7.613164e-08 | 2335 |
| 0.0454 | 0.9906 | 1.3929 | 0.7042 | 7.6113864e-08 | 2336 |
| 0.0462 | 0.9882 | 1.3938 | 0.7042 | 7.609608e-08 | 2337 |
| 0.0435 | 0.9953 | 1.3937 | 0.7042 | 7.6078294e-08 | 2338 |
| 0.0497 | 0.9906 | 1.3923 | 0.7042 | 7.60605e-08 | 2339 |
| 0.0402 | 0.9976 | 1.3921 | 0.6972 | 7.604271e-08 | 2340 |
| 0.0446 | 0.9953 | 1.3958 | 0.7113 | 7.602491e-08 | 2341 |
| 0.0548 | 0.9859 | 1.4004 | 0.7113 | 7.600711e-08 | 2342 |
| 0.0439 | 0.9953 | 1.4009 | 0.7113 | 7.5989306e-08 | 2343 |
| 0.0493 | 0.9929 | 1.3986 | 0.7113 | 7.59715e-08 | 2344 |
| 0.0466 | 0.9906 | 1.3981 | 0.7113 | 7.5953686e-08 | 2345 |
| 0.0474 | 0.9976 | 1.3965 | 0.7042 | 7.593587e-08 | 2346 |
| 0.0505 | 0.9859 | 1.3971 | 0.7042 | 7.591805e-08 | 2347 |
| 0.0426 | 0.9953 | 1.3992 | 0.7113 | 7.590023e-08 | 2348 |
| 0.0433 | 0.9953 | 1.4004 | 0.7113 | 7.5882404e-08 | 2349 |
| 0.0464 | 0.9976 | 1.4011 | 0.7113 | 7.586458e-08 | 2350 |
| 0.0420 | 0.9906 | 1.4017 | 0.7113 | 7.584674e-08 | 2351 |
| 0.0397 | 0.9953 | 1.3991 | 0.7042 | 7.582891e-08 | 2352 |
| 0.0425 | 0.9953 | 1.3964 | 0.7113 | 7.5811066e-08 | 2353 |
| 0.0587 | 0.9788 | 1.3970 | 0.7042 | 7.5793224e-08 | 2354 |
| 0.0475 | 0.9929 | 1.3989 | 0.7042 | 7.577538e-08 | 2355 |
| 0.0430 | 0.9929 | 1.3995 | 0.7183 | 7.5757534e-08 | 2356 |
| 0.0495 | 0.9882 | 1.4017 | 0.7042 | 7.5739685e-08 | 2357 |
| 0.0375 | 0.9976 | 1.4049 | 0.7042 | 7.572183e-08 | 2358 |
| 0.0443 | 0.9976 | 1.4070 | 0.7113 | 7.570397e-08 | 2359 |
| 0.0410 | 0.9976 | 1.4074 | 0.7113 | 7.568611e-08 | 2360 |
| 0.0384 | 0.9976 | 1.4064 | 0.6972 | 7.566825e-08 | 2361 |
| 0.0479 | 0.9953 | 1.4059 | 0.7042 | 7.565038e-08 | 2362 |
| 0.0491 | 0.9906 | 1.4063 | 0.7042 | 7.563251e-08 | 2363 |
| 0.0483 | 0.9882 | 1.4074 | 0.7113 | 7.561463e-08 | 2364 |
| 0.0356 | 0.9929 | 1.4076 | 0.7113 | 7.559675e-08 | 2365 |
| 0.0391 | 0.9929 | 1.4090 | 0.7042 | 7.557887e-08 | 2366 |
| 0.0472 | 0.9929 | 1.4105 | 0.7113 | 7.556098e-08 | 2367 |
| 0.0425 | 0.9906 | 1.4104 | 0.7042 | 7.554309e-08 | 2368 |
| 0.0535 | 0.9882 | 1.4095 | 0.7042 | 7.55252e-08 | 2369 |
| 0.0409 | 0.9953 | 1.4091 | 0.6972 | 7.55073e-08 | 2370 |
| 0.0457 | 0.9929 | 1.4100 | 0.7042 | 7.54894e-08 | 2371 |
| 0.0487 | 0.9859 | 1.4112 | 0.7042 | 7.5471505e-08 | 2372 |
| 0.0450 | 0.9929 | 1.4109 | 0.7042 | 7.54536e-08 | 2373 |
| 0.0464 | 0.9906 | 1.4094 | 0.6972 | 7.543569e-08 | 2374 |
| 0.0417 | 0.9929 | 1.4091 | 0.7042 | 7.541778e-08 | 2375 |
| 0.0423 | 0.9976 | 1.4093 | 0.7042 | 7.539987e-08 | 2376 |
| 0.0453 | 0.9906 | 1.4116 | 0.7042 | 7.538195e-08 | 2377 |
| 0.0479 | 0.9882 | 1.4164 | 0.7113 | 7.536403e-08 | 2378 |
| 0.0486 | 0.9906 | 1.4164 | 0.7113 | 7.53461e-08 | 2379 |
| 0.0343 | 0.9976 | 1.4162 | 0.7113 | 7.5328174e-08 | 2380 |
| 0.0511 | 0.9859 | 1.4165 | 0.7113 | 7.531024e-08 | 2381 |
| 0.0361 | 0.9953 | 1.4174 | 0.7113 | 7.5292306e-08 | 2382 |
| 0.0437 | 0.9929 | 1.4181 | 0.7113 | 7.5274365e-08 | 2383 |
| 0.0430 | 0.9953 | 1.4166 | 0.7113 | 7.525642e-08 | 2384 |
| 0.0459 | 0.9953 | 1.4175 | 0.7113 | 7.523848e-08 | 2385 |
| 0.0434 | 0.9953 | 1.4197 | 0.7113 | 7.5220534e-08 | 2386 |
| 0.0373 | 0.9953 | 1.4183 | 0.7113 | 7.5202585e-08 | 2387 |
| 0.0412 | 0.9929 | 1.4172 | 0.7113 | 7.518463e-08 | 2388 |
| 0.0620 | 0.9859 | 1.4162 | 0.7113 | 7.5166675e-08 | 2389 |
| 0.0441 | 0.9929 | 1.4185 | 0.7113 | 7.514871e-08 | 2390 |
| 0.0469 | 0.9929 | 1.4209 | 0.7113 | 7.513075e-08 | 2391 |
| 0.0552 | 0.9882 | 1.4205 | 0.7113 | 7.511278e-08 | 2392 |
| 0.0426 | 0.9929 | 1.4175 | 0.7042 | 7.509481e-08 | 2393 |
| 0.0513 | 0.9859 | 1.4156 | 0.7042 | 7.5076834e-08 | 2394 |
| 0.0468 | 0.9929 | 1.4142 | 0.7042 | 7.505886e-08 | 2395 |
| 0.0472 | 0.9882 | 1.4155 | 0.7042 | 7.504088e-08 | 2396 |
| 0.0465 | 0.9929 | 1.4168 | 0.7042 | 7.5022896e-08 | 2397 |
| 0.0402 | 0.9906 | 1.4161 | 0.7042 | 7.500491e-08 | 2398 |
| 0.0371 | 0.9953 | 1.4141 | 0.6972 | 7.498692e-08 | 2399 |
| 0.0425 | 0.9953 | 1.4168 | 0.6972 | 7.496893e-08 | 2400 |
| 0.0594 | 0.9835 | 1.4179 | 0.7042 | 7.495093e-08 | 2401 |
| 0.0439 | 0.9929 | 1.4180 | 0.7042 | 7.4932935e-08 | 2402 |
| 0.0365 | 0.9976 | 1.4180 | 0.7042 | 7.491493e-08 | 2403 |
| 0.0396 | 0.9953 | 1.4185 | 0.7042 | 7.4896924e-08 | 2404 |
| 0.0361 | 0.9976 | 1.4195 | 0.7042 | 7.487892e-08 | 2405 |
| 0.0421 | 0.9953 | 1.4204 | 0.7042 | 7.486091e-08 | 2406 |
| 0.0418 | 0.9906 | 1.4191 | 0.7042 | 7.4842895e-08 | 2407 |
| 0.0471 | 0.9859 | 1.4186 | 0.7042 | 7.4824875e-08 | 2408 |
| 0.0432 | 0.9906 | 1.4182 | 0.7042 | 7.4806856e-08 | 2409 |
| 0.0382 | 0.9953 | 1.4175 | 0.7042 | 7.478883e-08 | 2410 |
| 0.0433 | 0.9906 | 1.4191 | 0.7042 | 7.47708e-08 | 2411 |
| 0.0427 | 0.9929 | 1.4188 | 0.7042 | 7.475277e-08 | 2412 |
| 0.0438 | 0.9929 | 1.4186 | 0.7042 | 7.4734736e-08 | 2413 |
| 0.0594 | 0.9906 | 1.4207 | 0.7042 | 7.47167e-08 | 2414 |
| 0.0418 | 0.9906 | 1.4235 | 0.7042 | 7.469866e-08 | 2415 |
| 0.0402 | 0.9906 | 1.4261 | 0.7113 | 7.468062e-08 | 2416 |
| 0.0396 | 0.9953 | 1.4255 | 0.7113 | 7.466257e-08 | 2417 |
| 0.0449 | 0.9882 | 1.4254 | 0.7042 | 7.4644525e-08 | 2418 |
| 0.0335 | 0.9976 | 1.4243 | 0.7042 | 7.462647e-08 | 2419 |
| 0.0460 | 0.9929 | 1.4234 | 0.6972 | 7.4608415e-08 | 2420 |
| 0.0477 | 0.9906 | 1.4235 | 0.7042 | 7.459036e-08 | 2421 |
| 0.0428 | 0.9882 | 1.4227 | 0.7042 | 7.45723e-08 | 2422 |
| 0.0429 | 0.9953 | 1.4237 | 0.7042 | 7.455424e-08 | 2423 |
| 0.0304 | 1.0 | 1.4240 | 0.7042 | 7.453617e-08 | 2424 |
| 0.0435 | 0.9906 | 1.4211 | 0.7113 | 7.45181e-08 | 2425 |
| 0.0400 | 0.9953 | 1.4213 | 0.7113 | 7.450002e-08 | 2426 |
| 0.0416 | 0.9929 | 1.4235 | 0.7042 | 7.4481946e-08 | 2427 |
| 0.0426 | 0.9953 | 1.4258 | 0.6972 | 7.446386e-08 | 2428 |
| 0.0434 | 0.9953 | 1.4273 | 0.6972 | 7.444578e-08 | 2429 |
| 0.0360 | 0.9929 | 1.4296 | 0.7113 | 7.4427696e-08 | 2430 |
| 0.0391 | 0.9976 | 1.4308 | 0.7113 | 7.4409606e-08 | 2431 |
| 0.0473 | 0.9882 | 1.4342 | 0.7113 | 7.4391515e-08 | 2432 |
| 0.0430 | 0.9929 | 1.4344 | 0.7113 | 7.437342e-08 | 2433 |
| 0.0416 | 0.9929 | 1.4334 | 0.7113 | 7.435532e-08 | 2434 |
| 0.0454 | 0.9882 | 1.4323 | 0.6972 | 7.4337215e-08 | 2435 |
| 0.0358 | 0.9929 | 1.4311 | 0.6972 | 7.431911e-08 | 2436 |
| 0.0472 | 0.9859 | 1.4336 | 0.7113 | 7.4301006e-08 | 2437 |
| 0.0534 | 0.9812 | 1.4365 | 0.7113 | 7.4282895e-08 | 2438 |
| 0.0400 | 0.9929 | 1.4349 | 0.7113 | 7.426478e-08 | 2439 |
| 0.0381 | 0.9953 | 1.4328 | 0.7042 | 7.4246664e-08 | 2440 |
| 0.0298 | 1.0 | 1.4326 | 0.7042 | 7.4228545e-08 | 2441 |
| 0.0431 | 0.9906 | 1.4333 | 0.7042 | 7.4210426e-08 | 2442 |
| 0.0356 | 0.9906 | 1.4348 | 0.7113 | 7.41923e-08 | 2443 |
| 0.0382 | 0.9953 | 1.4344 | 0.7113 | 7.4174174e-08 | 2444 |
| 0.0381 | 0.9929 | 1.4344 | 0.7113 | 7.415604e-08 | 2445 |
| 0.0442 | 0.9953 | 1.4352 | 0.7113 | 7.413791e-08 | 2446 |
| 0.0410 | 0.9906 | 1.4367 | 0.7113 | 7.411977e-08 | 2447 |
| 0.0297 | 0.9976 | 1.4368 | 0.7113 | 7.410163e-08 | 2448 |
| 0.0443 | 0.9906 | 1.4354 | 0.7113 | 7.408349e-08 | 2449 |
| 0.0428 | 0.9906 | 1.4348 | 0.6972 | 7.406534e-08 | 2450 |
| 0.0313 | 1.0 | 1.4360 | 0.6972 | 7.404719e-08 | 2451 |
| 0.0465 | 0.9929 | 1.4382 | 0.7113 | 7.402904e-08 | 2452 |
| 0.0445 | 0.9929 | 1.4378 | 0.7042 | 7.4010885e-08 | 2453 |
| 0.0416 | 0.9953 | 1.4384 | 0.7113 | 7.399273e-08 | 2454 |
| 0.0511 | 0.9882 | 1.4385 | 0.7113 | 7.397457e-08 | 2455 |
| 0.0502 | 0.9882 | 1.4355 | 0.7042 | 7.395641e-08 | 2456 |
| 0.0410 | 0.9953 | 1.4355 | 0.7042 | 7.393824e-08 | 2457 |
| 0.0355 | 0.9976 | 1.4360 | 0.6972 | 7.392007e-08 | 2458 |
| 0.0512 | 0.9906 | 1.4390 | 0.7042 | 7.3901894e-08 | 2459 |
| 0.0485 | 0.9859 | 1.4450 | 0.7113 | 7.388372e-08 | 2460 |
| 0.0341 | 0.9976 | 1.4449 | 0.7113 | 7.386554e-08 | 2461 |
| 0.0361 | 0.9953 | 1.4436 | 0.7113 | 7.384736e-08 | 2462 |
| 0.0376 | 0.9953 | 1.4422 | 0.7113 | 7.382918e-08 | 2463 |
| 0.0373 | 0.9976 | 1.4394 | 0.7042 | 7.381099e-08 | 2464 |
| 0.0475 | 0.9929 | 1.4379 | 0.7042 | 7.37928e-08 | 2465 |
| 0.0419 | 0.9882 | 1.4371 | 0.7042 | 7.377461e-08 | 2466 |
| 0.0346 | 0.9976 | 1.4375 | 0.7042 | 7.375641e-08 | 2467 |
| 0.0406 | 0.9906 | 1.4381 | 0.7042 | 7.3738214e-08 | 2468 |
| 0.0369 | 0.9929 | 1.4391 | 0.7042 | 7.372001e-08 | 2469 |
| 0.0428 | 0.9882 | 1.4401 | 0.7042 | 7.3701806e-08 | 2470 |
| 0.0453 | 0.9906 | 1.4413 | 0.7042 | 7.36836e-08 | 2471 |
| 0.0351 | 0.9929 | 1.4410 | 0.7042 | 7.366539e-08 | 2472 |
| 0.0317 | 1.0 | 1.4413 | 0.7042 | 7.364718e-08 | 2473 |
| 0.0381 | 0.9953 | 1.4419 | 0.7113 | 7.362896e-08 | 2474 |
| 0.0418 | 0.9906 | 1.4394 | 0.7042 | 7.361074e-08 | 2475 |
| 0.0484 | 0.9859 | 1.4397 | 0.7042 | 7.3592524e-08 | 2476 |
| 0.0379 | 0.9906 | 1.4448 | 0.7113 | 7.35743e-08 | 2477 |
| 0.0395 | 0.9929 | 1.4451 | 0.7113 | 7.355607e-08 | 2478 |
| 0.0403 | 0.9929 | 1.4451 | 0.7042 | 7.353784e-08 | 2479 |
| 0.0482 | 0.9906 | 1.4461 | 0.7042 | 7.351961e-08 | 2480 |
| 0.0329 | 0.9976 | 1.4462 | 0.7113 | 7.3501376e-08 | 2481 |
| 0.0506 | 0.9859 | 1.4456 | 0.7042 | 7.3483136e-08 | 2482 |
| 0.0407 | 0.9929 | 1.4476 | 0.7113 | 7.3464896e-08 | 2483 |
| 0.0396 | 0.9953 | 1.4461 | 0.7042 | 7.344665e-08 | 2484 |
| 0.0426 | 0.9929 | 1.4461 | 0.6972 | 7.34284e-08 | 2485 |
| 0.0345 | 0.9929 | 1.4488 | 0.7113 | 7.3410156e-08 | 2486 |
| 0.0525 | 0.9882 | 1.4476 | 0.7113 | 7.33919e-08 | 2487 |
| 0.0413 | 0.9976 | 1.4451 | 0.7042 | 7.337365e-08 | 2488 |
| 0.0347 | 0.9976 | 1.4443 | 0.7042 | 7.335539e-08 | 2489 |
| 0.0362 | 0.9953 | 1.4443 | 0.7042 | 7.333713e-08 | 2490 |
| 0.0395 | 0.9882 | 1.4452 | 0.7042 | 7.3318866e-08 | 2491 |
| 0.0414 | 0.9906 | 1.4454 | 0.7042 | 7.33006e-08 | 2492 |
| 0.0478 | 0.9906 | 1.4461 | 0.7042 | 7.328233e-08 | 2493 |
| 0.0324 | 0.9976 | 1.4459 | 0.7042 | 7.3264054e-08 | 2494 |
| 0.0363 | 0.9953 | 1.4457 | 0.7042 | 7.324578e-08 | 2495 |
| 0.0360 | 0.9976 | 1.4448 | 0.7042 | 7.3227504e-08 | 2496 |
| 0.0369 | 0.9953 | 1.4454 | 0.7042 | 7.320922e-08 | 2497 |
| 0.0352 | 0.9953 | 1.4465 | 0.6972 | 7.319094e-08 | 2498 |
| 0.0428 | 0.9953 | 1.4479 | 0.7042 | 7.317265e-08 | 2499 |
| 0.0317 | 0.9953 | 1.4485 | 0.7113 | 7.315436e-08 | 2500 |
| 0.0337 | 0.9953 | 1.4488 | 0.7113 | 7.313607e-08 | 2501 |
| 0.0383 | 0.9929 | 1.4485 | 0.7042 | 7.3117775e-08 | 2502 |
| 0.0382 | 0.9953 | 1.4500 | 0.7042 | 7.309948e-08 | 2503 |
| 0.0361 | 0.9953 | 1.4529 | 0.7113 | 7.308118e-08 | 2504 |
| 0.0442 | 0.9859 | 1.4520 | 0.6972 | 7.306288e-08 | 2505 |
| 0.0372 | 0.9929 | 1.4497 | 0.6972 | 7.3044575e-08 | 2506 |
| 0.0463 | 0.9882 | 1.4504 | 0.7113 | 7.3026264e-08 | 2507 |
| 0.0315 | 1.0 | 1.4516 | 0.7113 | 7.300795e-08 | 2508 |
| 0.0412 | 0.9906 | 1.4508 | 0.7113 | 7.298964e-08 | 2509 |
| 0.0355 | 0.9976 | 1.4508 | 0.7042 | 7.2971325e-08 | 2510 |
| 0.0392 | 0.9929 | 1.4525 | 0.7113 | 7.295301e-08 | 2511 |
| 0.0420 | 0.9929 | 1.4547 | 0.7113 | 7.293469e-08 | 2512 |
| 0.0423 | 0.9906 | 1.4553 | 0.7113 | 7.2916365e-08 | 2513 |
| 0.0489 | 0.9882 | 1.4544 | 0.7113 | 7.289804e-08 | 2514 |
| 0.0355 | 0.9953 | 1.4535 | 0.7042 | 7.287971e-08 | 2515 |
| 0.0373 | 0.9929 | 1.4516 | 0.7042 | 7.2861376e-08 | 2516 |
| 0.0388 | 0.9929 | 1.4512 | 0.7042 | 7.2843044e-08 | 2517 |
| 0.0311 | 0.9976 | 1.4511 | 0.7042 | 7.2824704e-08 | 2518 |
| 0.0359 | 0.9953 | 1.4517 | 0.7042 | 7.2806365e-08 | 2519 |
| 0.0374 | 0.9929 | 1.4507 | 0.7042 | 7.2788026e-08 | 2520 |
| 0.0403 | 0.9906 | 1.4497 | 0.7042 | 7.276968e-08 | 2521 |
| 0.0358 | 0.9976 | 1.4507 | 0.7042 | 7.2751334e-08 | 2522 |
| 0.0348 | 0.9976 | 1.4500 | 0.7042 | 7.273298e-08 | 2523 |
| 0.0344 | 0.9953 | 1.4512 | 0.7042 | 7.271463e-08 | 2524 |
| 0.0319 | 1.0 | 1.4524 | 0.7042 | 7.2696274e-08 | 2525 |
| 0.0360 | 0.9953 | 1.4515 | 0.7042 | 7.267791e-08 | 2526 |
| 0.0336 | 0.9929 | 1.4507 | 0.7042 | 7.265955e-08 | 2527 |
| 0.0363 | 0.9953 | 1.4505 | 0.7042 | 7.264119e-08 | 2528 |
| 0.0407 | 0.9953 | 1.4518 | 0.7042 | 7.2622825e-08 | 2529 |
| 0.0310 | 0.9976 | 1.4515 | 0.7042 | 7.260446e-08 | 2530 |
| 0.0541 | 0.9859 | 1.4531 | 0.7042 | 7.258608e-08 | 2531 |
| 0.0403 | 0.9953 | 1.4541 | 0.7042 | 7.256771e-08 | 2532 |
| 0.0460 | 0.9859 | 1.4547 | 0.7042 | 7.2549334e-08 | 2533 |
| 0.0460 | 0.9882 | 1.4545 | 0.6972 | 7.253095e-08 | 2534 |
| 0.0342 | 0.9953 | 1.4545 | 0.7042 | 7.251257e-08 | 2535 |
| 0.0423 | 0.9859 | 1.4538 | 0.6972 | 7.249419e-08 | 2536 |
| 0.0391 | 0.9929 | 1.4551 | 0.7042 | 7.24758e-08 | 2537 |
| 0.0340 | 0.9953 | 1.4572 | 0.7042 | 7.245741e-08 | 2538 |
| 0.0318 | 0.9929 | 1.4587 | 0.7042 | 7.243902e-08 | 2539 |
| 0.0367 | 0.9953 | 1.4596 | 0.7042 | 7.2420626e-08 | 2540 |
| 0.0476 | 0.9812 | 1.4581 | 0.6972 | 7.240223e-08 | 2541 |
| 0.0472 | 0.9906 | 1.4591 | 0.7042 | 7.238383e-08 | 2542 |
| 0.0396 | 0.9929 | 1.4582 | 0.7042 | 7.2365424e-08 | 2543 |
| 0.0445 | 0.9882 | 1.4591 | 0.7042 | 7.234702e-08 | 2544 |
| 0.0363 | 0.9929 | 1.4601 | 0.7042 | 7.232861e-08 | 2545 |
| 0.0339 | 0.9953 | 1.4687 | 0.7113 | 7.23102e-08 | 2546 |
| 0.0410 | 0.9929 | 1.4697 | 0.7113 | 7.229179e-08 | 2547 |
| 0.0365 | 0.9929 | 1.4675 | 0.7113 | 7.227337e-08 | 2548 |
| 0.0404 | 0.9929 | 1.4646 | 0.7113 | 7.2254956e-08 | 2549 |
| 0.0424 | 0.9953 | 1.4630 | 0.7042 | 7.223654e-08 | 2550 |
| 0.0389 | 0.9929 | 1.4618 | 0.7042 | 7.2218114e-08 | 2551 |
| 0.0433 | 0.9929 | 1.4594 | 0.7042 | 7.219969e-08 | 2552 |
| 0.0378 | 0.9953 | 1.4582 | 0.6972 | 7.2181265e-08 | 2553 |
| 0.0397 | 0.9906 | 1.4564 | 0.6972 | 7.2162834e-08 | 2554 |
| 0.0359 | 0.9929 | 1.4583 | 0.6972 | 7.21444e-08 | 2555 |
| 0.0311 | 0.9976 | 1.4586 | 0.7042 | 7.2125964e-08 | 2556 |
| 0.0381 | 0.9906 | 1.4574 | 0.6972 | 7.2107525e-08 | 2557 |
| 0.0436 | 0.9882 | 1.4593 | 0.7042 | 7.208909e-08 | 2558 |
| 0.0293 | 1.0 | 1.4607 | 0.7042 | 7.207064e-08 | 2559 |
| 0.0293 | 1.0 | 1.4615 | 0.7042 | 7.2052195e-08 | 2560 |
| 0.0536 | 0.9812 | 1.4608 | 0.7042 | 7.203375e-08 | 2561 |
| 0.0313 | 0.9953 | 1.4609 | 0.6972 | 7.20153e-08 | 2562 |
| 0.0266 | 0.9976 | 1.4617 | 0.6972 | 7.1996844e-08 | 2563 |
| 0.0474 | 0.9859 | 1.4642 | 0.7042 | 7.197839e-08 | 2564 |
| 0.0354 | 1.0 | 1.4656 | 0.7113 | 7.195993e-08 | 2565 |
| 0.0295 | 1.0 | 1.4657 | 0.7042 | 7.194147e-08 | 2566 |
| 0.0414 | 0.9906 | 1.4650 | 0.7042 | 7.192301e-08 | 2567 |
| 0.0360 | 0.9929 | 1.4622 | 0.7042 | 7.1904545e-08 | 2568 |
| 0.0414 | 0.9906 | 1.4636 | 0.7042 | 7.188608e-08 | 2569 |
| 0.0300 | 0.9976 | 1.4664 | 0.7042 | 7.186761e-08 | 2570 |
| 0.0278 | 0.9976 | 1.4645 | 0.7042 | 7.1849136e-08 | 2571 |
| 0.0364 | 0.9929 | 1.4642 | 0.6972 | 7.183066e-08 | 2572 |
| 0.0347 | 0.9929 | 1.4645 | 0.7042 | 7.181219e-08 | 2573 |
| 0.0345 | 0.9953 | 1.4658 | 0.6972 | 7.179371e-08 | 2574 |
| 0.0318 | 0.9976 | 1.4707 | 0.7042 | 7.1775226e-08 | 2575 |
| 0.0375 | 0.9929 | 1.4720 | 0.7113 | 7.1756745e-08 | 2576 |
| 0.0259 | 1.0 | 1.4720 | 0.7113 | 7.1738256e-08 | 2577 |
| 0.0322 | 0.9953 | 1.4685 | 0.7042 | 7.171977e-08 | 2578 |
| 0.0400 | 0.9882 | 1.4688 | 0.7042 | 7.170128e-08 | 2579 |
| 0.0399 | 0.9953 | 1.4727 | 0.7042 | 7.1682784e-08 | 2580 |
| 0.0293 | 0.9976 | 1.4699 | 0.6972 | 7.166429e-08 | 2581 |
| 0.0326 | 0.9929 | 1.4692 | 0.6972 | 7.164579e-08 | 2582 |
| 0.0304 | 0.9976 | 1.4674 | 0.7042 | 7.162729e-08 | 2583 |
| 0.0430 | 0.9953 | 1.4691 | 0.7042 | 7.160879e-08 | 2584 |
| 0.0316 | 1.0 | 1.4718 | 0.7042 | 7.159028e-08 | 2585 |
| 0.0382 | 0.9929 | 1.4703 | 0.6972 | 7.157177e-08 | 2586 |
| 0.0304 | 0.9953 | 1.4711 | 0.7042 | 7.155326e-08 | 2587 |
| 0.0364 | 0.9882 | 1.4720 | 0.7042 | 7.153474e-08 | 2588 |
| 0.0308 | 0.9953 | 1.4786 | 0.7113 | 7.1516226e-08 | 2589 |
| 0.0314 | 0.9976 | 1.4783 | 0.7113 | 7.149771e-08 | 2590 |
| 0.0476 | 0.9906 | 1.4732 | 0.7042 | 7.147919e-08 | 2591 |
| 0.0371 | 0.9929 | 1.4695 | 0.7042 | 7.146067e-08 | 2592 |
| 0.0394 | 0.9929 | 1.4693 | 0.7042 | 7.1442145e-08 | 2593 |
| 0.0372 | 0.9953 | 1.4712 | 0.7042 | 7.142362e-08 | 2594 |
| 0.0383 | 0.9929 | 1.4698 | 0.7042 | 7.140509e-08 | 2595 |
| 0.0450 | 0.9929 | 1.4699 | 0.7042 | 7.138656e-08 | 2596 |
| 0.0350 | 0.9906 | 1.4731 | 0.7042 | 7.136803e-08 | 2597 |
| 0.0320 | 0.9976 | 1.4712 | 0.6972 | 7.134949e-08 | 2598 |
| 0.0299 | 0.9976 | 1.4713 | 0.6972 | 7.133095e-08 | 2599 |
| 0.0350 | 0.9953 | 1.4717 | 0.6972 | 7.1312414e-08 | 2600 |
| 0.0425 | 0.9882 | 1.4753 | 0.7042 | 7.129387e-08 | 2601 |
| 0.0339 | 0.9976 | 1.4776 | 0.7042 | 7.127532e-08 | 2602 |
| 0.0278 | 0.9953 | 1.4765 | 0.6972 | 7.125678e-08 | 2603 |
| 0.0370 | 0.9882 | 1.4761 | 0.6972 | 7.1238226e-08 | 2604 |
| 0.0361 | 0.9882 | 1.4773 | 0.7042 | 7.1219674e-08 | 2605 |
| 0.0396 | 0.9906 | 1.4777 | 0.7042 | 7.120112e-08 | 2606 |
| 0.0335 | 0.9929 | 1.4774 | 0.7042 | 7.118256e-08 | 2607 |
| 0.0364 | 0.9929 | 1.4762 | 0.6972 | 7.1164e-08 | 2608 |
| 0.0459 | 0.9835 | 1.4738 | 0.7042 | 7.114544e-08 | 2609 |
| 0.0397 | 0.9929 | 1.4733 | 0.7042 | 7.112688e-08 | 2610 |
| 0.0291 | 0.9953 | 1.4749 | 0.7042 | 7.110831e-08 | 2611 |
| 0.0322 | 1.0 | 1.4785 | 0.7042 | 7.1089744e-08 | 2612 |
| 0.0362 | 0.9976 | 1.4791 | 0.7042 | 7.107117e-08 | 2613 |
| 0.0329 | 0.9976 | 1.4802 | 0.7042 | 7.10526e-08 | 2614 |
| 0.0303 | 0.9976 | 1.4789 | 0.7042 | 7.103402e-08 | 2615 |
| 0.0328 | 0.9953 | 1.4781 | 0.7042 | 7.101544e-08 | 2616 |
| 0.0288 | 0.9976 | 1.4794 | 0.7042 | 7.099686e-08 | 2617 |
| 0.0348 | 0.9929 | 1.4791 | 0.7042 | 7.097828e-08 | 2618 |
| 0.0442 | 0.9929 | 1.4779 | 0.7042 | 7.095969e-08 | 2619 |
| 0.0284 | 0.9976 | 1.4784 | 0.7042 | 7.0941105e-08 | 2620 |
| 0.0369 | 0.9929 | 1.4800 | 0.7042 | 7.092252e-08 | 2621 |
| 0.0448 | 0.9882 | 1.4790 | 0.7042 | 7.090393e-08 | 2622 |
| 0.0324 | 0.9953 | 1.4775 | 0.7042 | 7.0885335e-08 | 2623 |
| 0.0295 | 0.9929 | 1.4775 | 0.7042 | 7.086674e-08 | 2624 |
| 0.0381 | 0.9882 | 1.4823 | 0.7042 | 7.0848145e-08 | 2625 |
| 0.0356 | 0.9953 | 1.4820 | 0.7042 | 7.082954e-08 | 2626 |
| 0.0337 | 0.9953 | 1.4814 | 0.7042 | 7.081094e-08 | 2627 |
| 0.0354 | 0.9929 | 1.4804 | 0.7042 | 7.079234e-08 | 2628 |
| 0.0354 | 0.9953 | 1.4823 | 0.7113 | 7.077373e-08 | 2629 |
| 0.0270 | 0.9976 | 1.4818 | 0.7113 | 7.075512e-08 | 2630 |
| 0.0291 | 0.9976 | 1.4817 | 0.7113 | 7.073651e-08 | 2631 |
| 0.0383 | 0.9976 | 1.4806 | 0.7042 | 7.0717896e-08 | 2632 |
| 0.0333 | 0.9976 | 1.4808 | 0.7042 | 7.069928e-08 | 2633 |
| 0.0329 | 0.9929 | 1.4775 | 0.7042 | 7.068066e-08 | 2634 |
| 0.0340 | 0.9953 | 1.4751 | 0.7113 | 7.066205e-08 | 2635 |
| 0.0271 | 0.9976 | 1.4767 | 0.7113 | 7.0643424e-08 | 2636 |
| 0.0291 | 0.9953 | 1.4785 | 0.7042 | 7.06248e-08 | 2637 |
| 0.0375 | 0.9953 | 1.4796 | 0.7042 | 7.060618e-08 | 2638 |
| 0.0337 | 0.9929 | 1.4813 | 0.7113 | 7.058755e-08 | 2639 |
| 0.0297 | 0.9953 | 1.4827 | 0.7042 | 7.0568916e-08 | 2640 |
| 0.0281 | 0.9976 | 1.4850 | 0.7042 | 7.0550286e-08 | 2641 |
| 0.0411 | 0.9882 | 1.4833 | 0.7042 | 7.053165e-08 | 2642 |
| 0.0354 | 0.9929 | 1.4831 | 0.6972 | 7.051301e-08 | 2643 |
| 0.0290 | 1.0 | 1.4810 | 0.7113 | 7.049437e-08 | 2644 |
| 0.0342 | 0.9953 | 1.4806 | 0.7113 | 7.0475735e-08 | 2645 |
| 0.0295 | 0.9953 | 1.4830 | 0.7042 | 7.045709e-08 | 2646 |
| 0.0356 | 0.9906 | 1.4837 | 0.7042 | 7.0438446e-08 | 2647 |
| 0.0342 | 0.9929 | 1.4843 | 0.7042 | 7.04198e-08 | 2648 |
| 0.0348 | 0.9953 | 1.4852 | 0.7042 | 7.040115e-08 | 2649 |
| 0.0288 | 0.9953 | 1.4871 | 0.7042 | 7.03825e-08 | 2650 |
| 0.0298 | 0.9976 | 1.4870 | 0.7042 | 7.0363846e-08 | 2651 |
| 0.0284 | 0.9953 | 1.4881 | 0.7042 | 7.034519e-08 | 2652 |
| 0.0364 | 0.9906 | 1.4890 | 0.7042 | 7.032653e-08 | 2653 |
| 0.0296 | 0.9976 | 1.4880 | 0.7042 | 7.030787e-08 | 2654 |
| 0.0319 | 0.9953 | 1.4859 | 0.7042 | 7.028921e-08 | 2655 |
| 0.0428 | 0.9882 | 1.4869 | 0.7042 | 7.0270545e-08 | 2656 |
| 0.0423 | 0.9859 | 1.4916 | 0.7042 | 7.025188e-08 | 2657 |
| 0.0365 | 0.9882 | 1.4941 | 0.7042 | 7.023321e-08 | 2658 |
| 0.0416 | 0.9835 | 1.4920 | 0.7042 | 7.021454e-08 | 2659 |
| 0.0269 | 1.0 | 1.4917 | 0.7042 | 7.019587e-08 | 2660 |
| 0.0330 | 0.9906 | 1.4945 | 0.7042 | 7.0177194e-08 | 2661 |
| 0.0268 | 1.0 | 1.4948 | 0.7042 | 7.015852e-08 | 2662 |
| 0.0371 | 0.9906 | 1.4948 | 0.7042 | 7.013984e-08 | 2663 |
| 0.0377 | 0.9929 | 1.4947 | 0.7042 | 7.012116e-08 | 2664 |
| 0.0334 | 0.9953 | 1.4939 | 0.7113 | 7.010248e-08 | 2665 |
| 0.0369 | 0.9929 | 1.4914 | 0.7042 | 7.008379e-08 | 2666 |
| 0.0444 | 0.9929 | 1.4921 | 0.7042 | 7.0065106e-08 | 2667 |
| 0.0460 | 0.9859 | 1.4876 | 0.7042 | 7.004642e-08 | 2668 |
| 0.0272 | 0.9953 | 1.4876 | 0.7042 | 7.002773e-08 | 2669 |
| 0.0371 | 0.9929 | 1.4904 | 0.7042 | 7.000904e-08 | 2670 |
| 0.0291 | 1.0 | 1.4919 | 0.7042 | 6.999034e-08 | 2671 |
| 0.0340 | 0.9953 | 1.4948 | 0.7042 | 6.997165e-08 | 2672 |
| 0.0292 | 1.0 | 1.4970 | 0.7042 | 6.995295e-08 | 2673 |
| 0.0383 | 0.9906 | 1.4995 | 0.7113 | 6.9934245e-08 | 2674 |
| 0.0294 | 1.0 | 1.4985 | 0.7042 | 6.9915544e-08 | 2675 |
| 0.0255 | 1.0 | 1.4988 | 0.7042 | 6.989684e-08 | 2676 |
| 0.0286 | 0.9953 | 1.4980 | 0.7042 | 6.987813e-08 | 2677 |
| 0.0345 | 0.9906 | 1.4977 | 0.7042 | 6.9859425e-08 | 2678 |
| 0.0271 | 0.9976 | 1.4986 | 0.7042 | 6.9840716e-08 | 2679 |
| 0.0414 | 0.9882 | 1.4968 | 0.7042 | 6.9822e-08 | 2680 |
| 0.0371 | 0.9929 | 1.4951 | 0.7042 | 6.9803285e-08 | 2681 |
| 0.0371 | 0.9929 | 1.4915 | 0.7042 | 6.978457e-08 | 2682 |
| 0.0312 | 0.9953 | 1.4902 | 0.7042 | 6.976585e-08 | 2683 |
| 0.0289 | 1.0 | 1.4906 | 0.7042 | 6.974713e-08 | 2684 |
| 0.0282 | 0.9953 | 1.4924 | 0.7042 | 6.972841e-08 | 2685 |
| 0.0318 | 0.9953 | 1.4939 | 0.7042 | 6.9709685e-08 | 2686 |
| 0.0222 | 1.0 | 1.4928 | 0.7042 | 6.969096e-08 | 2687 |
| 0.0368 | 0.9859 | 1.4925 | 0.7042 | 6.967223e-08 | 2688 |
| 0.0376 | 0.9882 | 1.4934 | 0.7042 | 6.96535e-08 | 2689 |
| 0.0285 | 0.9976 | 1.4886 | 0.7113 | 6.963477e-08 | 2690 |
| 0.0327 | 0.9906 | 1.4889 | 0.7113 | 6.9616036e-08 | 2691 |
| 0.0262 | 0.9976 | 1.4907 | 0.7113 | 6.95973e-08 | 2692 |
| 0.0298 | 0.9953 | 1.4937 | 0.7113 | 6.957856e-08 | 2693 |
| 0.0406 | 0.9929 | 1.4981 | 0.7042 | 6.9559825e-08 | 2694 |
| 0.0461 | 0.9929 | 1.4967 | 0.7113 | 6.954108e-08 | 2695 |
| 0.0305 | 0.9953 | 1.4969 | 0.7042 | 6.9522336e-08 | 2696 |
| 0.0382 | 0.9953 | 1.4962 | 0.7042 | 6.950359e-08 | 2697 |
| 0.0298 | 0.9929 | 1.4962 | 0.7042 | 6.948485e-08 | 2698 |
| 0.0304 | 0.9976 | 1.4998 | 0.7042 | 6.94661e-08 | 2699 |
| 0.0289 | 0.9976 | 1.4997 | 0.7042 | 6.9447346e-08 | 2700 |
| 0.0356 | 0.9906 | 1.4992 | 0.7042 | 6.9428594e-08 | 2701 |
| 0.0264 | 0.9976 | 1.4993 | 0.7042 | 6.940984e-08 | 2702 |
| 0.0272 | 0.9976 | 1.4992 | 0.7042 | 6.9391085e-08 | 2703 |
| 0.0299 | 0.9953 | 1.4979 | 0.7042 | 6.937233e-08 | 2704 |
| 0.0312 | 0.9976 | 1.4967 | 0.7113 | 6.935357e-08 | 2705 |
| 0.0280 | 0.9953 | 1.4971 | 0.7113 | 6.93348e-08 | 2706 |
| 0.0282 | 0.9953 | 1.4998 | 0.7113 | 6.931604e-08 | 2707 |
| 0.0307 | 0.9953 | 1.4990 | 0.7113 | 6.929727e-08 | 2708 |
| 0.0291 | 0.9929 | 1.5012 | 0.7113 | 6.927851e-08 | 2709 |
| 0.0283 | 0.9976 | 1.5018 | 0.7113 | 6.9259734e-08 | 2710 |
| 0.0400 | 0.9882 | 1.5010 | 0.7113 | 6.924096e-08 | 2711 |
| 0.0298 | 0.9953 | 1.5007 | 0.7042 | 6.922219e-08 | 2712 |
| 0.0246 | 1.0 | 1.5021 | 0.7042 | 6.9203416e-08 | 2713 |
| 0.0317 | 0.9953 | 1.5030 | 0.7042 | 6.918464e-08 | 2714 |
| 0.0337 | 0.9953 | 1.5037 | 0.7042 | 6.916586e-08 | 2715 |
| 0.0373 | 0.9929 | 1.5027 | 0.7042 | 6.914708e-08 | 2716 |
| 0.0273 | 0.9976 | 1.5050 | 0.7042 | 6.91283e-08 | 2717 |
| 0.0372 | 0.9906 | 1.5090 | 0.7042 | 6.910951e-08 | 2718 |
| 0.0292 | 0.9976 | 1.5107 | 0.7042 | 6.9090724e-08 | 2719 |
| 0.0275 | 1.0 | 1.5095 | 0.7042 | 6.907194e-08 | 2720 |
| 0.0238 | 0.9976 | 1.5095 | 0.7042 | 6.905315e-08 | 2721 |
| 0.0225 | 1.0 | 1.5093 | 0.7042 | 6.903436e-08 | 2722 |
| 0.0264 | 0.9953 | 1.5085 | 0.7042 | 6.901556e-08 | 2723 |
| 0.0288 | 0.9953 | 1.5077 | 0.7042 | 6.899677e-08 | 2724 |
| 0.0350 | 0.9929 | 1.5119 | 0.7042 | 6.8977975e-08 | 2725 |
| 0.0354 | 0.9906 | 1.5117 | 0.7042 | 6.8959174e-08 | 2726 |
| 0.0218 | 1.0 | 1.5104 | 0.7042 | 6.894037e-08 | 2727 |
| 0.0285 | 0.9953 | 1.5088 | 0.7042 | 6.892157e-08 | 2728 |
| 0.0286 | 0.9953 | 1.5082 | 0.7042 | 6.890277e-08 | 2729 |
| 0.0323 | 0.9929 | 1.5104 | 0.7042 | 6.888396e-08 | 2730 |
| 0.0259 | 0.9976 | 1.5126 | 0.7042 | 6.8865155e-08 | 2731 |
| 0.0232 | 1.0 | 1.5153 | 0.7042 | 6.884635e-08 | 2732 |
| 0.0253 | 0.9976 | 1.5143 | 0.7042 | 6.882754e-08 | 2733 |
| 0.0278 | 0.9953 | 1.5109 | 0.7042 | 6.8808724e-08 | 2734 |
| 0.0470 | 0.9882 | 1.5076 | 0.7042 | 6.878991e-08 | 2735 |
| 0.0350 | 0.9953 | 1.5092 | 0.7042 | 6.8771094e-08 | 2736 |
| 0.0325 | 0.9953 | 1.5088 | 0.7042 | 6.875228e-08 | 2737 |
| 0.0239 | 1.0 | 1.5068 | 0.7042 | 6.8733456e-08 | 2738 |
| 0.0340 | 0.9929 | 1.5053 | 0.7113 | 6.8714634e-08 | 2739 |
| 0.0266 | 0.9976 | 1.5057 | 0.7113 | 6.869581e-08 | 2740 |
| 0.0287 | 0.9929 | 1.5069 | 0.7042 | 6.867699e-08 | 2741 |
| 0.0351 | 0.9929 | 1.5092 | 0.7042 | 6.865816e-08 | 2742 |
| 0.0309 | 0.9929 | 1.5101 | 0.7113 | 6.863933e-08 | 2743 |
| 0.0284 | 0.9929 | 1.5136 | 0.7042 | 6.86205e-08 | 2744 |
| 0.0222 | 1.0 | 1.5161 | 0.7042 | 6.860167e-08 | 2745 |
| 0.0229 | 0.9976 | 1.5154 | 0.7042 | 6.858284e-08 | 2746 |
| 0.0288 | 0.9976 | 1.5156 | 0.7042 | 6.8564006e-08 | 2747 |
| 0.0388 | 0.9882 | 1.5170 | 0.7042 | 6.854517e-08 | 2748 |
| 0.0320 | 0.9976 | 1.5173 | 0.7042 | 6.852633e-08 | 2749 |
| 0.0332 | 0.9929 | 1.5174 | 0.7042 | 6.85075e-08 | 2750 |
| 0.0387 | 0.9882 | 1.5183 | 0.7042 | 6.848865e-08 | 2751 |
| 0.0342 | 0.9953 | 1.5193 | 0.7042 | 6.846981e-08 | 2752 |
| 0.0465 | 0.9882 | 1.5215 | 0.7042 | 6.8450966e-08 | 2753 |
| 0.0238 | 1.0 | 1.5241 | 0.6972 | 6.843212e-08 | 2754 |
| 0.0328 | 0.9953 | 1.5258 | 0.6972 | 6.841327e-08 | 2755 |
| 0.0316 | 0.9929 | 1.5235 | 0.7042 | 6.839442e-08 | 2756 |
| 0.0315 | 0.9906 | 1.5230 | 0.7042 | 6.837557e-08 | 2757 |
| 0.0267 | 0.9976 | 1.5221 | 0.7042 | 6.835672e-08 | 2758 |
| 0.0330 | 0.9929 | 1.5201 | 0.7042 | 6.833786e-08 | 2759 |
| 0.0232 | 0.9953 | 1.5179 | 0.6972 | 6.8319004e-08 | 2760 |
| 0.0304 | 0.9929 | 1.5186 | 0.6972 | 6.8300146e-08 | 2761 |
| 0.0274 | 0.9953 | 1.5196 | 0.7042 | 6.828129e-08 | 2762 |
| 0.0290 | 0.9976 | 1.5225 | 0.7042 | 6.826243e-08 | 2763 |
| 0.0271 | 0.9953 | 1.5199 | 0.7042 | 6.8243565e-08 | 2764 |
| 0.0227 | 0.9976 | 1.5190 | 0.7042 | 6.82247e-08 | 2765 |
| 0.0297 | 0.9953 | 1.5210 | 0.7042 | 6.8205836e-08 | 2766 |
| 0.0331 | 0.9929 | 1.5224 | 0.7042 | 6.818697e-08 | 2767 |
| 0.0269 | 1.0 | 1.5210 | 0.7042 | 6.81681e-08 | 2768 |
| 0.0247 | 0.9976 | 1.5213 | 0.7042 | 6.814923e-08 | 2769 |
| 0.0222 | 1.0 | 1.5227 | 0.7042 | 6.8130355e-08 | 2770 |
| 0.0219 | 1.0 | 1.5231 | 0.7042 | 6.811148e-08 | 2771 |
| 0.0451 | 0.9882 | 1.5247 | 0.7042 | 6.809261e-08 | 2772 |
| 0.0298 | 0.9929 | 1.5262 | 0.6972 | 6.807373e-08 | 2773 |
| 0.0319 | 0.9906 | 1.5273 | 0.6972 | 6.805485e-08 | 2774 |
| 0.0335 | 0.9953 | 1.5282 | 0.6972 | 6.803597e-08 | 2775 |
| 0.0253 | 0.9976 | 1.5257 | 0.6972 | 6.8017094e-08 | 2776 |
| 0.0318 | 0.9953 | 1.5245 | 0.6972 | 6.799821e-08 | 2777 |
| 0.0220 | 1.0 | 1.5257 | 0.6972 | 6.797932e-08 | 2778 |
| 0.0331 | 0.9953 | 1.5289 | 0.7042 | 6.7960436e-08 | 2779 |
| 0.0316 | 0.9929 | 1.5282 | 0.7042 | 6.794155e-08 | 2780 |
| 0.0287 | 0.9953 | 1.5259 | 0.7042 | 6.792266e-08 | 2781 |
| 0.0301 | 0.9929 | 1.5272 | 0.7042 | 6.790377e-08 | 2782 |
| 0.0239 | 0.9976 | 1.5266 | 0.7042 | 6.7884876e-08 | 2783 |
| 0.0297 | 0.9953 | 1.5273 | 0.6972 | 6.786598e-08 | 2784 |
| 0.0290 | 0.9929 | 1.5283 | 0.6972 | 6.784709e-08 | 2785 |
| 0.0464 | 0.9812 | 1.5314 | 0.7042 | 6.782819e-08 | 2786 |
| 0.0281 | 0.9929 | 1.5319 | 0.7042 | 6.780929e-08 | 2787 |
| 0.0231 | 0.9976 | 1.5294 | 0.7042 | 6.779039e-08 | 2788 |
| 0.0302 | 0.9976 | 1.5278 | 0.7042 | 6.777149e-08 | 2789 |
| 0.0228 | 1.0 | 1.5277 | 0.7042 | 6.775259e-08 | 2790 |
| 0.0233 | 0.9953 | 1.5292 | 0.7042 | 6.773368e-08 | 2791 |
| 0.0305 | 0.9976 | 1.5300 | 0.7042 | 6.771477e-08 | 2792 |
| 0.0199 | 1.0 | 1.5283 | 0.7042 | 6.7695865e-08 | 2793 |
| 0.0301 | 0.9976 | 1.5281 | 0.7042 | 6.767696e-08 | 2794 |
| 0.0224 | 0.9976 | 1.5279 | 0.7042 | 6.765805e-08 | 2795 |
| 0.0271 | 0.9976 | 1.5290 | 0.7042 | 6.7639135e-08 | 2796 |
| 0.0244 | 0.9976 | 1.5322 | 0.7042 | 6.762022e-08 | 2797 |
| 0.0227 | 1.0 | 1.5322 | 0.7042 | 6.7601306e-08 | 2798 |
| 0.0294 | 0.9976 | 1.5291 | 0.7042 | 6.758239e-08 | 2799 |
| 0.0298 | 0.9906 | 1.5262 | 0.7042 | 6.756348e-08 | 2800 |
| 0.0272 | 0.9953 | 1.5263 | 0.7042 | 6.7544555e-08 | 2801 |
| 0.0237 | 0.9976 | 1.5257 | 0.7042 | 6.752563e-08 | 2802 |
| 0.0261 | 0.9976 | 1.5243 | 0.7042 | 6.750671e-08 | 2803 |
| 0.0311 | 0.9976 | 1.5248 | 0.7042 | 6.748779e-08 | 2804 |
| 0.0305 | 0.9953 | 1.5240 | 0.6972 | 6.746887e-08 | 2805 |
| 0.0268 | 0.9976 | 1.5263 | 0.7113 | 6.744994e-08 | 2806 |
| 0.0284 | 0.9929 | 1.5289 | 0.6972 | 6.743101e-08 | 2807 |
| 0.0324 | 0.9929 | 1.5298 | 0.6972 | 6.741208e-08 | 2808 |
| 0.0228 | 1.0 | 1.5306 | 0.6972 | 6.739315e-08 | 2809 |
| 0.0386 | 0.9906 | 1.5334 | 0.7042 | 6.737422e-08 | 2810 |
| 0.0401 | 0.9882 | 1.5346 | 0.7042 | 6.735529e-08 | 2811 |
| 0.0219 | 1.0 | 1.5354 | 0.7042 | 6.733635e-08 | 2812 |
| 0.0318 | 0.9929 | 1.5357 | 0.7042 | 6.7317416e-08 | 2813 |
| 0.0391 | 0.9859 | 1.5395 | 0.7042 | 6.729848e-08 | 2814 |
| 0.0250 | 0.9976 | 1.5395 | 0.7042 | 6.7279544e-08 | 2815 |
| 0.0289 | 0.9929 | 1.5376 | 0.7042 | 6.72606e-08 | 2816 |
| 0.0210 | 1.0 | 1.5386 | 0.7042 | 6.724166e-08 | 2817 |
| 0.0231 | 0.9976 | 1.5392 | 0.7042 | 6.7222715e-08 | 2818 |
| 0.0263 | 0.9929 | 1.5364 | 0.7042 | 6.720377e-08 | 2819 |
| 0.0318 | 0.9976 | 1.5336 | 0.7042 | 6.718483e-08 | 2820 |
| 0.0333 | 0.9953 | 1.5309 | 0.7042 | 6.716588e-08 | 2821 |
| 0.0225 | 1.0 | 1.5313 | 0.7042 | 6.714693e-08 | 2822 |
| 0.0318 | 0.9929 | 1.5315 | 0.7042 | 6.712798e-08 | 2823 |
| 0.0262 | 0.9953 | 1.5291 | 0.7042 | 6.710903e-08 | 2824 |
| 0.0226 | 0.9976 | 1.5294 | 0.7042 | 6.709008e-08 | 2825 |
| 0.0287 | 0.9953 | 1.5344 | 0.7042 | 6.707112e-08 | 2826 |
| 0.0297 | 0.9929 | 1.5354 | 0.7042 | 6.705216e-08 | 2827 |
| 0.0173 | 1.0 | 1.5344 | 0.7042 | 6.7033206e-08 | 2828 |
| 0.0239 | 0.9976 | 1.5343 | 0.7042 | 6.701425e-08 | 2829 |
| 0.0335 | 0.9906 | 1.5365 | 0.7042 | 6.699529e-08 | 2830 |
| 0.0332 | 0.9929 | 1.5391 | 0.7042 | 6.697633e-08 | 2831 |
| 0.0260 | 0.9953 | 1.5386 | 0.7042 | 6.695736e-08 | 2832 |
| 0.0242 | 0.9953 | 1.5355 | 0.7042 | 6.69384e-08 | 2833 |
| 0.0247 | 0.9953 | 1.5344 | 0.7042 | 6.691943e-08 | 2834 |
| 0.0217 | 0.9953 | 1.5335 | 0.7042 | 6.690047e-08 | 2835 |
| 0.0271 | 0.9953 | 1.5339 | 0.7042 | 6.6881505e-08 | 2836 |
| 0.0227 | 0.9976 | 1.5343 | 0.7042 | 6.686253e-08 | 2837 |
| 0.0210 | 1.0 | 1.5352 | 0.7042 | 6.684356e-08 | 2838 |
| 0.0206 | 1.0 | 1.5355 | 0.7042 | 6.682459e-08 | 2839 |
| 0.0260 | 0.9953 | 1.5354 | 0.7042 | 6.680562e-08 | 2840 |
| 0.0359 | 0.9859 | 1.5371 | 0.7042 | 6.678665e-08 | 2841 |
| 0.0285 | 0.9953 | 1.5392 | 0.7042 | 6.676767e-08 | 2842 |
| 0.0225 | 0.9976 | 1.5407 | 0.7042 | 6.674869e-08 | 2843 |
| 0.0271 | 1.0 | 1.5383 | 0.7042 | 6.672971e-08 | 2844 |
| 0.0219 | 1.0 | 1.5361 | 0.7042 | 6.671073e-08 | 2845 |
| 0.0262 | 0.9953 | 1.5358 | 0.7042 | 6.6691754e-08 | 2846 |
| 0.0221 | 1.0 | 1.5353 | 0.7042 | 6.6672776e-08 | 2847 |
| 0.0244 | 0.9976 | 1.5355 | 0.7042 | 6.665379e-08 | 2848 |
| 0.0271 | 0.9929 | 1.5369 | 0.7042 | 6.6634804e-08 | 2849 |
| 0.0255 | 0.9976 | 1.5378 | 0.7042 | 6.661582e-08 | 2850 |
| 0.0260 | 0.9953 | 1.5374 | 0.7042 | 6.659683e-08 | 2851 |
| 0.0225 | 1.0 | 1.5390 | 0.7042 | 6.657785e-08 | 2852 |
| 0.0293 | 0.9929 | 1.5385 | 0.7042 | 6.655886e-08 | 2853 |
| 0.0195 | 1.0 | 1.5399 | 0.6972 | 6.653987e-08 | 2854 |
| 0.0277 | 0.9953 | 1.5421 | 0.6972 | 6.6520876e-08 | 2855 |
| 0.0228 | 0.9976 | 1.5421 | 0.6972 | 6.650188e-08 | 2856 |
| 0.0254 | 0.9976 | 1.5421 | 0.7042 | 6.648289e-08 | 2857 |
| 0.0228 | 0.9976 | 1.5420 | 0.7042 | 6.64639e-08 | 2858 |
| 0.0328 | 0.9906 | 1.5433 | 0.7042 | 6.64449e-08 | 2859 |
| 0.0263 | 0.9953 | 1.5458 | 0.7042 | 6.64259e-08 | 2860 |
| 0.0337 | 0.9953 | 1.5457 | 0.7042 | 6.64069e-08 | 2861 |
| 0.0334 | 0.9929 | 1.5441 | 0.7042 | 6.63879e-08 | 2862 |
| 0.0239 | 1.0 | 1.5414 | 0.7042 | 6.63689e-08 | 2863 |
| 0.0255 | 0.9953 | 1.5408 | 0.7042 | 6.63499e-08 | 2864 |
| 0.0324 | 0.9953 | 1.5414 | 0.7042 | 6.633089e-08 | 2865 |
| 0.0290 | 0.9906 | 1.5408 | 0.7042 | 6.6311884e-08 | 2866 |
| 0.0275 | 0.9906 | 1.5397 | 0.7042 | 6.629288e-08 | 2867 |
| 0.0203 | 1.0 | 1.5384 | 0.7042 | 6.627387e-08 | 2868 |
| 0.0269 | 0.9953 | 1.5389 | 0.7042 | 6.625486e-08 | 2869 |
| 0.0226 | 1.0 | 1.5399 | 0.7042 | 6.6235856e-08 | 2870 |
| 0.0283 | 0.9882 | 1.5416 | 0.7042 | 6.621684e-08 | 2871 |
| 0.0222 | 1.0 | 1.5446 | 0.7042 | 6.619783e-08 | 2872 |
| 0.0285 | 0.9953 | 1.5438 | 0.7042 | 6.617881e-08 | 2873 |
| 0.0297 | 0.9953 | 1.5454 | 0.7042 | 6.61598e-08 | 2874 |
| 0.0216 | 0.9976 | 1.5473 | 0.7042 | 6.6140785e-08 | 2875 |
| 0.0228 | 0.9976 | 1.5481 | 0.7042 | 6.612177e-08 | 2876 |
| 0.0309 | 0.9929 | 1.5479 | 0.7042 | 6.610275e-08 | 2877 |
| 0.0295 | 0.9906 | 1.5439 | 0.7113 | 6.608373e-08 | 2878 |
| 0.0323 | 0.9906 | 1.5386 | 0.7113 | 6.606471e-08 | 2879 |
| 0.0212 | 0.9976 | 1.5400 | 0.7113 | 6.6045686e-08 | 2880 |
| 0.0277 | 0.9953 | 1.5424 | 0.7113 | 6.6026665e-08 | 2881 |
| 0.0291 | 0.9976 | 1.5455 | 0.7042 | 6.6007644e-08 | 2882 |
| 0.0231 | 0.9953 | 1.5454 | 0.7042 | 6.598862e-08 | 2883 |
| 0.0235 | 1.0 | 1.5451 | 0.7042 | 6.5969594e-08 | 2884 |
| 0.0354 | 0.9882 | 1.5456 | 0.7042 | 6.5950566e-08 | 2885 |
| 0.0261 | 0.9953 | 1.5468 | 0.7042 | 6.593154e-08 | 2886 |
| 0.0270 | 0.9976 | 1.5461 | 0.7042 | 6.591251e-08 | 2887 |
| 0.0289 | 0.9906 | 1.5445 | 0.7042 | 6.589348e-08 | 2888 |
| 0.0285 | 0.9929 | 1.5447 | 0.7042 | 6.587445e-08 | 2889 |
| 0.0209 | 0.9976 | 1.5444 | 0.7042 | 6.585542e-08 | 2890 |
| 0.0279 | 0.9929 | 1.5441 | 0.7042 | 6.583638e-08 | 2891 |
| 0.0227 | 1.0 | 1.5459 | 0.7042 | 6.5817346e-08 | 2892 |
| 0.0293 | 0.9976 | 1.5454 | 0.7113 | 6.579831e-08 | 2893 |
| 0.0390 | 0.9929 | 1.5466 | 0.7113 | 6.5779275e-08 | 2894 |
| 0.0247 | 0.9976 | 1.5494 | 0.7042 | 6.576024e-08 | 2895 |
| 0.0245 | 0.9953 | 1.5504 | 0.7042 | 6.5741204e-08 | 2896 |
| 0.0266 | 0.9953 | 1.5526 | 0.7042 | 6.572216e-08 | 2897 |
| 0.0252 | 0.9976 | 1.5532 | 0.7042 | 6.570312e-08 | 2898 |
| 0.0292 | 0.9976 | 1.5518 | 0.7042 | 6.568408e-08 | 2899 |
| 0.0236 | 0.9976 | 1.5521 | 0.7042 | 6.5665034e-08 | 2900 |
| 0.0257 | 0.9929 | 1.5531 | 0.7042 | 6.564599e-08 | 2901 |
| 0.0219 | 1.0 | 1.5523 | 0.7042 | 6.562695e-08 | 2902 |
| 0.0242 | 0.9976 | 1.5499 | 0.7113 | 6.560791e-08 | 2903 |
| 0.0219 | 0.9953 | 1.5490 | 0.7042 | 6.558886e-08 | 2904 |
| 0.0259 | 0.9976 | 1.5521 | 0.7042 | 6.556981e-08 | 2905 |
| 0.0233 | 0.9953 | 1.5514 | 0.7042 | 6.555076e-08 | 2906 |
| 0.0256 | 0.9929 | 1.5529 | 0.7042 | 6.553171e-08 | 2907 |
| 0.0234 | 0.9976 | 1.5540 | 0.7042 | 6.551266e-08 | 2908 |
| 0.0275 | 0.9953 | 1.5549 | 0.7042 | 6.549361e-08 | 2909 |
| 0.0261 | 0.9953 | 1.5542 | 0.7042 | 6.547456e-08 | 2910 |
| 0.0200 | 1.0 | 1.5542 | 0.7042 | 6.54555e-08 | 2911 |
| 0.0309 | 0.9929 | 1.5504 | 0.7042 | 6.5436446e-08 | 2912 |
| 0.0231 | 0.9929 | 1.5485 | 0.7042 | 6.541739e-08 | 2913 |
| 0.0209 | 1.0 | 1.5486 | 0.7042 | 6.539833e-08 | 2914 |
| 0.0193 | 1.0 | 1.5482 | 0.7042 | 6.5379275e-08 | 2915 |
| 0.0204 | 1.0 | 1.5492 | 0.7042 | 6.536022e-08 | 2916 |
| 0.0294 | 0.9929 | 1.5508 | 0.7042 | 6.534116e-08 | 2917 |
| 0.0212 | 0.9976 | 1.5510 | 0.7042 | 6.53221e-08 | 2918 |
| 0.0275 | 0.9929 | 1.5523 | 0.7042 | 6.5303034e-08 | 2919 |
| 0.0255 | 0.9953 | 1.5501 | 0.7042 | 6.528397e-08 | 2920 |
| 0.0262 | 0.9929 | 1.5493 | 0.7042 | 6.5264906e-08 | 2921 |
| 0.0227 | 0.9953 | 1.5474 | 0.7113 | 6.524584e-08 | 2922 |
| 0.0295 | 0.9906 | 1.5479 | 0.7113 | 6.522678e-08 | 2923 |
| 0.0254 | 1.0 | 1.5471 | 0.7183 | 6.5207715e-08 | 2924 |
| 0.0259 | 0.9976 | 1.5492 | 0.7113 | 6.5188644e-08 | 2925 |
| 0.0265 | 0.9953 | 1.5547 | 0.7042 | 6.516957e-08 | 2926 |
| 0.0328 | 0.9929 | 1.5575 | 0.7042 | 6.51505e-08 | 2927 |
| 0.0240 | 0.9953 | 1.5583 | 0.7042 | 6.513143e-08 | 2928 |
| 0.0280 | 0.9953 | 1.5587 | 0.7042 | 6.511236e-08 | 2929 |
| 0.0216 | 0.9976 | 1.5574 | 0.7042 | 6.509329e-08 | 2930 |
| 0.0301 | 0.9953 | 1.5566 | 0.6972 | 6.507422e-08 | 2931 |
| 0.0285 | 0.9906 | 1.5564 | 0.7042 | 6.505515e-08 | 2932 |
| 0.0204 | 1.0 | 1.5551 | 0.7042 | 6.503607e-08 | 2933 |
| 0.0264 | 0.9929 | 1.5549 | 0.7113 | 6.501699e-08 | 2934 |
| 0.0196 | 1.0 | 1.5559 | 0.7042 | 6.499791e-08 | 2935 |
| 0.0238 | 0.9953 | 1.5567 | 0.7042 | 6.4978835e-08 | 2936 |
| 0.0297 | 0.9906 | 1.5578 | 0.7042 | 6.495976e-08 | 2937 |
| 0.0216 | 0.9953 | 1.5577 | 0.7042 | 6.494068e-08 | 2938 |
| 0.0270 | 0.9976 | 1.5653 | 0.7042 | 6.49216e-08 | 2939 |
| 0.0238 | 0.9976 | 1.5679 | 0.7042 | 6.490252e-08 | 2940 |
| 0.0374 | 0.9906 | 1.5689 | 0.7042 | 6.488344e-08 | 2941 |
| 0.0254 | 0.9976 | 1.5661 | 0.7042 | 6.486435e-08 | 2942 |
| 0.0262 | 0.9953 | 1.5643 | 0.7042 | 6.484527e-08 | 2943 |
| 0.0206 | 0.9976 | 1.5643 | 0.7042 | 6.482618e-08 | 2944 |
| 0.0220 | 0.9976 | 1.5654 | 0.7042 | 6.48071e-08 | 2945 |
| 0.0338 | 0.9906 | 1.5634 | 0.7042 | 6.478801e-08 | 2946 |
| 0.0233 | 0.9976 | 1.5618 | 0.7042 | 6.4768926e-08 | 2947 |
| 0.0217 | 1.0 | 1.5624 | 0.7042 | 6.474984e-08 | 2948 |
| 0.0251 | 0.9953 | 1.5674 | 0.7042 | 6.473075e-08 | 2949 |
| 0.0205 | 0.9953 | 1.5705 | 0.7042 | 6.471166e-08 | 2950 |
| 0.0175 | 1.0 | 1.5699 | 0.7042 | 6.4692564e-08 | 2951 |
| 0.0248 | 0.9976 | 1.5694 | 0.7042 | 6.467347e-08 | 2952 |
| 0.0279 | 0.9929 | 1.5654 | 0.7042 | 6.465438e-08 | 2953 |
| 0.0219 | 0.9976 | 1.5651 | 0.7042 | 6.463529e-08 | 2954 |
| 0.0279 | 0.9929 | 1.5667 | 0.7042 | 6.4616195e-08 | 2955 |
| 0.0252 | 0.9953 | 1.5681 | 0.7042 | 6.45971e-08 | 2956 |
| 0.0197 | 1.0 | 1.5678 | 0.7042 | 6.457801e-08 | 2957 |
| 0.0262 | 0.9929 | 1.5657 | 0.7042 | 6.455891e-08 | 2958 |
| 0.0244 | 0.9929 | 1.5637 | 0.7042 | 6.453981e-08 | 2959 |
| 0.0197 | 0.9976 | 1.5661 | 0.7042 | 6.452071e-08 | 2960 |
| 0.0294 | 0.9929 | 1.5672 | 0.7042 | 6.450161e-08 | 2961 |
| 0.0261 | 0.9976 | 1.5690 | 0.7042 | 6.448251e-08 | 2962 |
| 0.0214 | 0.9976 | 1.5684 | 0.7042 | 6.4463414e-08 | 2963 |
| 0.0274 | 0.9976 | 1.5684 | 0.7042 | 6.4444315e-08 | 2964 |
| 0.0302 | 0.9906 | 1.5698 | 0.7042 | 6.4425215e-08 | 2965 |
| 0.0189 | 0.9976 | 1.5691 | 0.7042 | 6.4406116e-08 | 2966 |
| 0.0179 | 1.0 | 1.5683 | 0.7042 | 6.438701e-08 | 2967 |
| 0.0254 | 0.9976 | 1.5666 | 0.7042 | 6.43679e-08 | 2968 |
| 0.0179 | 1.0 | 1.5652 | 0.7042 | 6.43488e-08 | 2969 |
| 0.0202 | 0.9976 | 1.5658 | 0.7042 | 6.432969e-08 | 2970 |
| 0.0228 | 0.9953 | 1.5657 | 0.7042 | 6.431058e-08 | 2971 |
| 0.0242 | 0.9953 | 1.5676 | 0.7042 | 6.429148e-08 | 2972 |
| 0.0219 | 0.9976 | 1.5694 | 0.7042 | 6.427237e-08 | 2973 |
| 0.0208 | 1.0 | 1.5710 | 0.7042 | 6.4253264e-08 | 2974 |
| 0.0244 | 0.9953 | 1.5718 | 0.7042 | 6.423416e-08 | 2975 |
| 0.0201 | 1.0 | 1.5735 | 0.7042 | 6.4215044e-08 | 2976 |
| 0.0258 | 0.9976 | 1.5738 | 0.7042 | 6.419593e-08 | 2977 |
| 0.0170 | 0.9976 | 1.5720 | 0.7042 | 6.417682e-08 | 2978 |
| 0.0177 | 1.0 | 1.5713 | 0.7042 | 6.41577e-08 | 2979 |
| 0.0297 | 0.9953 | 1.5680 | 0.7042 | 6.413859e-08 | 2980 |
| 0.0247 | 0.9953 | 1.5656 | 0.7113 | 6.4119476e-08 | 2981 |
| 0.0256 | 0.9953 | 1.5648 | 0.7042 | 6.410036e-08 | 2982 |
| 0.0220 | 0.9976 | 1.5634 | 0.7042 | 6.408125e-08 | 2983 |
| 0.0187 | 0.9976 | 1.5656 | 0.7042 | 6.4062135e-08 | 2984 |
| 0.0194 | 0.9976 | 1.5669 | 0.7042 | 6.404302e-08 | 2985 |
| 0.0220 | 0.9976 | 1.5656 | 0.7042 | 6.40239e-08 | 2986 |
| 0.0342 | 0.9882 | 1.5654 | 0.7042 | 6.400478e-08 | 2987 |
| 0.0305 | 0.9929 | 1.5653 | 0.7042 | 6.398566e-08 | 2988 |
| 0.0238 | 0.9976 | 1.5650 | 0.7113 | 6.396654e-08 | 2989 |
| 0.0261 | 0.9929 | 1.5661 | 0.7113 | 6.394742e-08 | 2990 |
| 0.0240 | 0.9929 | 1.5657 | 0.7113 | 6.39283e-08 | 2991 |
| 0.0182 | 0.9976 | 1.5654 | 0.7113 | 6.390918e-08 | 2992 |
| 0.0236 | 0.9953 | 1.5683 | 0.7042 | 6.3890056e-08 | 2993 |
| 0.0255 | 0.9953 | 1.5691 | 0.7042 | 6.3870935e-08 | 2994 |
| 0.0221 | 0.9976 | 1.5674 | 0.7042 | 6.3851815e-08 | 2995 |
| 0.0261 | 0.9929 | 1.5680 | 0.7042 | 6.383269e-08 | 2996 |
| 0.0216 | 0.9976 | 1.5703 | 0.7042 | 6.381356e-08 | 2997 |
| 0.0192 | 1.0 | 1.5711 | 0.7042 | 6.379443e-08 | 2998 |
| 0.0220 | 0.9976 | 1.5697 | 0.7042 | 6.37753e-08 | 2999 |
| 0.0152 | 1.0 | 1.5693 | 0.7042 | 6.3756175e-08 | 3000 |
| 0.0292 | 0.9953 | 1.5721 | 0.7042 | 6.373705e-08 | 3001 |
| 0.0169 | 1.0 | 1.5713 | 0.7042 | 6.371792e-08 | 3002 |
| 0.0209 | 0.9976 | 1.5696 | 0.7042 | 6.369879e-08 | 3003 |
| 0.0278 | 0.9906 | 1.5706 | 0.7042 | 6.3679664e-08 | 3004 |
| 0.0218 | 0.9976 | 1.5743 | 0.7042 | 6.3660536e-08 | 3005 |
| 0.0187 | 0.9976 | 1.5770 | 0.7042 | 6.364141e-08 | 3006 |
| 0.0263 | 0.9953 | 1.5793 | 0.7042 | 6.362228e-08 | 3007 |
| 0.0228 | 0.9976 | 1.5813 | 0.7042 | 6.3603146e-08 | 3008 |
| 0.0270 | 0.9976 | 1.5784 | 0.7042 | 6.358401e-08 | 3009 |
| 0.0206 | 1.0 | 1.5749 | 0.7042 | 6.3564876e-08 | 3010 |
| 0.0196 | 1.0 | 1.5756 | 0.7042 | 6.354574e-08 | 3011 |
| 0.0181 | 0.9976 | 1.5768 | 0.7042 | 6.3526606e-08 | 3012 |
| 0.0210 | 1.0 | 1.5753 | 0.7042 | 6.350747e-08 | 3013 |
| 0.0181 | 1.0 | 1.5739 | 0.7042 | 6.3488336e-08 | 3014 |
| 0.0209 | 0.9976 | 1.5761 | 0.7042 | 6.34692e-08 | 3015 |
| 0.0208 | 0.9953 | 1.5771 | 0.7042 | 6.345007e-08 | 3016 |
| 0.0231 | 0.9929 | 1.5767 | 0.7042 | 6.343093e-08 | 3017 |
| 0.0227 | 0.9929 | 1.5784 | 0.7042 | 6.34118e-08 | 3018 |
| 0.0154 | 1.0 | 1.5773 | 0.7042 | 6.339266e-08 | 3019 |
| 0.0202 | 1.0 | 1.5778 | 0.7042 | 6.337352e-08 | 3020 |
| 0.0270 | 0.9906 | 1.5791 | 0.7042 | 6.335438e-08 | 3021 |
| 0.0231 | 0.9976 | 1.5802 | 0.7042 | 6.3335236e-08 | 3022 |
| 0.0226 | 0.9976 | 1.5824 | 0.7042 | 6.3316094e-08 | 3023 |
| 0.0238 | 0.9976 | 1.5832 | 0.7042 | 6.329695e-08 | 3024 |
| 0.0249 | 1.0 | 1.5845 | 0.7042 | 6.327781e-08 | 3025 |
| 0.0250 | 0.9953 | 1.5791 | 0.7042 | 6.325867e-08 | 3026 |
| 0.0279 | 0.9929 | 1.5778 | 0.7042 | 6.3239526e-08 | 3027 |
| 0.0216 | 0.9976 | 1.5812 | 0.7042 | 6.3220384e-08 | 3028 |
| 0.0250 | 0.9953 | 1.5805 | 0.6972 | 6.320124e-08 | 3029 |
| 0.0179 | 1.0 | 1.5804 | 0.6972 | 6.31821e-08 | 3030 |
| 0.0179 | 0.9953 | 1.5803 | 0.7042 | 6.316296e-08 | 3031 |
| 0.0188 | 1.0 | 1.5821 | 0.7042 | 6.3143816e-08 | 3032 |
| 0.0227 | 0.9953 | 1.5826 | 0.7042 | 6.3124666e-08 | 3033 |
| 0.0310 | 0.9906 | 1.5825 | 0.7042 | 6.310552e-08 | 3034 |
| 0.0312 | 0.9929 | 1.5809 | 0.6972 | 6.308637e-08 | 3035 |
| 0.0236 | 0.9976 | 1.5800 | 0.7042 | 6.306722e-08 | 3036 |
| 0.0216 | 1.0 | 1.5792 | 0.7042 | 6.304807e-08 | 3037 |
| 0.0305 | 0.9953 | 1.5807 | 0.7042 | 6.302892e-08 | 3038 |
| 0.0205 | 0.9976 | 1.5825 | 0.7042 | 6.300977e-08 | 3039 |
| 0.0222 | 0.9953 | 1.5833 | 0.7042 | 6.299062e-08 | 3040 |
| 0.0220 | 0.9953 | 1.5839 | 0.7042 | 6.297147e-08 | 3041 |
| 0.0211 | 1.0 | 1.5863 | 0.7042 | 6.2952324e-08 | 3042 |
| 0.0188 | 0.9976 | 1.5858 | 0.7042 | 6.2933175e-08 | 3043 |
| 0.0203 | 0.9976 | 1.5860 | 0.7042 | 6.2914026e-08 | 3044 |
| 0.0200 | 0.9976 | 1.5858 | 0.7042 | 6.289488e-08 | 3045 |
| 0.0260 | 0.9953 | 1.5863 | 0.7042 | 6.287573e-08 | 3046 |
| 0.0188 | 1.0 | 1.5862 | 0.7042 | 6.285658e-08 | 3047 |
| 0.0253 | 0.9953 | 1.5838 | 0.7042 | 6.283742e-08 | 3048 |
| 0.0242 | 0.9953 | 1.5823 | 0.7042 | 6.2818266e-08 | 3049 |
| 0.0222 | 0.9953 | 1.5814 | 0.7042 | 6.279911e-08 | 3050 |
| 0.0266 | 0.9953 | 1.5819 | 0.7042 | 6.2779954e-08 | 3051 |
| 0.0195 | 0.9976 | 1.5831 | 0.7042 | 6.27608e-08 | 3052 |
| 0.0235 | 0.9953 | 1.5840 | 0.7042 | 6.274164e-08 | 3053 |
| 0.0200 | 0.9953 | 1.5828 | 0.7042 | 6.2722485e-08 | 3054 |
| 0.0263 | 0.9953 | 1.5834 | 0.7042 | 6.270333e-08 | 3055 |
| 0.0185 | 0.9976 | 1.5836 | 0.7042 | 6.268417e-08 | 3056 |
| 0.0239 | 0.9953 | 1.5785 | 0.7042 | 6.2665016e-08 | 3057 |
| 0.0174 | 1.0 | 1.5779 | 0.7042 | 6.264586e-08 | 3058 |
| 0.0220 | 0.9953 | 1.5795 | 0.7042 | 6.2626704e-08 | 3059 |
| 0.0203 | 0.9976 | 1.5835 | 0.7042 | 6.260755e-08 | 3060 |
| 0.0180 | 0.9976 | 1.5856 | 0.7042 | 6.258839e-08 | 3061 |
| 0.0231 | 0.9929 | 1.5846 | 0.7042 | 6.2569235e-08 | 3062 |
| 0.0172 | 0.9976 | 1.5834 | 0.7042 | 6.255008e-08 | 3063 |
| 0.0320 | 0.9906 | 1.5802 | 0.7042 | 6.253092e-08 | 3064 |
| 0.0206 | 0.9953 | 1.5824 | 0.7042 | 6.251176e-08 | 3065 |
| 0.0175 | 1.0 | 1.5833 | 0.7042 | 6.2492596e-08 | 3066 |
| 0.0206 | 0.9976 | 1.5819 | 0.7042 | 6.247343e-08 | 3067 |
| 0.0227 | 0.9976 | 1.5810 | 0.7042 | 6.245427e-08 | 3068 |
| 0.0212 | 0.9953 | 1.5808 | 0.7042 | 6.2435106e-08 | 3069 |
| 0.0303 | 0.9929 | 1.5806 | 0.7042 | 6.241594e-08 | 3070 |
| 0.0224 | 0.9976 | 1.5812 | 0.7042 | 6.239678e-08 | 3071 |
| 0.0286 | 0.9906 | 1.5819 | 0.7042 | 6.2377616e-08 | 3072 |
| 0.0262 | 0.9929 | 1.5820 | 0.7042 | 6.235845e-08 | 3073 |
| 0.0258 | 0.9929 | 1.5832 | 0.7042 | 6.233929e-08 | 3074 |
| 0.0322 | 0.9906 | 1.5823 | 0.7042 | 6.2320126e-08 | 3075 |
| 0.0223 | 0.9929 | 1.5809 | 0.7042 | 6.230096e-08 | 3076 |
| 0.0244 | 0.9953 | 1.5805 | 0.7042 | 6.22818e-08 | 3077 |
| 0.0189 | 1.0 | 1.5809 | 0.7042 | 6.2262636e-08 | 3078 |
| 0.0213 | 0.9953 | 1.5810 | 0.7042 | 6.224347e-08 | 3079 |
| 0.0161 | 1.0 | 1.5811 | 0.7042 | 6.222431e-08 | 3080 |
| 0.0238 | 0.9976 | 1.5832 | 0.7042 | 6.2205146e-08 | 3081 |
| 0.0166 | 0.9976 | 1.5837 | 0.7042 | 6.218598e-08 | 3082 |
| 0.0165 | 1.0 | 1.5821 | 0.7042 | 6.216682e-08 | 3083 |
| 0.0192 | 1.0 | 1.5795 | 0.7042 | 6.2147656e-08 | 3084 |
| 0.0202 | 0.9976 | 1.5796 | 0.7042 | 6.212849e-08 | 3085 |
| 0.0193 | 0.9976 | 1.5809 | 0.7042 | 6.210932e-08 | 3086 |
| 0.0157 | 1.0 | 1.5821 | 0.7042 | 6.209015e-08 | 3087 |
| 0.0218 | 0.9929 | 1.5834 | 0.7042 | 6.207098e-08 | 3088 |
| 0.0196 | 0.9976 | 1.5903 | 0.7042 | 6.205181e-08 | 3089 |
| 0.0267 | 0.9976 | 1.5917 | 0.7042 | 6.203264e-08 | 3090 |
| 0.0165 | 0.9976 | 1.5937 | 0.7042 | 6.201347e-08 | 3091 |
| 0.0209 | 1.0 | 1.5921 | 0.7042 | 6.19943e-08 | 3092 |
| 0.0234 | 0.9976 | 1.5901 | 0.7042 | 6.197513e-08 | 3093 |
| 0.0178 | 0.9976 | 1.5892 | 0.7042 | 6.195596e-08 | 3094 |
| 0.0203 | 0.9953 | 1.5885 | 0.7042 | 6.193679e-08 | 3095 |
| 0.0254 | 0.9953 | 1.5869 | 0.7042 | 6.191762e-08 | 3096 |
| 0.0192 | 0.9976 | 1.5868 | 0.7042 | 6.189845e-08 | 3097 |
| 0.0183 | 1.0 | 1.5885 | 0.7042 | 6.187928e-08 | 3098 |
| 0.0249 | 0.9929 | 1.5913 | 0.7042 | 6.1860106e-08 | 3099 |
| 0.0240 | 0.9953 | 1.5962 | 0.7042 | 6.1840936e-08 | 3100 |
| 0.0252 | 0.9976 | 1.5994 | 0.7042 | 6.1821765e-08 | 3101 |
| 0.0342 | 0.9929 | 1.5971 | 0.7042 | 6.1802595e-08 | 3102 |
| 0.0197 | 1.0 | 1.5882 | 0.7042 | 6.1783425e-08 | 3103 |
| 0.0151 | 1.0 | 1.5865 | 0.7113 | 6.1764254e-08 | 3104 |
| 0.0210 | 0.9976 | 1.5883 | 0.7042 | 6.1745084e-08 | 3105 |
| 0.0307 | 0.9929 | 1.5905 | 0.7042 | 6.172591e-08 | 3106 |
| 0.0204 | 0.9953 | 1.5939 | 0.7042 | 6.170674e-08 | 3107 |
| 0.0321 | 0.9953 | 1.5964 | 0.7042 | 6.168757e-08 | 3108 |
| 0.0277 | 0.9953 | 1.5979 | 0.7042 | 6.16684e-08 | 3109 |
| 0.0199 | 0.9953 | 1.5996 | 0.7042 | 6.164923e-08 | 3110 |
| 0.0182 | 0.9976 | 1.5997 | 0.7042 | 6.163006e-08 | 3111 |
| 0.0152 | 1.0 | 1.5990 | 0.7042 | 6.161089e-08 | 3112 |
| 0.0288 | 0.9929 | 1.5964 | 0.7042 | 6.159172e-08 | 3113 |
| 0.0195 | 0.9953 | 1.5957 | 0.7042 | 6.157255e-08 | 3114 |
| 0.0217 | 0.9976 | 1.5977 | 0.7042 | 6.155338e-08 | 3115 |
| 0.0169 | 1.0 | 1.5977 | 0.7042 | 6.15342e-08 | 3116 |
| 0.0194 | 0.9976 | 1.5990 | 0.7042 | 6.1515024e-08 | 3117 |
| 0.0174 | 0.9976 | 1.5982 | 0.7042 | 6.149585e-08 | 3118 |
| 0.0208 | 0.9953 | 1.5984 | 0.7042 | 6.147667e-08 | 3119 |
| 0.0231 | 0.9929 | 1.5987 | 0.7042 | 6.145749e-08 | 3120 |
| 0.0255 | 0.9953 | 1.5984 | 0.7042 | 6.1438314e-08 | 3121 |
| 0.0153 | 1.0 | 1.5981 | 0.7042 | 6.1419136e-08 | 3122 |
| 0.0146 | 1.0 | 1.5975 | 0.7042 | 6.139996e-08 | 3123 |
| 0.0185 | 0.9976 | 1.5973 | 0.7042 | 6.138078e-08 | 3124 |
| 0.0243 | 0.9953 | 1.5971 | 0.7042 | 6.1361604e-08 | 3125 |
| 0.0141 | 1.0 | 1.5967 | 0.7042 | 6.1342426e-08 | 3126 |
| 0.0187 | 0.9976 | 1.5980 | 0.7042 | 6.132325e-08 | 3127 |
| 0.0231 | 0.9953 | 1.5974 | 0.7042 | 6.130407e-08 | 3128 |
| 0.0240 | 0.9929 | 1.5972 | 0.7042 | 6.1284894e-08 | 3129 |
| 0.0227 | 0.9976 | 1.5964 | 0.7042 | 6.1265716e-08 | 3130 |
| 0.0151 | 1.0 | 1.5934 | 0.7042 | 6.124654e-08 | 3131 |
| 0.0163 | 1.0 | 1.5929 | 0.7042 | 6.122736e-08 | 3132 |
| 0.0282 | 0.9953 | 1.5949 | 0.7042 | 6.120818e-08 | 3133 |
| 0.0186 | 1.0 | 1.5959 | 0.7042 | 6.1189006e-08 | 3134 |
| 0.0183 | 1.0 | 1.5969 | 0.7042 | 6.116983e-08 | 3135 |
| 0.0171 | 1.0 | 1.5965 | 0.7042 | 6.115065e-08 | 3136 |
| 0.0155 | 0.9976 | 1.5973 | 0.7042 | 6.113147e-08 | 3137 |
| 0.0177 | 0.9976 | 1.5995 | 0.7042 | 6.1112296e-08 | 3138 |
| 0.0233 | 0.9929 | 1.5984 | 0.7042 | 6.109312e-08 | 3139 |
| 0.0206 | 0.9976 | 1.5999 | 0.7042 | 6.107394e-08 | 3140 |
| 0.0246 | 0.9953 | 1.6000 | 0.7042 | 6.105476e-08 | 3141 |
| 0.0155 | 1.0 | 1.6010 | 0.7042 | 6.1035585e-08 | 3142 |
| 0.0152 | 1.0 | 1.6014 | 0.7042 | 6.101641e-08 | 3143 |
| 0.0212 | 0.9953 | 1.6012 | 0.7042 | 6.099723e-08 | 3144 |
| 0.0228 | 0.9976 | 1.6000 | 0.7042 | 6.097805e-08 | 3145 |
| 0.0193 | 0.9976 | 1.5975 | 0.6972 | 6.0958875e-08 | 3146 |
| 0.0174 | 0.9976 | 1.5964 | 0.6972 | 6.09397e-08 | 3147 |
| 0.0202 | 0.9953 | 1.5985 | 0.7042 | 6.092052e-08 | 3148 |
| 0.0223 | 0.9976 | 1.5987 | 0.7042 | 6.090134e-08 | 3149 |
| 0.0249 | 0.9906 | 1.6020 | 0.7042 | 6.0882165e-08 | 3150 |
| 0.0148 | 1.0 | 1.6035 | 0.7042 | 6.086299e-08 | 3151 |
| 0.0195 | 1.0 | 1.6044 | 0.7042 | 6.084381e-08 | 3152 |
| 0.0175 | 0.9976 | 1.6041 | 0.7042 | 6.082463e-08 | 3153 |
| 0.0171 | 0.9976 | 1.6032 | 0.7042 | 6.0805455e-08 | 3154 |
| 0.0256 | 0.9906 | 1.6012 | 0.7042 | 6.078628e-08 | 3155 |
| 0.0189 | 0.9953 | 1.6011 | 0.7042 | 6.07671e-08 | 3156 |
| 0.0228 | 0.9953 | 1.6034 | 0.7042 | 6.074792e-08 | 3157 |
| 0.0171 | 1.0 | 1.6059 | 0.7042 | 6.0728745e-08 | 3158 |
| 0.0159 | 1.0 | 1.6050 | 0.7042 | 6.070957e-08 | 3159 |
| 0.0228 | 0.9953 | 1.6049 | 0.7042 | 6.069039e-08 | 3160 |
| 0.0228 | 0.9953 | 1.6055 | 0.7042 | 6.067121e-08 | 3161 |
| 0.0153 | 1.0 | 1.6031 | 0.7042 | 6.0652035e-08 | 3162 |
| 0.0224 | 0.9953 | 1.6020 | 0.7042 | 6.063286e-08 | 3163 |
| 0.0190 | 0.9953 | 1.6020 | 0.7042 | 6.061368e-08 | 3164 |
| 0.0172 | 0.9976 | 1.6047 | 0.7042 | 6.05945e-08 | 3165 |
| 0.0285 | 0.9929 | 1.6061 | 0.7042 | 6.0575324e-08 | 3166 |
| 0.0193 | 0.9976 | 1.6061 | 0.7042 | 6.055615e-08 | 3167 |
| 0.0196 | 0.9976 | 1.6072 | 0.7042 | 6.053697e-08 | 3168 |
| 0.0166 | 1.0 | 1.6068 | 0.7042 | 6.051779e-08 | 3169 |
| 0.0270 | 0.9953 | 1.6051 | 0.7042 | 6.0498614e-08 | 3170 |
| 0.0121 | 1.0 | 1.6047 | 0.7042 | 6.047944e-08 | 3171 |
| 0.0140 | 1.0 | 1.6039 | 0.7042 | 6.046026e-08 | 3172 |
| 0.0258 | 0.9953 | 1.6023 | 0.7042 | 6.044108e-08 | 3173 |
| 0.0148 | 1.0 | 1.6021 | 0.7042 | 6.0421904e-08 | 3174 |
| 0.0208 | 0.9929 | 1.6035 | 0.7042 | 6.0402726e-08 | 3175 |
| 0.0152 | 0.9976 | 1.6037 | 0.6972 | 6.038355e-08 | 3176 |
| 0.0131 | 1.0 | 1.6036 | 0.7042 | 6.036437e-08 | 3177 |
| 0.0144 | 1.0 | 1.6053 | 0.7042 | 6.0345194e-08 | 3178 |
| 0.0199 | 0.9953 | 1.6067 | 0.7042 | 6.0326016e-08 | 3179 |
| 0.0162 | 0.9976 | 1.6076 | 0.7042 | 6.030684e-08 | 3180 |
| 0.0212 | 0.9929 | 1.6092 | 0.7042 | 6.028766e-08 | 3181 |
| 0.0171 | 1.0 | 1.6099 | 0.7042 | 6.026848e-08 | 3182 |
| 0.0153 | 1.0 | 1.6085 | 0.7042 | 6.0249306e-08 | 3183 |
| 0.0182 | 0.9953 | 1.6058 | 0.7042 | 6.023013e-08 | 3184 |
| 0.0211 | 0.9976 | 1.6054 | 0.7042 | 6.021095e-08 | 3185 |
| 0.0206 | 0.9953 | 1.6082 | 0.7042 | 6.019177e-08 | 3186 |
| 0.0227 | 0.9976 | 1.6114 | 0.7042 | 6.0172596e-08 | 3187 |
| 0.0177 | 1.0 | 1.6120 | 0.7042 | 6.015342e-08 | 3188 |
| 0.0216 | 0.9953 | 1.6101 | 0.7042 | 6.013424e-08 | 3189 |
| 0.0261 | 0.9929 | 1.6102 | 0.7042 | 6.011506e-08 | 3190 |
| 0.0174 | 1.0 | 1.6115 | 0.7042 | 6.0095886e-08 | 3191 |
| 0.0227 | 0.9906 | 1.6116 | 0.7042 | 6.007671e-08 | 3192 |
| 0.0169 | 1.0 | 1.6111 | 0.7042 | 6.005753e-08 | 3193 |
| 0.0214 | 0.9953 | 1.6103 | 0.7042 | 6.003835e-08 | 3194 |
| 0.0167 | 0.9976 | 1.6090 | 0.7042 | 6.0019175e-08 | 3195 |
| 0.0201 | 0.9953 | 1.6073 | 0.7113 | 6e-08 | 3196 |
| 0.0215 | 0.9953 | 1.6073 | 0.7042 | 5.998082e-08 | 3197 |
| 0.0129 | 1.0 | 1.6066 | 0.7042 | 5.996164e-08 | 3198 |
| 0.0166 | 1.0 | 1.6077 | 0.7042 | 5.9942465e-08 | 3199 |
| 0.0269 | 0.9906 | 1.6103 | 0.7042 | 5.992329e-08 | 3200 |
| 0.0189 | 0.9976 | 1.6106 | 0.7042 | 5.990411e-08 | 3201 |
| 0.0276 | 0.9882 | 1.6134 | 0.7042 | 5.988493e-08 | 3202 |
| 0.0189 | 1.0 | 1.6132 | 0.7042 | 5.9865755e-08 | 3203 |
| 0.0177 | 1.0 | 1.6115 | 0.7042 | 5.984658e-08 | 3204 |
| 0.0222 | 0.9976 | 1.6126 | 0.7042 | 5.98274e-08 | 3205 |
| 0.0159 | 0.9976 | 1.6141 | 0.7042 | 5.980822e-08 | 3206 |
| 0.0247 | 0.9976 | 1.6151 | 0.7042 | 5.9789045e-08 | 3207 |
| 0.0163 | 1.0 | 1.6147 | 0.7042 | 5.976987e-08 | 3208 |
| 0.0239 | 0.9976 | 1.6149 | 0.7042 | 5.975069e-08 | 3209 |
| 0.0212 | 0.9953 | 1.6163 | 0.7042 | 5.973152e-08 | 3210 |
| 0.0213 | 0.9953 | 1.6160 | 0.7042 | 5.971235e-08 | 3211 |
| 0.0252 | 0.9953 | 1.6169 | 0.7042 | 5.969318e-08 | 3212 |
| 0.0275 | 0.9929 | 1.6165 | 0.7042 | 5.967401e-08 | 3213 |
| 0.0344 | 0.9906 | 1.6146 | 0.7042 | 5.965484e-08 | 3214 |
| 0.0161 | 1.0 | 1.6134 | 0.7042 | 5.963567e-08 | 3215 |
| 0.0178 | 0.9953 | 1.6140 | 0.7042 | 5.96165e-08 | 3216 |
| 0.0275 | 0.9953 | 1.6145 | 0.7042 | 5.9597323e-08 | 3217 |
| 0.0176 | 0.9976 | 1.6159 | 0.7042 | 5.957815e-08 | 3218 |
| 0.0243 | 0.9953 | 1.6185 | 0.7042 | 5.9558978e-08 | 3219 |
| 0.0140 | 1.0 | 1.6189 | 0.7042 | 5.9539808e-08 | 3220 |
| 0.0255 | 0.9929 | 1.6199 | 0.7042 | 5.9520637e-08 | 3221 |
| 0.0212 | 0.9953 | 1.6208 | 0.7042 | 5.9501467e-08 | 3222 |
| 0.0169 | 0.9953 | 1.6166 | 0.7042 | 5.9482296e-08 | 3223 |
| 0.0192 | 0.9976 | 1.6130 | 0.7042 | 5.9463126e-08 | 3224 |
| 0.0152 | 0.9976 | 1.6122 | 0.7042 | 5.9443956e-08 | 3225 |
| 0.0156 | 1.0 | 1.6142 | 0.7042 | 5.9424785e-08 | 3226 |
| 0.0206 | 0.9953 | 1.6129 | 0.7042 | 5.9405615e-08 | 3227 |
| 0.0174 | 0.9976 | 1.6129 | 0.7042 | 5.9386444e-08 | 3228 |
| 0.0191 | 0.9976 | 1.6132 | 0.7042 | 5.9367274e-08 | 3229 |
| 0.0170 | 0.9976 | 1.6128 | 0.7042 | 5.9348103e-08 | 3230 |
| 0.0195 | 0.9953 | 1.6134 | 0.7042 | 5.9328933e-08 | 3231 |
| 0.0232 | 0.9953 | 1.6164 | 0.7042 | 5.9309762e-08 | 3232 |
| 0.0136 | 1.0 | 1.6190 | 0.7042 | 5.9290596e-08 | 3233 |
| 0.0175 | 0.9976 | 1.6188 | 0.7042 | 5.927143e-08 | 3234 |
| 0.0269 | 0.9953 | 1.6198 | 0.7042 | 5.925226e-08 | 3235 |
| 0.0171 | 1.0 | 1.6212 | 0.7042 | 5.9233095e-08 | 3236 |
| 0.0170 | 0.9976 | 1.6188 | 0.7042 | 5.9213928e-08 | 3237 |
| 0.0175 | 0.9976 | 1.6155 | 0.7042 | 5.919476e-08 | 3238 |
| 0.0230 | 0.9953 | 1.6146 | 0.7042 | 5.9175594e-08 | 3239 |
| 0.0160 | 0.9976 | 1.6140 | 0.7042 | 5.9156427e-08 | 3240 |
| 0.0300 | 0.9953 | 1.6164 | 0.7042 | 5.913726e-08 | 3241 |
| 0.0124 | 1.0 | 1.6196 | 0.7042 | 5.9118094e-08 | 3242 |
| 0.0193 | 0.9976 | 1.6208 | 0.7042 | 5.9098927e-08 | 3243 |
| 0.0183 | 0.9976 | 1.6180 | 0.7042 | 5.907976e-08 | 3244 |
| 0.0170 | 1.0 | 1.6171 | 0.7042 | 5.9060593e-08 | 3245 |
| 0.0155 | 1.0 | 1.6188 | 0.7042 | 5.904143e-08 | 3246 |
| 0.0183 | 1.0 | 1.6221 | 0.7042 | 5.9022266e-08 | 3247 |
| 0.0240 | 0.9929 | 1.6219 | 0.7042 | 5.9003103e-08 | 3248 |
| 0.0119 | 1.0 | 1.6225 | 0.7042 | 5.898394e-08 | 3249 |
| 0.0195 | 0.9976 | 1.6234 | 0.7042 | 5.8964776e-08 | 3250 |
| 0.0154 | 0.9976 | 1.6232 | 0.7042 | 5.8945613e-08 | 3251 |
| 0.0244 | 0.9953 | 1.6210 | 0.7042 | 5.892645e-08 | 3252 |
| 0.0135 | 1.0 | 1.6218 | 0.7042 | 5.8907286e-08 | 3253 |
| 0.0154 | 1.0 | 1.6221 | 0.7042 | 5.8888123e-08 | 3254 |
| 0.0137 | 0.9976 | 1.6216 | 0.7042 | 5.886896e-08 | 3255 |
| 0.0213 | 0.9976 | 1.6230 | 0.7042 | 5.88498e-08 | 3256 |
| 0.0257 | 0.9929 | 1.6263 | 0.7042 | 5.883064e-08 | 3257 |
| 0.0224 | 0.9976 | 1.6261 | 0.7042 | 5.881148e-08 | 3258 |
| 0.0137 | 1.0 | 1.6205 | 0.7042 | 5.879232e-08 | 3259 |
| 0.0129 | 1.0 | 1.6210 | 0.7042 | 5.877316e-08 | 3260 |
| 0.0137 | 1.0 | 1.6223 | 0.7042 | 5.8754e-08 | 3261 |
| 0.0168 | 0.9976 | 1.6234 | 0.7042 | 5.873484e-08 | 3262 |
| 0.0210 | 0.9976 | 1.6238 | 0.7042 | 5.871568e-08 | 3263 |
| 0.0206 | 0.9953 | 1.6252 | 0.7042 | 5.869652e-08 | 3264 |
| 0.0167 | 1.0 | 1.6263 | 0.7042 | 5.867736e-08 | 3265 |
| 0.0130 | 1.0 | 1.6257 | 0.7042 | 5.8658205e-08 | 3266 |
| 0.0127 | 1.0 | 1.6243 | 0.7042 | 5.863905e-08 | 3267 |
| 0.0163 | 0.9976 | 1.6250 | 0.7042 | 5.8619893e-08 | 3268 |
| 0.0140 | 1.0 | 1.6255 | 0.7042 | 5.8600737e-08 | 3269 |
| 0.0236 | 0.9976 | 1.6237 | 0.7042 | 5.858158e-08 | 3270 |
| 0.0217 | 0.9953 | 1.6246 | 0.7042 | 5.8562424e-08 | 3271 |
| 0.0154 | 0.9976 | 1.6250 | 0.7042 | 5.8543268e-08 | 3272 |
| 0.0170 | 0.9976 | 1.6254 | 0.7042 | 5.8524112e-08 | 3273 |
| 0.0198 | 0.9953 | 1.6252 | 0.7042 | 5.8504956e-08 | 3274 |
| 0.0107 | 1.0 | 1.6283 | 0.7042 | 5.8485803e-08 | 3275 |
| 0.0174 | 0.9976 | 1.6295 | 0.7042 | 5.846665e-08 | 3276 |
| 0.0192 | 0.9976 | 1.6310 | 0.7042 | 5.8447498e-08 | 3277 |
| 0.0176 | 0.9976 | 1.6304 | 0.7042 | 5.8428345e-08 | 3278 |
| 0.0190 | 0.9953 | 1.6302 | 0.7042 | 5.8409192e-08 | 3279 |
| 0.0165 | 1.0 | 1.6307 | 0.7042 | 5.839004e-08 | 3280 |
| 0.0189 | 0.9953 | 1.6311 | 0.7042 | 5.8370887e-08 | 3281 |
| 0.0176 | 1.0 | 1.6288 | 0.7042 | 5.8351734e-08 | 3282 |
| 0.0220 | 0.9976 | 1.6265 | 0.7042 | 5.8332585e-08 | 3283 |
| 0.0229 | 0.9953 | 1.6270 | 0.7042 | 5.8313436e-08 | 3284 |
| 0.0165 | 1.0 | 1.6271 | 0.7042 | 5.8294287e-08 | 3285 |
| 0.0140 | 1.0 | 1.6262 | 0.7042 | 5.8275138e-08 | 3286 |
| 0.0189 | 0.9976 | 1.6284 | 0.7042 | 5.825599e-08 | 3287 |
| 0.0142 | 1.0 | 1.6300 | 0.7042 | 5.823684e-08 | 3288 |
| 0.0159 | 1.0 | 1.6295 | 0.7042 | 5.821769e-08 | 3289 |
| 0.0255 | 0.9953 | 1.6270 | 0.7042 | 5.8198545e-08 | 3290 |
| 0.0195 | 0.9953 | 1.6277 | 0.7042 | 5.81794e-08 | 3291 |
| 0.0210 | 0.9953 | 1.6320 | 0.7042 | 5.8160254e-08 | 3292 |
| 0.0283 | 0.9906 | 1.6296 | 0.7042 | 5.8141108e-08 | 3293 |
| 0.0192 | 0.9953 | 1.6286 | 0.7042 | 5.8121962e-08 | 3294 |
| 0.0172 | 1.0 | 1.6278 | 0.7042 | 5.8102817e-08 | 3295 |
| 0.0136 | 1.0 | 1.6273 | 0.7042 | 5.808367e-08 | 3296 |
| 0.0131 | 1.0 | 1.6273 | 0.7042 | 5.8064526e-08 | 3297 |
| 0.0213 | 0.9953 | 1.6278 | 0.7042 | 5.8045384e-08 | 3298 |
| 0.0266 | 0.9929 | 1.6266 | 0.7113 | 5.802624e-08 | 3299 |
| 0.0145 | 1.0 | 1.6259 | 0.7183 | 5.80071e-08 | 3300 |
| 0.0191 | 0.9953 | 1.6265 | 0.7113 | 5.7987958e-08 | 3301 |
| 0.0210 | 0.9906 | 1.6311 | 0.7042 | 5.7968816e-08 | 3302 |
| 0.0200 | 0.9976 | 1.6336 | 0.7042 | 5.7949674e-08 | 3303 |
| 0.0179 | 1.0 | 1.6341 | 0.7042 | 5.793053e-08 | 3304 |
| 0.0245 | 0.9929 | 1.6349 | 0.7042 | 5.7911393e-08 | 3305 |
| 0.0226 | 0.9929 | 1.6357 | 0.7042 | 5.7892255e-08 | 3306 |
| 0.0131 | 1.0 | 1.6358 | 0.7042 | 5.7873116e-08 | 3307 |
| 0.0197 | 0.9976 | 1.6377 | 0.7042 | 5.7853978e-08 | 3308 |
| 0.0164 | 0.9976 | 1.6405 | 0.7042 | 5.783484e-08 | 3309 |
| 0.0157 | 1.0 | 1.6379 | 0.7042 | 5.78157e-08 | 3310 |
| 0.0184 | 0.9976 | 1.6327 | 0.7042 | 5.7796566e-08 | 3311 |
| 0.0123 | 1.0 | 1.6306 | 0.7042 | 5.777743e-08 | 3312 |
| 0.0155 | 0.9976 | 1.6306 | 0.7042 | 5.7758296e-08 | 3313 |
| 0.0200 | 0.9976 | 1.6305 | 0.7042 | 5.773916e-08 | 3314 |
| 0.0212 | 0.9953 | 1.6325 | 0.7042 | 5.7720026e-08 | 3315 |
| 0.0239 | 0.9953 | 1.6350 | 0.7042 | 5.770089e-08 | 3316 |
| 0.0163 | 0.9976 | 1.6345 | 0.7042 | 5.768176e-08 | 3317 |
| 0.0157 | 0.9976 | 1.6336 | 0.7042 | 5.766263e-08 | 3318 |
| 0.0140 | 1.0 | 1.6332 | 0.7042 | 5.7643497e-08 | 3319 |
| 0.0250 | 0.9953 | 1.6330 | 0.7042 | 5.7624366e-08 | 3320 |
| 0.0148 | 1.0 | 1.6336 | 0.7042 | 5.7605234e-08 | 3321 |
| 0.0181 | 0.9976 | 1.6326 | 0.7042 | 5.7586103e-08 | 3322 |
| 0.0145 | 1.0 | 1.6331 | 0.7042 | 5.7566975e-08 | 3323 |
| 0.0200 | 0.9953 | 1.6335 | 0.7042 | 5.7547847e-08 | 3324 |
| 0.0242 | 0.9929 | 1.6329 | 0.7042 | 5.752872e-08 | 3325 |
| 0.0116 | 1.0 | 1.6328 | 0.7042 | 5.750959e-08 | 3326 |
| 0.0185 | 0.9953 | 1.6336 | 0.7042 | 5.7490464e-08 | 3327 |
| 0.0220 | 0.9976 | 1.6328 | 0.7042 | 5.7471336e-08 | 3328 |
| 0.0164 | 0.9976 | 1.6323 | 0.7042 | 5.7452212e-08 | 3329 |
| 0.0154 | 0.9976 | 1.6316 | 0.7042 | 5.7433088e-08 | 3330 |
| 0.0114 | 1.0 | 1.6302 | 0.7113 | 5.7413963e-08 | 3331 |
| 0.0164 | 1.0 | 1.6320 | 0.7042 | 5.739484e-08 | 3332 |
| 0.0175 | 0.9976 | 1.6311 | 0.7042 | 5.7375715e-08 | 3333 |
| 0.0158 | 1.0 | 1.6308 | 0.7113 | 5.735659e-08 | 3334 |
| 0.0154 | 1.0 | 1.6341 | 0.7042 | 5.733747e-08 | 3335 |
| 0.0180 | 0.9929 | 1.6336 | 0.7042 | 5.731835e-08 | 3336 |
| 0.0167 | 0.9976 | 1.6338 | 0.7042 | 5.729923e-08 | 3337 |
| 0.0265 | 0.9882 | 1.6373 | 0.7042 | 5.7280108e-08 | 3338 |
| 0.0170 | 0.9953 | 1.6407 | 0.7042 | 5.7260987e-08 | 3339 |
| 0.0164 | 1.0 | 1.6418 | 0.7042 | 5.724187e-08 | 3340 |
| 0.0263 | 0.9929 | 1.6417 | 0.7042 | 5.7222753e-08 | 3341 |
| 0.0136 | 1.0 | 1.6414 | 0.7042 | 5.7203636e-08 | 3342 |
| 0.0167 | 0.9976 | 1.6404 | 0.7042 | 5.718452e-08 | 3343 |
| 0.0246 | 0.9953 | 1.6385 | 0.7042 | 5.71654e-08 | 3344 |
| 0.0200 | 0.9976 | 1.6406 | 0.7042 | 5.7146284e-08 | 3345 |
| 0.0192 | 0.9953 | 1.6387 | 0.7042 | 5.712717e-08 | 3346 |
| 0.0130 | 0.9976 | 1.6344 | 0.7042 | 5.7108057e-08 | 3347 |
| 0.0164 | 0.9953 | 1.6324 | 0.7113 | 5.7088943e-08 | 3348 |
| 0.0175 | 0.9976 | 1.6331 | 0.7113 | 5.706983e-08 | 3349 |
| 0.0225 | 0.9906 | 1.6357 | 0.7042 | 5.7050716e-08 | 3350 |
| 0.0127 | 1.0 | 1.6378 | 0.7042 | 5.7031606e-08 | 3351 |
| 0.0216 | 0.9953 | 1.6396 | 0.7042 | 5.7012496e-08 | 3352 |
| 0.0150 | 0.9976 | 1.6428 | 0.7042 | 5.6993386e-08 | 3353 |
| 0.0184 | 0.9976 | 1.6419 | 0.7042 | 5.6974276e-08 | 3354 |
| 0.0151 | 0.9953 | 1.6422 | 0.7042 | 5.6955166e-08 | 3355 |
| 0.0165 | 1.0 | 1.6421 | 0.7042 | 5.693606e-08 | 3356 |
| 0.0133 | 1.0 | 1.6421 | 0.7042 | 5.6916953e-08 | 3357 |
| 0.0154 | 0.9976 | 1.6422 | 0.7042 | 5.6897846e-08 | 3358 |
| 0.0146 | 1.0 | 1.6398 | 0.7042 | 5.687874e-08 | 3359 |
| 0.0181 | 0.9976 | 1.6390 | 0.7042 | 5.6859633e-08 | 3360 |
| 0.0177 | 0.9953 | 1.6372 | 0.7042 | 5.684053e-08 | 3361 |
| 0.0201 | 0.9953 | 1.6350 | 0.7183 | 5.6821428e-08 | 3362 |
| 0.0135 | 1.0 | 1.6351 | 0.7183 | 5.6802325e-08 | 3363 |
| 0.0169 | 0.9953 | 1.6353 | 0.7183 | 5.678322e-08 | 3364 |
| 0.0130 | 1.0 | 1.6341 | 0.7183 | 5.6764122e-08 | 3365 |
| 0.0149 | 1.0 | 1.6339 | 0.7183 | 5.6745023e-08 | 3366 |
| 0.0170 | 0.9953 | 1.6345 | 0.7183 | 5.6725924e-08 | 3367 |
| 0.0166 | 1.0 | 1.6339 | 0.7183 | 5.6706824e-08 | 3368 |
| 0.0250 | 0.9929 | 1.6328 | 0.7183 | 5.6687725e-08 | 3369 |
| 0.0179 | 0.9976 | 1.6330 | 0.7183 | 5.666863e-08 | 3370 |
| 0.0131 | 0.9976 | 1.6351 | 0.7183 | 5.6649533e-08 | 3371 |
| 0.0142 | 1.0 | 1.6363 | 0.7183 | 5.6630437e-08 | 3372 |
| 0.0107 | 1.0 | 1.6371 | 0.7183 | 5.661134e-08 | 3373 |
| 0.0243 | 0.9929 | 1.6372 | 0.7183 | 5.6592246e-08 | 3374 |
| 0.0268 | 0.9906 | 1.6385 | 0.7183 | 5.6573153e-08 | 3375 |
| 0.0170 | 0.9953 | 1.6396 | 0.7183 | 5.655406e-08 | 3376 |
| 0.0145 | 0.9976 | 1.6392 | 0.7183 | 5.653497e-08 | 3377 |
| 0.0191 | 0.9953 | 1.6399 | 0.7183 | 5.6515876e-08 | 3378 |
| 0.0180 | 0.9953 | 1.6404 | 0.7183 | 5.6496788e-08 | 3379 |
| 0.0244 | 0.9953 | 1.6403 | 0.7183 | 5.64777e-08 | 3380 |
| 0.0170 | 0.9976 | 1.6383 | 0.7183 | 5.645861e-08 | 3381 |
| 0.0148 | 1.0 | 1.6394 | 0.7113 | 5.643952e-08 | 3382 |
| 0.0213 | 0.9929 | 1.6427 | 0.7042 | 5.6420433e-08 | 3383 |
| 0.0140 | 0.9976 | 1.6442 | 0.7042 | 5.6401348e-08 | 3384 |
| 0.0198 | 0.9976 | 1.6451 | 0.7042 | 5.6382262e-08 | 3385 |
| 0.0269 | 0.9906 | 1.6443 | 0.7042 | 5.6363177e-08 | 3386 |
| 0.0146 | 1.0 | 1.6433 | 0.7042 | 5.6344092e-08 | 3387 |
| 0.0145 | 1.0 | 1.6426 | 0.7042 | 5.632501e-08 | 3388 |
| 0.0206 | 0.9929 | 1.6413 | 0.7042 | 5.630593e-08 | 3389 |
| 0.0139 | 1.0 | 1.6396 | 0.7042 | 5.6286847e-08 | 3390 |
| 0.0168 | 0.9976 | 1.6373 | 0.7113 | 5.6267766e-08 | 3391 |
| 0.0161 | 0.9976 | 1.6373 | 0.7183 | 5.6248687e-08 | 3392 |
| 0.0183 | 0.9976 | 1.6389 | 0.7113 | 5.622961e-08 | 3393 |
| 0.0183 | 0.9953 | 1.6404 | 0.7042 | 5.621053e-08 | 3394 |
| 0.0157 | 1.0 | 1.6430 | 0.7042 | 5.6191453e-08 | 3395 |
| 0.0159 | 1.0 | 1.6449 | 0.7042 | 5.617238e-08 | 3396 |
| 0.0172 | 0.9976 | 1.6460 | 0.7042 | 5.6153304e-08 | 3397 |
| 0.0138 | 1.0 | 1.6466 | 0.7042 | 5.613423e-08 | 3398 |
| 0.0135 | 0.9976 | 1.6483 | 0.7042 | 5.6115155e-08 | 3399 |
| 0.0195 | 0.9906 | 1.6468 | 0.7042 | 5.6096084e-08 | 3400 |
| 0.0170 | 1.0 | 1.6461 | 0.7042 | 5.6077013e-08 | 3401 |
| 0.0150 | 1.0 | 1.6467 | 0.7042 | 5.6057942e-08 | 3402 |
| 0.0132 | 0.9976 | 1.6472 | 0.7042 | 5.603887e-08 | 3403 |
| 0.0184 | 0.9953 | 1.6452 | 0.7042 | 5.6019804e-08 | 3404 |
| 0.0218 | 0.9976 | 1.6437 | 0.7042 | 5.6000736e-08 | 3405 |
| 0.0143 | 1.0 | 1.6438 | 0.7042 | 5.598167e-08 | 3406 |
| 0.0225 | 0.9953 | 1.6437 | 0.7042 | 5.59626e-08 | 3407 |
| 0.0158 | 0.9953 | 1.6429 | 0.7113 | 5.5943538e-08 | 3408 |
| 0.0143 | 0.9976 | 1.6464 | 0.7042 | 5.5924474e-08 | 3409 |
| 0.0211 | 0.9976 | 1.6472 | 0.7042 | 5.590541e-08 | 3410 |
| 0.0168 | 0.9976 | 1.6472 | 0.7042 | 5.5886346e-08 | 3411 |
| 0.0193 | 0.9976 | 1.6440 | 0.7183 | 5.5867286e-08 | 3412 |
| 0.0182 | 0.9976 | 1.6402 | 0.7183 | 5.5848226e-08 | 3413 |
| 0.0158 | 1.0 | 1.6408 | 0.7183 | 5.5829165e-08 | 3414 |
| 0.0126 | 0.9976 | 1.6412 | 0.7183 | 5.5810105e-08 | 3415 |
| 0.0164 | 0.9976 | 1.6412 | 0.7183 | 5.579105e-08 | 3416 |
| 0.0129 | 1.0 | 1.6404 | 0.7183 | 5.577199e-08 | 3417 |
| 0.0185 | 0.9929 | 1.6414 | 0.7183 | 5.5752935e-08 | 3418 |
| 0.0232 | 0.9953 | 1.6424 | 0.7183 | 5.5733878e-08 | 3419 |
| 0.0105 | 1.0 | 1.6438 | 0.7183 | 5.5714825e-08 | 3420 |
| 0.0222 | 0.9929 | 1.6439 | 0.7183 | 5.569577e-08 | 3421 |
| 0.0141 | 1.0 | 1.6444 | 0.7113 | 5.567672e-08 | 3422 |
| 0.0221 | 0.9929 | 1.6461 | 0.7042 | 5.5657665e-08 | 3423 |
| 0.0173 | 0.9976 | 1.6491 | 0.7042 | 5.5638615e-08 | 3424 |
| 0.0184 | 0.9976 | 1.6493 | 0.7042 | 5.5619566e-08 | 3425 |
| 0.0114 | 1.0 | 1.6493 | 0.7042 | 5.5600516e-08 | 3426 |
| 0.0130 | 1.0 | 1.6490 | 0.7042 | 5.558147e-08 | 3427 |
| 0.0153 | 0.9976 | 1.6496 | 0.7042 | 5.5562424e-08 | 3428 |
| 0.0122 | 1.0 | 1.6516 | 0.7042 | 5.5543378e-08 | 3429 |
| 0.0133 | 0.9976 | 1.6516 | 0.7042 | 5.5524332e-08 | 3430 |
| 0.0134 | 1.0 | 1.6497 | 0.7042 | 5.550529e-08 | 3431 |
| 0.0153 | 0.9953 | 1.6486 | 0.7042 | 5.5486247e-08 | 3432 |
| 0.0189 | 0.9953 | 1.6471 | 0.7042 | 5.5467204e-08 | 3433 |
| 0.0180 | 0.9953 | 1.6465 | 0.7042 | 5.544816e-08 | 3434 |
| 0.0155 | 0.9976 | 1.6452 | 0.7042 | 5.5429123e-08 | 3435 |
| 0.0213 | 0.9953 | 1.6484 | 0.7042 | 5.5410084e-08 | 3436 |
| 0.0169 | 0.9953 | 1.6490 | 0.7042 | 5.5391045e-08 | 3437 |
| 0.0174 | 0.9953 | 1.6486 | 0.7042 | 5.537201e-08 | 3438 |
| 0.0155 | 1.0 | 1.6473 | 0.7042 | 5.5352974e-08 | 3439 |
| 0.0195 | 0.9976 | 1.6452 | 0.7042 | 5.533394e-08 | 3440 |
| 0.0100 | 1.0 | 1.6441 | 0.7113 | 5.5314903e-08 | 3441 |
| 0.0160 | 0.9976 | 1.6442 | 0.7113 | 5.529587e-08 | 3442 |
| 0.0149 | 0.9976 | 1.6463 | 0.7042 | 5.527684e-08 | 3443 |
| 0.0169 | 1.0 | 1.6489 | 0.7042 | 5.5257807e-08 | 3444 |
| 0.0184 | 0.9953 | 1.6499 | 0.7042 | 5.523878e-08 | 3445 |
| 0.0164 | 0.9976 | 1.6491 | 0.7042 | 5.521975e-08 | 3446 |
| 0.0134 | 1.0 | 1.6498 | 0.7042 | 5.5200722e-08 | 3447 |
| 0.0169 | 1.0 | 1.6504 | 0.7042 | 5.5181694e-08 | 3448 |
| 0.0124 | 1.0 | 1.6503 | 0.7042 | 5.516267e-08 | 3449 |
| 0.0144 | 1.0 | 1.6490 | 0.7042 | 5.5143644e-08 | 3450 |
| 0.0169 | 0.9976 | 1.6502 | 0.7042 | 5.512462e-08 | 3451 |
| 0.0152 | 0.9976 | 1.6520 | 0.7042 | 5.51056e-08 | 3452 |
| 0.0151 | 0.9976 | 1.6506 | 0.7042 | 5.5086577e-08 | 3453 |
| 0.0115 | 1.0 | 1.6500 | 0.7042 | 5.5067556e-08 | 3454 |
| 0.0140 | 1.0 | 1.6501 | 0.7042 | 5.5048538e-08 | 3455 |
| 0.0154 | 1.0 | 1.6518 | 0.7042 | 5.502952e-08 | 3456 |
| 0.0267 | 0.9929 | 1.6521 | 0.7042 | 5.5010503e-08 | 3457 |
| 0.0198 | 0.9976 | 1.6541 | 0.7042 | 5.4991485e-08 | 3458 |
| 0.0245 | 0.9929 | 1.6519 | 0.7042 | 5.497247e-08 | 3459 |
| 0.0141 | 0.9976 | 1.6479 | 0.7042 | 5.4953457e-08 | 3460 |
| 0.0220 | 0.9953 | 1.6452 | 0.7113 | 5.4934443e-08 | 3461 |
| 0.0130 | 1.0 | 1.6465 | 0.7113 | 5.4915432e-08 | 3462 |
| 0.0167 | 0.9976 | 1.6475 | 0.7042 | 5.489642e-08 | 3463 |
| 0.0145 | 0.9953 | 1.6484 | 0.7042 | 5.487741e-08 | 3464 |
| 0.0170 | 1.0 | 1.6473 | 0.7042 | 5.4858404e-08 | 3465 |
| 0.0166 | 0.9976 | 1.6476 | 0.7042 | 5.4839397e-08 | 3466 |
| 0.0128 | 1.0 | 1.6513 | 0.7042 | 5.482039e-08 | 3467 |
| 0.0116 | 1.0 | 1.6535 | 0.7042 | 5.4801387e-08 | 3468 |
| 0.0155 | 1.0 | 1.6540 | 0.7042 | 5.4782383e-08 | 3469 |
| 0.0139 | 0.9976 | 1.6533 | 0.7042 | 5.476338e-08 | 3470 |
| 0.0135 | 0.9976 | 1.6546 | 0.7042 | 5.4744376e-08 | 3471 |
| 0.0156 | 0.9976 | 1.6570 | 0.7042 | 5.4725376e-08 | 3472 |
| 0.0193 | 0.9976 | 1.6592 | 0.7042 | 5.4706376e-08 | 3473 |
| 0.0139 | 0.9976 | 1.6579 | 0.7042 | 5.4687376e-08 | 3474 |
| 0.0149 | 1.0 | 1.6554 | 0.7042 | 5.466838e-08 | 3475 |
| 0.0192 | 0.9953 | 1.6534 | 0.7113 | 5.4649384e-08 | 3476 |
| 0.0115 | 1.0 | 1.6532 | 0.7113 | 5.4630387e-08 | 3477 |
| 0.0201 | 0.9976 | 1.6557 | 0.7042 | 5.4611395e-08 | 3478 |
| 0.0179 | 0.9953 | 1.6575 | 0.7042 | 5.45924e-08 | 3479 |
| 0.0122 | 1.0 | 1.6529 | 0.7113 | 5.457341e-08 | 3480 |
| 0.0155 | 0.9976 | 1.6551 | 0.7042 | 5.455442e-08 | 3481 |
| 0.0138 | 1.0 | 1.6578 | 0.7042 | 5.453543e-08 | 3482 |
| 0.0254 | 0.9906 | 1.6597 | 0.7042 | 5.451644e-08 | 3483 |
| 0.0177 | 0.9953 | 1.6583 | 0.7042 | 5.4497455e-08 | 3484 |
| 0.0185 | 0.9976 | 1.6585 | 0.7042 | 5.447847e-08 | 3485 |
| 0.0216 | 0.9976 | 1.6579 | 0.7042 | 5.4459484e-08 | 3486 |
| 0.0153 | 0.9976 | 1.6601 | 0.7042 | 5.4440502e-08 | 3487 |
| 0.0137 | 0.9976 | 1.6600 | 0.7042 | 5.442152e-08 | 3488 |
| 0.0135 | 1.0 | 1.6597 | 0.7042 | 5.4402538e-08 | 3489 |
| 0.0155 | 1.0 | 1.6602 | 0.7042 | 5.438356e-08 | 3490 |
| 0.0180 | 0.9953 | 1.6614 | 0.7042 | 5.436458e-08 | 3491 |
| 0.0183 | 0.9976 | 1.6599 | 0.7042 | 5.4345602e-08 | 3492 |
| 0.0145 | 1.0 | 1.6583 | 0.7113 | 5.4326627e-08 | 3493 |
| 0.0148 | 0.9953 | 1.6562 | 0.7113 | 5.430765e-08 | 3494 |
| 0.0112 | 1.0 | 1.6544 | 0.7113 | 5.4288677e-08 | 3495 |
| 0.0169 | 0.9976 | 1.6554 | 0.7113 | 5.4269705e-08 | 3496 |
| 0.0123 | 0.9976 | 1.6571 | 0.7113 | 5.4250734e-08 | 3497 |
| 0.0178 | 0.9976 | 1.6600 | 0.7042 | 5.4231762e-08 | 3498 |
| 0.0169 | 0.9976 | 1.6622 | 0.7042 | 5.4212794e-08 | 3499 |
| 0.0136 | 1.0 | 1.6631 | 0.7042 | 5.4193826e-08 | 3500 |
| 0.0117 | 1.0 | 1.6622 | 0.7042 | 5.417486e-08 | 3501 |
| 0.0156 | 0.9976 | 1.6602 | 0.7042 | 5.4155894e-08 | 3502 |
| 0.0162 | 0.9976 | 1.6599 | 0.7042 | 5.413693e-08 | 3503 |
| 0.0161 | 0.9976 | 1.6579 | 0.7042 | 5.4117965e-08 | 3504 |
| 0.0161 | 0.9976 | 1.6569 | 0.7042 | 5.4099004e-08 | 3505 |
| 0.0153 | 0.9976 | 1.6572 | 0.7042 | 5.4080044e-08 | 3506 |
| 0.0197 | 0.9976 | 1.6590 | 0.7042 | 5.4061083e-08 | 3507 |
| 0.0152 | 0.9976 | 1.6595 | 0.7042 | 5.4042125e-08 | 3508 |
| 0.0112 | 1.0 | 1.6600 | 0.7042 | 5.4023168e-08 | 3509 |
| 0.0106 | 1.0 | 1.6590 | 0.7042 | 5.4004214e-08 | 3510 |
| 0.0246 | 0.9929 | 1.6607 | 0.7042 | 5.398526e-08 | 3511 |
| 0.0135 | 0.9976 | 1.6623 | 0.7042 | 5.3966307e-08 | 3512 |
| 0.0182 | 0.9953 | 1.6601 | 0.7042 | 5.3947357e-08 | 3513 |
| 0.0116 | 1.0 | 1.6597 | 0.7042 | 5.3928407e-08 | 3514 |
| 0.0231 | 0.9976 | 1.6588 | 0.7042 | 5.3909456e-08 | 3515 |
| 0.0201 | 0.9906 | 1.6624 | 0.7042 | 5.389051e-08 | 3516 |
| 0.0138 | 0.9976 | 1.6626 | 0.7042 | 5.3871563e-08 | 3517 |
| 0.0260 | 0.9929 | 1.6620 | 0.7042 | 5.3852617e-08 | 3518 |
| 0.0301 | 0.9882 | 1.6597 | 0.7042 | 5.3833674e-08 | 3519 |
| 0.0172 | 0.9976 | 1.6587 | 0.7042 | 5.381473e-08 | 3520 |
| 0.0157 | 0.9976 | 1.6597 | 0.7042 | 5.3795787e-08 | 3521 |
| 0.0172 | 0.9976 | 1.6580 | 0.7042 | 5.3776848e-08 | 3522 |
| 0.0086 | 1.0 | 1.6565 | 0.7042 | 5.375791e-08 | 3523 |
| 0.0138 | 0.9976 | 1.6574 | 0.7042 | 5.3738972e-08 | 3524 |
| 0.0171 | 0.9976 | 1.6590 | 0.7042 | 5.3720036e-08 | 3525 |
| 0.0151 | 1.0 | 1.6580 | 0.7042 | 5.37011e-08 | 3526 |
| 0.0126 | 0.9976 | 1.6569 | 0.7113 | 5.3682168e-08 | 3527 |
| 0.0173 | 0.9976 | 1.6550 | 0.7113 | 5.3663236e-08 | 3528 |
| 0.0127 | 1.0 | 1.6540 | 0.7113 | 5.3644303e-08 | 3529 |
| 0.0115 | 1.0 | 1.6545 | 0.7113 | 5.3625374e-08 | 3530 |
| 0.0134 | 1.0 | 1.6543 | 0.7113 | 5.3606446e-08 | 3531 |
| 0.0154 | 0.9976 | 1.6551 | 0.7113 | 5.3587517e-08 | 3532 |
| 0.0185 | 0.9976 | 1.6560 | 0.7113 | 5.356859e-08 | 3533 |
| 0.0125 | 0.9976 | 1.6557 | 0.7113 | 5.3549666e-08 | 3534 |
| 0.0129 | 0.9976 | 1.6555 | 0.7113 | 5.3530744e-08 | 3535 |
| 0.0208 | 0.9976 | 1.6551 | 0.7113 | 5.3511823e-08 | 3536 |
| 0.0190 | 0.9953 | 1.6569 | 0.7113 | 5.34929e-08 | 3537 |
| 0.0165 | 0.9953 | 1.6594 | 0.7113 | 5.3473983e-08 | 3538 |
| 0.0134 | 1.0 | 1.6621 | 0.7042 | 5.3455064e-08 | 3539 |
| 0.0181 | 0.9953 | 1.6631 | 0.7042 | 5.343615e-08 | 3540 |
| 0.0133 | 1.0 | 1.6644 | 0.7042 | 5.3417235e-08 | 3541 |
| 0.0183 | 1.0 | 1.6636 | 0.7042 | 5.339832e-08 | 3542 |
| 0.0143 | 0.9976 | 1.6620 | 0.7042 | 5.337941e-08 | 3543 |
| 0.0143 | 0.9976 | 1.6591 | 0.7113 | 5.33605e-08 | 3544 |
| 0.0137 | 0.9976 | 1.6592 | 0.7183 | 5.3341587e-08 | 3545 |
| 0.0114 | 1.0 | 1.6601 | 0.7113 | 5.332268e-08 | 3546 |
| 0.0152 | 0.9976 | 1.6635 | 0.7113 | 5.3303772e-08 | 3547 |
| 0.0121 | 0.9976 | 1.6666 | 0.7042 | 5.3284868e-08 | 3548 |
| 0.0118 | 1.0 | 1.6666 | 0.7042 | 5.3265964e-08 | 3549 |
| 0.0149 | 1.0 | 1.6661 | 0.7042 | 5.324706e-08 | 3550 |
| 0.0157 | 1.0 | 1.6685 | 0.7042 | 5.322816e-08 | 3551 |
| 0.0183 | 0.9953 | 1.6709 | 0.7042 | 5.320926e-08 | 3552 |
| 0.0132 | 1.0 | 1.6718 | 0.7042 | 5.3190362e-08 | 3553 |
| 0.0094 | 1.0 | 1.6720 | 0.7042 | 5.3171465e-08 | 3554 |
| 0.0170 | 0.9976 | 1.6722 | 0.7042 | 5.315257e-08 | 3555 |
| 0.0118 | 1.0 | 1.6736 | 0.7042 | 5.3133675e-08 | 3556 |
| 0.0133 | 0.9976 | 1.6725 | 0.7042 | 5.3114782e-08 | 3557 |
| 0.0174 | 0.9953 | 1.6713 | 0.7042 | 5.3095892e-08 | 3558 |
| 0.0112 | 1.0 | 1.6713 | 0.7042 | 5.3077002e-08 | 3559 |
| 0.0189 | 0.9953 | 1.6703 | 0.7042 | 5.3058113e-08 | 3560 |
| 0.0116 | 0.9976 | 1.6679 | 0.7042 | 5.3039226e-08 | 3561 |
| 0.0126 | 1.0 | 1.6658 | 0.7042 | 5.302034e-08 | 3562 |
| 0.0188 | 0.9953 | 1.6678 | 0.7042 | 5.3001457e-08 | 3563 |
| 0.0102 | 1.0 | 1.6691 | 0.7042 | 5.2982575e-08 | 3564 |
| 0.0215 | 0.9929 | 1.6683 | 0.7042 | 5.2963692e-08 | 3565 |
| 0.0120 | 1.0 | 1.6663 | 0.7042 | 5.2944813e-08 | 3566 |
| 0.0116 | 1.0 | 1.6656 | 0.7042 | 5.2925934e-08 | 3567 |
| 0.0110 | 1.0 | 1.6641 | 0.7113 | 5.290706e-08 | 3568 |
| 0.0102 | 1.0 | 1.6641 | 0.7113 | 5.2888183e-08 | 3569 |
| 0.0132 | 1.0 | 1.6648 | 0.7113 | 5.2869307e-08 | 3570 |
| 0.0141 | 0.9976 | 1.6661 | 0.7042 | 5.2850435e-08 | 3571 |
| 0.0094 | 1.0 | 1.6642 | 0.7113 | 5.2831563e-08 | 3572 |
| 0.0127 | 0.9976 | 1.6649 | 0.7042 | 5.2812695e-08 | 3573 |
| 0.0181 | 0.9929 | 1.6665 | 0.7042 | 5.2793826e-08 | 3574 |
| 0.0166 | 0.9953 | 1.6635 | 0.7042 | 5.2774958e-08 | 3575 |
| 0.0134 | 0.9976 | 1.6627 | 0.7113 | 5.2756093e-08 | 3576 |
| 0.0138 | 1.0 | 1.6640 | 0.7042 | 5.2737228e-08 | 3577 |
| 0.0123 | 1.0 | 1.6661 | 0.7042 | 5.2718367e-08 | 3578 |
| 0.0119 | 1.0 | 1.6677 | 0.7042 | 5.2699505e-08 | 3579 |
| 0.0123 | 0.9976 | 1.6718 | 0.7042 | 5.2680644e-08 | 3580 |
| 0.0182 | 0.9976 | 1.6720 | 0.7042 | 5.2661786e-08 | 3581 |
| 0.0138 | 1.0 | 1.6707 | 0.7042 | 5.264293e-08 | 3582 |
| 0.0095 | 1.0 | 1.6701 | 0.7042 | 5.2624074e-08 | 3583 |
| 0.0129 | 1.0 | 1.6704 | 0.7113 | 5.260522e-08 | 3584 |
| 0.0169 | 0.9953 | 1.6676 | 0.7113 | 5.258637e-08 | 3585 |
| 0.0130 | 0.9976 | 1.6660 | 0.7113 | 5.256752e-08 | 3586 |
| 0.0173 | 0.9953 | 1.6669 | 0.7113 | 5.2548668e-08 | 3587 |
| 0.0157 | 0.9953 | 1.6691 | 0.7113 | 5.252982e-08 | 3588 |
| 0.0116 | 1.0 | 1.6711 | 0.7113 | 5.2510973e-08 | 3589 |
| 0.0138 | 0.9976 | 1.6725 | 0.7113 | 5.249213e-08 | 3590 |
| 0.0167 | 0.9976 | 1.6757 | 0.7042 | 5.2473286e-08 | 3591 |
| 0.0109 | 1.0 | 1.6777 | 0.7042 | 5.2454446e-08 | 3592 |
| 0.0101 | 1.0 | 1.6771 | 0.7042 | 5.2435606e-08 | 3593 |
| 0.0132 | 1.0 | 1.6758 | 0.7042 | 5.2416766e-08 | 3594 |
| 0.0113 | 1.0 | 1.6753 | 0.7042 | 5.239793e-08 | 3595 |
| 0.0144 | 1.0 | 1.6729 | 0.7042 | 5.2379093e-08 | 3596 |
| 0.0103 | 1.0 | 1.6715 | 0.7042 | 5.236026e-08 | 3597 |
| 0.0184 | 0.9976 | 1.6731 | 0.7042 | 5.2341427e-08 | 3598 |
| 0.0156 | 0.9953 | 1.6703 | 0.7113 | 5.2322598e-08 | 3599 |
| 0.0151 | 0.9976 | 1.6689 | 0.7113 | 5.230377e-08 | 3600 |
| 0.0113 | 0.9976 | 1.6668 | 0.7183 | 5.228494e-08 | 3601 |
| 0.0155 | 0.9976 | 1.6675 | 0.7113 | 5.2266113e-08 | 3602 |
| 0.0195 | 0.9953 | 1.6706 | 0.7113 | 5.2247287e-08 | 3603 |
| 0.0124 | 0.9976 | 1.6713 | 0.7113 | 5.2228465e-08 | 3604 |
| 0.0089 | 1.0 | 1.6717 | 0.7113 | 5.2209643e-08 | 3605 |
| 0.0137 | 0.9976 | 1.6714 | 0.7113 | 5.2190824e-08 | 3606 |
| 0.0131 | 1.0 | 1.6715 | 0.7113 | 5.2172005e-08 | 3607 |
| 0.0106 | 1.0 | 1.6721 | 0.7113 | 5.215319e-08 | 3608 |
| 0.0131 | 0.9976 | 1.6708 | 0.7113 | 5.2134375e-08 | 3609 |
| 0.0119 | 1.0 | 1.6708 | 0.7113 | 5.211556e-08 | 3610 |
| 0.0132 | 0.9976 | 1.6741 | 0.7042 | 5.209675e-08 | 3611 |
| 0.0177 | 0.9929 | 1.6763 | 0.7042 | 5.2077937e-08 | 3612 |
| 0.0096 | 1.0 | 1.6773 | 0.7042 | 5.205913e-08 | 3613 |
| 0.0146 | 0.9976 | 1.6771 | 0.7042 | 5.204032e-08 | 3614 |
| 0.0174 | 0.9976 | 1.6777 | 0.7042 | 5.2021516e-08 | 3615 |
| 0.0176 | 0.9976 | 1.6795 | 0.7042 | 5.200271e-08 | 3616 |
| 0.0117 | 0.9976 | 1.6807 | 0.7042 | 5.198391e-08 | 3617 |
| 0.0160 | 1.0 | 1.6768 | 0.7042 | 5.196511e-08 | 3618 |
| 0.0093 | 1.0 | 1.6757 | 0.7042 | 5.194631e-08 | 3619 |
| 0.0145 | 0.9953 | 1.6758 | 0.7042 | 5.192751e-08 | 3620 |
| 0.0127 | 0.9976 | 1.6761 | 0.7042 | 5.1908714e-08 | 3621 |
| 0.0134 | 0.9976 | 1.6780 | 0.7042 | 5.188992e-08 | 3622 |
| 0.0134 | 1.0 | 1.6789 | 0.7042 | 5.1871126e-08 | 3623 |
| 0.0174 | 0.9953 | 1.6824 | 0.7042 | 5.1852336e-08 | 3624 |
| 0.0236 | 0.9929 | 1.6828 | 0.7042 | 5.1833545e-08 | 3625 |
| 0.0101 | 1.0 | 1.6814 | 0.7042 | 5.181476e-08 | 3626 |
| 0.0210 | 0.9906 | 1.6819 | 0.7042 | 5.1795972e-08 | 3627 |
| 0.0177 | 0.9953 | 1.6830 | 0.7042 | 5.177719e-08 | 3628 |
| 0.0116 | 1.0 | 1.6849 | 0.7042 | 5.1758406e-08 | 3629 |
| 0.0101 | 1.0 | 1.6845 | 0.7042 | 5.1739622e-08 | 3630 |
| 0.0135 | 0.9953 | 1.6857 | 0.7042 | 5.1720843e-08 | 3631 |
| 0.0136 | 0.9976 | 1.6832 | 0.7042 | 5.1702063e-08 | 3632 |
| 0.0166 | 0.9953 | 1.6817 | 0.7042 | 5.1683287e-08 | 3633 |
| 0.0171 | 0.9976 | 1.6827 | 0.7042 | 5.166451e-08 | 3634 |
| 0.0176 | 0.9976 | 1.6815 | 0.7042 | 5.164574e-08 | 3635 |
| 0.0120 | 1.0 | 1.6806 | 0.7113 | 5.1626966e-08 | 3636 |
| 0.0113 | 1.0 | 1.6815 | 0.7042 | 5.1608197e-08 | 3637 |
| 0.0160 | 0.9976 | 1.6821 | 0.7113 | 5.1589428e-08 | 3638 |
| 0.0087 | 1.0 | 1.6823 | 0.7042 | 5.1570662e-08 | 3639 |
| 0.0139 | 1.0 | 1.6835 | 0.7042 | 5.1551897e-08 | 3640 |
| 0.0132 | 0.9976 | 1.6863 | 0.7042 | 5.1533135e-08 | 3641 |
| 0.0112 | 1.0 | 1.6859 | 0.7042 | 5.1514373e-08 | 3642 |
| 0.0142 | 0.9976 | 1.6829 | 0.7042 | 5.1495615e-08 | 3643 |
| 0.0168 | 0.9953 | 1.6841 | 0.7042 | 5.1476857e-08 | 3644 |
| 0.0116 | 0.9976 | 1.6851 | 0.7042 | 5.1458102e-08 | 3645 |
| 0.0130 | 1.0 | 1.6867 | 0.7042 | 5.1439347e-08 | 3646 |
| 0.0116 | 1.0 | 1.6899 | 0.7042 | 5.1420592e-08 | 3647 |
| 0.0092 | 1.0 | 1.6896 | 0.7042 | 5.140184e-08 | 3648 |
| 0.0134 | 1.0 | 1.6873 | 0.7042 | 5.138309e-08 | 3649 |
| 0.0147 | 0.9976 | 1.6886 | 0.7042 | 5.1364342e-08 | 3650 |
| 0.0110 | 1.0 | 1.6879 | 0.7042 | 5.1345594e-08 | 3651 |
| 0.0095 | 1.0 | 1.6881 | 0.7042 | 5.132685e-08 | 3652 |
| 0.0110 | 1.0 | 1.6886 | 0.7042 | 5.1308106e-08 | 3653 |
| 0.0175 | 0.9953 | 1.6850 | 0.7042 | 5.1289366e-08 | 3654 |
| 0.0159 | 0.9976 | 1.6830 | 0.7042 | 5.1270625e-08 | 3655 |
| 0.0176 | 0.9976 | 1.6870 | 0.7042 | 5.1251888e-08 | 3656 |
| 0.0089 | 1.0 | 1.6879 | 0.7042 | 5.123315e-08 | 3657 |
| 0.0110 | 1.0 | 1.6879 | 0.7042 | 5.1214418e-08 | 3658 |
| 0.0133 | 0.9953 | 1.6881 | 0.7042 | 5.1195684e-08 | 3659 |
| 0.0179 | 0.9976 | 1.6866 | 0.7042 | 5.1176954e-08 | 3660 |
| 0.0163 | 0.9953 | 1.6868 | 0.7042 | 5.1158224e-08 | 3661 |
| 0.0203 | 0.9953 | 1.6866 | 0.7042 | 5.1139498e-08 | 3662 |
| 0.0100 | 1.0 | 1.6869 | 0.7042 | 5.112077e-08 | 3663 |
| 0.0234 | 0.9953 | 1.6880 | 0.7042 | 5.110205e-08 | 3664 |
| 0.0103 | 1.0 | 1.6890 | 0.7042 | 5.1083326e-08 | 3665 |
| 0.0145 | 0.9976 | 1.6885 | 0.7042 | 5.1064607e-08 | 3666 |
| 0.0119 | 1.0 | 1.6862 | 0.6972 | 5.1045888e-08 | 3667 |
| 0.0190 | 0.9953 | 1.6864 | 0.7042 | 5.1027172e-08 | 3668 |
| 0.0141 | 0.9976 | 1.6889 | 0.7042 | 5.1008456e-08 | 3669 |
| 0.0141 | 0.9976 | 1.6905 | 0.7042 | 5.0989744e-08 | 3670 |
| 0.0141 | 0.9976 | 1.6915 | 0.7042 | 5.0971032e-08 | 3671 |
| 0.0105 | 1.0 | 1.6893 | 0.7042 | 5.0952323e-08 | 3672 |
| 0.0147 | 1.0 | 1.6901 | 0.7042 | 5.0933615e-08 | 3673 |
| 0.0142 | 0.9976 | 1.6877 | 0.7042 | 5.091491e-08 | 3674 |
| 0.0142 | 0.9976 | 1.6859 | 0.7042 | 5.0896205e-08 | 3675 |
| 0.0103 | 1.0 | 1.6859 | 0.7042 | 5.0877503e-08 | 3676 |
| 0.0121 | 1.0 | 1.6858 | 0.7042 | 5.08588e-08 | 3677 |
| 0.0182 | 0.9976 | 1.6856 | 0.7042 | 5.0840104e-08 | 3678 |
| 0.0252 | 0.9953 | 1.6828 | 0.7042 | 5.0821406e-08 | 3679 |
| 0.0190 | 0.9976 | 1.6802 | 0.7042 | 5.080271e-08 | 3680 |
| 0.0138 | 0.9976 | 1.6790 | 0.7042 | 5.0784017e-08 | 3681 |
| 0.0137 | 0.9976 | 1.6787 | 0.7042 | 5.0765326e-08 | 3682 |
| 0.0172 | 0.9976 | 1.6785 | 0.7042 | 5.0746635e-08 | 3683 |
| 0.0205 | 0.9929 | 1.6797 | 0.7042 | 5.0727948e-08 | 3684 |
| 0.0093 | 1.0 | 1.6815 | 0.7042 | 5.070926e-08 | 3685 |
| 0.0077 | 1.0 | 1.6828 | 0.7042 | 5.0690577e-08 | 3686 |
| 0.0134 | 0.9976 | 1.6823 | 0.7042 | 5.0671893e-08 | 3687 |
| 0.0139 | 0.9976 | 1.6820 | 0.7113 | 5.0653213e-08 | 3688 |
| 0.0124 | 1.0 | 1.6849 | 0.7042 | 5.0634533e-08 | 3689 |
| 0.0217 | 0.9953 | 1.6858 | 0.6972 | 5.0615856e-08 | 3690 |
| 0.0147 | 0.9976 | 1.6867 | 0.7042 | 5.059718e-08 | 3691 |
| 0.0139 | 1.0 | 1.6869 | 0.7042 | 5.0578507e-08 | 3692 |
| 0.0101 | 0.9976 | 1.6887 | 0.7042 | 5.0559834e-08 | 3693 |
| 0.0146 | 1.0 | 1.6893 | 0.7042 | 5.0541164e-08 | 3694 |
| 0.0126 | 0.9976 | 1.6889 | 0.7042 | 5.0522495e-08 | 3695 |
| 0.0151 | 0.9953 | 1.6916 | 0.7042 | 5.050383e-08 | 3696 |
| 0.0122 | 1.0 | 1.6930 | 0.7042 | 5.0485163e-08 | 3697 |
| 0.0117 | 1.0 | 1.6941 | 0.7042 | 5.04665e-08 | 3698 |
| 0.0131 | 1.0 | 1.6933 | 0.6972 | 5.0447838e-08 | 3699 |
| 0.0173 | 1.0 | 1.6943 | 0.7042 | 5.042918e-08 | 3700 |
| 0.0181 | 0.9953 | 1.6932 | 0.6972 | 5.0410524e-08 | 3701 |
| 0.0135 | 0.9976 | 1.6909 | 0.6972 | 5.039187e-08 | 3702 |
| 0.0193 | 0.9976 | 1.6904 | 0.7042 | 5.0373217e-08 | 3703 |
| 0.0099 | 1.0 | 1.6912 | 0.7042 | 5.0354565e-08 | 3704 |
| 0.0140 | 0.9976 | 1.6925 | 0.7042 | 5.0335917e-08 | 3705 |
| 0.0128 | 1.0 | 1.6933 | 0.7042 | 5.031727e-08 | 3706 |
| 0.0120 | 1.0 | 1.6934 | 0.7042 | 5.0298624e-08 | 3707 |
| 0.0166 | 0.9976 | 1.6924 | 0.7042 | 5.027998e-08 | 3708 |
| 0.0138 | 0.9953 | 1.6910 | 0.7042 | 5.0261338e-08 | 3709 |
| 0.0103 | 1.0 | 1.6912 | 0.7042 | 5.0242697e-08 | 3710 |
| 0.0124 | 0.9976 | 1.6915 | 0.7042 | 5.022406e-08 | 3711 |
| 0.0204 | 0.9953 | 1.6911 | 0.7042 | 5.0205422e-08 | 3712 |
| 0.0123 | 1.0 | 1.6921 | 0.6972 | 5.0186788e-08 | 3713 |
| 0.0104 | 1.0 | 1.6923 | 0.6972 | 5.0168154e-08 | 3714 |
| 0.0114 | 0.9976 | 1.6922 | 0.6972 | 5.0149524e-08 | 3715 |
| 0.0149 | 0.9976 | 1.6922 | 0.6972 | 5.0130897e-08 | 3716 |
| 0.0122 | 1.0 | 1.6917 | 0.7042 | 5.011227e-08 | 3717 |
| 0.0091 | 1.0 | 1.6929 | 0.7042 | 5.0093647e-08 | 3718 |
| 0.0123 | 1.0 | 1.6924 | 0.7042 | 5.0075023e-08 | 3719 |
| 0.0082 | 1.0 | 1.6922 | 0.7042 | 5.0056403e-08 | 3720 |
| 0.0186 | 0.9953 | 1.6946 | 0.7042 | 5.0037784e-08 | 3721 |
| 0.0106 | 1.0 | 1.6985 | 0.7042 | 5.0019167e-08 | 3722 |
| 0.0179 | 0.9976 | 1.6977 | 0.7042 | 5.000055e-08 | 3723 |
| 0.0126 | 1.0 | 1.6980 | 0.7042 | 4.998194e-08 | 3724 |
| 0.0173 | 0.9976 | 1.6950 | 0.7042 | 4.9963326e-08 | 3725 |
| 0.0231 | 0.9906 | 1.6966 | 0.7042 | 4.9944717e-08 | 3726 |
| 0.0157 | 1.0 | 1.6952 | 0.7042 | 4.9926108e-08 | 3727 |
| 0.0210 | 0.9953 | 1.6905 | 0.7042 | 4.9907502e-08 | 3728 |
| 0.0135 | 0.9976 | 1.6919 | 0.7042 | 4.98889e-08 | 3729 |
| 0.0100 | 1.0 | 1.6932 | 0.7042 | 4.9870298e-08 | 3730 |
| 0.0190 | 0.9953 | 1.6922 | 0.7042 | 4.98517e-08 | 3731 |
| 0.0105 | 1.0 | 1.6935 | 0.7042 | 4.98331e-08 | 3732 |
| 0.0084 | 1.0 | 1.6941 | 0.7042 | 4.9814506e-08 | 3733 |
| 0.0106 | 1.0 | 1.6923 | 0.7042 | 4.979591e-08 | 3734 |
| 0.0198 | 0.9953 | 1.6937 | 0.7042 | 4.977732e-08 | 3735 |
| 0.0109 | 1.0 | 1.6949 | 0.6972 | 4.975873e-08 | 3736 |
| 0.0129 | 1.0 | 1.6957 | 0.7042 | 4.974014e-08 | 3737 |
| 0.0095 | 1.0 | 1.6956 | 0.7042 | 4.9721557e-08 | 3738 |
| 0.0160 | 0.9976 | 1.6942 | 0.7042 | 4.9702972e-08 | 3739 |
| 0.0135 | 1.0 | 1.6938 | 0.7042 | 4.968439e-08 | 3740 |
| 0.0122 | 1.0 | 1.6941 | 0.7042 | 4.966581e-08 | 3741 |
| 0.0120 | 0.9976 | 1.6945 | 0.7042 | 4.9647234e-08 | 3742 |
| 0.0098 | 1.0 | 1.6946 | 0.7042 | 4.9628657e-08 | 3743 |
| 0.0104 | 0.9976 | 1.6949 | 0.7042 | 4.9610083e-08 | 3744 |
| 0.0271 | 0.9953 | 1.6965 | 0.7042 | 4.959151e-08 | 3745 |
| 0.0131 | 0.9953 | 1.6978 | 0.7042 | 4.957294e-08 | 3746 |
| 0.0148 | 0.9976 | 1.6994 | 0.7042 | 4.9554373e-08 | 3747 |
| 0.0175 | 0.9953 | 1.7007 | 0.7042 | 4.9535807e-08 | 3748 |
| 0.0091 | 1.0 | 1.7011 | 0.7042 | 4.9517244e-08 | 3749 |
| 0.0166 | 0.9953 | 1.7012 | 0.7042 | 4.949868e-08 | 3750 |
| 0.0118 | 0.9976 | 1.6992 | 0.7042 | 4.948012e-08 | 3751 |
| 0.0125 | 0.9976 | 1.6980 | 0.7042 | 4.9461562e-08 | 3752 |
| 0.0111 | 1.0 | 1.6968 | 0.7042 | 4.9443006e-08 | 3753 |
| 0.0114 | 1.0 | 1.6990 | 0.7042 | 4.9424454e-08 | 3754 |
| 0.0096 | 1.0 | 1.7000 | 0.7042 | 4.94059e-08 | 3755 |
| 0.0149 | 0.9953 | 1.7001 | 0.7042 | 4.9387353e-08 | 3756 |
| 0.0112 | 1.0 | 1.6959 | 0.7042 | 4.9368804e-08 | 3757 |
| 0.0096 | 1.0 | 1.6935 | 0.7042 | 4.935026e-08 | 3758 |
| 0.0121 | 1.0 | 1.6943 | 0.7042 | 4.9331714e-08 | 3759 |
| 0.0159 | 0.9976 | 1.6961 | 0.7042 | 4.9313172e-08 | 3760 |
| 0.0157 | 0.9976 | 1.6958 | 0.7042 | 4.9294634e-08 | 3761 |
| 0.0098 | 1.0 | 1.6955 | 0.7042 | 4.9276096e-08 | 3762 |
| 0.0128 | 1.0 | 1.6947 | 0.7042 | 4.925756e-08 | 3763 |
| 0.0155 | 0.9953 | 1.6930 | 0.7113 | 4.9239027e-08 | 3764 |
| 0.0193 | 0.9953 | 1.6946 | 0.7113 | 4.9220496e-08 | 3765 |
| 0.0210 | 0.9906 | 1.6952 | 0.7042 | 4.9201965e-08 | 3766 |
| 0.0151 | 0.9976 | 1.6931 | 0.7042 | 4.9183438e-08 | 3767 |
| 0.0099 | 0.9976 | 1.6921 | 0.7042 | 4.9164914e-08 | 3768 |
| 0.0128 | 0.9976 | 1.6914 | 0.7042 | 4.914639e-08 | 3769 |
| 0.0102 | 1.0 | 1.6911 | 0.7113 | 4.912787e-08 | 3770 |
| 0.0121 | 0.9976 | 1.6921 | 0.7113 | 4.910935e-08 | 3771 |
| 0.0118 | 1.0 | 1.6923 | 0.7113 | 4.9090833e-08 | 3772 |
| 0.0153 | 0.9976 | 1.6926 | 0.7113 | 4.9072316e-08 | 3773 |
| 0.0164 | 0.9953 | 1.6935 | 0.7042 | 4.9053803e-08 | 3774 |
| 0.0140 | 0.9976 | 1.6921 | 0.7113 | 4.9035293e-08 | 3775 |
| 0.0095 | 1.0 | 1.6941 | 0.7042 | 4.9016784e-08 | 3776 |
| 0.0185 | 0.9929 | 1.6982 | 0.7042 | 4.8998277e-08 | 3777 |
| 0.0185 | 0.9953 | 1.6993 | 0.7042 | 4.897977e-08 | 3778 |
| 0.0117 | 0.9976 | 1.6992 | 0.7042 | 4.896127e-08 | 3779 |
| 0.0117 | 0.9976 | 1.7002 | 0.7042 | 4.894277e-08 | 3780 |
| 0.0091 | 1.0 | 1.6998 | 0.7042 | 4.892427e-08 | 3781 |
| 0.0151 | 0.9976 | 1.7027 | 0.7042 | 4.8905775e-08 | 3782 |
| 0.0074 | 1.0 | 1.7034 | 0.7042 | 4.888728e-08 | 3783 |
| 0.0102 | 1.0 | 1.7041 | 0.7042 | 4.8868788e-08 | 3784 |
| 0.0190 | 0.9976 | 1.7050 | 0.7042 | 4.88503e-08 | 3785 |
| 0.0085 | 1.0 | 1.7073 | 0.7042 | 4.883181e-08 | 3786 |
| 0.0120 | 0.9976 | 1.7085 | 0.7042 | 4.8813327e-08 | 3787 |
| 0.0197 | 0.9929 | 1.7082 | 0.7042 | 4.8794842e-08 | 3788 |
| 0.0118 | 0.9976 | 1.7058 | 0.7042 | 4.877636e-08 | 3789 |
| 0.0113 | 1.0 | 1.7032 | 0.7042 | 4.8757883e-08 | 3790 |
| 0.0166 | 0.9953 | 1.7027 | 0.7042 | 4.8739405e-08 | 3791 |
| 0.0083 | 1.0 | 1.7025 | 0.7042 | 4.872093e-08 | 3792 |
| 0.0148 | 0.9976 | 1.7025 | 0.7042 | 4.8702457e-08 | 3793 |
| 0.0099 | 1.0 | 1.7041 | 0.7042 | 4.8683987e-08 | 3794 |
| 0.0108 | 0.9976 | 1.7050 | 0.7042 | 4.866552e-08 | 3795 |
| 0.0113 | 1.0 | 1.7052 | 0.7042 | 4.8647053e-08 | 3796 |
| 0.0119 | 1.0 | 1.7038 | 0.7042 | 4.862859e-08 | 3797 |
| 0.0091 | 1.0 | 1.7034 | 0.7042 | 4.8610126e-08 | 3798 |
| 0.0140 | 0.9976 | 1.7051 | 0.7042 | 4.8591666e-08 | 3799 |
| 0.0100 | 0.9976 | 1.7070 | 0.7042 | 4.857321e-08 | 3800 |
| 0.0125 | 1.0 | 1.7061 | 0.7042 | 4.8554753e-08 | 3801 |
| 0.0095 | 1.0 | 1.7014 | 0.7042 | 4.85363e-08 | 3802 |
| 0.0087 | 1.0 | 1.7005 | 0.7042 | 4.8517848e-08 | 3803 |
| 0.0107 | 0.9976 | 1.7000 | 0.7042 | 4.84994e-08 | 3804 |
| 0.0124 | 0.9976 | 1.6996 | 0.7042 | 4.8480953e-08 | 3805 |
| 0.0115 | 1.0 | 1.6997 | 0.7042 | 4.8462507e-08 | 3806 |
| 0.0122 | 1.0 | 1.7001 | 0.6972 | 4.8444065e-08 | 3807 |
| 0.0150 | 0.9953 | 1.7010 | 0.7042 | 4.8425623e-08 | 3808 |
| 0.0153 | 1.0 | 1.7032 | 0.7042 | 4.8407184e-08 | 3809 |
| 0.0108 | 1.0 | 1.7053 | 0.7042 | 4.838875e-08 | 3810 |
| 0.0121 | 1.0 | 1.7046 | 0.7042 | 4.8370314e-08 | 3811 |
| 0.0107 | 1.0 | 1.7026 | 0.7042 | 4.8351882e-08 | 3812 |
| 0.0096 | 1.0 | 1.7025 | 0.7042 | 4.8333455e-08 | 3813 |
| 0.0121 | 0.9976 | 1.7039 | 0.7042 | 4.8315027e-08 | 3814 |
| 0.0118 | 1.0 | 1.7075 | 0.7042 | 4.8296602e-08 | 3815 |
| 0.0111 | 1.0 | 1.7073 | 0.7042 | 4.8278178e-08 | 3816 |
| 0.0141 | 0.9953 | 1.7071 | 0.7042 | 4.8259757e-08 | 3817 |
| 0.0134 | 0.9953 | 1.7140 | 0.7042 | 4.824134e-08 | 3818 |
| 0.0141 | 0.9976 | 1.7140 | 0.7042 | 4.8222923e-08 | 3819 |
| 0.0096 | 1.0 | 1.7130 | 0.7042 | 4.820451e-08 | 3820 |
| 0.0098 | 1.0 | 1.7106 | 0.7042 | 4.81861e-08 | 3821 |
| 0.0189 | 0.9953 | 1.7082 | 0.7042 | 4.816769e-08 | 3822 |
| 0.0124 | 0.9976 | 1.7077 | 0.7042 | 4.8149282e-08 | 3823 |
| 0.0095 | 1.0 | 1.7083 | 0.7042 | 4.8130875e-08 | 3824 |
| 0.0103 | 0.9976 | 1.7077 | 0.7042 | 4.8112472e-08 | 3825 |
| 0.0182 | 0.9929 | 1.7075 | 0.7042 | 4.8094073e-08 | 3826 |
| 0.0194 | 0.9906 | 1.7100 | 0.7042 | 4.8075673e-08 | 3827 |
| 0.0112 | 1.0 | 1.7105 | 0.7042 | 4.8057277e-08 | 3828 |
| 0.0121 | 0.9976 | 1.7099 | 0.7042 | 4.8038885e-08 | 3829 |
| 0.0156 | 0.9976 | 1.7118 | 0.7042 | 4.8020492e-08 | 3830 |
| 0.0156 | 0.9976 | 1.7094 | 0.7042 | 4.8002104e-08 | 3831 |
| 0.0118 | 0.9976 | 1.7057 | 0.7042 | 4.7983715e-08 | 3832 |
| 0.0104 | 1.0 | 1.7046 | 0.7042 | 4.796533e-08 | 3833 |
| 0.0086 | 1.0 | 1.7042 | 0.7042 | 4.7946948e-08 | 3834 |
| 0.0107 | 1.0 | 1.7037 | 0.7042 | 4.7928566e-08 | 3835 |
| 0.0103 | 0.9976 | 1.7039 | 0.7042 | 4.7910188e-08 | 3836 |
| 0.0125 | 1.0 | 1.7051 | 0.7042 | 4.7891813e-08 | 3837 |
| 0.0168 | 0.9953 | 1.7068 | 0.7042 | 4.787344e-08 | 3838 |
| 0.0089 | 1.0 | 1.7079 | 0.7042 | 4.7855067e-08 | 3839 |
| 0.0155 | 0.9953 | 1.7069 | 0.7042 | 4.78367e-08 | 3840 |
| 0.0140 | 0.9953 | 1.7057 | 0.7042 | 4.7818332e-08 | 3841 |
| 0.0111 | 1.0 | 1.7052 | 0.7042 | 4.779997e-08 | 3842 |
| 0.0101 | 1.0 | 1.7024 | 0.7042 | 4.7781604e-08 | 3843 |
| 0.0119 | 1.0 | 1.6977 | 0.7113 | 4.7763244e-08 | 3844 |
| 0.0146 | 0.9953 | 1.6999 | 0.7113 | 4.7744887e-08 | 3845 |
| 0.0113 | 1.0 | 1.7034 | 0.7113 | 4.772653e-08 | 3846 |
| 0.0088 | 1.0 | 1.7040 | 0.7113 | 4.7708177e-08 | 3847 |
| 0.0146 | 0.9976 | 1.7042 | 0.7042 | 4.7689827e-08 | 3848 |
| 0.0082 | 1.0 | 1.7043 | 0.7042 | 4.7671477e-08 | 3849 |
| 0.0111 | 1.0 | 1.7054 | 0.7042 | 4.765313e-08 | 3850 |
| 0.0115 | 1.0 | 1.7058 | 0.7042 | 4.763479e-08 | 3851 |
| 0.0131 | 0.9976 | 1.7074 | 0.7042 | 4.7616446e-08 | 3852 |
| 0.0130 | 0.9976 | 1.7052 | 0.7042 | 4.7598107e-08 | 3853 |
| 0.0126 | 0.9976 | 1.7041 | 0.7113 | 4.757977e-08 | 3854 |
| 0.0120 | 1.0 | 1.7008 | 0.7113 | 4.7561436e-08 | 3855 |
| 0.0109 | 0.9976 | 1.7010 | 0.7113 | 4.7543104e-08 | 3856 |
| 0.0122 | 1.0 | 1.7029 | 0.7113 | 4.752477e-08 | 3857 |
| 0.0122 | 0.9976 | 1.7039 | 0.7042 | 4.7506443e-08 | 3858 |
| 0.0107 | 0.9976 | 1.7021 | 0.7113 | 4.748812e-08 | 3859 |
| 0.0158 | 0.9976 | 1.7011 | 0.7113 | 4.7469793e-08 | 3860 |
| 0.0085 | 1.0 | 1.7008 | 0.7183 | 4.7451472e-08 | 3861 |
| 0.0106 | 0.9976 | 1.7012 | 0.7113 | 4.7433154e-08 | 3862 |
| 0.0210 | 0.9976 | 1.7011 | 0.7113 | 4.7414837e-08 | 3863 |
| 0.0127 | 0.9976 | 1.7025 | 0.7113 | 4.7396522e-08 | 3864 |
| 0.0110 | 0.9976 | 1.7026 | 0.7113 | 4.737821e-08 | 3865 |
| 0.0105 | 1.0 | 1.7016 | 0.7113 | 4.73599e-08 | 3866 |
| 0.0126 | 0.9976 | 1.7035 | 0.7113 | 4.7341594e-08 | 3867 |
| 0.0083 | 1.0 | 1.7053 | 0.7042 | 4.732329e-08 | 3868 |
| 0.0147 | 0.9953 | 1.7069 | 0.7042 | 4.7304987e-08 | 3869 |
| 0.0220 | 0.9929 | 1.7074 | 0.7042 | 4.7286687e-08 | 3870 |
| 0.0084 | 1.0 | 1.7080 | 0.7042 | 4.726839e-08 | 3871 |
| 0.0170 | 0.9953 | 1.7067 | 0.7113 | 4.7250094e-08 | 3872 |
| 0.0102 | 1.0 | 1.7062 | 0.7113 | 4.72318e-08 | 3873 |
| 0.0121 | 1.0 | 1.7064 | 0.7113 | 4.721351e-08 | 3874 |
| 0.0151 | 0.9953 | 1.7068 | 0.7113 | 4.7195222e-08 | 3875 |
| 0.0112 | 1.0 | 1.7061 | 0.7113 | 4.7176936e-08 | 3876 |
| 0.0125 | 0.9976 | 1.7054 | 0.7113 | 4.7158654e-08 | 3877 |
| 0.0100 | 1.0 | 1.7056 | 0.7113 | 4.714037e-08 | 3878 |
| 0.0122 | 0.9976 | 1.7070 | 0.7113 | 4.7122093e-08 | 3879 |
| 0.0098 | 1.0 | 1.7059 | 0.7183 | 4.7103818e-08 | 3880 |
| 0.0097 | 1.0 | 1.7059 | 0.7183 | 4.7085543e-08 | 3881 |
| 0.0085 | 1.0 | 1.7071 | 0.7113 | 4.706727e-08 | 3882 |
| 0.0159 | 0.9953 | 1.7093 | 0.7113 | 4.7049003e-08 | 3883 |
| 0.0111 | 0.9976 | 1.7092 | 0.7113 | 4.7030735e-08 | 3884 |
| 0.0137 | 0.9976 | 1.7108 | 0.7113 | 4.701247e-08 | 3885 |
| 0.0111 | 0.9976 | 1.7123 | 0.7042 | 4.699421e-08 | 3886 |
| 0.0122 | 1.0 | 1.7122 | 0.7113 | 4.697595e-08 | 3887 |
| 0.0113 | 1.0 | 1.7117 | 0.7113 | 4.695769e-08 | 3888 |
| 0.0098 | 1.0 | 1.7116 | 0.7113 | 4.6939437e-08 | 3889 |
| 0.0101 | 1.0 | 1.7107 | 0.7113 | 4.6921183e-08 | 3890 |
| 0.0183 | 0.9929 | 1.7096 | 0.7113 | 4.6902933e-08 | 3891 |
| 0.0137 | 0.9976 | 1.7079 | 0.7113 | 4.6884686e-08 | 3892 |
| 0.0134 | 0.9953 | 1.7060 | 0.7113 | 4.686644e-08 | 3893 |
| 0.0084 | 1.0 | 1.7054 | 0.7113 | 4.6848196e-08 | 3894 |
| 0.0154 | 0.9953 | 1.7053 | 0.7113 | 4.6829957e-08 | 3895 |
| 0.0107 | 1.0 | 1.7050 | 0.7042 | 4.681172e-08 | 3896 |
| 0.0156 | 0.9976 | 1.7049 | 0.7113 | 4.6793485e-08 | 3897 |
| 0.0087 | 1.0 | 1.7044 | 0.7113 | 4.6775252e-08 | 3898 |
| 0.0134 | 0.9976 | 1.7051 | 0.7113 | 4.6757023e-08 | 3899 |
| 0.0108 | 0.9976 | 1.7078 | 0.7042 | 4.6738794e-08 | 3900 |
| 0.0103 | 0.9976 | 1.7076 | 0.7042 | 4.672057e-08 | 3901 |
| 0.0082 | 1.0 | 1.7081 | 0.7042 | 4.6702347e-08 | 3902 |
| 0.0118 | 0.9953 | 1.7087 | 0.7042 | 4.6684125e-08 | 3903 |
| 0.0249 | 0.9929 | 1.7105 | 0.7042 | 4.6665907e-08 | 3904 |
| 0.0126 | 1.0 | 1.7103 | 0.7042 | 4.6647692e-08 | 3905 |
| 0.0131 | 0.9976 | 1.7082 | 0.7113 | 4.6629477e-08 | 3906 |
| 0.0149 | 0.9953 | 1.7065 | 0.7113 | 4.6611266e-08 | 3907 |
| 0.0119 | 0.9953 | 1.7059 | 0.7113 | 4.659306e-08 | 3908 |
| 0.0159 | 0.9976 | 1.7072 | 0.7113 | 4.657485e-08 | 3909 |
| 0.0112 | 0.9976 | 1.7081 | 0.7113 | 4.6556647e-08 | 3910 |
| 0.0134 | 1.0 | 1.7092 | 0.7113 | 4.6538446e-08 | 3911 |
| 0.0097 | 0.9976 | 1.7090 | 0.7113 | 4.652025e-08 | 3912 |
| 0.0097 | 1.0 | 1.7080 | 0.7042 | 4.6502052e-08 | 3913 |
| 0.0171 | 0.9976 | 1.7108 | 0.7042 | 4.648386e-08 | 3914 |
| 0.0093 | 1.0 | 1.7123 | 0.7113 | 4.646567e-08 | 3915 |
| 0.0150 | 0.9953 | 1.7114 | 0.7113 | 4.644748e-08 | 3916 |
| 0.0164 | 0.9976 | 1.7105 | 0.7042 | 4.6429292e-08 | 3917 |
| 0.0113 | 0.9976 | 1.7104 | 0.7042 | 4.641111e-08 | 3918 |
| 0.0091 | 0.9976 | 1.7111 | 0.7042 | 4.6392927e-08 | 3919 |
| 0.0218 | 0.9953 | 1.7122 | 0.7042 | 4.6374748e-08 | 3920 |
| 0.0149 | 0.9953 | 1.7126 | 0.7042 | 4.6356572e-08 | 3921 |
| 0.0255 | 0.9953 | 1.7136 | 0.7113 | 4.6338396e-08 | 3922 |
| 0.0126 | 0.9976 | 1.7127 | 0.7042 | 4.6320224e-08 | 3923 |
| 0.0112 | 1.0 | 1.7121 | 0.7042 | 4.6302056e-08 | 3924 |
| 0.0104 | 0.9976 | 1.7126 | 0.7113 | 4.628389e-08 | 3925 |
| 0.0108 | 0.9976 | 1.7118 | 0.7113 | 4.6265725e-08 | 3926 |
| 0.0107 | 0.9976 | 1.7109 | 0.7113 | 4.6247564e-08 | 3927 |
| 0.0145 | 0.9976 | 1.7111 | 0.7042 | 4.6229406e-08 | 3928 |
| 0.0099 | 1.0 | 1.7105 | 0.7042 | 4.6211248e-08 | 3929 |
| 0.0096 | 1.0 | 1.7120 | 0.7113 | 4.6193094e-08 | 3930 |
| 0.0133 | 0.9976 | 1.7129 | 0.7042 | 4.6174943e-08 | 3931 |
| 0.0092 | 1.0 | 1.7156 | 0.7042 | 4.6156796e-08 | 3932 |
| 0.0127 | 0.9976 | 1.7163 | 0.7113 | 4.613865e-08 | 3933 |
| 0.0101 | 0.9976 | 1.7138 | 0.7042 | 4.6120505e-08 | 3934 |
| 0.0157 | 0.9953 | 1.7134 | 0.7042 | 4.6102365e-08 | 3935 |
| 0.0177 | 0.9953 | 1.7165 | 0.7042 | 4.6084224e-08 | 3936 |
| 0.0090 | 1.0 | 1.7178 | 0.7042 | 4.6066088e-08 | 3937 |
| 0.0099 | 1.0 | 1.7174 | 0.7042 | 4.6047955e-08 | 3938 |
| 0.0099 | 1.0 | 1.7168 | 0.7042 | 4.6029825e-08 | 3939 |
| 0.0239 | 0.9929 | 1.7137 | 0.7113 | 4.6011696e-08 | 3940 |
| 0.0085 | 1.0 | 1.7111 | 0.7042 | 4.599357e-08 | 3941 |
| 0.0100 | 1.0 | 1.7113 | 0.7042 | 4.5975447e-08 | 3942 |
| 0.0200 | 0.9929 | 1.7136 | 0.7042 | 4.5957325e-08 | 3943 |
| 0.0084 | 1.0 | 1.7144 | 0.7042 | 4.5939206e-08 | 3944 |
| 0.0096 | 1.0 | 1.7143 | 0.7042 | 4.592109e-08 | 3945 |
| 0.0075 | 1.0 | 1.7142 | 0.7042 | 4.590298e-08 | 3946 |
| 0.0137 | 0.9976 | 1.7135 | 0.7042 | 4.5884867e-08 | 3947 |
| 0.0075 | 1.0 | 1.7129 | 0.7042 | 4.586676e-08 | 3948 |
| 0.0217 | 0.9976 | 1.7115 | 0.7042 | 4.5848655e-08 | 3949 |
| 0.0159 | 0.9953 | 1.7112 | 0.7042 | 4.583055e-08 | 3950 |
| 0.0111 | 0.9976 | 1.7118 | 0.7042 | 4.581245e-08 | 3951 |
| 0.0096 | 0.9976 | 1.7121 | 0.7042 | 4.579435e-08 | 3952 |
| 0.0087 | 1.0 | 1.7134 | 0.7042 | 4.5776257e-08 | 3953 |
| 0.0113 | 1.0 | 1.7147 | 0.7042 | 4.5758163e-08 | 3954 |
| 0.0113 | 0.9976 | 1.7147 | 0.7042 | 4.5740073e-08 | 3955 |
| 0.0162 | 0.9953 | 1.7146 | 0.7042 | 4.5721986e-08 | 3956 |
| 0.0130 | 0.9976 | 1.7141 | 0.6972 | 4.5703903e-08 | 3957 |
| 0.0113 | 1.0 | 1.7147 | 0.6972 | 4.568582e-08 | 3958 |
| 0.0134 | 0.9976 | 1.7172 | 0.6972 | 4.566774e-08 | 3959 |
| 0.0094 | 1.0 | 1.7190 | 0.6972 | 4.5649664e-08 | 3960 |
| 0.0077 | 1.0 | 1.7183 | 0.6972 | 4.5631587e-08 | 3961 |
| 0.0100 | 1.0 | 1.7194 | 0.6972 | 4.5613515e-08 | 3962 |
| 0.0137 | 0.9976 | 1.7203 | 0.7042 | 4.5595446e-08 | 3963 |
| 0.0089 | 1.0 | 1.7210 | 0.6972 | 4.557738e-08 | 3964 |
| 0.0182 | 0.9976 | 1.7210 | 0.6972 | 4.5559315e-08 | 3965 |
| 0.0081 | 1.0 | 1.7207 | 0.6972 | 4.5541253e-08 | 3966 |
| 0.0096 | 1.0 | 1.7219 | 0.6972 | 4.5523194e-08 | 3967 |
| 0.0174 | 0.9953 | 1.7230 | 0.7042 | 4.550514e-08 | 3968 |
| 0.0103 | 1.0 | 1.7237 | 0.7042 | 4.5487084e-08 | 3969 |
| 0.0095 | 1.0 | 1.7239 | 0.7042 | 4.5469033e-08 | 3970 |
| 0.0140 | 0.9953 | 1.7220 | 0.7042 | 4.5450985e-08 | 3971 |
| 0.0101 | 1.0 | 1.7221 | 0.7042 | 4.543294e-08 | 3972 |
| 0.0120 | 1.0 | 1.7229 | 0.6972 | 4.5414897e-08 | 3973 |
| 0.0094 | 1.0 | 1.7232 | 0.6972 | 4.5396856e-08 | 3974 |
| 0.0161 | 0.9929 | 1.7230 | 0.7042 | 4.537882e-08 | 3975 |
| 0.0109 | 0.9976 | 1.7220 | 0.7042 | 4.5360782e-08 | 3976 |
| 0.0125 | 0.9976 | 1.7224 | 0.7042 | 4.5342748e-08 | 3977 |
| 0.0091 | 1.0 | 1.7219 | 0.7042 | 4.5324718e-08 | 3978 |
| 0.0113 | 1.0 | 1.7209 | 0.7042 | 4.530669e-08 | 3979 |
| 0.0149 | 0.9976 | 1.7196 | 0.7042 | 4.5288665e-08 | 3980 |
| 0.0193 | 0.9929 | 1.7173 | 0.7042 | 4.5270642e-08 | 3981 |
| 0.0111 | 0.9976 | 1.7190 | 0.7042 | 4.5252623e-08 | 3982 |
| 0.0113 | 1.0 | 1.7191 | 0.6972 | 4.5234607e-08 | 3983 |
| 0.0120 | 0.9976 | 1.7199 | 0.7113 | 4.521659e-08 | 3984 |
| 0.0121 | 1.0 | 1.7198 | 0.7113 | 4.519858e-08 | 3985 |
| 0.0118 | 0.9976 | 1.7202 | 0.7113 | 4.518057e-08 | 3986 |
| 0.0088 | 1.0 | 1.7218 | 0.7042 | 4.5162565e-08 | 3987 |
| 0.0086 | 1.0 | 1.7235 | 0.6972 | 4.514456e-08 | 3988 |
| 0.0128 | 0.9976 | 1.7236 | 0.7042 | 4.512656e-08 | 3989 |
| 0.0115 | 1.0 | 1.7235 | 0.6972 | 4.510856e-08 | 3990 |
| 0.0105 | 1.0 | 1.7234 | 0.6972 | 4.5090566e-08 | 3991 |
| 0.0108 | 0.9976 | 1.7256 | 0.7042 | 4.507257e-08 | 3992 |
| 0.0138 | 0.9976 | 1.7277 | 0.7042 | 4.505458e-08 | 3993 |
| 0.0100 | 1.0 | 1.7270 | 0.7042 | 4.5036593e-08 | 3994 |
| 0.0129 | 0.9953 | 1.7259 | 0.6972 | 4.501861e-08 | 3995 |
| 0.0118 | 1.0 | 1.7244 | 0.6972 | 4.5000625e-08 | 3996 |
| 0.0078 | 1.0 | 1.7237 | 0.6972 | 4.4982645e-08 | 3997 |
| 0.0177 | 0.9953 | 1.7234 | 0.6972 | 4.496467e-08 | 3998 |
| 0.0102 | 1.0 | 1.7238 | 0.6972 | 4.4946695e-08 | 3999 |
### Framework versions
- Transformers 4.29.0.dev0
- TensorFlow 2.9.1
- Datasets 2.8.0
- Tokenizers 0.13.2
| 385,395 | [
[
-0.052947998046875,
-0.034271240234375,
0.027374267578125,
0.008148193359375,
0.0012674331665039062,
0.00437164306640625,
0.004055023193359375,
-0.000926971435546875,
0.055328369140625,
0.01861572265625,
-0.048797607421875,
-0.04443359375,
-0.03765869140625,
... |
wallacenpj/q05_kaggle_distilbert_inverted_weights | 2023-05-08T22:50:21.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | wallacenpj | null | null | wallacenpj/q05_kaggle_distilbert_inverted_weights | 0 | 2 | transformers | 2023-05-08T21:50:29 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
- recall
- precision
model-index:
- name: q05_kaggle_distilbert_inverted_weights
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# q05_kaggle_distilbert_inverted_weights
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0977
- Accuracy: 0.8581
- F1: 0.3892
- Recall: 0.4038
- Precision: 0.3758
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 0
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- label_smoothing_factor: 0.1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Recall | Precision |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:------:|:---------:|
| 0.2883 | 0.86 | 50 | 0.1095 | 0.8041 | 0.3035 | 0.3072 | 0.3137 |
| 0.124 | 1.72 | 100 | 0.1317 | 0.8497 | 0.3806 | 0.3933 | 0.3687 |
| 0.1062 | 2.59 | 150 | 0.1585 | 0.8361 | 0.3900 | 0.4294 | 0.3674 |
| 0.0888 | 3.45 | 200 | 0.1000 | 0.8328 | 0.3500 | 0.3531 | 0.3505 |
| 0.0789 | 4.31 | 250 | 0.1004 | 0.8395 | 0.3555 | 0.3573 | 0.3587 |
| 0.0649 | 5.17 | 300 | 0.0977 | 0.8581 | 0.3892 | 0.4038 | 0.3758 |
| 0.0526 | 6.03 | 350 | 0.1649 | 0.8615 | 0.3985 | 0.4222 | 0.3794 |
| 0.0384 | 6.9 | 400 | 0.1455 | 0.8733 | 0.4581 | 0.4546 | 0.5005 |
| 0.0351 | 7.76 | 450 | 0.1883 | 0.8767 | 0.5082 | 0.4999 | 0.5376 |
| 0.0344 | 8.62 | 500 | 0.2364 | 0.8733 | 0.5062 | 0.5104 | 0.5179 |
| 0.024 | 9.48 | 550 | 0.1847 | 0.8767 | 0.5483 | 0.5109 | 0.6784 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,498 | [
[
-0.035186767578125,
-0.042083740234375,
0.01171875,
0.006557464599609375,
-0.010772705078125,
-0.006641387939453125,
-0.00185394287109375,
-0.00780487060546875,
0.0238037109375,
0.016632080078125,
-0.049224853515625,
-0.0440673828125,
-0.06390380859375,
-0.0... |
501Good/distilbert-base-cased-finetuned-tweeteval | 2023-05-10T08:49:50.000Z | [
"transformers",
"pytorch",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:tweet_eval",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | 501Good | null | null | 501Good/distilbert-base-cased-finetuned-tweeteval | 0 | 2 | transformers | 2023-05-08T21:58:52 | ---
language:
- en
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- tweet_eval
metrics:
- accuracy
model-index:
- name: distilbert-base-cased-finetuned-tweeteval
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: tweet_eval
type: tweet_eval
config: emotion
split: validation
args: emotion
metrics:
- name: Accuracy
type: accuracy
value: 0.7887700534759359
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-cased-finetuned-tweeteval
This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the tweet_eval dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7720
- Accuracy: 0.7888
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 204 | 0.6867 | 0.7647 |
| No log | 2.0 | 408 | 0.6318 | 0.7968 |
| 0.6397 | 3.0 | 612 | 0.6931 | 0.7834 |
| 0.6397 | 4.0 | 816 | 0.7631 | 0.7754 |
| 0.2064 | 5.0 | 1020 | 0.7720 | 0.7888 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,957 | [
[
-0.021392822265625,
-0.050140380859375,
0.01352691650390625,
0.0166168212890625,
-0.025390625,
-0.00884246826171875,
-0.0080108642578125,
-0.001972198486328125,
0.007076263427734375,
0.01532745361328125,
-0.051177978515625,
-0.056976318359375,
-0.061309814453125... |
chaninder/trashtacks-model-v3 | 2023-05-09T01:00:27.000Z | [
"keras",
"region:us"
] | null | chaninder | null | null | chaninder/trashtacks-model-v3 | 0 | 2 | keras | 2023-05-09T00:59:53 | ---
library_name: keras
---
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
| Hyperparameters | Value |
| :-- | :-- |
| name | Adam |
| learning_rate | 0.0010000000474974513 |
| decay | 0.0 |
| beta_1 | 0.8999999761581421 |
| beta_2 | 0.9990000128746033 |
| epsilon | 1e-07 |
| amsgrad | False |
| training_precision | float32 |
## Model Plot
<details>
<summary>View Model Plot</summary>

</details> | 658 | [
[
-0.034637451171875,
-0.0401611328125,
0.0255584716796875,
0.00649261474609375,
-0.041046142578125,
-0.0197601318359375,
0.01187896728515625,
-0.0110015869140625,
0.0156707763671875,
0.033538818359375,
-0.035552978515625,
-0.053741455078125,
-0.0428466796875,
... |
xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-1 | 2023-05-09T03:52:17.000Z | [
"transformers",
"tf",
"albert",
"text-classification",
"generated_from_keras_callback",
"endpoints_compatible",
"region:us"
] | text-classification | xinyixiuxiu | null | null | xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-1 | 0 | 2 | transformers | 2023-05-09T01:21:58 | ---
tags:
- generated_from_keras_callback
model-index:
- name: xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-1
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-1
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1984
- Train Accuracy: 0.9228
- Validation Loss: 0.1250
- Validation Accuracy: 0.9541
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 3e-06, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.1984 | 0.9228 | 0.1250 | 0.9541 | 0 |
### Framework versions
- Transformers 4.28.1
- TensorFlow 2.7.0
- Datasets 2.10.1
- Tokenizers 0.12.1
| 1,441 | [
[
-0.0311431884765625,
-0.033294677734375,
0.0272979736328125,
0.0107269287109375,
-0.036407470703125,
-0.0284881591796875,
-0.0038814544677734375,
-0.02606201171875,
0.00611114501953125,
0.01519012451171875,
-0.05377197265625,
-0.039337158203125,
-0.0559692382812... |
Arm627/NewsRelevanceFinetunedDistilbertBase | 2023-05-09T02:39:41.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | Arm627 | null | null | Arm627/NewsRelevanceFinetunedDistilbertBase | 0 | 2 | transformers | 2023-05-09T02:30:15 | ---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: NewsRelevanceFinetunedDistilbertBase
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# NewsRelevanceFinetunedDistilbertBase
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
### Training results
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,082 | [
[
-0.034759521484375,
-0.051849365234375,
0.0139923095703125,
0.017578125,
-0.03106689453125,
-0.0183563232421875,
-0.00821685791015625,
-0.01210784912109375,
0.007381439208984375,
0.0251922607421875,
-0.05401611328125,
-0.04638671875,
-0.056884765625,
-0.0026... |
Arm627/NewsRelevanceFinetunedDistilbertBaseBinary | 2023-05-09T03:44:23.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | Arm627 | null | null | Arm627/NewsRelevanceFinetunedDistilbertBaseBinary | 0 | 2 | transformers | 2023-05-09T03:35:27 | ---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: NewsRelevanceFinetunedDistilbertBaseBinary
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# NewsRelevanceFinetunedDistilbertBaseBinary
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
### Training results
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,094 | [
[
-0.03265380859375,
-0.052337646484375,
0.0138702392578125,
0.016387939453125,
-0.03173828125,
-0.0174560546875,
-0.007110595703125,
-0.0136871337890625,
0.007198333740234375,
0.0229644775390625,
-0.05291748046875,
-0.045501708984375,
-0.058746337890625,
-0.0... |
xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-2 | 2023-05-09T06:37:59.000Z | [
"transformers",
"tf",
"albert",
"text-classification",
"generated_from_keras_callback",
"endpoints_compatible",
"region:us"
] | text-classification | xinyixiuxiu | null | null | xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-2 | 0 | 2 | transformers | 2023-05-09T06:02:09 | ---
tags:
- generated_from_keras_callback
model-index:
- name: xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-2
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1049
- Train Accuracy: 0.9641
- Validation Loss: 0.1328
- Validation Accuracy: 0.9564
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 3e-06, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.1049 | 0.9641 | 0.1328 | 0.9564 | 0 |
### Framework versions
- Transformers 4.28.1
- TensorFlow 2.7.0
- Datasets 2.10.1
- Tokenizers 0.12.1
| 1,441 | [
[
-0.029693603515625,
-0.03253173828125,
0.027587890625,
0.01169586181640625,
-0.036163330078125,
-0.02960205078125,
-0.004993438720703125,
-0.0269012451171875,
0.00542449951171875,
0.0152587890625,
-0.053070068359375,
-0.039825439453125,
-0.05572509765625,
-0... |
Cynthiaiii4/Text_classification_model_blu | 2023-05-10T05:58:38.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | Cynthiaiii4 | null | null | Cynthiaiii4/Text_classification_model_blu | 0 | 2 | transformers | 2023-05-09T06:41:39 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: Text_classification_model_blu
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Text_classification_model_blu
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4720
- Accuracy: 0.78
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 100 | 0.4871 | 0.7675 |
| No log | 2.0 | 200 | 0.4720 | 0.78 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,410 | [
[
-0.025848388671875,
-0.03521728515625,
0.004253387451171875,
0.016693115234375,
-0.0318603515625,
-0.03143310546875,
-0.0111846923828125,
-0.0321044921875,
0.0020847320556640625,
0.020751953125,
-0.051422119140625,
-0.05059814453125,
-0.04132080078125,
-0.02... |
xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-3 | 2023-05-09T07:25:13.000Z | [
"transformers",
"tf",
"albert",
"text-classification",
"generated_from_keras_callback",
"endpoints_compatible",
"region:us"
] | text-classification | xinyixiuxiu | null | null | xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-3 | 0 | 2 | transformers | 2023-05-09T06:49:25 | ---
tags:
- generated_from_keras_callback
model-index:
- name: xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-3
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-3
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0713
- Train Accuracy: 0.9771
- Validation Loss: 0.1705
- Validation Accuracy: 0.9541
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 3e-06, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.0713 | 0.9771 | 0.1705 | 0.9541 | 0 |
### Framework versions
- Transformers 4.28.1
- TensorFlow 2.7.0
- Datasets 2.10.1
- Tokenizers 0.12.1
| 1,441 | [
[
-0.030120849609375,
-0.032012939453125,
0.0279541015625,
0.01152801513671875,
-0.036407470703125,
-0.0295867919921875,
-0.0035610198974609375,
-0.02667236328125,
0.00592803955078125,
0.0157318115234375,
-0.053192138671875,
-0.039276123046875,
-0.055450439453125,... |
zxy1231/tm_simcse_zh_model | 2023-05-09T06:59:09.000Z | [
"sentence-transformers",
"pytorch",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"endpoints_compatible",
"region:us"
] | sentence-similarity | zxy1231 | null | null | zxy1231/tm_simcse_zh_model | 0 | 2 | sentence-transformers | 2023-05-09T06:50:51 | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# {MODEL_NAME}
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
model = AutoModel.from_pretrained('{MODEL_NAME}')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 313 with parameters:
```
{'batch_size': 32, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss` with parameters:
```
{'scale': 20.0, 'similarity_fct': 'cos_sim'}
```
Parameters of the fit()-Method:
```
{
"epochs": 3,
"evaluation_steps": 0,
"evaluator": "NoneType",
"max_grad_norm": 1,
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 500,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 64, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information --> | 3,795 | [
[
-0.0223846435546875,
-0.061248779296875,
0.0202484130859375,
0.02392578125,
-0.0197601318359375,
-0.032470703125,
-0.0195770263671875,
0.0030231475830078125,
0.0161285400390625,
0.0272064208984375,
-0.050628662109375,
-0.04541015625,
-0.051513671875,
-0.0011... |
Cynthiaiii4/Text_classification_model_bbc | 2023-05-09T06:57:06.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | Cynthiaiii4 | null | null | Cynthiaiii4/Text_classification_model_bbc | 0 | 2 | transformers | 2023-05-09T06:52:56 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: Text_classification_model_bbc
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Text_classification_model_bbc
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6851
- Accuracy: 0.78
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 100 | 0.6159 | 0.795 |
| No log | 2.0 | 200 | 0.6851 | 0.78 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,410 | [
[
-0.031707763671875,
-0.0321044921875,
0.006229400634765625,
0.00860595703125,
-0.03192138671875,
-0.032928466796875,
-0.0140838623046875,
-0.0281219482421875,
0.0028781890869140625,
0.0238037109375,
-0.045196533203125,
-0.054656982421875,
-0.051666259765625,
... |
Intel/bert-large-uncased-rte-int8-dynamic | 2023-05-10T09:35:58.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"rte",
"glue",
"torchdistill",
"nlp",
"int8",
"neural-compressor",
"Intel® Neural Compressor",
"text-classfication",
"PostTrainingDynamic",
"en",
"dataset:rte",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | Intel | null | null | Intel/bert-large-uncased-rte-int8-dynamic | 0 | 2 | transformers | 2023-05-09T07:57:28 | ---
language: en
tags:
- bert
- rte
- glue
- torchdistill
- nlp
- int8
- neural-compressor
- Intel® Neural Compressor
- text-classfication
- PostTrainingDynamic
license: apache-2.0
datasets:
- rte
metrics:
- f1
---
# INT8 bert-large-uncased-rte-int8-dynamic
## Post-training dynamic quantization
### PyTorch
This is an INT8 PyTorch model quantized with [Intel® Neural Compressor](https://github.com/intel/neural-compressor).
The original fp32 model comes from the fine-tuned model [yoshitomo-matsubara/bert-large-uncased-rte](https://huggingface.co/yoshitomo-matsubara/bert-large-uncased-rte).
#### Test result
| |INT8|FP32|
|---|:---:|:---:|
| **Accuracy (eval-f1)** |0.7076|0.7401|
| **Model size (MB)** |766|1349|
#### Load with Intel® Neural Compressor:
```python
from optimum.intel.neural_compressor.quantization import IncQuantizedModelForSequenceClassification
int8_model = IncQuantizedModelForSequenceClassification.from_pretrained(
"Intel/bert-large-uncased-rte-int8-dynamic",
)
```
| 1,011 | [
[
-0.021636962890625,
-0.04327392578125,
0.0122833251953125,
0.0160980224609375,
-0.02044677734375,
0.021209716796875,
-0.0299072265625,
-0.00803375244140625,
-0.0007824897766113281,
0.0032958984375,
-0.0268707275390625,
-0.0160980224609375,
-0.047027587890625,
... |
xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-4 | 2023-05-09T08:36:38.000Z | [
"transformers",
"tf",
"albert",
"text-classification",
"generated_from_keras_callback",
"endpoints_compatible",
"region:us"
] | text-classification | xinyixiuxiu | null | null | xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-4 | 0 | 2 | transformers | 2023-05-09T08:00:50 | ---
tags:
- generated_from_keras_callback
model-index:
- name: xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-4
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-4
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0490
- Train Accuracy: 0.9845
- Validation Loss: 0.1365
- Validation Accuracy: 0.9599
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 3e-06, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.0490 | 0.9845 | 0.1365 | 0.9599 | 0 |
### Framework versions
- Transformers 4.28.1
- TensorFlow 2.7.0
- Datasets 2.10.1
- Tokenizers 0.12.1
| 1,441 | [
[
-0.03009033203125,
-0.0323486328125,
0.0270538330078125,
0.010498046875,
-0.03619384765625,
-0.0298309326171875,
-0.00446319580078125,
-0.027435302734375,
0.006134033203125,
0.01517486572265625,
-0.053924560546875,
-0.039794921875,
-0.05615234375,
-0.0195007... |
xqchq/test-trainer | 2023-05-11T09:24:21.000Z | [
"transformers",
"pytorch",
"tensorboard",
"onnx",
"bert",
"text-classification",
"generated_from_trainer",
"zh",
"dataset:seamew/THUCNewsText",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | xqchq | null | null | xqchq/test-trainer | 0 | 2 | transformers | 2023-05-09T08:30:10 | ---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: test-trainer
results: []
datasets:
- seamew/THUCNewsText
language:
- zh
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# test-trainer1
This model is a fine-tuned version of [hfl/minirbt-h256](https://huggingface.co/hfl/minirbt-h256) on seamew/THUCNewsText dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3 | 1,076 | [
[
-0.041595458984375,
-0.049163818359375,
0.004482269287109375,
0.01247406005859375,
-0.034271240234375,
-0.031158447265625,
-0.005352020263671875,
-0.019256591796875,
0.01165008544921875,
0.0146331787109375,
-0.057281494140625,
-0.020477294921875,
-0.038848876953... |
Purus15987/English_Telugu_Translation | 2023-05-10T05:09:15.000Z | [
"transformers",
"pytorch",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_trainer",
"dataset:samanantar",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text2text-generation | Purus15987 | null | null | Purus15987/English_Telugu_Translation | 0 | 2 | transformers | 2023-05-09T08:48:48 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- samanantar
model-index:
- name: English_Telugu_Translation
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# English_Telugu_Translation
This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on the samanantar dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,032 | [
[
-0.01511383056640625,
-0.0305328369140625,
0.01024627685546875,
0.01611328125,
-0.05535888671875,
-0.0196533203125,
-0.00635528564453125,
-0.01445770263671875,
0.0159759521484375,
0.0227508544921875,
-0.050048828125,
-0.049957275390625,
-0.0540771484375,
0.0... |
directtt/wine-reviews-distilbert | 2023-05-09T10:46:50.000Z | [
"transformers",
"tf",
"distilbert",
"text-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | directtt | null | null | directtt/wine-reviews-distilbert | 0 | 2 | transformers | 2023-05-09T09:11:41 | ---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: wine-reviews-distilbert
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# wine-reviews-distilbert
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.3834
- Train Acc: 0.8375
- Validation Loss: 0.5538
- Validation Acc: 0.7741
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 24455, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Acc | Validation Loss | Validation Acc | Epoch |
|:----------:|:---------:|:---------------:|:--------------:|:-----:|
| 0.6005 | 0.7381 | 0.5342 | 0.7661 | 0 |
| 0.4822 | 0.7915 | 0.5570 | 0.7612 | 1 |
| 0.3834 | 0.8375 | 0.5538 | 0.7741 | 2 |
### Framework versions
- Transformers 4.28.1
- TensorFlow 2.11.0
- Datasets 2.1.0
- Tokenizers 0.13.3
| 1,905 | [
[
-0.041839599609375,
-0.042572021484375,
0.0208282470703125,
0.004589080810546875,
-0.03216552734375,
-0.01812744140625,
-0.00905609130859375,
-0.01268768310546875,
0.0092010498046875,
0.007297515869140625,
-0.05023193359375,
-0.046234130859375,
-0.05828857421875... |
alexandrualexandru/my-final-v1-text-to-sparql-combined-dataset-t5-base-2023-05-09_09-13 | 2023-05-09T12:33:17.000Z | [
"transformers",
"pytorch",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text2text-generation | alexandrualexandru | null | null | alexandrualexandru/my-final-v1-text-to-sparql-combined-dataset-t5-base-2023-05-09_09-13 | 0 | 2 | transformers | 2023-05-09T09:17:02 | ---
tags:
- generated_from_trainer
model-index:
- name: my-final-v1-text-to-sparql-combined-dataset-t5-base-2023-05-09_09-13
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my-final-v1-text-to-sparql-combined-dataset-t5-base-2023-05-09_09-13
This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3456
- Gen Len: 19.0
- Bertscorer-p: 0.5013
- Bertscorer-r: 0.1137
- Bertscorer-f1: 0.3000
- Sacrebleu-score: 6.1003
- Sacrebleu-precisions: [77.97754754552538, 64.74142628270293, 53.3199157675034, 47.63691495511611]
- Bleu-bp: 0.1019
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Gen Len | Bertscorer-p | Bertscorer-r | Bertscorer-f1 | Sacrebleu-score | Sacrebleu-precisions | Bleu-bp |
|:-------------:|:-----:|:-----:|:---------------:|:-------:|:------------:|:------------:|:-------------:|:---------------:|:----------------------------------------------------------------------------:|:-------:|
| 0.4215 | 1.0 | 7822 | 0.3919 | 19.0 | 0.4997 | 0.1122 | 0.2984 | 5.8699 | [77.35323282257656, 63.16682990532158, 51.41608735111668, 45.63668646835748] | 0.1009 |
| 0.3639 | 2.0 | 15644 | 0.3456 | 19.0 | 0.5013 | 0.1137 | 0.3000 | 6.1003 | [77.97754754552538, 64.74142628270293, 53.3199157675034, 47.63691495511611] | 0.1019 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,252 | [
[
-0.046905517578125,
-0.0382080078125,
0.00930023193359375,
0.0146026611328125,
-0.016082763671875,
-0.0255584716796875,
-0.0018062591552734375,
-0.0165863037109375,
0.01910400390625,
0.033111572265625,
-0.046844482421875,
-0.050628662109375,
-0.050628662109375,
... |
Neutralzz/BiLLa-7B-SFT | 2023-05-12T15:18:24.000Z | [
"transformers",
"pytorch",
"llama",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | null | Neutralzz | null | null | Neutralzz/BiLLa-7B-SFT | 65 | 2 | transformers | 2023-05-09T14:26:02 | ---
license: apache-2.0
---
# BiLLa: A Bilingual LLaMA with Enhanced Reasoning Ability
BiLLa is an open-source reasoning-enhanced bilingual LLaMA model. The main features are:
- Greatly improve the ability of Chinese language modeling, and minimize the damage to the original English ability of LLaMA;
- During the training, more task data is added with ChatGPT-generated analysis;
- Full-parameter optimization for better performance.
Github: https://github.com/Neutralzz/BiLLa
<b>Note</b>: Due to LLaMA's license, the model weights in this hub cannot be used directly.
The weight of `word embedding` is the sum of the weights of the trained model and the original LLaMA,
so as to ensure that developers with LLaMA original model accessibility can convert the model released by this hub into a usable one.
## Usage
First, you can revert the model weights by [this script](https://github.com/Neutralzz/BiLLa/blob/main/embedding_convert.py):
```shell
python3 embedding_convert.py \
--model_dir /path_to_BiLLa/BiLLa-7B-SFT \
--meta_llama_pth_file /path_to_LLaMA/llama-7b/consolidated.00.pth
```
Then, you can run this model as follows:
```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
model_path = "/path_to_BiLLa/BiLLa-7B-SFT"
tokenizer = AutoTokenizer.from_pretrained(model_path, use_fast=False)
model = AutoModelForCausalLM.from_pretrained(model_path, low_cpu_mem_usage=True, torch_dtype=torch.float16).cuda()
prompt = "Human: Write a Python function that checks if a given number is even or odd.\nAssistant: "
input_ids = tokenizer([prompt]).input_ids
output_ids = model.generate(
torch.as_tensor(input_ids).cuda(),
do_sample=True,
temperature=0.7,
max_new_tokens=1024
)
output_ids = output_ids[0][len(input_ids[0]):]
outputs = tokenizer.decode(output_ids, skip_special_tokens=True).strip()
print(outputs)
```
### Input Format
Different from [BiLLa-7B-LLM](https://huggingface.co/Neutralzz/BiLLa-7B-LLM), the model input of `BiLLa-7B-SFT` should be formatted as follows:
```
Human: [Your question]
Assistant:
```
Note that <b>a space</b> is following the `Assistant:`
| 2,184 | [
[
-0.01727294921875,
-0.0625,
0.0202178955078125,
0.033233642578125,
-0.0333251953125,
-0.002613067626953125,
0.00970458984375,
-0.037261962890625,
0.0289764404296875,
0.03759765625,
-0.01108551025390625,
-0.024505615234375,
-0.0443115234375,
0.020858764648437... |
xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-5 | 2023-05-09T16:21:49.000Z | [
"transformers",
"tf",
"albert",
"text-classification",
"generated_from_keras_callback",
"endpoints_compatible",
"region:us"
] | text-classification | xinyixiuxiu | null | null | xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-5 | 0 | 2 | transformers | 2023-05-09T15:45:59 | ---
tags:
- generated_from_keras_callback
model-index:
- name: xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-5
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-5
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0332
- Train Accuracy: 0.9897
- Validation Loss: 0.1438
- Validation Accuracy: 0.9599
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 3e-06, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.0332 | 0.9897 | 0.1438 | 0.9599 | 0 |
### Framework versions
- Transformers 4.28.1
- TensorFlow 2.7.0
- Datasets 2.10.1
- Tokenizers 0.12.1
| 1,441 | [
[
-0.030120849609375,
-0.032440185546875,
0.0276336669921875,
0.01097869873046875,
-0.0357666015625,
-0.0303192138671875,
-0.003963470458984375,
-0.0277099609375,
0.00540924072265625,
0.01505279541015625,
-0.053619384765625,
-0.039764404296875,
-0.055419921875,
... |
hr-elrond/autotrain-p2_finbert_training_100-56875131853 | 2023-05-09T16:23:24.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"autotrain",
"unk",
"dataset:hr-elrond/autotrain-data-p2_finbert_training_100",
"co2_eq_emissions",
"endpoints_compatible",
"region:us"
] | text-classification | hr-elrond | null | null | hr-elrond/autotrain-p2_finbert_training_100-56875131853 | 0 | 2 | transformers | 2023-05-09T16:22:38 | ---
tags:
- autotrain
- text-classification
language:
- unk
widget:
- text: "I love AutoTrain 🤗"
datasets:
- hr-elrond/autotrain-data-p2_finbert_training_100
co2_eq_emissions:
emissions: 0.2967273355715001
---
# Model Trained Using AutoTrain
- Problem type: Binary Classification
- Model ID: 56875131853
- CO2 Emissions (in grams): 0.2967
## Validation Metrics
- Loss: 0.068
- Accuracy: 0.984
- Precision: 0.993
- Recall: 0.983
- AUC: 0.996
- F1: 0.988
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoTrain"}' https://api-inference.huggingface.co/models/hr-elrond/autotrain-p2_finbert_training_100-56875131853
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("hr-elrond/autotrain-p2_finbert_training_100-56875131853", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("hr-elrond/autotrain-p2_finbert_training_100-56875131853", use_auth_token=True)
inputs = tokenizer("I love AutoTrain", return_tensors="pt")
outputs = model(**inputs)
``` | 1,195 | [
[
-0.03131103515625,
-0.0307159423828125,
0.00875091552734375,
0.002605438232421875,
-0.00441741943359375,
-0.00012195110321044922,
0.004657745361328125,
-0.01290130615234375,
-0.005596160888671875,
0.0157318115234375,
-0.054473876953125,
-0.038238525390625,
-0.05... |
harvinder676/bert-news | 2023-05-09T18:02:03.000Z | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | harvinder676 | null | null | harvinder676/bert-news | 0 | 2 | transformers | 2023-05-09T17:44:52 | ---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: bert-news
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-news
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5512
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 0
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.7548 | 1.0 | 1531 | 2.6146 |
| 2.6217 | 2.0 | 3062 | 2.5512 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.13.3
| 1,326 | [
[
-0.03521728515625,
-0.045440673828125,
0.01555633544921875,
0.0178985595703125,
-0.034088134765625,
-0.03094482421875,
-0.016387939453125,
-0.00970458984375,
0.0015649795532226562,
0.017486572265625,
-0.05615234375,
-0.04144287109375,
-0.05029296875,
-0.0177... |
syndi-models/bart-large-cnn | 2023-01-24T16:28:55.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"bart",
"text2text-generation",
"summarization",
"en",
"dataset:cnn_dailymail",
"arxiv:1910.13461",
"license:mit",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | summarization | syndi-models | null | null | syndi-models/bart-large-cnn | 0 | 2 | transformers | 2023-05-09T18:54:09 | ---
language:
- en
tags:
- summarization
license: mit
thumbnail: https://huggingface.co/front/thumbnails/facebook.png
datasets:
- cnn_dailymail
model-index:
- name: facebook/bart-large-cnn
results:
- task:
type: summarization
name: Summarization
dataset:
name: cnn_dailymail
type: cnn_dailymail
config: 3.0.0
split: train
metrics:
- name: ROUGE-1
type: rouge
value: 42.9486
verified: true
- name: ROUGE-2
type: rouge
value: 20.8149
verified: true
- name: ROUGE-L
type: rouge
value: 30.6186
verified: true
- name: ROUGE-LSUM
type: rouge
value: 40.0376
verified: true
- name: loss
type: loss
value: 2.529000997543335
verified: true
- name: gen_len
type: gen_len
value: 78.5866
verified: true
---
# BART (large-sized model), fine-tuned on CNN Daily Mail
BART model pre-trained on English language, and fine-tuned on [CNN Daily Mail](https://huggingface.co/datasets/cnn_dailymail). It was introduced in the paper [BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension](https://arxiv.org/abs/1910.13461) by Lewis et al. and first released in [this repository (https://github.com/pytorch/fairseq/tree/master/examples/bart).
Disclaimer: The team releasing BART did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
BART is a transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder. BART is pre-trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text.
BART is particularly effective when fine-tuned for text generation (e.g. summarization, translation) but also works well for comprehension tasks (e.g. text classification, question answering). This particular checkpoint has been fine-tuned on CNN Daily Mail, a large collection of text-summary pairs.
## Intended uses & limitations
You can use this model for text summarization.
### How to use
Here is how to use this model with the [pipeline API](https://huggingface.co/transformers/main_classes/pipelines.html):
```python
from transformers import pipeline
summarizer = pipeline("summarization", model="facebook/bart-large-cnn")
ARTICLE = """ New York (CNN)When Liana Barrientos was 23 years old, she got married in Westchester County, New York.
A year later, she got married again in Westchester County, but to a different man and without divorcing her first husband.
Only 18 days after that marriage, she got hitched yet again. Then, Barrientos declared "I do" five more times, sometimes only within two weeks of each other.
In 2010, she married once more, this time in the Bronx. In an application for a marriage license, she stated it was her "first and only" marriage.
Barrientos, now 39, is facing two criminal counts of "offering a false instrument for filing in the first degree," referring to her false statements on the
2010 marriage license application, according to court documents.
Prosecutors said the marriages were part of an immigration scam.
On Friday, she pleaded not guilty at State Supreme Court in the Bronx, according to her attorney, Christopher Wright, who declined to comment further.
After leaving court, Barrientos was arrested and charged with theft of service and criminal trespass for allegedly sneaking into the New York subway through an emergency exit, said Detective
Annette Markowski, a police spokeswoman. In total, Barrientos has been married 10 times, with nine of her marriages occurring between 1999 and 2002.
All occurred either in Westchester County, Long Island, New Jersey or the Bronx. She is believed to still be married to four men, and at one time, she was married to eight men at once, prosecutors say.
Prosecutors said the immigration scam involved some of her husbands, who filed for permanent residence status shortly after the marriages.
Any divorces happened only after such filings were approved. It was unclear whether any of the men will be prosecuted.
The case was referred to the Bronx District Attorney\'s Office by Immigration and Customs Enforcement and the Department of Homeland Security\'s
Investigation Division. Seven of the men are from so-called "red-flagged" countries, including Egypt, Turkey, Georgia, Pakistan and Mali.
Her eighth husband, Rashid Rajput, was deported in 2006 to his native Pakistan after an investigation by the Joint Terrorism Task Force.
If convicted, Barrientos faces up to four years in prison. Her next court appearance is scheduled for May 18.
"""
print(summarizer(ARTICLE, max_length=130, min_length=30, do_sample=False))
>>> [{'summary_text': 'Liana Barrientos, 39, is charged with two counts of "offering a false instrument for filing in the first degree" In total, she has been married 10 times, with nine of her marriages occurring between 1999 and 2002. She is believed to still be married to four men.'}]
```
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1910-13461,
author = {Mike Lewis and
Yinhan Liu and
Naman Goyal and
Marjan Ghazvininejad and
Abdelrahman Mohamed and
Omer Levy and
Veselin Stoyanov and
Luke Zettlemoyer},
title = {{BART:} Denoising Sequence-to-Sequence Pre-training for Natural Language
Generation, Translation, and Comprehension},
journal = {CoRR},
volume = {abs/1910.13461},
year = {2019},
url = {http://arxiv.org/abs/1910.13461},
eprinttype = {arXiv},
eprint = {1910.13461},
timestamp = {Thu, 31 Oct 2019 14:02:26 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1910-13461.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
} | 5,999 | [
[
-0.03515625,
-0.055145263671875,
0.0284881591796875,
0.02752685546875,
-0.038116455078125,
-0.0190887451171875,
0.005397796630859375,
-0.0229339599609375,
0.0300445556640625,
0.046356201171875,
-0.020721435546875,
-0.02984619140625,
-0.042327880859375,
0.032... |
MFrazz/distilbert-base-uncased-finetuned-spam | 2023-06-11T18:38:19.000Z | [
"transformers",
"pytorch",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:sms_spam",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | MFrazz | null | null | MFrazz/distilbert-base-uncased-finetuned-spam | 0 | 2 | transformers | 2023-05-09T19:04:24 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- sms_spam
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-spam
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: sms_spam
type: sms_spam
config: plain_text
split: train
args: plain_text
metrics:
- name: Accuracy
type: accuracy
value: 0.9883408071748879
- name: F1
type: f1
value: 0.9882535196626446
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-spam
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the sms_spam dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0370
- Accuracy: 0.9883
- F1: 0.9883
## Model description
More information needed
### Label Key
- LABEL_1 = SPAM
- LABEL_0 = HAM
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.174 | 1.0 | 70 | 0.0444 | 0.9865 | 0.9866 |
| 0.0303 | 2.0 | 140 | 0.0370 | 0.9883 | 0.9883 |
### Framework versions
- Transformers 4.27.1
- Pytorch 2.0.0
- Datasets 2.10.1
- Tokenizers 0.13.2
| 1,905 | [
[
-0.03564453125,
-0.04974365234375,
0.007694244384765625,
0.02044677734375,
-0.0266265869140625,
-0.021514892578125,
-0.005218505859375,
-0.00577545166015625,
-0.0013456344604492188,
0.030120849609375,
-0.049591064453125,
-0.051513671875,
-0.0650634765625,
-0... |
syndi-models/ms-marco-MiniLM-L-12-v2 | 2021-08-05T08:39:01.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"text-classification",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | syndi-models | null | null | syndi-models/ms-marco-MiniLM-L-12-v2 | 0 | 2 | transformers | 2023-05-09T19:06:32 | ---
license: apache-2.0
---
# Cross-Encoder for MS Marco
This model was trained on the [MS Marco Passage Ranking](https://github.com/microsoft/MSMARCO-Passage-Ranking) task.
The model can be used for Information Retrieval: Given a query, encode the query will all possible passages (e.g. retrieved with ElasticSearch). Then sort the passages in a decreasing order. See [SBERT.net Retrieve & Re-rank](https://www.sbert.net/examples/applications/retrieve_rerank/README.html) for more details. The training code is available here: [SBERT.net Training MS Marco](https://github.com/UKPLab/sentence-transformers/tree/master/examples/training/ms_marco)
## Usage with Transformers
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
model = AutoModelForSequenceClassification.from_pretrained('model_name')
tokenizer = AutoTokenizer.from_pretrained('model_name')
features = tokenizer(['How many people live in Berlin?', 'How many people live in Berlin?'], ['Berlin has a population of 3,520,031 registered inhabitants in an area of 891.82 square kilometers.', 'New York City is famous for the Metropolitan Museum of Art.'], padding=True, truncation=True, return_tensors="pt")
model.eval()
with torch.no_grad():
scores = model(**features).logits
print(scores)
```
## Usage with SentenceTransformers
The usage becomes easier when you have [SentenceTransformers](https://www.sbert.net/) installed. Then, you can use the pre-trained models like this:
```python
from sentence_transformers import CrossEncoder
model = CrossEncoder('model_name', max_length=512)
scores = model.predict([('Query', 'Paragraph1'), ('Query', 'Paragraph2') , ('Query', 'Paragraph3')])
```
## Performance
In the following table, we provide various pre-trained Cross-Encoders together with their performance on the [TREC Deep Learning 2019](https://microsoft.github.io/TREC-2019-Deep-Learning/) and the [MS Marco Passage Reranking](https://github.com/microsoft/MSMARCO-Passage-Ranking/) dataset.
| Model-Name | NDCG@10 (TREC DL 19) | MRR@10 (MS Marco Dev) | Docs / Sec |
| ------------- |:-------------| -----| --- |
| **Version 2 models** | | |
| cross-encoder/ms-marco-TinyBERT-L-2-v2 | 69.84 | 32.56 | 9000
| cross-encoder/ms-marco-MiniLM-L-2-v2 | 71.01 | 34.85 | 4100
| cross-encoder/ms-marco-MiniLM-L-4-v2 | 73.04 | 37.70 | 2500
| cross-encoder/ms-marco-MiniLM-L-6-v2 | 74.30 | 39.01 | 1800
| cross-encoder/ms-marco-MiniLM-L-12-v2 | 74.31 | 39.02 | 960
| **Version 1 models** | | |
| cross-encoder/ms-marco-TinyBERT-L-2 | 67.43 | 30.15 | 9000
| cross-encoder/ms-marco-TinyBERT-L-4 | 68.09 | 34.50 | 2900
| cross-encoder/ms-marco-TinyBERT-L-6 | 69.57 | 36.13 | 680
| cross-encoder/ms-marco-electra-base | 71.99 | 36.41 | 340
| **Other models** | | |
| nboost/pt-tinybert-msmarco | 63.63 | 28.80 | 2900
| nboost/pt-bert-base-uncased-msmarco | 70.94 | 34.75 | 340
| nboost/pt-bert-large-msmarco | 73.36 | 36.48 | 100
| Capreolus/electra-base-msmarco | 71.23 | 36.89 | 340
| amberoad/bert-multilingual-passage-reranking-msmarco | 68.40 | 35.54 | 330
| sebastian-hofstaetter/distilbert-cat-margin_mse-T2-msmarco | 72.82 | 37.88 | 720
Note: Runtime was computed on a V100 GPU.
| 3,233 | [
[
-0.03228759765625,
-0.043670654296875,
0.0250396728515625,
0.01168060302734375,
-0.0127105712890625,
0.01073455810546875,
-0.01338958740234375,
-0.038543701171875,
0.025146484375,
0.0255889892578125,
-0.041229248046875,
-0.051055908203125,
-0.058013916015625,
... |
bpben/en_imdb_sent_cnn | 2023-05-09T19:58:32.000Z | [
"spacy",
"text-classification",
"en",
"region:us"
] | text-classification | bpben | null | null | bpben/en_imdb_sent_cnn | 0 | 2 | spacy | 2023-05-09T19:58:25 | ---
tags:
- spacy
- text-classification
language:
- en
model-index:
- name: en_imdb_sent_cnn
results: []
---
| Feature | Description |
| --- | --- |
| **Name** | `en_imdb_sent_cnn` |
| **Version** | `0.0.0` |
| **spaCy** | `>=3.4.4,<3.5.0` |
| **Default Pipeline** | `textcat` |
| **Components** | `textcat` |
| **Vectors** | 0 keys, 0 unique vectors (0 dimensions) |
| **Sources** | n/a |
| **License** | n/a |
| **Author** | [n/a]() |
### Label Scheme
<details>
<summary>View label scheme (2 labels for 1 components)</summary>
| Component | Labels |
| --- | --- |
| **`textcat`** | `pos`, `neg` |
</details>
### Accuracy
| Type | Score |
| --- | --- |
| `CATS_SCORE` | 82.51 |
| `CATS_MICRO_P` | 82.51 |
| `CATS_MICRO_R` | 82.51 |
| `CATS_MICRO_F` | 82.51 |
| `CATS_MACRO_P` | 82.51 |
| `CATS_MACRO_R` | 82.51 |
| `CATS_MACRO_F` | 82.51 |
| `CATS_MACRO_AUC` | 90.17 |
| `CATS_MACRO_AUC_PER_TYPE` | 0.00 |
| `TEXTCAT_LOSS` | 2099.23 | | 944 | [
[
-0.047576904296875,
-0.0271453857421875,
0.004543304443359375,
0.0082550048828125,
-0.05316162109375,
0.02435302734375,
-0.01006317138671875,
0.003086090087890625,
0.0440673828125,
0.04473876953125,
-0.050018310546875,
-0.061065673828125,
-0.059478759765625,
... |
jules654/ppo-Huggy | 2023-05-09T21:32:18.000Z | [
"ml-agents",
"tensorboard",
"onnx",
"Huggy",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Huggy",
"region:us"
] | reinforcement-learning | jules654 | null | null | jules654/ppo-Huggy | 0 | 2 | ml-agents | 2023-05-09T21:32:11 | ---
library_name: ml-agents
tags:
- Huggy
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Huggy
---
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-Huggy
2. Step 1: Find your model_id: jules654/ppo-Huggy
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
| 933 | [
[
-0.030517578125,
-0.0343017578125,
0.0171356201171875,
0.0123748779296875,
-0.0134429931640625,
0.00931549072265625,
0.024627685546875,
-0.01548004150390625,
0.0484619140625,
0.0406494140625,
-0.0460205078125,
-0.0458984375,
-0.037384033203125,
-0.0096893310... |
Abdeldjalil21/djalil-base-sentiment-model-10k-samples | 2023-05-12T18:29:32.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | Abdeldjalil21 | null | null | Abdeldjalil21/djalil-base-sentiment-model-10k-samples | 0 | 2 | transformers | 2023-05-09T23:59:57 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: djalil-base-sentiment-model-10k-samples
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# djalil-base-sentiment-model-10k-samples
This model is a fine-tuned version of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4204
- Accuracy: 0.827
- F1: 0.8171
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
### Framework versions
- Transformers 4.29.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,223 | [
[
-0.0489501953125,
-0.045928955078125,
0.0143890380859375,
0.0280609130859375,
-0.04034423828125,
-0.0228729248046875,
-0.0293121337890625,
-0.005344390869140625,
0.01708984375,
0.018829345703125,
-0.05010986328125,
-0.06439208984375,
-0.04949951171875,
-0.00... |
Consensus/e5-base | 2023-05-10T00:04:56.000Z | [
"sentence-transformers",
"pytorch",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"endpoints_compatible",
"region:us"
] | sentence-similarity | Consensus | null | null | Consensus/e5-base | 0 | 2 | sentence-transformers | 2023-05-10T00:03:04 | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# {MODEL_NAME}
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
model = AutoModel.from_pretrained('{MODEL_NAME}')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information --> | 2,961 | [
[
-0.0197296142578125,
-0.055938720703125,
0.01861572265625,
0.02886962890625,
-0.02349853515625,
-0.03253173828125,
-0.01629638671875,
0.0014705657958984375,
0.01323699951171875,
0.0303497314453125,
-0.03924560546875,
-0.042633056640625,
-0.053070068359375,
-... |
paulokewunmi/claim_extractor_distilbert | 2023-05-10T03:00:21.000Z | [
"transformers",
"tf",
"distilbert",
"text-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | paulokewunmi | null | null | paulokewunmi/claim_extractor_distilbert | 0 | 2 | transformers | 2023-05-10T00:36:59 | ---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: claim_extractor_distilbert
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# claim_extractor_distilbert
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0859
- Train Sparse Categorical Accuracy: 0.9708
- Validation Loss: 0.2284
- Validation Sparse Categorical Accuracy: 0.9244
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 1e-05, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Sparse Categorical Accuracy | Validation Loss | Validation Sparse Categorical Accuracy | Epoch |
|:----------:|:---------------------------------:|:---------------:|:--------------------------------------:|:-----:|
| 0.2731 | 0.8882 | 0.1975 | 0.9200 | 0 |
| 0.1554 | 0.9437 | 0.1929 | 0.9229 | 1 |
| 0.0859 | 0.9708 | 0.2284 | 0.9244 | 2 |
### Framework versions
- Transformers 4.28.1
- TensorFlow 2.12.0
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,036 | [
[
-0.03924560546875,
-0.046478271484375,
0.0187530517578125,
0.0079803466796875,
-0.031158447265625,
-0.018402099609375,
-0.007720947265625,
-0.01477813720703125,
0.01300048828125,
0.007434844970703125,
-0.050384521484375,
-0.046630859375,
-0.068603515625,
-0.... |
jkefeli/PrimaryGleasonBERT | 2023-05-10T16:16:55.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | text-classification | jkefeli | null | null | jkefeli/PrimaryGleasonBERT | 1 | 2 | transformers | 2023-05-10T01:58:00 | To use the model, add the following from the transformers package:
(1) ClinicalBERT tokenizer:
tokenizer = AutoTokenizer.from_pretrained("emilyalsentzer/Bio_ClinicalBERT")
(2) Model type:
model = BertForSequenceClassification.from_pretrained(checkpoint_directory, num_labels=3)
| 285 | [
[
-0.0131988525390625,
-0.010223388671875,
0.042266845703125,
0.0167236328125,
-0.0290985107421875,
0.00623321533203125,
0.023468017578125,
-0.0067901611328125,
0.0237274169921875,
0.032958984375,
-0.04522705078125,
-0.048675537109375,
-0.057891845703125,
-0.0... |
xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-6 | 2023-05-10T02:38:22.000Z | [
"transformers",
"tf",
"albert",
"text-classification",
"generated_from_keras_callback",
"endpoints_compatible",
"region:us"
] | text-classification | xinyixiuxiu | null | null | xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-6 | 0 | 2 | transformers | 2023-05-10T02:01:47 | ---
tags:
- generated_from_keras_callback
model-index:
- name: xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-6
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-6
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0240
- Train Accuracy: 0.9926
- Validation Loss: 0.1901
- Validation Accuracy: 0.9507
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 3e-06, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.0240 | 0.9926 | 0.1901 | 0.9507 | 0 |
### Framework versions
- Transformers 4.28.1
- TensorFlow 2.7.0
- Datasets 2.10.1
- Tokenizers 0.12.1
| 1,441 | [
[
-0.029754638671875,
-0.031097412109375,
0.0273590087890625,
0.01132965087890625,
-0.035736083984375,
-0.0294342041015625,
-0.003261566162109375,
-0.0268096923828125,
0.005733489990234375,
0.01617431640625,
-0.053436279296875,
-0.038818359375,
-0.055572509765625,... |
hermanshid/distilbert-base-uncased-finetuned-sarcasm | 2023-05-11T09:37:45.000Z | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | hermanshid | null | null | hermanshid/distilbert-base-uncased-finetuned-sarcasm | 0 | 2 | transformers | 2023-05-10T04:09:48 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: distilbert-base-uncased-finetuned-sarcasm
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-sarcasm
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3075
- Matthews Correlation: 0.4109
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 57 | 0.7322 | 0.0 |
| No log | 2.0 | 114 | 0.6734 | 0.1752 |
| No log | 3.0 | 171 | 0.6436 | 0.3228 |
| No log | 4.0 | 228 | 0.7826 | 0.2778 |
| No log | 5.0 | 285 | 1.0203 | 0.2707 |
| No log | 6.0 | 342 | 1.0190 | 0.3356 |
| No log | 7.0 | 399 | 1.1675 | 0.3177 |
| No log | 8.0 | 456 | 1.5206 | 0.2514 |
| 0.3597 | 9.0 | 513 | 1.5789 | 0.4097 |
| 0.3597 | 10.0 | 570 | 1.5752 | 0.3740 |
| 0.3597 | 11.0 | 627 | 1.9003 | 0.3506 |
| 0.3597 | 12.0 | 684 | 1.9354 | 0.3855 |
| 0.3597 | 13.0 | 741 | 1.9770 | 0.3289 |
| 0.3597 | 14.0 | 798 | 1.9802 | 0.3583 |
| 0.3597 | 15.0 | 855 | 2.1322 | 0.3255 |
| 0.3597 | 16.0 | 912 | 2.1541 | 0.2994 |
| 0.3597 | 17.0 | 969 | 2.2047 | 0.2992 |
| 0.0329 | 18.0 | 1026 | 2.0794 | 0.3466 |
| 0.0329 | 19.0 | 1083 | 2.0705 | 0.3012 |
| 0.0329 | 20.0 | 1140 | 2.0158 | 0.3759 |
| 0.0329 | 21.0 | 1197 | 2.3999 | 0.3151 |
| 0.0329 | 22.0 | 1254 | 2.1017 | 0.3917 |
| 0.0329 | 23.0 | 1311 | 2.3275 | 0.3255 |
| 0.0329 | 24.0 | 1368 | 2.2258 | 0.3386 |
| 0.0329 | 25.0 | 1425 | 2.3628 | 0.3406 |
| 0.0329 | 26.0 | 1482 | 2.4197 | 0.3077 |
| 0.0145 | 27.0 | 1539 | 2.2661 | 0.3759 |
| 0.0145 | 28.0 | 1596 | 2.4074 | 0.3077 |
| 0.0145 | 29.0 | 1653 | 2.3326 | 0.3255 |
| 0.0145 | 30.0 | 1710 | 2.2813 | 0.3740 |
| 0.0145 | 31.0 | 1767 | 2.3242 | 0.3181 |
| 0.0145 | 32.0 | 1824 | 2.5039 | 0.2930 |
| 0.0145 | 33.0 | 1881 | 2.6045 | 0.3151 |
| 0.0145 | 34.0 | 1938 | 2.3075 | 0.4109 |
| 0.0145 | 35.0 | 1995 | 2.3572 | 0.3759 |
| 0.0129 | 36.0 | 2052 | 2.3833 | 0.3759 |
| 0.0129 | 37.0 | 2109 | 2.6260 | 0.3009 |
| 0.0129 | 38.0 | 2166 | 2.6132 | 0.3289 |
| 0.0129 | 39.0 | 2223 | 2.4151 | 0.3989 |
| 0.0129 | 40.0 | 2280 | 2.5695 | 0.3360 |
| 0.0129 | 41.0 | 2337 | 2.3902 | 0.3989 |
| 0.0129 | 42.0 | 2394 | 2.4388 | 0.3759 |
| 0.0129 | 43.0 | 2451 | 2.6323 | 0.3289 |
| 0.0065 | 44.0 | 2508 | 2.6131 | 0.3553 |
| 0.0065 | 45.0 | 2565 | 2.4426 | 0.3958 |
| 0.0065 | 46.0 | 2622 | 2.4481 | 0.3958 |
| 0.0065 | 47.0 | 2679 | 2.4440 | 0.3958 |
| 0.0065 | 48.0 | 2736 | 2.4689 | 0.3784 |
| 0.0065 | 49.0 | 2793 | 2.4725 | 0.3784 |
| 0.0065 | 50.0 | 2850 | 2.4718 | 0.3784 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 5,073 | [
[
-0.03936767578125,
-0.0411376953125,
0.0169219970703125,
0.008270263671875,
-0.002422332763671875,
-0.00005638599395751953,
0.004322052001953125,
0.0031833648681640625,
0.0469970703125,
0.0215911865234375,
-0.041046142578125,
-0.049346923828125,
-0.0509338378906... |
xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-5-1 | 2023-05-11T08:36:29.000Z | [
"transformers",
"tf",
"albert",
"text-classification",
"generated_from_keras_callback",
"endpoints_compatible",
"region:us"
] | text-classification | xinyixiuxiu | null | null | xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-5-1 | 0 | 2 | transformers | 2023-05-10T04:57:48 | ---
tags:
- generated_from_keras_callback
model-index:
- name: xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-5-1
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# xinyixiuxiu/albert-xxlarge-v2-SST2-incremental_pre_training-epoch1-5-1
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0334
- Train Accuracy: 0.9893
- Validation Loss: 0.1265
- Validation Accuracy: 0.9599
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 3e-06, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.0334 | 0.9893 | 0.1265 | 0.9599 | 0 |
### Framework versions
- Transformers 4.28.1
- TensorFlow 2.7.0
- Datasets 2.10.1
- Tokenizers 0.12.1
| 1,445 | [
[
-0.030792236328125,
-0.03173828125,
0.0269317626953125,
0.010833740234375,
-0.036285400390625,
-0.0302276611328125,
-0.004070281982421875,
-0.0274200439453125,
0.0058135986328125,
0.0148773193359375,
-0.053375244140625,
-0.0400390625,
-0.056060791015625,
-0.... |
Winnie-Kay/Sentiment-Analysis-Roberta-bases | 2023-05-11T04:00:08.000Z | [
"transformers",
"pytorch",
"roberta",
"text-classification",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | Winnie-Kay | null | null | Winnie-Kay/Sentiment-Analysis-Roberta-bases | 0 | 2 | transformers | 2023-05-10T06:12:14 | ---
license: mit
tags:
- generated_from_trainer
model-index:
- name: Finetuned_bert_model
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Finetuned_bert_model
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5644
- Rmse: 0.6048
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rmse |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.6429 | 4.0 | 500 | 0.5644 | 0.6048 |
### Framework versions
- Transformers 4.29.0
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,420 | [
[
-0.04351806640625,
-0.05810546875,
0.01245880126953125,
0.011993408203125,
-0.0283660888671875,
-0.04168701171875,
-0.0296783447265625,
-0.0156707763671875,
0.00445556640625,
0.0264892578125,
-0.06658935546875,
-0.037841796875,
-0.05413818359375,
-0.01934814... |
IRI2070/dal-sbert-address-v1 | 2023-05-10T08:14:52.000Z | [
"sentence-transformers",
"pytorch",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"endpoints_compatible",
"region:us"
] | sentence-similarity | IRI2070 | null | null | IRI2070/dal-sbert-address-v1 | 0 | 2 | sentence-transformers | 2023-05-10T08:14:30 | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# {MODEL_NAME}
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
model = AutoModel.from_pretrained('{MODEL_NAME}')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 7325 with parameters:
```
{'batch_size': 64, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
Parameters of the fit()-Method:
```
{
"epochs": 4,
"evaluation_steps": 1000,
"evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
"max_grad_norm": 1,
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 2930,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 258, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information --> | 3,785 | [
[
-0.018768310546875,
-0.06280517578125,
0.0206146240234375,
0.0229644775390625,
-0.0202789306640625,
-0.031829833984375,
-0.0186614990234375,
0.0020236968994140625,
0.0171966552734375,
0.0268707275390625,
-0.047088623046875,
-0.046661376953125,
-0.052581787109375... |
babs001seye/distilbert-base-uncased-finetuned-cola | 2023-05-10T09:52:44.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | babs001seye | null | null | babs001seye/distilbert-base-uncased-finetuned-cola | 0 | 2 | transformers | 2023-05-10T09:21:18 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: distilbert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5425688103069501
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-cola
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8207
- Matthews Correlation: 0.5426
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5281 | 1.0 | 535 | 0.5314 | 0.3856 |
| 0.3528 | 2.0 | 1070 | 0.4721 | 0.4975 |
| 0.2407 | 3.0 | 1605 | 0.5518 | 0.5245 |
| 0.1785 | 4.0 | 2140 | 0.7532 | 0.5139 |
| 0.1367 | 5.0 | 2675 | 0.8207 | 0.5426 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,042 | [
[
-0.02166748046875,
-0.050018310546875,
0.01096343994140625,
0.0175323486328125,
-0.02288818359375,
-0.00925445556640625,
-0.00650787353515625,
-0.0034198760986328125,
0.022796630859375,
0.01025390625,
-0.046600341796875,
-0.037078857421875,
-0.0625,
-0.00694... |
ctu-aic/xlm-roberta-large-squad2-csfever_v2-f1 | 2023-05-10T22:01:33.000Z | [
"sentence-transformers",
"pytorch",
"xlm-roberta",
"text-classification",
"cs",
"dataset:ctu-aic/csfever_v2",
"license:cc-by-sa-4.0",
"region:us"
] | text-classification | ctu-aic | null | null | ctu-aic/xlm-roberta-large-squad2-csfever_v2-f1 | 0 | 2 | sentence-transformers | 2023-05-10T09:24:30 | ---
license: cc-by-sa-4.0
datasets:
- ctu-aic/csfever_v2
language:
- cs
library_name: sentence-transformers
pipeline_tag: text-classification
---
# Model Card for xlm-roberta-large-squad2-csfever_v2-f1
## Model Details
Model for natural language inference trained as a part of bachelor thesis.
## Uses
### Transformers
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("ctu-aic/xlm-roberta-large-squad2-csfever_v2-f1")
tokenizer = AutoTokenizer.from_pretrained("ctu-aic/xlm-roberta-large-squad2-csfever_v2-f1")
```
### Sentence Transformers
```python
from sentence_transformers.cross_encoder import CrossEncoder
model = CrossEncoder('ctu-aic/xlm-roberta-large-squad2-csfever_v2-f1')
scores = model.predict([["My first context.", "My first hypothesis."],
["Second context.", "Hypothesis."]])
``` | 921 | [
[
-0.01971435546875,
-0.051788330078125,
0.0191802978515625,
0.037109375,
-0.003570556640625,
-0.00554656982421875,
-0.005107879638671875,
-0.01416778564453125,
-0.01088714599609375,
0.04052734375,
-0.05145263671875,
-0.039947509765625,
-0.050079345703125,
0.0... |
rishabhjain16/whisper_medium_to_myst_pf_ot50 | 2023-05-15T14:11:21.000Z | [
"transformers",
"pytorch",
"tensorboard",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | rishabhjain16 | null | null | rishabhjain16/whisper_medium_to_myst_pf_ot50 | 0 | 2 | transformers | 2023-05-10T09:43:08 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: openai/whisper-medium
results:
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/infer_so_chinese
type: rishabhjain16/infer_so_chinese
config: en
split: test
metrics:
- type: wer
value: 16.02
name: WER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/infer_pf_italian
type: rishabhjain16/infer_pf_italian
config: en
split: test
metrics:
- type: wer
value: 5.1
name: WER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/infer_pf_german
type: rishabhjain16/infer_pf_german
config: en
split: test
metrics:
- type: wer
value: 34.59
name: WER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/infer_pf_swedish
type: rishabhjain16/infer_pf_swedish
config: en
split: test
metrics:
- type: wer
value: 9.12
name: WER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/libritts_dev_clean
type: rishabhjain16/libritts_dev_clean
config: en
split: test
metrics:
- type: wer
value: 5.33
name: WER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/infer_cmu
type: rishabhjain16/infer_cmu
config: en
split: test
metrics:
- type: wer
value: 9.33
name: WER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/infer_pfs
type: rishabhjain16/infer_pfs
config: en
split: test
metrics:
- type: wer
value: 3.15
name: WER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/infer_myst
type: rishabhjain16/infer_myst
config: en
split: test
metrics:
- type: wer
value: 11.73
name: WER
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# openai/whisper-medium
This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3853
- Wer: 10.4258
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 4000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| 0.2536 | 0.12 | 500 | 0.2608 | 11.8586 |
| 0.3687 | 1.1 | 1000 | 0.2578 | 11.4576 |
| 0.1522 | 2.07 | 1500 | 0.2613 | 12.7949 |
| 0.0387 | 3.05 | 2000 | 0.2952 | 10.9378 |
| 0.014 | 4.02 | 2500 | 0.3271 | 10.6813 |
| 0.0186 | 4.14 | 3000 | 0.3389 | 10.3970 |
| 0.0057 | 5.12 | 3500 | 0.3670 | 10.6380 |
| 0.0108 | 6.09 | 4000 | 0.3853 | 10.4258 |
### Framework versions
- Transformers 4.29.0
- Pytorch 1.14.0a0+44dac51
- Datasets 2.12.0
- Tokenizers 0.13.3
| 4,114 | [
[
-0.035858154296875,
-0.04052734375,
0.0018100738525390625,
0.01287078857421875,
-0.0213775634765625,
-0.04150390625,
-0.0149993896484375,
-0.0182342529296875,
0.0218048095703125,
0.0266265869140625,
-0.053375244140625,
-0.044281005859375,
-0.04498291015625,
... |
rishabhjain16/whisper_medium_en_to_myst_pf_ot100 | 2023-05-12T15:05:57.000Z | [
"transformers",
"pytorch",
"tensorboard",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | rishabhjain16 | null | null | rishabhjain16/whisper_medium_en_to_myst_pf_ot100 | 0 | 2 | transformers | 2023-05-10T09:45:26 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: openai/whisper-medium.en
results:
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/infer_myst
type: rishabhjain16/infer_myst
config: en
split: test
metrics:
- type: wer
value: 12.3
name: WER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/infer_pfs
type: rishabhjain16/infer_pfs
config: en
split: test
metrics:
- type: wer
value: 3.28
name: WER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/infer_cmu
type: rishabhjain16/infer_cmu
config: en
split: test
metrics:
- type: wer
value: 9.53
name: WER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/libritts_dev_clean
type: rishabhjain16/libritts_dev_clean
config: en
split: test
metrics:
- type: wer
value: 5.01
name: WER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/infer_pf_swedish
type: rishabhjain16/infer_pf_swedish
config: en
split: test
metrics:
- type: wer
value: 8.94
name: WER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/infer_pf_german
type: rishabhjain16/infer_pf_german
config: en
split: test
metrics:
- type: wer
value: 34.78
name: WER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/infer_pf_italian
type: rishabhjain16/infer_pf_italian
config: en
split: test
metrics:
- type: wer
value: 4.42
name: WER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: rishabhjain16/infer_so_chinese
type: rishabhjain16/infer_so_chinese
config: en
split: test
metrics:
- type: wer
value: 14.87
name: WER
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# openai/whisper-medium.en
This model is a fine-tuned version of [openai/whisper-medium.en](https://huggingface.co/openai/whisper-medium.en) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4158
- Wer: 10.8712
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 4000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| 0.6148 | 0.12 | 500 | 0.3107 | 12.7838 |
| 0.1877 | 1.09 | 1000 | 0.2892 | 11.2910 |
| 0.0697 | 2.05 | 1500 | 0.3146 | 10.7857 |
| 0.0748 | 3.02 | 2000 | 0.3162 | 11.5254 |
| 0.0308 | 3.14 | 2500 | 0.3450 | 11.1111 |
| 0.0192 | 4.11 | 3000 | 0.3720 | 10.9101 |
| 0.0046 | 5.07 | 3500 | 0.4155 | 11.2344 |
| 0.0096 | 6.03 | 4000 | 0.4158 | 10.8712 |
### Framework versions
- Transformers 4.29.0
- Pytorch 1.14.0a0+44dac51
- Datasets 2.12.0
- Tokenizers 0.13.3
| 4,126 | [
[
-0.03363037109375,
-0.03753662109375,
0.0031719207763671875,
0.0164794921875,
-0.0252532958984375,
-0.04541015625,
-0.0171051025390625,
-0.019927978515625,
0.0189361572265625,
0.02728271484375,
-0.052734375,
-0.0443115234375,
-0.04547119140625,
-0.0184783935... |
ctu-aic/xlm-roberta-large-squad2-csfever_v2-precision | 2023-05-10T22:01:34.000Z | [
"sentence-transformers",
"pytorch",
"xlm-roberta",
"text-classification",
"cs",
"dataset:ctu-aic/csfever_v2",
"license:cc-by-sa-4.0",
"region:us"
] | text-classification | ctu-aic | null | null | ctu-aic/xlm-roberta-large-squad2-csfever_v2-precision | 0 | 2 | sentence-transformers | 2023-05-10T09:54:21 | ---
license: cc-by-sa-4.0
datasets:
- ctu-aic/csfever_v2
language:
- cs
library_name: sentence-transformers
pipeline_tag: text-classification
---
# Model Card for xlm-roberta-large-squad2-csfever_v2-precision
## Model Details
Model for natural language inference trained as a part of bachelor thesis.
## Uses
### Transformers
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("ctu-aic/xlm-roberta-large-squad2-csfever_v2-precision")
tokenizer = AutoTokenizer.from_pretrained("ctu-aic/xlm-roberta-large-squad2-csfever_v2-precision")
```
### Sentence Transformers
```python
from sentence_transformers.cross_encoder import CrossEncoder
model = CrossEncoder('ctu-aic/xlm-roberta-large-squad2-csfever_v2-precision')
scores = model.predict([["My first context.", "My first hypothesis."],
["Second context.", "Hypothesis."]])
``` | 949 | [
[
-0.0176849365234375,
-0.051849365234375,
0.0195770263671875,
0.034515380859375,
-0.002620697021484375,
-0.00769805908203125,
-0.0100250244140625,
-0.01175689697265625,
-0.00968170166015625,
0.039642333984375,
-0.0474853515625,
-0.042083740234375,
-0.052581787109... |
ctu-aic/xlm-roberta-large-squad2-csfever_v2-07 | 2023-05-10T22:01:33.000Z | [
"sentence-transformers",
"pytorch",
"xlm-roberta",
"text-classification",
"cs",
"dataset:ctu-aic/csfever_v2",
"license:cc-by-sa-4.0",
"region:us"
] | text-classification | ctu-aic | null | null | ctu-aic/xlm-roberta-large-squad2-csfever_v2-07 | 0 | 2 | sentence-transformers | 2023-05-10T09:55:21 | ---
license: cc-by-sa-4.0
datasets:
- ctu-aic/csfever_v2
language:
- cs
library_name: sentence-transformers
pipeline_tag: text-classification
---
# Model Card for xlm-roberta-large-squad2-csfever_v2-07
## Model Details
Model for natural language inference trained as a part of bachelor thesis.
## Uses
### Transformers
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("ctu-aic/xlm-roberta-large-squad2-csfever_v2-07")
tokenizer = AutoTokenizer.from_pretrained("ctu-aic/xlm-roberta-large-squad2-csfever_v2-07")
```
### Sentence Transformers
```python
from sentence_transformers.cross_encoder import CrossEncoder
model = CrossEncoder('ctu-aic/xlm-roberta-large-squad2-csfever_v2-07')
scores = model.predict([["My first context.", "My first hypothesis."],
["Second context.", "Hypothesis."]])
``` | 921 | [
[
-0.0188751220703125,
-0.0504150390625,
0.019561767578125,
0.0369873046875,
-0.00353240966796875,
-0.005809783935546875,
-0.00415802001953125,
-0.01413726806640625,
-0.01190948486328125,
0.041778564453125,
-0.050445556640625,
-0.040496826171875,
-0.05096435546875... |
Cynthiaiii4/Text_classification_model_blu_v1 | 2023-05-10T13:56:04.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | Cynthiaiii4 | null | null | Cynthiaiii4/Text_classification_model_blu_v1 | 0 | 2 | transformers | 2023-05-10T11:27:56 | ---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: Text_classification_model_blu_v1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Text_classification_model_blu_v1
This model is a fine-tuned version of [bert-large-uncased](https://huggingface.co/bert-large-uncased) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 40
- eval_batch_size: 40
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,061 | [
[
-0.032501220703125,
-0.03692626953125,
0.0053558349609375,
0.01548004150390625,
-0.037933349609375,
-0.0285797119140625,
-0.01090240478515625,
-0.037017822265625,
0.012847900390625,
0.02197265625,
-0.04815673828125,
-0.043243408203125,
-0.041015625,
-0.01004... |
guyyanko/split-3-hebrew-trc-alephbert-base-EMP | 2023-05-10T13:43:19.000Z | [
"transformers",
"pytorch",
"TemporalRelationClassification",
"text-classification",
"custom_code",
"he",
"dataset:guyyanko/hebrew-trc-special-markers",
"region:us"
] | text-classification | guyyanko | null | null | guyyanko/split-3-hebrew-trc-alephbert-base-EMP | 0 | 2 | transformers | 2023-05-10T11:30:05 | ---
datasets:
- guyyanko/hebrew-trc-special-markers
language:
- he
---
```python
from transformers import TextClassificationPipeline
from transformers import pipeline, AutoTokenizer, AutoModelForSequenceClassification
class TemporalRelationClassificationPipeline(TextClassificationPipeline):
def check_model_type(self, supported_models):
pass
pretrained_checkpoint = "guyyanko/split-3-hebrew-trc-alephbert-base-EMP"
model = AutoModelForSequenceClassification.from_pretrained(pretrained_checkpoint, trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained(pretrained_checkpoint, trust_remote_code=True)
classifier = pipeline(task='text-classification', model=model, tokenizer=tokenizer)
txt = "מחר [א1] אתאמן [/א1] אם [א2] אסיים [/א2] את כל המשימות שלי"
print(classifier(txt))
txt = "אחרי [א1] שאסיים [/א1] את כל המשימות שלי [א2] אלך [/א2] להתאמן בחדר הכושר"
print(classifier(txt))
```
| 915 | [
[
-0.004428863525390625,
-0.0229339599609375,
0.0149078369140625,
0.021636962890625,
-0.028045654296875,
0.0080108642578125,
0.01262664794921875,
-0.00524139404296875,
-0.01375579833984375,
0.017669677734375,
-0.048370361328125,
-0.025421142578125,
-0.064392089843... |
Mizuiro-sakura/deberta-v2-large-japanese-finetuned-ner | 2023-07-21T14:10:02.000Z | [
"transformers",
"pytorch",
"safetensors",
"deberta-v2",
"token-classification",
"deberta",
"named entity recognition",
"named-entity-recognition",
"ner",
"ja",
"dataset:wikipedia",
"dataset:cc100",
"dataset:oscar",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"regio... | token-classification | Mizuiro-sakura | null | null | Mizuiro-sakura/deberta-v2-large-japanese-finetuned-ner | 0 | 2 | transformers | 2023-05-10T13:22:23 | ---
license: mit
language: ja
library_name: transformers
tags:
- pytorch
- deberta
- deberta-v2
- named entity recognition
- named-entity-recognition
- ner
datasets:
- wikipedia
- cc100
- oscar
metrics:
- accuracy
---
# このモデルはdeberta-v2-large-japaneseをファインチューニングして固有表現抽出(NER)に用いれるようにしたものです。
このモデルはdeberta-v2-large-japaneseを Wikipediaを用いた日本語の固有表現抽出データセット(ストックマーク社、https://github.com/stockmarkteam/ner-wikipedia-dataset )を用いてファインチューニングしたものです。
# This model is fine-tuned model for Named Entity Recognition (NER) which is based on deberta-v2-large-japanese
This model is fine-tuned by using Wikipedia dataset.
You could use this model for NER tasks.
# How to use 使い方
transformersおよびpytorch、sentencepiece、Juman++をインストールしてください。
以下のコードを実行することで、固有表現抽出タスクを解かせることができます。 please execute this code.
```python
from transformers import AutoTokenizer,pipeline, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained('Mizuiro-sakura/deberta-v2-large-japanese-finetuned-ner')
model=AutoModelForTokenClassification.from_pretrained('Mizuiro-sakura/deberta-v2-large-japanese-finetuned-ner') # 学習済みモデルの読み込み
text=('昨日は東京で買い物をした')
ner=pipeline('ner', model=model, tokenizer=tokenizer)
result=ner(text)
print(result)
```
# モデルの精度 accuracy of model
全体:0.7974729241877256
precision recall f1-score support
その他の組織名 0.72 0.72 0.72 238
イベント名 0.73 0.85 0.79 215
人名 0.83 0.89 0.86 547
地名 0.79 0.80 0.80 446
政治的組織名 0.78 0.83 0.80 263
施設名 0.74 0.84 0.79 241
法人名 0.84 0.80 0.82 487
製品名 0.65 0.78 0.71 252
micro avg 0.77 0.82 0.80 2689
macro avg 0.76 0.82 0.79 2689
weighted avg 0.78 0.82 0.80 2689
# deberta-v2-base-japaneseとは?
日本語Wikipedeia(3.2GB)および、cc100(85GB)、oscar(54GB)を用いて訓練されたモデルです。
京都大学黒橋研究室が公表されました。
# Model description
This is a Japanese DeBERTa V2 base model pre-trained on Japanese Wikipedia, the Japanese portion of CC-100, and the Japanese portion of OSCAR.
# Acknowledgments 謝辞
モデルを公開してくださった京都大学黒橋研究室には感謝いたします。
I would like to thank Kurohashi Lab at Kyoto University.
| 2,377 | [
[
-0.0325927734375,
-0.045928955078125,
0.0282745361328125,
0.00270843505859375,
-0.04241943359375,
-0.005401611328125,
-0.01531982421875,
-0.0302886962890625,
0.03912353515625,
0.021270751953125,
-0.03582763671875,
-0.03167724609375,
-0.0677490234375,
0.01393... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.