The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
Error code: DatasetGenerationCastError
Exception: DatasetGenerationCastError
Message: An error occurred while generating the dataset
All the data files must have the same columns, but at some point there are 1 new columns ({'perturbation'}) and 1 missing columns ({'statistic'}).
This happened while the csv dataset builder was generating data using
hf://datasets/matthewshu/scgpt-replogle-activations/ablations/shuffle_seed42/eval/results.csv (at revision c5f07828f8774251e5fc9ed7a6e72b8a72976506), ['hf://datasets/matthewshu/scgpt-replogle-activations@c5f07828f8774251e5fc9ed7a6e72b8a72976506/ablations/shuffle_seed42/eval/agg_results.csv', 'hf://datasets/matthewshu/scgpt-replogle-activations@c5f07828f8774251e5fc9ed7a6e72b8a72976506/ablations/shuffle_seed42/eval/results.csv']
Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)
Traceback: Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1800, in _prepare_split_single
writer.write_table(table)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 765, in write_table
self._write_table(pa_table, writer_batch_size=writer_batch_size)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 773, in _write_table
pa_table = table_cast(pa_table, self._schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2321, in table_cast
return cast_table_to_schema(table, schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2249, in cast_table_to_schema
raise CastError(
datasets.table.CastError: Couldn't cast
perturbation: string
overlap_at_N: double
overlap_at_50: double
overlap_at_100: double
overlap_at_200: double
overlap_at_500: double
precision_at_N: double
precision_at_50: double
precision_at_100: double
precision_at_200: double
precision_at_500: double
de_spearman_sig: double
de_direction_match: double
de_spearman_lfc_sig: double
de_sig_genes_recall: double
de_nsig_counts_real: double
de_nsig_counts_pred: double
pr_auc: double
roc_auc: double
pearson_delta: double
mse: double
mae: double
mse_delta: double
mae_delta: double
discrimination_score_l1: double
discrimination_score_l2: double
discrimination_score_cosine: double
pearson_edistance: double
clustering_agreement: double
-- schema metadata --
pandas: '{"index_columns": [{"kind": "range", "name": null, "start": 0, "' + 4040
to
{'statistic': Value('string'), 'overlap_at_N': Value('float64'), 'overlap_at_50': Value('float64'), 'overlap_at_100': Value('float64'), 'overlap_at_200': Value('float64'), 'overlap_at_500': Value('float64'), 'precision_at_N': Value('float64'), 'precision_at_50': Value('float64'), 'precision_at_100': Value('float64'), 'precision_at_200': Value('float64'), 'precision_at_500': Value('float64'), 'de_spearman_sig': Value('float64'), 'de_direction_match': Value('float64'), 'de_spearman_lfc_sig': Value('float64'), 'de_sig_genes_recall': Value('float64'), 'de_nsig_counts_real': Value('float64'), 'de_nsig_counts_pred': Value('float64'), 'pr_auc': Value('float64'), 'roc_auc': Value('float64'), 'pearson_delta': Value('float64'), 'mse': Value('float64'), 'mae': Value('float64'), 'mse_delta': Value('float64'), 'mae_delta': Value('float64'), 'discrimination_score_l1': Value('float64'), 'discrimination_score_l2': Value('float64'), 'discrimination_score_cosine': Value('float64'), 'pearson_edistance': Value('float64'), 'clustering_agreement': Value('float64')}
because column names don't match
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1342, in compute_config_parquet_and_info_response
parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 907, in stream_convert_to_parquet
builder._prepare_split(split_generator=splits_generators[split], file_format="parquet")
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1646, in _prepare_split
for job_id, done, content in self._prepare_split_single(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1802, in _prepare_split_single
raise DatasetGenerationCastError.from_cast_error(
datasets.exceptions.DatasetGenerationCastError: An error occurred while generating the dataset
All the data files must have the same columns, but at some point there are 1 new columns ({'perturbation'}) and 1 missing columns ({'statistic'}).
This happened while the csv dataset builder was generating data using
hf://datasets/matthewshu/scgpt-replogle-activations/ablations/shuffle_seed42/eval/results.csv (at revision c5f07828f8774251e5fc9ed7a6e72b8a72976506), ['hf://datasets/matthewshu/scgpt-replogle-activations@c5f07828f8774251e5fc9ed7a6e72b8a72976506/ablations/shuffle_seed42/eval/agg_results.csv', 'hf://datasets/matthewshu/scgpt-replogle-activations@c5f07828f8774251e5fc9ed7a6e72b8a72976506/ablations/shuffle_seed42/eval/results.csv']
Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
statistic string | overlap_at_N float64 | overlap_at_50 float64 | overlap_at_100 float64 | overlap_at_200 float64 | overlap_at_500 float64 | precision_at_N float64 | precision_at_50 float64 | precision_at_100 float64 | precision_at_200 float64 | precision_at_500 float64 | de_spearman_sig float64 | de_direction_match float64 | de_spearman_lfc_sig float64 | de_sig_genes_recall float64 | de_nsig_counts_real float64 | de_nsig_counts_pred float64 | pr_auc float64 | roc_auc float64 | pearson_delta float64 | mse float64 | mae float64 | mse_delta float64 | mae_delta float64 | discrimination_score_l1 float64 | discrimination_score_l2 float64 | discrimination_score_cosine float64 | pearson_edistance float64 | clustering_agreement float64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
count | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 | 1,047 |
null_count | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
mean | 0.63259 | 0.090583 | 0.087927 | 0.110153 | 0.158151 | 0.588729 | 0.090583 | 0.087927 | 0.110153 | 0.158151 | 0.916601 | 0.469822 | 0.079939 | 0.815634 | 3,966.452722 | 5,500.504298 | 0.555201 | 0.41902 | 0.085208 | 0.104795 | 0.212271 | 0.104795 | 0.212271 | 0.503354 | 0.505236 | 0.530627 | 0.223858 | 0.025786 |
std | 0.078228 | 0.037812 | 0.030393 | 0.027201 | 0.024283 | 0.094403 | 0.037812 | 0.030393 | 0.027201 | 0.024283 | 0 | 0.01658 | 0.042051 | 0.064663 | 582.537636 | 351.451869 | 0.101583 | 0.022276 | 0.020607 | 0.013493 | 0.015189 | 0.013493 | 0.015189 | 0.28978 | 0.290614 | 0.297448 | 0 | 0 |
min | 0.460031 | 0 | 0 | 0.025 | 0.088 | 0.363479 | 0 | 0 | 0.025 | 0.088 | 0.916601 | 0.403384 | -0.031896 | 0.641559 | 2,525 | 4,575 | 0.32681 | 0.346431 | 0.036362 | 0.079841 | 0.182399 | 0.079841 | 0.182399 | 0.00191 | 0.000955 | 0.000955 | 0.223858 | 0.025786 |
25% | 0.576106 | 0.08 | 0.07 | 0.095 | 0.14 | 0.520494 | 0.08 | 0.07 | 0.095 | 0.14 | 0.916601 | 0.459967 | 0.050822 | 0.770672 | 3,550 | 5,266 | 0.476562 | 0.403374 | 0.071658 | 0.093698 | 0.199635 | 0.093698 | 0.199635 | 0.254059 | 0.254059 | 0.267431 | 0.223858 | 0.025786 |
50% | 0.632494 | 0.1 | 0.09 | 0.115 | 0.158 | 0.589559 | 0.1 | 0.09 | 0.115 | 0.158 | 0.916601 | 0.469036 | 0.07764 | 0.82219 | 3,969 | 5,538 | 0.550832 | 0.419929 | 0.082494 | 0.102312 | 0.209962 | 0.102312 | 0.209962 | 0.503343 | 0.503343 | 0.553009 | 0.223858 | 0.025786 |
75% | 0.685449 | 0.12 | 0.1 | 0.13 | 0.176 | 0.654276 | 0.12 | 0.1 | 0.13 | 0.176 | 0.916601 | 0.47973 | 0.108316 | 0.864194 | 4,351 | 5,755 | 0.625073 | 0.434231 | 0.096322 | 0.113642 | 0.222515 | 0.113642 | 0.222515 | 0.757402 | 0.759312 | 0.798472 | 0.223858 | 0.025786 |
max | 0.927512 | 0.18 | 0.21 | 0.185 | 0.238 | 0.926285 | 0.18 | 0.21 | 0.185 | 0.238 | 0.916601 | 0.542978 | 0.235903 | 0.9771 | 6,070 | 6,403 | 0.924476 | 0.516539 | 0.16931 | 0.162015 | 0.272521 | 0.162015 | 0.272521 | 1 | 1 | 1 | 0.223858 | 0.025786 |
null | 0.61641 | 0.12 | 0.12 | 0.13 | 0.172 | 0.573186 | 0.12 | 0.12 | 0.13 | 0.172 | 0.916601 | 0.467729 | 0.027056 | 0.840084 | 3,827 | 5,609 | 0.540747 | 0.431804 | 0.060022 | 0.093036 | 0.197797 | 0.093036 | 0.197797 | 0.860554 | 0.863419 | 0.848138 | 0.223858 | 0.025786 |
null | 0.620269 | 0.04 | 0.05 | 0.08 | 0.126 | 0.575094 | 0.04 | 0.05 | 0.08 | 0.126 | 0.916601 | 0.444501 | 0.134952 | 0.780036 | 3,937 | 5,340 | 0.521234 | 0.376877 | 0.109491 | 0.122337 | 0.233363 | 0.122337 | 0.233363 | 0.060172 | 0.063992 | 0.055396 | 0.223858 | 0.025786 |
null | 0.647565 | 0.12 | 0.09 | 0.13 | 0.18 | 0.610339 | 0.12 | 0.09 | 0.13 | 0.18 | 0.916601 | 0.469257 | 0.035328 | 0.853665 | 4,066 | 5,687 | 0.575557 | 0.427561 | 0.074622 | 0.091693 | 0.198182 | 0.091693 | 0.198182 | 0.78128 | 0.818529 | 0.483286 | 0.223858 | 0.025786 |
null | 0.577334 | 0.08 | 0.07 | 0.085 | 0.16 | 0.523114 | 0.08 | 0.07 | 0.085 | 0.16 | 0.916601 | 0.473003 | 0.026129 | 0.785996 | 3,556 | 5,343 | 0.485086 | 0.415221 | 0.079486 | 0.095749 | 0.20138 | 0.095749 | 0.20138 | 0.626552 | 0.628462 | 0.739255 | 0.223858 | 0.025786 |
null | 0.570706 | 0.06 | 0.08 | 0.095 | 0.132 | 0.516123 | 0.06 | 0.08 | 0.095 | 0.132 | 0.916601 | 0.463312 | 0.088598 | 0.746978 | 3,557 | 5,148 | 0.470612 | 0.393543 | 0.082386 | 0.119207 | 0.227614 | 0.119207 | 0.227614 | 0.179561 | 0.163324 | 0.107927 | 0.223858 | 0.025786 |
null | 0.582662 | 0.08 | 0.07 | 0.11 | 0.128 | 0.537159 | 0.08 | 0.07 | 0.11 | 0.128 | 0.916601 | 0.492081 | 0.065471 | 0.793276 | 3,599 | 5,315 | 0.512265 | 0.44871 | 0.054958 | 0.126046 | 0.232025 | 0.126046 | 0.232025 | 0.184336 | 0.114613 | 0.911175 | 0.223858 | 0.025786 |
null | 0.560465 | 0.1 | 0.11 | 0.12 | 0.156 | 0.501898 | 0.1 | 0.11 | 0.12 | 0.156 | 0.916601 | 0.482267 | 0.115036 | 0.768895 | 3,440 | 5,270 | 0.46186 | 0.406562 | 0.105046 | 0.096158 | 0.202922 | 0.096158 | 0.202922 | 0.553009 | 0.56447 | 0.30468 | 0.223858 | 0.025786 |
null | 0.721969 | 0.12 | 0.1 | 0.115 | 0.182 | 0.694813 | 0.12 | 0.1 | 0.115 | 0.182 | 0.916601 | 0.465192 | 0.012909 | 0.897636 | 4,611 | 5,957 | 0.674151 | 0.439459 | 0.068985 | 0.090005 | 0.195413 | 0.090005 | 0.195413 | 0.926457 | 0.928367 | 0.479465 | 0.223858 | 0.025786 |
null | 0.555458 | 0.08 | 0.09 | 0.105 | 0.172 | 0.493733 | 0.08 | 0.09 | 0.105 | 0.172 | 0.916601 | 0.483172 | 0.089653 | 0.737782 | 3,417 | 5,106 | 0.452408 | 0.39919 | 0.066476 | 0.110668 | 0.217509 | 0.110668 | 0.217509 | 0.325692 | 0.330468 | 0.336199 | 0.223858 | 0.025786 |
null | 0.712248 | 0.12 | 0.1 | 0.13 | 0.194 | 0.687194 | 0.12 | 0.1 | 0.13 | 0.194 | 0.916601 | 0.469052 | 0.05305 | 0.893986 | 4,556 | 5,927 | 0.671413 | 0.451534 | 0.072404 | 0.089596 | 0.194274 | 0.089596 | 0.194274 | 0.988539 | 0.983763 | 0.807068 | 0.223858 | 0.025786 |
null | 0.643704 | 0.1 | 0.13 | 0.115 | 0.152 | 0.597151 | 0.1 | 0.13 | 0.115 | 0.152 | 0.916601 | 0.453333 | 0.133484 | 0.807407 | 4,050 | 5,476 | 0.544789 | 0.38355 | 0.094138 | 0.116631 | 0.226599 | 0.116631 | 0.226599 | 0.212989 | 0.226361 | 0.266476 | 0.223858 | 0.025786 |
null | 0.698676 | 0.1 | 0.11 | 0.13 | 0.16 | 0.667714 | 0.1 | 0.11 | 0.13 | 0.16 | 0.916601 | 0.443796 | 0.054028 | 0.857976 | 4,457 | 5,727 | 0.634096 | 0.41491 | 0.070482 | 0.103234 | 0.213859 | 0.103234 | 0.213859 | 0.456543 | 0.499522 | 0.152818 | 0.223858 | 0.025786 |
null | 0.642161 | 0.1 | 0.13 | 0.125 | 0.178 | 0.595358 | 0.1 | 0.13 | 0.125 | 0.178 | 0.916601 | 0.475119 | 0.111427 | 0.827457 | 3,999 | 5,558 | 0.557319 | 0.413732 | 0.091711 | 0.099445 | 0.205593 | 0.099445 | 0.205593 | 0.535817 | 0.51958 | 0.415473 | 0.223858 | 0.025786 |
null | 0.65604 | 0.12 | 0.08 | 0.11 | 0.184 | 0.62721 | 0.12 | 0.08 | 0.11 | 0.184 | 0.916601 | 0.464525 | 0.006038 | 0.86745 | 4,172 | 5,770 | 0.595048 | 0.427959 | 0.075175 | 0.091058 | 0.197534 | 0.091058 | 0.197534 | 0.812798 | 0.813754 | 0.631328 | 0.223858 | 0.025786 |
null | 0.616865 | 0.1 | 0.09 | 0.095 | 0.172 | 0.583149 | 0.1 | 0.09 | 0.095 | 0.172 | 0.916601 | 0.504685 | 0.067719 | 0.802735 | 3,949 | 5,436 | 0.552566 | 0.420641 | 0.08705 | 0.086626 | 0.188495 | 0.086626 | 0.188495 | 0.768863 | 0.835721 | 0.262655 | 0.223858 | 0.025786 |
null | 0.656634 | 0.12 | 0.1 | 0.115 | 0.172 | 0.621953 | 0.12 | 0.1 | 0.115 | 0.172 | 0.916601 | 0.445461 | 0.006751 | 0.866362 | 4,153 | 5,785 | 0.592639 | 0.42718 | 0.061189 | 0.092295 | 0.198912 | 0.092295 | 0.198912 | 0.844317 | 0.851958 | 0.367717 | 0.223858 | 0.025786 |
null | 0.62752 | 0.08 | 0.08 | 0.095 | 0.152 | 0.5883 | 0.08 | 0.08 | 0.095 | 0.152 | 0.916601 | 0.496976 | 0.142639 | 0.805948 | 3,968 | 5,436 | 0.558976 | 0.425083 | 0.127634 | 0.102312 | 0.210373 | 0.102312 | 0.210373 | 0.385864 | 0.391595 | 0.638013 | 0.223858 | 0.025786 |
null | 0.626854 | 0.12 | 0.12 | 0.13 | 0.166 | 0.581462 | 0.12 | 0.12 | 0.13 | 0.166 | 0.916601 | 0.4711 | 0.090653 | 0.834271 | 3,910 | 5,610 | 0.544628 | 0.415829 | 0.093138 | 0.098063 | 0.206614 | 0.098063 | 0.206614 | 0.588348 | 0.586437 | 0.65043 | 0.223858 | 0.025786 |
null | 0.542017 | 0.1 | 0.1 | 0.11 | 0.15 | 0.480054 | 0.1 | 0.1 | 0.11 | 0.15 | 0.916601 | 0.438475 | 0.04187 | 0.743998 | 3,332 | 5,164 | 0.430961 | 0.386389 | 0.077186 | 0.110762 | 0.222784 | 0.110762 | 0.222784 | 0.313276 | 0.34575 | 0.186246 | 0.223858 | 0.025786 |
null | 0.691538 | 0.1 | 0.12 | 0.125 | 0.172 | 0.656109 | 0.1 | 0.12 | 0.125 | 0.172 | 0.916601 | 0.452684 | 0.105523 | 0.857598 | 4,396 | 5,746 | 0.62209 | 0.410117 | 0.102294 | 0.098952 | 0.207575 | 0.098952 | 0.207575 | 0.524355 | 0.537727 | 0.260745 | 0.223858 | 0.025786 |
null | 0.658307 | 0.1 | 0.08 | 0.11 | 0.156 | 0.6113 | 0.1 | 0.08 | 0.11 | 0.156 | 0.916601 | 0.445382 | 0.090313 | 0.821799 | 4,147 | 5,575 | 0.56772 | 0.389381 | 0.081115 | 0.112119 | 0.222635 | 0.112119 | 0.222635 | 0.309456 | 0.315186 | 0.246418 | 0.223858 | 0.025786 |
null | 0.619643 | 0.1 | 0.08 | 0.11 | 0.168 | 0.577657 | 0.1 | 0.08 | 0.11 | 0.168 | 0.916601 | 0.466582 | 0.112446 | 0.811224 | 3,920 | 5,505 | 0.532227 | 0.395175 | 0.084684 | 0.104441 | 0.212276 | 0.104441 | 0.212276 | 0.459408 | 0.468004 | 0.299904 | 0.223858 | 0.025786 |
null | 0.669291 | 0.12 | 0.1 | 0.135 | 0.162 | 0.628382 | 0.12 | 0.1 | 0.135 | 0.162 | 0.916601 | 0.467669 | 0.060539 | 0.858984 | 4,191 | 5,729 | 0.598574 | 0.428235 | 0.062808 | 0.096982 | 0.202088 | 0.096982 | 0.202088 | 0.69341 | 0.664756 | 0.459408 | 0.223858 | 0.025786 |
null | 0.533778 | 0.08 | 0.08 | 0.115 | 0.156 | 0.480104 | 0.08 | 0.08 | 0.115 | 0.156 | 0.916601 | 0.484702 | 0.090649 | 0.727355 | 3,301 | 5,001 | 0.446813 | 0.419755 | 0.104827 | 0.10713 | 0.216344 | 0.10713 | 0.216344 | 0.350525 | 0.340974 | 0.416428 | 0.223858 | 0.025786 |
null | 0.534384 | 0.1 | 0.08 | 0.13 | 0.136 | 0.474874 | 0.1 | 0.08 | 0.13 | 0.136 | 0.916601 | 0.466222 | 0.094543 | 0.74432 | 3,301 | 5,174 | 0.437054 | 0.400767 | 0.078673 | 0.105526 | 0.215209 | 0.105526 | 0.215209 | 0.412607 | 0.445081 | 0.277937 | 0.223858 | 0.025786 |
null | 0.717125 | 0.14 | 0.11 | 0.105 | 0.176 | 0.688745 | 0.14 | 0.11 | 0.105 | 0.176 | 0.916601 | 0.460682 | 0.080033 | 0.883574 | 4,578 | 5,873 | 0.663393 | 0.428639 | 0.073514 | 0.104755 | 0.210035 | 0.104755 | 0.210035 | 0.538682 | 0.498567 | 0.227316 | 0.223858 | 0.025786 |
null | 0.767973 | 0.12 | 0.11 | 0.155 | 0.196 | 0.750657 | 0.12 | 0.11 | 0.155 | 0.196 | 0.916601 | 0.461632 | -0.002807 | 0.922254 | 4,952 | 6,084 | 0.739894 | 0.46045 | 0.077425 | 0.089687 | 0.196374 | 0.089687 | 0.196374 | 0.989494 | 0.985673 | 0.579752 | 0.223858 | 0.025786 |
null | 0.576923 | 0.1 | 0.11 | 0.14 | 0.162 | 0.517178 | 0.1 | 0.11 | 0.14 | 0.162 | 0.916601 | 0.473684 | 0.079629 | 0.818392 | 3,458 | 5,472 | 0.481732 | 0.432826 | 0.078991 | 0.095897 | 0.202231 | 0.095897 | 0.202231 | 0.735435 | 0.745941 | 0.929322 | 0.223858 | 0.025786 |
null | 0.663533 | 0.1 | 0.1 | 0.13 | 0.162 | 0.620515 | 0.1 | 0.1 | 0.13 | 0.162 | 0.916601 | 0.446373 | 0.058354 | 0.854423 | 4,149 | 5,713 | 0.585108 | 0.418492 | 0.070552 | 0.104063 | 0.212978 | 0.104063 | 0.212978 | 0.512894 | 0.5234 | 0.5234 | 0.223858 | 0.025786 |
null | 0.541004 | 0.1 | 0.09 | 0.11 | 0.164 | 0.477601 | 0.1 | 0.09 | 0.11 | 0.164 | 0.916601 | 0.481447 | 0.004855 | 0.794512 | 3,207 | 5,335 | 0.449418 | 0.444415 | 0.079694 | 0.087123 | 0.192218 | 0.087123 | 0.192218 | 0.992359 | 0.992359 | 0.885387 | 0.223858 | 0.025786 |
null | 0.741774 | 0.14 | 0.1 | 0.12 | 0.17 | 0.724983 | 0.14 | 0.1 | 0.12 | 0.17 | 0.916601 | 0.467514 | 0.015634 | 0.905248 | 4,802 | 5,996 | 0.707484 | 0.442845 | 0.066342 | 0.092041 | 0.197727 | 0.092041 | 0.197727 | 0.86915 | 0.849093 | 0.639924 | 0.223858 | 0.025786 |
null | 0.686622 | 0.12 | 0.19 | 0.18 | 0.23 | 0.653873 | 0.12 | 0.19 | 0.18 | 0.23 | 0.916601 | 0.509786 | 0.18214 | 0.862998 | 4,343 | 5,732 | 0.631176 | 0.441611 | 0.152025 | 0.080059 | 0.182632 | 0.080059 | 0.182632 | 0.987584 | 0.989494 | 0.777459 | 0.223858 | 0.025786 |
null | 0.614801 | 0.16 | 0.09 | 0.12 | 0.166 | 0.571579 | 0.16 | 0.09 | 0.12 | 0.166 | 0.916601 | 0.458159 | 0.024739 | 0.854079 | 3,824 | 5,714 | 0.544304 | 0.437322 | 0.05437 | 0.095906 | 0.203386 | 0.095906 | 0.203386 | 0.800382 | 0.796562 | 0.801337 | 0.223858 | 0.025786 |
null | 0.652184 | 0.12 | 0.1 | 0.09 | 0.152 | 0.615646 | 0.12 | 0.1 | 0.09 | 0.152 | 0.916601 | 0.460922 | 0.101359 | 0.834709 | 4,120 | 5,586 | 0.579716 | 0.420411 | 0.078022 | 0.118545 | 0.227673 | 0.118545 | 0.227673 | 0.211079 | 0.210124 | 0.522445 | 0.223858 | 0.025786 |
null | 0.697586 | 0.12 | 0.1 | 0.155 | 0.186 | 0.67367 | 0.12 | 0.1 | 0.155 | 0.186 | 0.916601 | 0.474743 | 0.036909 | 0.88869 | 4,474 | 5,902 | 0.653406 | 0.441854 | 0.055692 | 0.091368 | 0.195444 | 0.091368 | 0.195444 | 0.96084 | 0.96084 | 0.603629 | 0.223858 | 0.025786 |
null | 0.529338 | 0.12 | 0.09 | 0.14 | 0.158 | 0.465927 | 0.12 | 0.09 | 0.14 | 0.158 | 0.916601 | 0.457797 | 0.087477 | 0.750863 | 3,187 | 5,136 | 0.42029 | 0.408224 | 0.084063 | 0.097945 | 0.207013 | 0.097945 | 0.207013 | 0.555874 | 0.597899 | 0.396371 | 0.223858 | 0.025786 |
null | 0.6367 | 0 | 0.02 | 0.05 | 0.12 | 0.599855 | 0 | 0.02 | 0.05 | 0.12 | 0.916601 | 0.469499 | 0.107693 | 0.816004 | 4,049 | 5,508 | 0.554161 | 0.399327 | 0.090589 | 0.125894 | 0.233658 | 0.125894 | 0.233658 | 0.086915 | 0.066858 | 0.30086 | 0.223858 | 0.025786 |
null | 0.600487 | 0.1 | 0.1 | 0.105 | 0.158 | 0.547955 | 0.1 | 0.1 | 0.105 | 0.158 | 0.916601 | 0.455775 | 0.050843 | 0.819042 | 3,697 | 5,526 | 0.509185 | 0.416745 | 0.075338 | 0.102186 | 0.20976 | 0.102186 | 0.20976 | 0.528176 | 0.51767 | 0.488061 | 0.223858 | 0.025786 |
null | 0.640929 | 0.12 | 0.1 | 0.13 | 0.156 | 0.591577 | 0.12 | 0.1 | 0.13 | 0.156 | 0.916601 | 0.457482 | 0.043634 | 0.843553 | 3,963 | 5,651 | 0.55804 | 0.42334 | 0.075052 | 0.094445 | 0.200645 | 0.094445 | 0.200645 | 0.757402 | 0.763133 | 0.622732 | 0.223858 | 0.025786 |
null | 0.731442 | 0.14 | 0.11 | 0.155 | 0.178 | 0.704111 | 0.14 | 0.11 | 0.155 | 0.178 | 0.916601 | 0.473763 | 0.076674 | 0.887799 | 4,688 | 5,911 | 0.686761 | 0.438992 | 0.078585 | 0.095032 | 0.199795 | 0.095032 | 0.199795 | 0.707736 | 0.69723 | 0.660936 | 0.223858 | 0.025786 |
null | 0.775612 | 0.1 | 0.11 | 0.12 | 0.186 | 0.76031 | 0.1 | 0.11 | 0.12 | 0.186 | 0.916601 | 0.471056 | 0.038302 | 0.916849 | 5,027 | 6,062 | 0.749034 | 0.453433 | 0.083309 | 0.093327 | 0.198675 | 0.093327 | 0.198675 | 0.727794 | 0.721108 | 0.47851 | 0.223858 | 0.025786 |
null | 0.680891 | 0.14 | 0.13 | 0.13 | 0.162 | 0.646087 | 0.14 | 0.13 | 0.13 | 0.162 | 0.916601 | 0.468692 | 0.093991 | 0.861549 | 4,312 | 5,750 | 0.617092 | 0.426764 | 0.092956 | 0.096546 | 0.203712 | 0.096546 | 0.203712 | 0.631328 | 0.637058 | 0.689589 | 0.223858 | 0.025786 |
null | 0.773814 | 0.16 | 0.11 | 0.145 | 0.186 | 0.758399 | 0.16 | 0.11 | 0.145 | 0.186 | 0.916601 | 0.464328 | 0.053474 | 0.917696 | 5,018 | 6,072 | 0.741521 | 0.439352 | 0.071233 | 0.099269 | 0.205823 | 0.099269 | 0.205823 | 0.65234 | 0.634193 | 0.748806 | 0.223858 | 0.025786 |
null | 0.632146 | 0.1 | 0.1 | 0.11 | 0.16 | 0.592302 | 0.1 | 0.1 | 0.11 | 0.16 | 0.916601 | 0.468405 | 0.062846 | 0.833501 | 3,988 | 5,612 | 0.554767 | 0.412552 | 0.101681 | 0.09515 | 0.201222 | 0.09515 | 0.201222 | 0.60745 | 0.604585 | 0.638013 | 0.223858 | 0.025786 |
null | 0.609201 | 0.14 | 0.12 | 0.14 | 0.168 | 0.561654 | 0.14 | 0.12 | 0.14 | 0.168 | 0.916601 | 0.489424 | 0.073858 | 0.822581 | 3,782 | 5,539 | 0.525803 | 0.420242 | 0.119153 | 0.093421 | 0.199543 | 0.093421 | 0.199543 | 0.657116 | 0.647564 | 0.883477 | 0.223858 | 0.025786 |
null | 0.730655 | 0.12 | 0.13 | 0.155 | 0.184 | 0.707329 | 0.12 | 0.13 | 0.155 | 0.184 | 0.916601 | 0.470663 | 0.089289 | 0.886267 | 4,704 | 5,894 | 0.686164 | 0.432932 | 0.10646 | 0.097689 | 0.204217 | 0.097689 | 0.204217 | 0.584527 | 0.571156 | 0.470869 | 0.223858 | 0.025786 |
null | 0.52116 | 0.1 | 0.09 | 0.125 | 0.16 | 0.440665 | 0.1 | 0.09 | 0.125 | 0.16 | 0.916601 | 0.486171 | 0.022886 | 0.768411 | 3,001 | 5,233 | 0.40473 | 0.426948 | 0.087771 | 0.094605 | 0.201339 | 0.094605 | 0.201339 | 0.744031 | 0.762178 | 0.850048 | 0.223858 | 0.025786 |
null | 0.556047 | 0.12 | 0.11 | 0.125 | 0.18 | 0.490599 | 0.12 | 0.11 | 0.125 | 0.18 | 0.916601 | 0.461062 | 0.078579 | 0.754277 | 3,390 | 5,212 | 0.44642 | 0.396276 | 0.110362 | 0.094501 | 0.203371 | 0.094501 | 0.203371 | 0.549188 | 0.587393 | 0.181471 | 0.223858 | 0.025786 |
null | 0.559199 | 0.1 | 0.1 | 0.1 | 0.134 | 0.505327 | 0.1 | 0.1 | 0.1 | 0.134 | 0.916601 | 0.480847 | 0.071329 | 0.770749 | 3,446 | 5,256 | 0.469824 | 0.41935 | 0.10422 | 0.098061 | 0.20519 | 0.098061 | 0.20519 | 0.510984 | 0.532951 | 0.91404 | 0.223858 | 0.025786 |
null | 0.500175 | 0.1 | 0.1 | 0.12 | 0.172 | 0.412864 | 0.1 | 0.1 | 0.12 | 0.172 | 0.916601 | 0.473611 | 0.137063 | 0.717931 | 2,861 | 4,975 | 0.374664 | 0.412174 | 0.101302 | 0.099078 | 0.208532 | 0.099078 | 0.208532 | 0.541547 | 0.590258 | 0.440306 | 0.223858 | 0.025786 |
null | 0.69518 | 0.16 | 0.14 | 0.135 | 0.194 | 0.662841 | 0.16 | 0.14 | 0.135 | 0.194 | 0.916601 | 0.478389 | 0.125004 | 0.875537 | 4,419 | 5,837 | 0.639525 | 0.432966 | 0.116153 | 0.094918 | 0.20187 | 0.094918 | 0.20187 | 0.6915 | 0.668577 | 0.827125 | 0.223858 | 0.025786 |
null | 0.640121 | 0.1 | 0.09 | 0.12 | 0.152 | 0.591819 | 0.1 | 0.09 | 0.12 | 0.152 | 0.916601 | 0.455897 | 0.06274 | 0.842238 | 3,968 | 5,647 | 0.548874 | 0.40917 | 0.089106 | 0.102369 | 0.210924 | 0.102369 | 0.210924 | 0.539637 | 0.540592 | 0.430755 | 0.223858 | 0.025786 |
null | 0.752569 | 0.08 | 0.06 | 0.085 | 0.154 | 0.733738 | 0.08 | 0.06 | 0.085 | 0.154 | 0.916601 | 0.453761 | 0.073945 | 0.89478 | 4,866 | 5,934 | 0.711269 | 0.429684 | 0.088563 | 0.114635 | 0.223686 | 0.114635 | 0.223686 | 0.255014 | 0.237822 | 0.117479 | 0.223858 | 0.025786 |
null | 0.506355 | 0 | 0.07 | 0.065 | 0.12 | 0.423954 | 0 | 0.07 | 0.065 | 0.12 | 0.916601 | 0.480268 | 0.144027 | 0.671237 | 2,990 | 4,734 | 0.390876 | 0.405526 | 0.093197 | 0.118512 | 0.22885 | 0.118512 | 0.22885 | 0.13467 | 0.153773 | 0.210124 | 0.223858 | 0.025786 |
null | 0.703206 | 0.1 | 0.08 | 0.125 | 0.17 | 0.672324 | 0.1 | 0.08 | 0.125 | 0.17 | 0.916601 | 0.46447 | 0.059993 | 0.870208 | 4,461 | 5,774 | 0.639418 | 0.424642 | 0.067936 | 0.102862 | 0.211499 | 0.102862 | 0.211499 | 0.489971 | 0.503343 | 0.291309 | 0.223858 | 0.025786 |
null | 0.608522 | 0.12 | 0.08 | 0.1 | 0.168 | 0.560115 | 0.12 | 0.08 | 0.1 | 0.168 | 0.916601 | 0.462317 | 0.029662 | 0.83249 | 3,755 | 5,581 | 0.524403 | 0.424132 | 0.053492 | 0.096284 | 0.201415 | 0.096284 | 0.201415 | 0.761223 | 0.749761 | 0.615091 | 0.223858 | 0.025786 |
null | 0.655466 | 0.14 | 0.12 | 0.135 | 0.196 | 0.613744 | 0.14 | 0.12 | 0.135 | 0.196 | 0.916601 | 0.480399 | 0.075217 | 0.854638 | 4,107 | 5,719 | 0.590227 | 0.436626 | 0.08187 | 0.090854 | 0.196371 | 0.090854 | 0.196371 | 0.86533 | 0.889207 | 0.836676 | 0.223858 | 0.025786 |
null | 0.546832 | 0.02 | 0.03 | 0.055 | 0.122 | 0.48091 | 0.02 | 0.03 | 0.055 | 0.122 | 0.916601 | 0.455593 | 0.135901 | 0.73689 | 3,299 | 5,055 | 0.432249 | 0.401591 | 0.076171 | 0.119754 | 0.228814 | 0.119754 | 0.228814 | 0.167144 | 0.187202 | 0.625597 | 0.223858 | 0.025786 |
null | 0.645374 | 0.06 | 0.07 | 0.09 | 0.14 | 0.60888 | 0.06 | 0.07 | 0.09 | 0.14 | 0.916601 | 0.490455 | 0.061227 | 0.80886 | 4,086 | 5,428 | 0.579859 | 0.430356 | 0.08148 | 0.116247 | 0.226328 | 0.116247 | 0.226328 | 0.174785 | 0.200573 | 0.944604 | 0.223858 | 0.025786 |
null | 0.59143 | 0.06 | 0.11 | 0.11 | 0.166 | 0.538607 | 0.06 | 0.11 | 0.11 | 0.166 | 0.916601 | 0.498635 | 0.13427 | 0.774836 | 3,664 | 5,271 | 0.500843 | 0.414588 | 0.128892 | 0.103268 | 0.211194 | 0.103268 | 0.211194 | 0.367717 | 0.383954 | 0.934097 | 0.223858 | 0.025786 |
null | 0.651683 | 0.06 | 0.08 | 0.085 | 0.148 | 0.615523 | 0.06 | 0.08 | 0.085 | 0.148 | 0.916601 | 0.461538 | 0.06101 | 0.821635 | 4,160 | 5,553 | 0.575107 | 0.400323 | 0.05563 | 0.108166 | 0.214824 | 0.108166 | 0.214824 | 0.423114 | 0.411652 | 0.213945 | 0.223858 | 0.025786 |
null | 0.59856 | 0.1 | 0.13 | 0.15 | 0.188 | 0.543474 | 0.1 | 0.13 | 0.15 | 0.188 | 0.916601 | 0.480395 | 0.233604 | 0.758602 | 3,749 | 5,233 | 0.4977 | 0.387635 | 0.15847 | 0.095346 | 0.202554 | 0.095346 | 0.202554 | 0.443171 | 0.457498 | 0.13085 | 0.223858 | 0.025786 |
null | 0.674524 | 0.14 | 0.12 | 0.135 | 0.176 | 0.632876 | 0.14 | 0.12 | 0.135 | 0.176 | 0.916601 | 0.476905 | 0.061042 | 0.868095 | 4,200 | 5,761 | 0.604148 | 0.435742 | 0.07611 | 0.09603 | 0.202235 | 0.09603 | 0.202235 | 0.702961 | 0.680038 | 0.698185 | 0.223858 | 0.025786 |
null | 0.541342 | 0.08 | 0.08 | 0.095 | 0.138 | 0.480339 | 0.08 | 0.08 | 0.095 | 0.138 | 0.916601 | 0.470042 | 0.090605 | 0.713601 | 3,338 | 4,959 | 0.435899 | 0.396659 | 0.109773 | 0.11558 | 0.225872 | 0.11558 | 0.225872 | 0.181471 | 0.174785 | 0.74021 | 0.223858 | 0.025786 |
null | 0.540676 | 0.14 | 0.1 | 0.13 | 0.174 | 0.463349 | 0.14 | 0.1 | 0.13 | 0.174 | 0.916601 | 0.46433 | 0.074814 | 0.751564 | 3,196 | 5,184 | 0.423073 | 0.405748 | 0.071248 | 0.101479 | 0.210014 | 0.101479 | 0.210014 | 0.526266 | 0.561605 | 0.442216 | 0.223858 | 0.025786 |
null | 0.691156 | 0.1 | 0.08 | 0.085 | 0.156 | 0.661116 | 0.1 | 0.08 | 0.085 | 0.156 | 0.916601 | 0.447619 | 0.132046 | 0.873243 | 4,410 | 5,825 | 0.625653 | 0.413286 | 0.078437 | 0.12587 | 0.235431 | 0.12587 | 0.235431 | 0.298949 | 0.296084 | 0.633238 | 0.223858 | 0.025786 |
null | 0.765432 | 0.12 | 0.11 | 0.14 | 0.208 | 0.747684 | 0.12 | 0.11 | 0.14 | 0.208 | 0.916601 | 0.480065 | 0.037438 | 0.914592 | 4,941 | 6,044 | 0.737803 | 0.459916 | 0.086565 | 0.087408 | 0.191763 | 0.087408 | 0.191763 | 0.984718 | 0.979943 | 0.748806 | 0.223858 | 0.025786 |
null | 0.628231 | 0.12 | 0.09 | 0.105 | 0.142 | 0.585353 | 0.12 | 0.09 | 0.105 | 0.142 | 0.916601 | 0.459706 | 0.061227 | 0.826406 | 3,946 | 5,571 | 0.547847 | 0.413053 | 0.058876 | 0.103043 | 0.212093 | 0.103043 | 0.212093 | 0.52149 | 0.558739 | 0.468959 | 0.223858 | 0.025786 |
null | 0.681871 | 0.1 | 0.12 | 0.125 | 0.176 | 0.646002 | 0.1 | 0.12 | 0.125 | 0.176 | 0.916601 | 0.48113 | 0.117912 | 0.849271 | 4,319 | 5,678 | 0.613004 | 0.418975 | 0.096322 | 0.1003 | 0.206585 | 0.1003 | 0.206585 | 0.5234 | 0.52149 | 0.659981 | 0.223858 | 0.025786 |
null | 0.715555 | 0.1 | 0.08 | 0.11 | 0.18 | 0.695871 | 0.1 | 0.08 | 0.11 | 0.18 | 0.916601 | 0.488735 | 0.010159 | 0.89818 | 4,616 | 5,958 | 0.679613 | 0.448193 | 0.073979 | 0.089187 | 0.192527 | 0.089187 | 0.192527 | 0.96084 | 0.930277 | 0.855778 | 0.223858 | 0.025786 |
null | 0.529888 | 0.08 | 0.09 | 0.11 | 0.158 | 0.472363 | 0.08 | 0.09 | 0.11 | 0.158 | 0.916601 | 0.50934 | 0.118102 | 0.755604 | 3,212 | 5,138 | 0.431922 | 0.419996 | 0.147377 | 0.08677 | 0.192702 | 0.08677 | 0.192702 | 0.628462 | 0.672397 | 0.904489 | 0.223858 | 0.025786 |
null | 0.581591 | 0.12 | 0.09 | 0.095 | 0.124 | 0.534317 | 0.12 | 0.09 | 0.095 | 0.124 | 0.916601 | 0.465381 | 0.073252 | 0.75672 | 3,683 | 5,216 | 0.487295 | 0.389096 | 0.091094 | 0.109312 | 0.219143 | 0.109312 | 0.219143 | 0.265521 | 0.286533 | 0.158548 | 0.223858 | 0.025786 |
null | 0.800846 | 0.12 | 0.13 | 0.145 | 0.172 | 0.790573 | 0.12 | 0.13 | 0.145 | 0.172 | 0.916601 | 0.472511 | 0.065263 | 0.931757 | 5,202 | 6,131 | 0.785442 | 0.474881 | 0.087514 | 0.090743 | 0.196673 | 0.090743 | 0.196673 | 0.927412 | 0.892073 | 0.911175 | 0.223858 | 0.025786 |
null | 0.716865 | 0.14 | 0.11 | 0.15 | 0.182 | 0.684415 | 0.14 | 0.11 | 0.15 | 0.182 | 0.916601 | 0.464113 | 0.060009 | 0.891458 | 4,542 | 5,916 | 0.662425 | 0.439217 | 0.0906 | 0.09234 | 0.199791 | 0.09234 | 0.199791 | 0.77937 | 0.78128 | 0.541547 | 0.223858 | 0.025786 |
null | 0.653656 | 0.12 | 0.11 | 0.15 | 0.196 | 0.608218 | 0.12 | 0.11 | 0.15 | 0.196 | 0.916601 | 0.484931 | 0.093946 | 0.84832 | 4,048 | 5,646 | 0.575029 | 0.430953 | 0.123536 | 0.0885 | 0.196017 | 0.0885 | 0.196017 | 0.743075 | 0.765043 | 0.769819 | 0.223858 | 0.025786 |
null | 0.70338 | 0.14 | 0.1 | 0.13 | 0.188 | 0.672165 | 0.14 | 0.1 | 0.13 | 0.188 | 0.916601 | 0.478173 | 0.067836 | 0.884934 | 4,467 | 5,881 | 0.65386 | 0.445836 | 0.078533 | 0.090753 | 0.195646 | 0.090753 | 0.195646 | 0.900669 | 0.921681 | 0.814709 | 0.223858 | 0.025786 |
null | 0.637412 | 0.12 | 0.08 | 0.105 | 0.136 | 0.594891 | 0.12 | 0.08 | 0.105 | 0.136 | 0.916601 | 0.477934 | 0.052473 | 0.829238 | 3,988 | 5,559 | 0.564915 | 0.429264 | 0.054604 | 0.103483 | 0.205469 | 0.103483 | 0.205469 | 0.60554 | 0.556829 | 0.522445 | 0.223858 | 0.025786 |
null | 0.609195 | 0.12 | 0.09 | 0.13 | 0.156 | 0.557374 | 0.12 | 0.09 | 0.13 | 0.156 | 0.916601 | 0.458968 | 0.050599 | 0.837477 | 3,741 | 5,621 | 0.523995 | 0.426587 | 0.065568 | 0.102003 | 0.210821 | 0.102003 | 0.210821 | 0.598854 | 0.599809 | 0.823305 | 0.223858 | 0.025786 |
null | 0.673734 | 0.14 | 0.11 | 0.145 | 0.186 | 0.640622 | 0.14 | 0.11 | 0.145 | 0.186 | 0.916601 | 0.483157 | 0.063417 | 0.873027 | 4,245 | 5,785 | 0.617537 | 0.446161 | 0.06857 | 0.091911 | 0.195463 | 0.091911 | 0.195463 | 0.950334 | 0.909265 | 0.91022 | 0.223858 | 0.025786 |
null | 0.636612 | 0.08 | 0.12 | 0.13 | 0.172 | 0.600964 | 0.08 | 0.12 | 0.13 | 0.172 | 0.916601 | 0.482365 | 0.037233 | 0.836066 | 4,026 | 5,601 | 0.573456 | 0.430898 | 0.077795 | 0.08922 | 0.191908 | 0.08922 | 0.191908 | 0.901624 | 0.891117 | 0.488061 | 0.223858 | 0.025786 |
null | 0.567583 | 0.06 | 0.1 | 0.115 | 0.156 | 0.509277 | 0.06 | 0.1 | 0.115 | 0.156 | 0.916601 | 0.469359 | 0.076888 | 0.770332 | 3,492 | 5,282 | 0.469494 | 0.406158 | 0.082015 | 0.10242 | 0.211944 | 0.10242 | 0.211944 | 0.475645 | 0.489016 | 0.391595 | 0.223858 | 0.025786 |
null | 0.536128 | 0.12 | 0.08 | 0.115 | 0.148 | 0.47304 | 0.12 | 0.08 | 0.115 | 0.148 | 0.916601 | 0.474195 | 0.047742 | 0.74914 | 3,197 | 5,063 | 0.440156 | 0.43752 | 0.074577 | 0.104985 | 0.213344 | 0.104985 | 0.213344 | 0.48233 | 0.48042 | 0.853868 | 0.223858 | 0.025786 |
null | 0.556452 | 0.08 | 0.08 | 0.09 | 0.152 | 0.508817 | 0.08 | 0.08 | 0.09 | 0.152 | 0.916601 | 0.485023 | 0.10869 | 0.747984 | 3,472 | 5,104 | 0.470903 | 0.417832 | 0.082931 | 0.117329 | 0.225595 | 0.117329 | 0.225595 | 0.224451 | 0.207259 | 0.802292 | 0.223858 | 0.025786 |
null | 0.535488 | 0.08 | 0.07 | 0.125 | 0.144 | 0.479007 | 0.08 | 0.07 | 0.125 | 0.144 | 0.916601 | 0.489278 | 0.058089 | 0.733917 | 3,311 | 5,073 | 0.445508 | 0.412989 | 0.101112 | 0.101627 | 0.210942 | 0.101627 | 0.210942 | 0.43553 | 0.43744 | 0.272206 | 0.223858 | 0.025786 |
null | 0.657508 | 0.12 | 0.11 | 0.13 | 0.188 | 0.62132 | 0.12 | 0.11 | 0.13 | 0.188 | 0.916601 | 0.466377 | 0.059034 | 0.859725 | 4,149 | 5,741 | 0.589959 | 0.425732 | 0.08055 | 0.092539 | 0.198658 | 0.092539 | 0.198658 | 0.787011 | 0.789876 | 0.559694 | 0.223858 | 0.025786 |
null | 0.639053 | 0.1 | 0.07 | 0.08 | 0.148 | 0.602784 | 0.1 | 0.07 | 0.08 | 0.148 | 0.916601 | 0.464744 | 0.050244 | 0.821992 | 4,056 | 5,531 | 0.564759 | 0.413182 | 0.092718 | 0.115221 | 0.225223 | 0.115221 | 0.225223 | 0.255969 | 0.247373 | 0.893983 | 0.223858 | 0.025786 |
null | 0.536082 | 0.04 | 0.08 | 0.115 | 0.172 | 0.466988 | 0.04 | 0.08 | 0.115 | 0.172 | 0.916601 | 0.467666 | 0.062497 | 0.755701 | 3,201 | 5,180 | 0.423219 | 0.406445 | 0.076808 | 0.094305 | 0.200615 | 0.094305 | 0.200615 | 0.641834 | 0.69341 | 0.246418 | 0.223858 | 0.025786 |
null | 0.541379 | 0.08 | 0.1 | 0.13 | 0.18 | 0.468272 | 0.08 | 0.1 | 0.13 | 0.18 | 0.916601 | 0.472727 | 0.06395 | 0.768025 | 3,190 | 5,232 | 0.427923 | 0.417888 | 0.089838 | 0.09642 | 0.204788 | 0.09642 | 0.204788 | 0.614136 | 0.624642 | 0.461318 | 0.223858 | 0.025786 |
null | 0.620307 | 0 | 0.02 | 0.06 | 0.108 | 0.577987 | 0 | 0.02 | 0.06 | 0.108 | 0.916601 | 0.456538 | 0.129506 | 0.773998 | 3,969 | 5,315 | 0.526607 | 0.377307 | 0.084661 | 0.132856 | 0.241497 | 0.132856 | 0.241497 | 0.031519 | 0.029608 | 0.080229 | 0.223858 | 0.025786 |
null | 0.627495 | 0 | 0.02 | 0.035 | 0.124 | 0.583523 | 0 | 0.02 | 0.035 | 0.124 | 0.916601 | 0.439621 | 0.173547 | 0.768713 | 4,008 | 5,280 | 0.530594 | 0.37263 | 0.106429 | 0.133284 | 0.243648 | 0.133284 | 0.243648 | 0.022923 | 0.027698 | 0.078319 | 0.223858 | 0.025786 |
null | 0.647302 | 0 | 0.02 | 0.065 | 0.11 | 0.601451 | 0 | 0.02 | 0.065 | 0.11 | 0.916601 | 0.454059 | 0.170757 | 0.786096 | 4,114 | 5,377 | 0.552566 | 0.378163 | 0.104285 | 0.134146 | 0.24299 | 0.134146 | 0.24299 | 0.022923 | 0.019102 | 0.08405 | 0.223858 | 0.025786 |
scGPT activation captures on Replogle (base vs ESM)
Per-cell layer-{5,7,10} hidden states from the two scGPT checkpoints in
matthewshu/scgpt-replogle-base-ft
and matthewshu/scgpt-replogle-esm-ft,
captured over the same 72,100-cell balanced sample of the
State-Replogle-Filtered
dataset. Sample is balanced over (cell_line, gene) buckets at 25 cells per
bucket. Intended for crosscoder diffing and other mechanistic-interpretability
analyses comparing the two models on identical inputs.
Layout
matthewshu/scgpt-replogle-activations/
βββ README.md
βββ base/ β scGPT-base capture (~313 GB)
β βββ shard-00000.h5 β¦ shard-00003.h5 (4 shards, 50 batches Γ batch 384)
β βββ stats.h5 (per-D Welford running mean/M2 per capture)
β βββ predictions.h5ad (3.78 GB, .X = pred, .layers["truth"] = real)
β βββ training_stats.json (epochs, wandb URL, best val pearson_delta)
βββ esm/ β scGPT+ESM capture (~312 GB), same layout
Total: ~625 GB across 14 data files (7 per side).
| shard | cells | layout |
|---|---|---|
| 0 | 19,200 | full (50 Γ 384) |
| 1 | 19,200 | full |
| 2 | 19,200 | full |
| 3 | 14,500 | partial (last batch trims to 72,100 total) |
Capture configuration
Both runs share:
| Source code commit | 7384c03 |
| Runner | python -m scripts.run scgpt --dataset replogle --split sample |
| Sample | --sample-by cell_line,gene --sample-n-per-bucket 25 --seed 42 β 72,100 cells across 2,884 buckets |
| Sample order | rng.permutation-shuffled at sample build time, so each shard interleaves cell lines and perturbations |
| Capture layers | transformer_encoder.layers.{5,7,10} |
| Capture dtype | fp16 |
| Shard size | 50 batches Γ batch 384 = 19,200 cells/shard |
| Compression | gzip-4 (lossless) |
| Hardware | NVIDIA H100 PCIe (80 GB) |
The two captures differ only in the model: base loads
replogle_base_ft/best_model.pt directly; ESM loads
replogle_esm_ft/best_model.pt after constructing the model with the frozen
ESM2-15B per-gene prior (scgpt_esm_prior.safetensors, 5120 β 512 linear).
Sample order: shuffled, not sorted
Earlier versions of this dataset stored cells in obs-frame index order, which left each shard 100% one cell-line (K562 in early shards, RPE1 in later shards). Downstream SAE/crosscoder training therefore needed an explicit global shuffle to avoid mid-epoch distribution shift.
The current capture seeds rng.permutation over the concatenated bucket
indices (commit
7384c03), so
each shard contains a mix close to the global proportion (empirically ~52/48
K562/RPE1 per shard since the sample over-weights shared perts). Sequential
iter_chunks(chunk_rows=256) plus chunk-local shuffle is now sufficient for
IID-style training batches; no global-shuffle dataloader is needed downstream.
Determinism and base/esm alignment
Same --seed 42, same dataset preprocessing, same balanced-sample bucket
indices (and the same rng.permutation order from a single seeded RNG),
same per-cell-line basal pairings. Train/val/test pair counts also match
between runs (180,021 / 8,569 / 109,207). The 72,100-row sample is
bit-identical at the cell-id level β only the captured activations differ.
For every shard, cell_id arrays compare element-wise equal between
base/shard-NNNNN.h5 and esm/shard-NNNNN.h5, so position N in the base
shard pairs with position N in the matching esm shard. This is the
load-bearing invariant for crosscoder pairing.
obs.index of predictions.h5ad matches meta/cell_id in each shard h5,
so joining cells across files is by string key, not row index. CUDA RNG and
hardware-level non-determinism mean the captured activations themselves are
not bit-reproducible across runs.
Files
shard-NNNNN.h5β per-batch h5 layout fromH5ActivationSink(seescripts/interp/hook_sinks.pyin the source repo for the schema and reader). Each shard contains:- the captured fp16 BTD tensor at
<capture>/<tags>/activationwith shape(B, T=1536, D=512), - per-cell labels under
<capture>/<tags>/labels/{cell_id, pert, cell_index, gene_dataset_ids}, - per-shard Welford accumulators at
<capture>/<tags>/running_stats/{count, mean, M2}.
- the captured fp16 BTD tensor at
stats.h5β global per-D Welford accumulators across all shards in this run, flushed at sink close. Shape(D,)(token axis collapsed at write time, suitable as the normalizer for crosscoder training without further backfill).predictions.h5adβ self-contained predictions (.X= predicted log-norm expression,.layers["truth"]= ground truth). 0 control cells appended: the sample loader already includes balanced controls as their own buckets, unlike train/val/test where bulk controls get appended for cell-eval.training_stats.jsonβ provenance for the underlying model: wandb run URL, epoch count, best validationpearson_delta, total training cells.
Source
mattshu0410/sc-interp β the runner and HookManager/H5ActivationSink that produced these captures.
- Downloads last month
- 62