hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
3db04ddd83cdd4b14eac2ae6a032b2875265f32d | 53 | py | Python | thaifin/__init__.py | CircleOnCircles/thaifin | 8dfa0dfa9dd94e3a9b76a6830f9e317565212dc5 | [
"0BSD"
] | 7 | 2020-10-22T04:02:01.000Z | 2021-05-26T07:06:12.000Z | thaifin/__init__.py | CircleOnCircles/thaifin | 8dfa0dfa9dd94e3a9b76a6830f9e317565212dc5 | [
"0BSD"
] | 4 | 2020-09-10T02:40:28.000Z | 2022-02-11T10:52:19.000Z | thaifin/__init__.py | CircleOnCircles/thaifin | 8dfa0dfa9dd94e3a9b76a6830f9e317565212dc5 | [
"0BSD"
] | 2 | 2020-11-27T02:10:22.000Z | 2021-08-14T14:26:01.000Z | from thaifin.stock import Stock
__all__ = ["Stock"]
| 13.25 | 31 | 0.735849 | 7 | 53 | 5 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150943 | 53 | 3 | 32 | 17.666667 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0.09434 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
3dd68e55d443ebd4767ffcf426747c37e3abbdf5 | 1,222 | py | Python | python/src/iceberg/exceptions.py | xloya/iceberg | 5d6c6ccecc43f9d9d2348fddbde45b747016d643 | [
"Apache-2.0"
] | 502 | 2018-11-20T12:19:42.000Z | 2020-05-27T08:50:04.000Z | python/src/iceberg/exceptions.py | xloya/iceberg | 5d6c6ccecc43f9d9d2348fddbde45b747016d643 | [
"Apache-2.0"
] | 926 | 2018-11-26T17:35:21.000Z | 2020-05-27T20:10:05.000Z | python/src/iceberg/exceptions.py | xloya/iceberg | 5d6c6ccecc43f9d9d2348fddbde45b747016d643 | [
"Apache-2.0"
] | 223 | 2018-11-20T20:29:56.000Z | 2020-05-27T16:57:30.000Z | # Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
class NoSuchTableError(Exception):
"""Raised when a referenced table is not found"""
class NoSuchNamespaceError(Exception):
"""Raised when a referenced name-space is not found"""
class NamespaceNotEmptyError(Exception):
"""Raised when a name-space being dropped is not empty"""
class AlreadyExistsError(Exception):
"""Raised when a table or name-space being created already exists in the catalog"""
| 37.030303 | 87 | 0.753682 | 173 | 1,222 | 5.323699 | 0.531792 | 0.065147 | 0.082519 | 0.086862 | 0.065147 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003984 | 0.178396 | 1,222 | 32 | 88 | 38.1875 | 0.913347 | 0.810147 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
3de0a568e053eefa8ecaafd6d34cfe311d256b5e | 1,095 | py | Python | regene/regene.py | rinslow/regene | b1aa6cf9dcd436d547e9d4276bb850644dfc53aa | [
"MIT"
] | 2 | 2018-02-23T23:12:44.000Z | 2019-04-28T03:20:17.000Z | regene/regene.py | rinslow/regene | b1aa6cf9dcd436d547e9d4276bb850644dfc53aa | [
"MIT"
] | 2 | 2018-02-27T21:36:24.000Z | 2018-03-09T11:28:44.000Z | regene/regene.py | rinslow/regene | b1aa6cf9dcd436d547e9d4276bb850644dfc53aa | [
"MIT"
] | null | null | null | from regene.compile.compose import Composer
from regene.compile.regular_expression import RegularExpression
class Regene(object):
def __init__(self, expression: str):
self.expression = expression
def random(self) -> str:
"""A random string that would match given expression."""
raise NotImplementedError("No support for randoms yet.")
def minimal(self) -> str:
raise NotImplementedError("No support for minimals yet.")
def _precompiled_experssion(self):
# TODO: Remove spaces from quantifiers
return (self.expression.replace(r"\d", r"[0-9]")
.replace(r"\D", r"[^0-9]")
.replace(r"\s", r"[ \t\n\f\r]")
.replace(r"\S", r"[^ \t\n\f\r]")
.replace(r"\w", r"[a-zA-Z_0-9]")
.replace(r"\W", r"[^a-zA-Z_0-9]"))
def simple(self) -> str:
"""Minimal string that would match a given expression."""
return str(Composer(RegularExpression(self.expression)).enter())
| 39.107143 | 72 | 0.558904 | 131 | 1,095 | 4.603053 | 0.396947 | 0.079602 | 0.044776 | 0.049751 | 0.258706 | 0.139303 | 0.139303 | 0.139303 | 0.099502 | 0.059701 | 0 | 0.010417 | 0.29863 | 1,095 | 27 | 73 | 40.555556 | 0.77474 | 0.127854 | 0 | 0 | 0 | 0 | 0.133475 | 0 | 0 | 0 | 0 | 0.037037 | 0 | 1 | 0.277778 | false | 0 | 0.111111 | 0.055556 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
3de0bbbdc01ac9ddc57fac6cdcc8b7fd30a9fcd6 | 505 | py | Python | ai/src/algorithms/dql/__init__.py | ScriptBox99/spiceai | f8aa178fed5cc6d6d9397c123bdc869500c5135b | [
"Apache-2.0"
] | 713 | 2021-09-07T19:57:25.000Z | 2022-03-21T02:31:02.000Z | ai/src/algorithms/dql/__init__.py | ScriptBox99/spiceai | f8aa178fed5cc6d6d9397c123bdc869500c5135b | [
"Apache-2.0"
] | 133 | 2021-09-07T17:34:16.000Z | 2022-02-27T17:34:31.000Z | ai/src/algorithms/dql/__init__.py | ScriptBox99/spiceai | f8aa178fed5cc6d6d9397c123bdc869500c5135b | [
"Apache-2.0"
] | 29 | 2021-09-07T23:46:20.000Z | 2022-02-11T21:11:04.000Z | # Spice.ai implementation of Deep Q Learning (DQL)
#
# Explanation from: https://www.tensorflow.org/agents/tutorials/0_intro_rl
#
# The DQN (Deep Q-Network) algorithm was developed by DeepMind in 2015. It was able to
# solve a wide range of Atari games (some to superhuman level) by combining reinforcement
# learning and deep neural networks at scale. The algorithm was developed by enhancing a
# classic RL algorithm called Q-Learning with deep neural networks and a technique called experience replay.
| 56.111111 | 108 | 0.792079 | 79 | 505 | 5.037975 | 0.696203 | 0.025126 | 0.105528 | 0.115578 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011655 | 0.150495 | 505 | 8 | 109 | 63.125 | 0.916084 | 0.966337 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
3de6cb202f9749ce24a0802312b3ba58bf5525b1 | 303 | py | Python | zunzun/config/config.py | aprezcuba24/zunzun | cc294d9dfb84695be0ed1425cf946a0f4ea644a9 | [
"MIT"
] | null | null | null | zunzun/config/config.py | aprezcuba24/zunzun | cc294d9dfb84695be0ed1425cf946a0f4ea644a9 | [
"MIT"
] | null | null | null | zunzun/config/config.py | aprezcuba24/zunzun | cc294d9dfb84695be0ed1425cf946a0f4ea644a9 | [
"MIT"
] | null | null | null | import importlib
class Config:
def __init__(self, config, env):
self._config = importlib.import_module(f"{config}.{env}")
def __getattr__(self, name):
return getattr(self._config, name, None)
def get(self, name, default=None):
return getattr(self, name, default)
| 23.307692 | 65 | 0.660066 | 38 | 303 | 4.973684 | 0.421053 | 0.15873 | 0.15873 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.217822 | 303 | 12 | 66 | 25.25 | 0.797468 | 0 | 0 | 0 | 0 | 0 | 0.046205 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0 | 0.25 | 0.25 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
3de953281595bd1eb2444b36f0953e565227b71a | 1,217 | py | Python | dsrs/migrations/0008_auto_20210309_1922.py | raihanba13/DRF-API-Development | e137c2088d2eed8fd899760ad444bbc093d6e929 | [
"MIT"
] | 1 | 2021-12-12T12:05:25.000Z | 2021-12-12T12:05:25.000Z | dsrs/migrations/0008_auto_20210309_1922.py | raihanba13/DRF-API-Development | e137c2088d2eed8fd899760ad444bbc093d6e929 | [
"MIT"
] | null | null | null | dsrs/migrations/0008_auto_20210309_1922.py | raihanba13/DRF-API-Development | e137c2088d2eed8fd899760ad444bbc093d6e929 | [
"MIT"
] | null | null | null | # Generated by Django 3.1 on 2021-03-09 13:22
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dsrs', '0007_auto_20210309_1835'),
]
operations = [
migrations.AlterField(
model_name='resource',
name='artists',
field=models.CharField(max_length=150, null=True),
),
migrations.AlterField(
model_name='resource',
name='dsrs',
field=models.IntegerField(default=0, null=True),
),
migrations.AlterField(
model_name='resource',
name='isrc',
field=models.CharField(max_length=20, null=True),
),
migrations.AlterField(
model_name='resource',
name='revenue',
field=models.FloatField(default=0, null=True),
),
migrations.AlterField(
model_name='resource',
name='title',
field=models.CharField(max_length=40, null=True),
),
migrations.AlterField(
model_name='resource',
name='usages',
field=models.IntegerField(default=0, null=True),
),
]
| 27.659091 | 62 | 0.550534 | 115 | 1,217 | 5.721739 | 0.408696 | 0.182371 | 0.227964 | 0.264438 | 0.68541 | 0.553191 | 0.490881 | 0.396657 | 0.173252 | 0.173252 | 0 | 0.04908 | 0.33032 | 1,217 | 43 | 63 | 28.302326 | 0.758282 | 0.035333 | 0 | 0.540541 | 1 | 0 | 0.09215 | 0.019625 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.027027 | 0 | 0.108108 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
9a95bf807486445b3e33a03e915b63dc7feda301 | 603 | py | Python | wannacri/usm/__init__.py | emoose/WannaCRI | 8d1d2d66c19b669c8c937e3ed585d00d422f0d62 | [
"MIT"
] | 21 | 2021-08-08T20:53:56.000Z | 2022-03-27T06:34:58.000Z | wannacri/usm/__init__.py | emoose/WannaCRI | 8d1d2d66c19b669c8c937e3ed585d00d422f0d62 | [
"MIT"
] | 5 | 2021-08-08T20:53:48.000Z | 2022-03-30T13:34:18.000Z | wannacri/usm/__init__.py | emoose/WannaCRI | 8d1d2d66c19b669c8c937e3ed585d00d422f0d62 | [
"MIT"
] | 4 | 2021-08-31T23:00:29.000Z | 2022-03-18T17:22:51.000Z | from .tools import (
chunk_size_and_padding,
generate_keys,
is_valid_chunk,
encrypt_video_packet,
decrypt_video_packet,
encrypt_audio_packet,
decrypt_audio_packet,
get_video_header_end_offset,
is_usm,
)
from .page import UsmPage, get_pages, pack_pages
from .usm import Usm
from .chunk import UsmChunk
from .media import UsmMedia, UsmVideo, UsmAudio, GenericVideo, GenericAudio, Vp9
from .types import OpMode, ElementOccurrence, ElementType, PayloadType, ChunkType
import logging
from logging import NullHandler
logging.getLogger(__name__).addHandler(NullHandler())
| 27.409091 | 81 | 0.791045 | 75 | 603 | 6.026667 | 0.586667 | 0.048673 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001949 | 0.149254 | 603 | 21 | 82 | 28.714286 | 0.879142 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.421053 | 0 | 0.421053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
9aac1e2c235852c5a0152ef4459135246541bcfb | 496 | py | Python | tests/test_cli.py | pluce/meddra-toolkit | be360a59be6de9a279a8547c824e2a6c1f2534a9 | [
"MIT"
] | 1 | 2021-12-18T07:03:43.000Z | 2021-12-18T07:03:43.000Z | tests/test_cli.py | pluce/meddra-toolkit | be360a59be6de9a279a8547c824e2a6c1f2534a9 | [
"MIT"
] | 1 | 2022-03-11T20:14:18.000Z | 2022-03-11T20:14:18.000Z | tests/test_cli.py | posos-tech/meddra-toolkit | be360a59be6de9a279a8547c824e2a6c1f2534a9 | [
"MIT"
] | null | null | null | """Sample integration test module using pytest-describe and expecter."""
# pylint: disable=redefined-outer-name,unused-variable,expression-not-assigned
import pytest
from click.testing import CliRunner
from expecter import expect
from meddra_toolkit.cli import main
@pytest.fixture
def runner():
return CliRunner()
def describe_cli():
def describe_conversion():
def when_integer(runner):
result = runner.invoke(main)
# expect(result.exit_code) == 1
| 22.545455 | 78 | 0.725806 | 61 | 496 | 5.819672 | 0.672131 | 0.061972 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002463 | 0.181452 | 496 | 21 | 79 | 23.619048 | 0.871921 | 0.350806 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.363636 | false | 0 | 0.363636 | 0.090909 | 0.818182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
9aefa53294926ef5799c2973eb5aa354306abd68 | 86,489 | py | Python | tests/test_patch3pt.py | llinke1/TreeCorr | 02f4c0547ac1917f77a9e1e3c55d7677fd2ec78f | [
"BSD-2-Clause-FreeBSD"
] | 86 | 2015-02-09T05:46:13.000Z | 2022-01-12T17:00:33.000Z | tests/test_patch3pt.py | llinke1/TreeCorr | 02f4c0547ac1917f77a9e1e3c55d7677fd2ec78f | [
"BSD-2-Clause-FreeBSD"
] | 102 | 2015-02-25T04:41:34.000Z | 2022-03-16T23:41:53.000Z | tests/test_patch3pt.py | llinke1/TreeCorr | 02f4c0547ac1917f77a9e1e3c55d7677fd2ec78f | [
"BSD-2-Clause-FreeBSD"
] | 38 | 2015-07-20T15:14:12.000Z | 2022-03-24T06:37:01.000Z | # Copyright (c) 2003-2019 by Mike Jarvis
#
# TreeCorr is free software: redistribution and use in source and binary forms,
# with or without modification, are permitted provided that the following
# conditions are met:
#
# 1. Redistributions of source code must retain the above copyright notice, this
# list of conditions, and the disclaimer given in the accompanying LICENSE
# file.
# 2. Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions, and the disclaimer given in the documentation
# and/or other materials provided with the distribution.
from __future__ import print_function
import numpy as np
import os
import coord
import time
import fitsio
import treecorr
from test_helper import assert_raises, do_pickle, timer, get_from_wiki, CaptureLog, clear_save
from test_helper import profile
def generate_shear_field(npos, nhalo, rng=None):
# We do something completely different here than we did for 2pt patch tests.
# A straight Gaussian field with a given power spectrum has no significant 3pt power,
# so it's not a great choice for simulating a field for 3pt tests.
# Instead we place N SIS "halos" randomly in the grid.
# Then we translate that to a shear field via FFT.
if rng is None:
rng = np.random.RandomState()
# Generate x,y values for the real-space field
x = rng.uniform(0,1000, size=npos)
y = rng.uniform(0,1000, size=npos)
nh = rng.poisson(nhalo)
# Fill the kappa values with SIS halo profiles.
xc = rng.uniform(0,1000, size=nh)
yc = rng.uniform(0,1000, size=nh)
scale = rng.uniform(20,50, size=nh)
mass = rng.uniform(0.01, 0.05, size=nh)
# Avoid making huge nhalo * nsource arrays. Loop in blocks of 64 halos
nblock = (nh-1) // 64 + 1
kappa = np.zeros_like(x)
gamma = np.zeros_like(x, dtype=complex)
for iblock in range(nblock):
i = iblock*64
j = (iblock+1)*64
dx = x[:,np.newaxis]-xc[np.newaxis,i:j]
dy = y[:,np.newaxis]-yc[np.newaxis,i:j]
dx[dx==0] = 1 # Avoid division by zero.
dy[dy==0] = 1
dx /= scale[i:j]
dy /= scale[i:j]
rsq = dx**2 + dy**2
r = rsq**0.5
k = mass[i:j] / r # "Mass" here is really just a dimensionless normalization propto mass.
kappa += np.sum(k, axis=1)
# gamma_t = kappa for SIS.
g = -k * (dx + 1j*dy)**2 / rsq
gamma += np.sum(g, axis=1)
return x, y, np.real(gamma), np.imag(gamma), kappa
@timer
def test_kkk_jk():
# Test jackknife and other covariance estimates for kkk correlations.
# Note: This test takes a while!
# The main version I think is a pretty decent test of the code correctness.
# It shows that bootstrap in particular easily gets to within 50% of the right variance.
# Sometimes within 20%, but because of the randomness there, it varies a bit.
# Jackknife isn't much worse. Just a little below 50%. But still pretty good.
# Sample and Marked are not great for this test. I think they will work ok when the
# triangles of interest are mostly within single patches, but that's not the case we
# have here, and it would take a lot more points to get to that regime. So the
# accuracy tests for those two are pretty loose.
if __name__ == '__main__':
# This setup takes about 740 sec to run.
nhalo = 3000
nsource = 5000
npatch = 32
tol_factor = 1
elif False:
# This setup takes about 180 sec to run.
nhalo = 2000
nsource = 2000
npatch = 16
tol_factor = 2
elif False:
# This setup takes about 51 sec to run.
nhalo = 1000
nsource = 1000
npatch = 16
tol_factor = 3
else:
# This setup takes about 20 sec to run.
# So we use this one for regular unit test runs.
# It's pretty terrible in terms of testing the accuracy, but it works for code coverage.
# But whenever actually working on this part of the code, definitely need to switch
# to one of the above setups. Preferably run the name==main version to get a good
# test of the code correctness.
nhalo = 500
nsource = 500
npatch = 16
tol_factor = 4
file_name = 'data/test_kkk_jk_{}.npz'.format(nsource)
print(file_name)
if not os.path.isfile(file_name):
nruns = 1000
all_kkks = []
rng1 = np.random.RandomState()
for run in range(nruns):
x, y, _, _, k = generate_shear_field(nsource, nhalo, rng1)
print(run,': ',np.mean(k),np.std(k))
cat = treecorr.Catalog(x=x, y=y, k=k)
kkk = treecorr.KKKCorrelation(nbins=3, min_sep=30., max_sep=100.,
min_u=0.9, max_u=1.0, nubins=1,
min_v=0.0, max_v=0.1, nvbins=1)
kkk.process(cat)
print(kkk.ntri.ravel().tolist())
print(kkk.zeta.ravel().tolist())
all_kkks.append(kkk)
mean_kkk = np.mean([kkk.zeta.ravel() for kkk in all_kkks], axis=0)
var_kkk = np.var([kkk.zeta.ravel() for kkk in all_kkks], axis=0)
np.savez(file_name, all_kkk=np.array([kkk.zeta.ravel() for kkk in all_kkks]),
mean_kkk=mean_kkk, var_kkk=var_kkk)
data = np.load(file_name)
mean_kkk = data['mean_kkk']
var_kkk = data['var_kkk']
print('mean = ',mean_kkk)
print('var = ',var_kkk)
rng = np.random.RandomState(12345)
x, y, _, _, k = generate_shear_field(nsource, nhalo, rng)
cat = treecorr.Catalog(x=x, y=y, k=k)
kkk = treecorr.KKKCorrelation(nbins=3, min_sep=30., max_sep=100.,
min_u=0.9, max_u=1.0, nubins=1,
min_v=0.0, max_v=0.1, nvbins=1, rng=rng)
kkk.process(cat)
print(kkk.ntri.ravel())
print(kkk.zeta.ravel())
print(kkk.varzeta.ravel())
kkkp = kkk.copy()
catp = treecorr.Catalog(x=x, y=y, k=k, npatch=npatch)
# Do the same thing with patches.
kkkp.process(catp)
print('with patches:')
print(kkkp.ntri.ravel())
print(kkkp.zeta.ravel())
print(kkkp.varzeta.ravel())
np.testing.assert_allclose(kkkp.ntri, kkk.ntri, rtol=0.05 * tol_factor)
np.testing.assert_allclose(kkkp.zeta, kkk.zeta, rtol=0.1 * tol_factor, atol=1e-3 * tol_factor)
np.testing.assert_allclose(kkkp.varzeta, kkk.varzeta, rtol=0.05 * tol_factor, atol=3.e-6)
print('jackknife:')
cov = kkkp.estimate_cov('jackknife')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.diagonal(cov), var_kkk, rtol=0.6 * tol_factor)
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.5*tol_factor)
print('sample:')
cov = kkkp.estimate_cov('sample')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.diagonal(cov), var_kkk, rtol=0.7 * tol_factor)
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.7*tol_factor)
print('marked:')
cov = kkkp.estimate_cov('marked_bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.diagonal(cov), var_kkk, rtol=0.7 * tol_factor)
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.7*tol_factor)
print('bootstrap:')
cov = kkkp.estimate_cov('bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.diagonal(cov), var_kkk, rtol=0.5 * tol_factor)
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.3*tol_factor)
# Now as a cross correlation with all 3 using the same patch catalog.
print('with 3 patched catalogs:')
kkkp.process(catp, catp, catp)
print(kkkp.zeta.ravel())
np.testing.assert_allclose(kkkp.zeta, kkk.zeta, rtol=0.1 * tol_factor, atol=1e-3 * tol_factor)
print('jackknife:')
cov = kkkp.estimate_cov('jackknife')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.5*tol_factor)
print('sample:')
cov = kkkp.estimate_cov('sample')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('marked:')
cov = kkkp.estimate_cov('marked_bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('bootstrap:')
cov = kkkp.estimate_cov('bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.3*tol_factor)
# Repeat this test with different combinations of patch with non-patch catalogs:
# All the methods work best when the patches are used for all 3 catalogs. But there
# are probably cases where this kind of cross correlation with only some catalogs having
# patches could be desired. So this mostly just checks that the code runs properly.
# Patch on 1 only:
print('with patches on 1 only:')
kkkp.process(catp, cat)
print(kkkp.zeta.ravel())
np.testing.assert_allclose(kkkp.zeta, kkk.zeta, rtol=0.1 * tol_factor, atol=1e-3 * tol_factor)
print('jackknife:')
cov = kkkp.estimate_cov('jackknife')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('sample:')
cov = kkkp.estimate_cov('sample')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('marked:')
cov = kkkp.estimate_cov('marked_bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('bootstrap:')
cov = kkkp.estimate_cov('bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
# Patch on 2 only:
print('with patches on 2 only:')
kkkp.process(cat, catp, cat)
print(kkkp.zeta.ravel())
np.testing.assert_allclose(kkkp.zeta, kkk.zeta, rtol=0.1 * tol_factor, atol=1e-3 * tol_factor)
print('jackknife:')
cov = kkkp.estimate_cov('jackknife')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.diagonal(cov), var_kkk, rtol=0.9 * tol_factor)
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('sample:')
cov = kkkp.estimate_cov('sample')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('marked:')
cov = kkkp.estimate_cov('marked_bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('bootstrap:')
cov = kkkp.estimate_cov('bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
# Patch on 3 only:
print('with patches on 3 only:')
kkkp.process(cat, cat, catp)
print(kkkp.zeta.ravel())
np.testing.assert_allclose(kkkp.zeta, kkk.zeta, rtol=0.1 * tol_factor, atol=1e-3 * tol_factor)
print('jackknife:')
cov = kkkp.estimate_cov('jackknife')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('sample:')
cov = kkkp.estimate_cov('sample')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('marked:')
cov = kkkp.estimate_cov('marked_bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('bootstrap:')
cov = kkkp.estimate_cov('bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
# Patch on 1,2
print('with patches on 1,2:')
kkkp.process(catp, catp, cat)
print(kkkp.zeta.ravel())
np.testing.assert_allclose(kkkp.zeta, kkk.zeta, rtol=0.1 * tol_factor, atol=1e-3 * tol_factor)
print('jackknife:')
cov = kkkp.estimate_cov('jackknife')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.3*tol_factor)
print('sample:')
cov = kkkp.estimate_cov('sample')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('marked:')
cov = kkkp.estimate_cov('marked_bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('bootstrap:')
cov = kkkp.estimate_cov('bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.4*tol_factor)
# Patch on 2,3
print('with patches on 2,3:')
kkkp.process(cat, catp)
print(kkkp.zeta.ravel())
np.testing.assert_allclose(kkkp.zeta, kkk.zeta, rtol=0.1 * tol_factor, atol=1e-3 * tol_factor)
print('jackknife:')
cov = kkkp.estimate_cov('jackknife')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.3*tol_factor)
print('sample:')
cov = kkkp.estimate_cov('sample')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.7*tol_factor)
print('marked:')
cov = kkkp.estimate_cov('marked_bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('bootstrap:')
cov = kkkp.estimate_cov('bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.3*tol_factor)
# Patch on 1,3
print('with patches on 1,3:')
kkkp.process(catp, cat, catp)
print(kkkp.zeta.ravel())
np.testing.assert_allclose(kkkp.zeta, kkk.zeta, rtol=0.1 * tol_factor, atol=1e-3 * tol_factor)
print('jackknife:')
cov = kkkp.estimate_cov('jackknife')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.3*tol_factor)
print('sample:')
cov = kkkp.estimate_cov('sample')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('marked:')
cov = kkkp.estimate_cov('marked_bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.8*tol_factor)
print('bootstrap:')
cov = kkkp.estimate_cov('bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_kkk))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_kkk), atol=0.3*tol_factor)
# Finally a set (with all patches) using the KKKCrossCorrelation class.
kkkc = treecorr.KKKCrossCorrelation(nbins=3, min_sep=30., max_sep=100.,
min_u=0.9, max_u=1.0, nubins=1,
min_v=0.0, max_v=0.1, nvbins=1, rng=rng)
print('CrossCorrelation:')
kkkc.process(catp, catp, catp)
for k1 in kkkc._all:
print(k1.ntri.ravel())
print(k1.zeta.ravel())
print(k1.varzeta.ravel())
np.testing.assert_allclose(k1.ntri, kkk.ntri, rtol=0.05 * tol_factor)
np.testing.assert_allclose(k1.zeta, kkk.zeta, rtol=0.1 * tol_factor, atol=1e-3 * tol_factor)
np.testing.assert_allclose(k1.varzeta, kkk.varzeta, rtol=0.05 * tol_factor, atol=3.e-6)
print('jackknife:')
cov = kkkc.estimate_cov('jackknife')
print(np.diagonal(cov))
for i in range(6):
v = np.diagonal(cov)[i*6:(i+1)*6]
print('max log(ratio) = ',np.max(np.abs(np.log(v)-np.log(var_kkk))))
np.testing.assert_allclose(np.log(v), np.log(var_kkk), atol=0.5*tol_factor)
print('sample:')
cov = kkkc.estimate_cov('sample')
print(np.diagonal(cov))
for i in range(6):
v = np.diagonal(cov)[i*6:(i+1)*6]
print('max log(ratio) = ',np.max(np.abs(np.log(v)-np.log(var_kkk))))
np.testing.assert_allclose(np.log(v), np.log(var_kkk), atol=0.8*tol_factor)
print('marked:')
cov = kkkc.estimate_cov('marked_bootstrap')
print(np.diagonal(cov))
for i in range(6):
v = np.diagonal(cov)[i*6:(i+1)*6]
print('max log(ratio) = ',np.max(np.abs(np.log(v)-np.log(var_kkk))))
np.testing.assert_allclose(np.log(v), np.log(var_kkk), atol=0.8*tol_factor)
print('bootstrap:')
cov = kkkc.estimate_cov('bootstrap')
print(np.diagonal(cov))
for i in range(6):
v = np.diagonal(cov)[i*6:(i+1)*6]
print('max log(ratio) = ',np.max(np.abs(np.log(v)-np.log(var_kkk))))
np.testing.assert_allclose(np.log(v), np.log(var_kkk), atol=0.5*tol_factor)
# All catalogs need to have the same number of patches
catq = treecorr.Catalog(x=x, y=y, k=k, npatch=2*npatch)
with assert_raises(RuntimeError):
kkkp.process(catp, catq)
with assert_raises(RuntimeError):
kkkp.process(catp, catq, catq)
with assert_raises(RuntimeError):
kkkp.process(catq, catp, catq)
with assert_raises(RuntimeError):
kkkp.process(catq, catq, catp)
@timer
def test_ggg_jk():
# Test jackknife and other covariance estimates for ggg correlations.
if __name__ == '__main__':
# This setup takes about 590 sec to run.
nhalo = 5000
nsource = 5000
npatch = 32
tol_factor = 1
elif False:
# This setup takes about 160 sec to run.
nhalo = 2000
nsource = 2000
npatch = 16
tol_factor = 2
elif False:
# This setup takes about 50 sec to run.
nhalo = 1000
nsource = 1000
npatch = 16
tol_factor = 3
else:
# This setup takes about 13 sec to run.
nhalo = 500
nsource = 500
npatch = 8
tol_factor = 3
# I couldn't figure out a way to get reasonable S/N in the shear field. I thought doing
# discrete halos would give some significant 3pt shear pattern, at least for equilateral
# triangles, but the signal here is still consistent with zero. :(
# The point is the variance, which is still calculated ok, but I would have rathered
# have something with S/N > 0.
# For these tests, I set up the binning to just accumulate all roughly equilateral triangles
# in a small separation range. The binning always uses two bins for each to get + and - v
# bins. So this function averages these two values to produce 1 value for each gamma.
f = lambda g: np.array([np.mean(g.gam0), np.mean(g.gam1), np.mean(g.gam2), np.mean(g.gam3)])
file_name = 'data/test_ggg_jk_{}.npz'.format(nsource)
print(file_name)
if not os.path.isfile(file_name):
nruns = 1000
all_gggs = []
rng1 = np.random.RandomState()
for run in range(nruns):
x, y, g1, g2, _ = generate_shear_field(nsource, nhalo, rng1)
# For some reason std(g2) is coming out about 1.5x larger than std(g1).
# Probably a sign of some error in the generate function, but I don't see it.
# For this purpose I think it doesn't really matter, but it's a bit odd.
print(run,': ',np.mean(g1),np.std(g1),np.mean(g2),np.std(g2))
cat = treecorr.Catalog(x=x, y=y, g1=g1, g2=g2)
ggg = treecorr.GGGCorrelation(nbins=1, min_sep=20., max_sep=40.,
min_u=0.6, max_u=1.0, nubins=1,
min_v=0.0, max_v=0.6, nvbins=1)
ggg.process(cat)
print(ggg.ntri.ravel())
print(f(ggg))
all_gggs.append(ggg)
all_ggg = np.array([f(ggg) for ggg in all_gggs])
mean_ggg = np.mean(all_ggg, axis=0)
var_ggg = np.var(all_ggg, axis=0)
np.savez(file_name, mean_ggg=mean_ggg, var_ggg=var_ggg)
data = np.load(file_name)
mean_ggg = data['mean_ggg']
var_ggg = data['var_ggg']
print('mean = ',mean_ggg)
print('var = ',var_ggg)
rng = np.random.RandomState(12345)
x, y, g1, g2, _ = generate_shear_field(nsource, nhalo, rng)
cat = treecorr.Catalog(x=x, y=y, g1=g1, g2=g2)
ggg = treecorr.GGGCorrelation(nbins=1, min_sep=20., max_sep=40.,
min_u=0.6, max_u=1.0, nubins=1,
min_v=0.0, max_v=0.6, nvbins=1, rng=rng)
ggg.process(cat)
print(ggg.ntri.ravel())
print(ggg.gam0.ravel())
print(ggg.gam1.ravel())
print(ggg.gam2.ravel())
print(ggg.gam3.ravel())
gggp = ggg.copy()
catp = treecorr.Catalog(x=x, y=y, g1=g1, g2=g2, npatch=npatch)
# Do the same thing with patches.
gggp.process(catp)
print('with patches:')
print(gggp.ntri.ravel())
print(gggp.vargam0.ravel())
print(gggp.vargam1.ravel())
print(gggp.vargam2.ravel())
print(gggp.vargam3.ravel())
print(gggp.gam0.ravel())
print(gggp.gam1.ravel())
print(gggp.gam2.ravel())
print(gggp.gam3.ravel())
np.testing.assert_allclose(gggp.ntri, ggg.ntri, rtol=0.05 * tol_factor)
np.testing.assert_allclose(gggp.gam0, ggg.gam0, rtol=0.3 * tol_factor, atol=0.3 * tol_factor)
np.testing.assert_allclose(gggp.gam1, ggg.gam1, rtol=0.3 * tol_factor, atol=0.3 * tol_factor)
np.testing.assert_allclose(gggp.gam2, ggg.gam2, rtol=0.3 * tol_factor, atol=0.3 * tol_factor)
np.testing.assert_allclose(gggp.gam3, ggg.gam3, rtol=0.3 * tol_factor, atol=0.3 * tol_factor)
np.testing.assert_allclose(gggp.vargam0, ggg.vargam0, rtol=0.1 * tol_factor)
np.testing.assert_allclose(gggp.vargam1, ggg.vargam1, rtol=0.1 * tol_factor)
np.testing.assert_allclose(gggp.vargam2, ggg.vargam2, rtol=0.1 * tol_factor)
np.testing.assert_allclose(gggp.vargam3, ggg.vargam3, rtol=0.1 * tol_factor)
print('jackknife:')
cov = gggp.estimate_cov('jackknife', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.4*tol_factor)
print('sample:')
cov = gggp.estimate_cov('sample', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.8*tol_factor)
print('marked:')
cov = gggp.estimate_cov('marked_bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.9*tol_factor)
print('bootstrap:')
cov = gggp.estimate_cov('bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.3*tol_factor)
# Now as a cross correlation with all 3 using the same patch catalog.
print('with 3 patched catalogs:')
gggp.process(catp, catp, catp)
print(gggp.gam0.ravel())
np.testing.assert_allclose(gggp.gam0, ggg.gam0, rtol=0.3 * tol_factor, atol=0.3 * tol_factor)
np.testing.assert_allclose(gggp.gam1, ggg.gam1, rtol=0.3 * tol_factor, atol=0.3 * tol_factor)
np.testing.assert_allclose(gggp.gam2, ggg.gam2, rtol=0.3 * tol_factor, atol=0.3 * tol_factor)
np.testing.assert_allclose(gggp.gam3, ggg.gam3, rtol=0.3 * tol_factor, atol=0.3 * tol_factor)
print('jackknife:')
cov = gggp.estimate_cov('jackknife', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.4*tol_factor)
print('sample:')
cov = gggp.estimate_cov('sample', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.6*tol_factor)
print('marked:')
cov = gggp.estimate_cov('marked_bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.8*tol_factor)
print('bootstrap:')
cov = gggp.estimate_cov('bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.4*tol_factor)
# The separate patch/non-patch combinations aren't that interesting, so skip them
# for GGG unless running from main.
if __name__ == '__main__':
# Patch on 1 only:
print('with patches on 1 only:')
gggp.process(catp, cat)
print('jackknife:')
cov = gggp.estimate_cov('jackknife', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.8*tol_factor)
print('sample:')
cov = gggp.estimate_cov('sample', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.7*tol_factor)
print('marked:')
cov = gggp.estimate_cov('marked_bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.8*tol_factor)
print('bootstrap:')
cov = gggp.estimate_cov('bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.8*tol_factor)
# Patch on 2 only:
print('with patches on 2 only:')
gggp.process(cat, catp, cat)
print('jackknife:')
cov = gggp.estimate_cov('jackknife', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.8*tol_factor)
print('sample:')
cov = gggp.estimate_cov('sample', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.7*tol_factor)
print('marked:')
cov = gggp.estimate_cov('marked_bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.8*tol_factor)
print('bootstrap:')
cov = gggp.estimate_cov('bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.8*tol_factor)
# Patch on 3 only:
print('with patches on 3 only:')
gggp.process(cat, cat, catp)
print('jackknife:')
cov = gggp.estimate_cov('jackknife', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.8*tol_factor)
print('sample:')
cov = gggp.estimate_cov('sample', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.7*tol_factor)
print('marked:')
cov = gggp.estimate_cov('marked_bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.8*tol_factor)
print('bootstrap:')
cov = gggp.estimate_cov('bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.9*tol_factor)
# Patch on 1,2
print('with patches on 1,2:')
gggp.process(catp, catp, cat)
print('jackknife:')
cov = gggp.estimate_cov('jackknife', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.6*tol_factor)
print('sample:')
cov = gggp.estimate_cov('sample', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.6*tol_factor)
print('marked:')
cov = gggp.estimate_cov('marked_bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.8*tol_factor)
print('bootstrap:')
cov = gggp.estimate_cov('bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.5*tol_factor)
# Patch on 2,3
print('with patches on 2,3:')
gggp.process(cat, catp)
print('jackknife:')
cov = gggp.estimate_cov('jackknife', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.6*tol_factor)
print('sample:')
cov = gggp.estimate_cov('sample', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.8*tol_factor)
print('marked:')
cov = gggp.estimate_cov('marked_bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=1.0*tol_factor)
print('bootstrap:')
cov = gggp.estimate_cov('bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.3*tol_factor)
# Patch on 1,3
print('with patches on 1,3:')
gggp.process(catp, cat, catp)
print('jackknife:')
cov = gggp.estimate_cov('jackknife', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.6*tol_factor)
print('sample:')
cov = gggp.estimate_cov('sample', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.6*tol_factor)
print('marked:')
cov = gggp.estimate_cov('marked_bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.7*tol_factor)
print('bootstrap:')
cov = gggp.estimate_cov('bootstrap', func=f)
print(np.diagonal(cov).real)
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_ggg))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_ggg), atol=0.5*tol_factor)
# Finally a set (with all patches) using the GGGCrossCorrelation class.
gggc = treecorr.GGGCrossCorrelation(nbins=1, min_sep=20., max_sep=40.,
min_u=0.6, max_u=1.0, nubins=1,
min_v=0.0, max_v=0.6, nvbins=1, rng=rng)
print('CrossCorrelation:')
gggc.process(catp, catp, catp)
for g in gggc._all:
print(g.ntri.ravel())
print(g.gam0.ravel())
print(g.vargam0.ravel())
np.testing.assert_allclose(g.ntri, ggg.ntri, rtol=0.05 * tol_factor)
np.testing.assert_allclose(g.gam0, ggg.gam0, rtol=0.3 * tol_factor, atol=0.3 * tol_factor)
np.testing.assert_allclose(g.vargam0, ggg.vargam0, rtol=0.05 * tol_factor)
np.testing.assert_allclose(g.gam1, ggg.gam1, rtol=0.3 * tol_factor, atol=0.3 * tol_factor)
np.testing.assert_allclose(g.vargam1, ggg.vargam1, rtol=0.05 * tol_factor)
np.testing.assert_allclose(g.gam2, ggg.gam2, rtol=0.3 * tol_factor, atol=0.3 * tol_factor)
np.testing.assert_allclose(g.vargam2, ggg.vargam2, rtol=0.05 * tol_factor)
np.testing.assert_allclose(g.gam3, ggg.gam3, rtol=0.3 * tol_factor, atol=0.3 * tol_factor)
np.testing.assert_allclose(g.vargam3, ggg.vargam3, rtol=0.05 * tol_factor)
fc = lambda gggc: np.concatenate([
[np.mean(g.gam0), np.mean(g.gam1), np.mean(g.gam2), np.mean(g.gam3)]
for g in gggc._all])
print('jackknife:')
cov = gggc.estimate_cov('jackknife', func=fc)
print(np.diagonal(cov).real)
for i in range(6):
v = np.diagonal(cov)[i*4:(i+1)*4]
print('max log(ratio) = ',np.max(np.abs(np.log(v)-np.log(var_ggg))))
np.testing.assert_allclose(np.log(v), np.log(var_ggg), atol=0.4*tol_factor)
print('sample:')
cov = gggc.estimate_cov('sample', func=fc)
print(np.diagonal(cov).real)
for i in range(6):
v = np.diagonal(cov)[i*4:(i+1)*4]
print('max log(ratio) = ',np.max(np.abs(np.log(v)-np.log(var_ggg))))
np.testing.assert_allclose(np.log(v), np.log(var_ggg), atol=0.6*tol_factor)
print('marked:')
cov = gggc.estimate_cov('marked_bootstrap', func=fc)
print(np.diagonal(cov).real)
for i in range(6):
v = np.diagonal(cov)[i*4:(i+1)*4]
print('max log(ratio) = ',np.max(np.abs(np.log(v)-np.log(var_ggg))))
np.testing.assert_allclose(np.log(v), np.log(var_ggg), atol=0.8*tol_factor)
print('bootstrap:')
cov = gggc.estimate_cov('bootstrap', func=fc)
print(np.diagonal(cov).real)
for i in range(6):
v = np.diagonal(cov)[i*4:(i+1)*4]
print('max log(ratio) = ',np.max(np.abs(np.log(v)-np.log(var_ggg))))
np.testing.assert_allclose(np.log(v), np.log(var_ggg), atol=0.3*tol_factor)
# Without func, don't check the accuracy, but make sure it returns something the right shape.
cov = gggc.estimate_cov('jackknife')
assert cov.shape == (48, 48)
@timer
def test_nnn_jk():
# Test jackknife and other covariance estimates for nnn correlations.
if __name__ == '__main__':
# This setup takes about 1200 sec to run.
nhalo = 300
nsource = 2000
npatch = 16
source_factor = 50
rand_factor = 3
tol_factor = 1
elif False:
# This setup takes about 250 sec to run.
nhalo = 200
nsource = 1000
npatch = 16
source_factor = 50
rand_factor = 2
tol_factor = 2
else:
# This setup takes about 44 sec to run.
nhalo = 100
nsource = 500
npatch = 8
source_factor = 30
rand_factor = 1
tol_factor = 3
file_name = 'data/test_nnn_jk_{}.npz'.format(nsource)
print(file_name)
if not os.path.isfile(file_name):
rng = np.random.RandomState()
nruns = 1000
all_nnns = []
all_nnnc = []
t0 = time.time()
for run in range(nruns):
t2 = time.time()
x, y, _, _, k = generate_shear_field(nsource * source_factor, nhalo, rng)
p = k**3
p /= np.sum(p)
ns = rng.poisson(nsource)
select = rng.choice(range(len(x)), size=ns, replace=False, p=p)
print(run,': ',np.mean(k),np.std(k),np.min(k),np.max(k))
cat = treecorr.Catalog(x=x[select], y=y[select])
ddd = treecorr.NNNCorrelation(nbins=3, min_sep=50., max_sep=100., bin_slop=0.2,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0.0, max_v=0.2, nvbins=1)
rx = rng.uniform(0,1000, rand_factor*nsource)
ry = rng.uniform(0,1000, rand_factor*nsource)
rand_cat = treecorr.Catalog(x=rx, y=ry)
rrr = treecorr.NNNCorrelation(nbins=3, min_sep=50., max_sep=100., bin_slop=0.2,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0.0, max_v=0.2, nvbins=1)
rrr.process(rand_cat)
rdd = ddd.copy()
drr = ddd.copy()
ddd.process(cat)
rdd.process(rand_cat, cat)
drr.process(cat, rand_cat)
zeta_s, _ = ddd.calculateZeta(rrr)
zeta_c, _ = ddd.calculateZeta(rrr, drr, rdd)
print('simple: ',zeta_s.ravel())
print('compensated: ',zeta_c.ravel())
all_nnns.append(zeta_s.ravel())
all_nnnc.append(zeta_c.ravel())
t3 = time.time()
print('time: ',round(t3-t2),round((t3-t0)/60),round((t3-t0)*(nruns/(run+1)-1)/60))
mean_nnns = np.mean(all_nnns, axis=0)
var_nnns = np.var(all_nnns, axis=0)
mean_nnnc = np.mean(all_nnnc, axis=0)
var_nnnc = np.var(all_nnnc, axis=0)
np.savez(file_name, mean_nnns=mean_nnns, var_nnns=var_nnns,
mean_nnnc=mean_nnnc, var_nnnc=var_nnnc)
data = np.load(file_name)
mean_nnns = data['mean_nnns']
var_nnns = data['var_nnns']
mean_nnnc = data['mean_nnnc']
var_nnnc = data['var_nnnc']
print('mean simple = ',mean_nnns)
print('var simple = ',var_nnns)
print('mean compensated = ',mean_nnnc)
print('var compensated = ',var_nnnc)
# Make a random catalog with 2x as many sources, randomly distributed .
rng = np.random.RandomState(1234)
rx = rng.uniform(0,1000, rand_factor*nsource)
ry = rng.uniform(0,1000, rand_factor*nsource)
rand_cat = treecorr.Catalog(x=rx, y=ry)
rrr = treecorr.NNNCorrelation(nbins=3, min_sep=50., max_sep=100., bin_slop=0.2,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0.0, max_v=0.2, nvbins=1)
t0 = time.time()
rrr.process(rand_cat)
t1 = time.time()
print('Time to process rand cat = ',t1-t0)
print('RRR:',rrr.tot)
print(rrr.ntri.ravel())
# Make the data catalog
x, y, _, _, k = generate_shear_field(nsource * source_factor, nhalo, rng=rng)
print('mean k = ',np.mean(k))
print('min,max = ',np.min(k),np.max(k))
p = k**3
p /= np.sum(p)
select = rng.choice(range(len(x)), size=nsource, replace=False, p=p)
cat = treecorr.Catalog(x=x[select], y=y[select])
ddd = treecorr.NNNCorrelation(nbins=3, min_sep=50., max_sep=100., bin_slop=0.2,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0.0, max_v=0.2, nvbins=1, rng=rng)
rdd = ddd.copy()
drr = ddd.copy()
ddd.process(cat)
rdd.process(rand_cat, cat)
drr.process(cat, rand_cat)
zeta_s1, var_zeta_s1 = ddd.calculateZeta(rrr)
zeta_c1, var_zeta_c1 = ddd.calculateZeta(rrr, drr, rdd)
print('DDD:',ddd.tot)
print(ddd.ntri.ravel())
print('simple: ')
print(zeta_s1.ravel())
print(var_zeta_s1.ravel())
print('DRR:',drr.tot)
print(drr.ntri.ravel())
print('RDD:',rdd.tot)
print(rdd.ntri.ravel())
print('compensated: ')
print(zeta_c1.ravel())
print(var_zeta_c1.ravel())
# Make the patches with a large random catalog to make sure the patches are uniform area.
big_rx = rng.uniform(0,1000, 100*nsource)
big_ry = rng.uniform(0,1000, 100*nsource)
big_catp = treecorr.Catalog(x=big_rx, y=big_ry, npatch=npatch, rng=rng)
patch_centers = big_catp.patch_centers
# Do the same thing with patches on D, but not yet on R.
dddp = treecorr.NNNCorrelation(nbins=3, min_sep=50., max_sep=100., bin_slop=0.2,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0.0, max_v=0.2, nvbins=1, rng=rng)
rddp = dddp.copy()
drrp = dddp.copy()
catp = treecorr.Catalog(x=x[select], y=y[select], patch_centers=patch_centers)
print('Patch\tNtot')
for p in catp.patches:
print(p.patch,'\t',p.ntot,'\t',patch_centers[p.patch])
print('with patches on D:')
dddp.process(catp)
rddp.process(rand_cat, catp)
drrp.process(catp, rand_cat)
# Need to run calculateZeta to get patch-based covariance
with assert_raises(RuntimeError):
dddp.estimate_cov('jackknife')
zeta_s2, var_zeta_s2 = dddp.calculateZeta(rrr)
print('DDD:',dddp.tot)
print(dddp.ntri.ravel())
print('simple: ')
print(zeta_s2.ravel())
print(var_zeta_s2.ravel())
np.testing.assert_allclose(zeta_s2, zeta_s1, rtol=0.05 * tol_factor)
np.testing.assert_allclose(var_zeta_s2, var_zeta_s1, rtol=0.05 * tol_factor)
# Check the _calculate_xi_from_pairs function. Using all pairs, should get total xi.
ddd1 = dddp.copy()
ddd1._calculate_xi_from_pairs(dddp.results.keys())
np.testing.assert_allclose(ddd1.zeta, dddp.zeta)
# None of these are very good without the random using patches.
# I think this is basically just that the approximations used for estimating the area_frac
# to figure out the appropriate altered RRR counts isn't accurate enough when the total
# counts are as low as this. I think (hope) that it should be semi-ok when N is much larger,
# but this is probably saying that for 3pt using patches for R is even more important than
# for 2pt.
# Ofc, it could also be that this is telling me I still have a bug somewhere that I haven't
# managed to find... :(
print('jackknife:')
cov = dddp.estimate_cov('jackknife')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=2.3*tol_factor)
print('sample:')
cov = dddp.estimate_cov('sample')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=1.2*tol_factor)
print('marked:')
cov = dddp.estimate_cov('marked_bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=1.3*tol_factor)
print('bootstrap:')
cov = dddp.estimate_cov('bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=2.2*tol_factor)
zeta_c2, var_zeta_c2 = dddp.calculateZeta(rrr, drrp, rddp)
print('compensated: ')
print('DRR:',drrp.tot)
print(drrp.ntri.ravel())
print('RDD:',rddp.tot)
print(rddp.ntri.ravel())
print(zeta_c2.ravel())
print(var_zeta_c2.ravel())
np.testing.assert_allclose(zeta_c2, zeta_c1, rtol=0.05 * tol_factor, atol=1.e-3 * tol_factor)
np.testing.assert_allclose(var_zeta_c2, var_zeta_c1, rtol=0.05 * tol_factor)
print('jackknife:')
cov = dddp.estimate_cov('jackknife')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnnc))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnnc), atol=2.6*tol_factor)
print('sample:')
cov = dddp.estimate_cov('sample')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnnc))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnnc), atol=3.8*tol_factor)
print('marked:')
cov = dddp.estimate_cov('marked_bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnnc))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnnc), atol=2.3*tol_factor)
print('bootstrap:')
cov = dddp.estimate_cov('bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnnc))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnnc), atol=2.6*tol_factor)
# Now with the random also using patches
# These are a lot better than the above tests. But still not nearly as good as we were able
# to get in 2pt. I'm pretty sure this is just due to the fact that we need to have much
# smaller catalogs to make it feasible to run this in a reasonable amount of time. I don't
# think this is a sign of any bug in the code.
print('with patched random catalog:')
rand_catp = treecorr.Catalog(x=rx, y=ry, patch_centers=patch_centers)
rrrp = rrr.copy()
rrrp.process(rand_catp)
drrp.process(catp, rand_catp)
rddp.process(rand_catp, catp)
print('simple: ')
zeta_s2, var_zeta_s2 = dddp.calculateZeta(rrrp)
print('DDD:',dddp.tot)
print(dddp.ntri.ravel())
print(zeta_s2.ravel())
print(var_zeta_s2.ravel())
np.testing.assert_allclose(zeta_s2, zeta_s1, rtol=0.05 * tol_factor)
np.testing.assert_allclose(var_zeta_s2, var_zeta_s1, rtol=0.05 * tol_factor)
ddd1 = dddp.copy()
ddd1._calculate_xi_from_pairs(dddp.results.keys())
np.testing.assert_allclose(ddd1.zeta, dddp.zeta)
t0 = time.time()
print('jackknife:')
cov = dddp.estimate_cov('jackknife')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=0.9*tol_factor)
t1 = time.time()
print('t = ',t1-t0)
t0 = time.time()
print('sample:')
cov = dddp.estimate_cov('sample')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=0.7*tol_factor)
t1 = time.time()
print('t = ',t1-t0)
t0 = time.time()
print('marked:')
cov = dddp.estimate_cov('marked_bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=0.8*tol_factor)
t1 = time.time()
print('t = ',t1-t0)
t0 = time.time()
print('bootstrap:')
cov = dddp.estimate_cov('bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=1.0*tol_factor)
t1 = time.time()
print('t = ',t1-t0)
t0 = time.time()
print('compensated: ')
zeta_c2, var_zeta_c2 = dddp.calculateZeta(rrrp, drrp, rddp)
print('DRR:',drrp.tot)
print(drrp.ntri.ravel())
print('RDD:',rddp.tot)
print(rddp.ntri.ravel())
print(zeta_c2.ravel())
print(var_zeta_c2.ravel())
np.testing.assert_allclose(zeta_c2, zeta_c1, rtol=0.05 * tol_factor, atol=1.e-3 * tol_factor)
np.testing.assert_allclose(var_zeta_c2, var_zeta_c1, rtol=0.05 * tol_factor)
t0 = time.time()
print('jackknife:')
cov = dddp.estimate_cov('jackknife')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnnc))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnnc), atol=0.8*tol_factor)
t1 = time.time()
print('t = ',t1-t0)
t0 = time.time()
print('sample:')
cov = dddp.estimate_cov('sample')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnnc))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnnc), atol=0.8*tol_factor)
t1 = time.time()
print('t = ',t1-t0)
t0 = time.time()
print('marked:')
cov = dddp.estimate_cov('marked_bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnnc))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnnc), atol=0.8*tol_factor)
t1 = time.time()
print('t = ',t1-t0)
t0 = time.time()
print('bootstrap:')
cov = dddp.estimate_cov('bootstrap')
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnnc))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnnc), atol=0.8*tol_factor)
t1 = time.time()
print('t = ',t1-t0)
t0 = time.time()
# I haven't implemented calculateZeta for the NNNCrossCorrelation class, because I'm not
# actually sure what the right thing to do here is for calculating a single zeta vectors.
# Do we do a different one for each of the 6 permutations? Or one overall one?
# So rather than just do something, I'll wait until someone has a coherent use case where
# they want this and can explain exactly what the right thing to compute is.
# So to just exercise the machinery with NNNCrossCorrelation, I'm using a func parameter
# to compute something equivalent to the simple zeta calculation.
dddc = treecorr.NNNCrossCorrelation(nbins=3, min_sep=50., max_sep=100., bin_slop=0.2,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0.0, max_v=0.2, nvbins=1, rng=rng)
rrrc = treecorr.NNNCrossCorrelation(nbins=3, min_sep=50., max_sep=100., bin_slop=0.2,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0.0, max_v=0.2, nvbins=1)
print('CrossCorrelation:')
dddc.process(catp, catp, catp)
rrrc.process(rand_catp, rand_catp, rand_catp)
def cc_zeta(corrs):
d, r = corrs
d1 = d.n1n2n3.copy()
d1._sum(d._all)
r1 = r.n1n2n3.copy()
r1._sum(r._all)
zeta, _ = d1.calculateZeta(r1)
return zeta.ravel()
print('simple: ')
zeta_s3 = cc_zeta([dddc, rrrc])
print(zeta_s3)
np.testing.assert_allclose(zeta_s3, zeta_s1.ravel(), rtol=0.05 * tol_factor)
print('jackknife:')
cov = treecorr.estimate_multi_cov([dddc,rrrc], 'jackknife', cc_zeta)
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=0.9*tol_factor)
print('sample:')
cov = treecorr.estimate_multi_cov([dddc,rrrc], 'sample', cc_zeta)
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=1.2*tol_factor)
print('marked:')
cov = treecorr.estimate_multi_cov([dddc,rrrc], 'marked_bootstrap', cc_zeta)
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=1.5*tol_factor)
print('bootstrap:')
cov = treecorr.estimate_multi_cov([dddc,rrrc], 'bootstrap', cc_zeta)
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=0.6*tol_factor)
# Repeat with a 1-2 cross-correlation
print('CrossCorrelation 1-2:')
dddc.process(catp, catp)
rrrc.process(rand_catp, rand_catp)
print('simple: ')
zeta_s3 = cc_zeta([dddc, rrrc])
print(zeta_s3)
np.testing.assert_allclose(zeta_s3, zeta_s1.ravel(), rtol=0.05 * tol_factor)
print('jackknife:')
cov = treecorr.estimate_multi_cov([dddc,rrrc], 'jackknife', cc_zeta)
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=0.9*tol_factor)
print('sample:')
cov = treecorr.estimate_multi_cov([dddc,rrrc], 'sample', cc_zeta)
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=1.1*tol_factor)
print('marked:')
cov = treecorr.estimate_multi_cov([dddc,rrrc], 'marked_bootstrap', cc_zeta)
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=1.5*tol_factor)
print('bootstrap:')
cov = treecorr.estimate_multi_cov([dddc,rrrc], 'bootstrap', cc_zeta)
print(np.diagonal(cov))
print('max log(ratio) = ',np.max(np.abs(np.log(np.diagonal(cov))-np.log(var_nnns))))
np.testing.assert_allclose(np.log(np.diagonal(cov)), np.log(var_nnns), atol=0.6*tol_factor)
@timer
def test_brute_jk():
# With bin_slop = 0, the jackknife calculation from patches should match a
# brute force calcaulation where we literally remove one patch at a time to make
# the vectors.
if __name__ == '__main__':
nhalo = 100
ngal = 500
npatch = 16
rand_factor = 5
else:
nhalo = 100
ngal = 30
npatch = 16
rand_factor = 2
rng = np.random.RandomState(8675309)
x, y, g1, g2, k = generate_shear_field(ngal, nhalo, rng)
rx = rng.uniform(0,1000, rand_factor*ngal)
ry = rng.uniform(0,1000, rand_factor*ngal)
rand_cat_nopatch = treecorr.Catalog(x=rx, y=ry)
rand_cat = treecorr.Catalog(x=rx, y=ry, npatch=npatch, rng=rng)
patch_centers = rand_cat.patch_centers
cat_nopatch = treecorr.Catalog(x=x, y=y, g1=g1, g2=g2, k=k)
cat = treecorr.Catalog(x=x, y=y, g1=g1, g2=g2, k=k, patch_centers=patch_centers)
print('cat patches = ',np.unique(cat.patch))
print('len = ',cat.nobj, cat.ntot)
assert cat.nobj == ngal
print('Patch\tNtot')
for p in cat.patches:
print(p.patch,'\t',p.ntot,'\t',patch_centers[p.patch])
# Start with KKK, since relatively simple.
kkk1 = treecorr.KKKCorrelation(nbins=3, min_sep=100., max_sep=300., brute=True,
min_u=0., max_u=1.0, nubins=1,
min_v=0., max_v=1.0, nvbins=1)
kkk1.process(cat_nopatch)
kkk = treecorr.KKKCorrelation(nbins=3, min_sep=100., max_sep=300., brute=True,
min_u=0., max_u=1.0, nubins=1,
min_v=0., max_v=1.0, nvbins=1,
var_method='jackknife')
kkk.process(cat)
np.testing.assert_allclose(kkk.zeta, kkk1.zeta)
kkk_zeta_list = []
for i in range(npatch):
cat1 = treecorr.Catalog(x=cat.x[cat.patch != i],
y=cat.y[cat.patch != i],
k=cat.k[cat.patch != i],
g1=cat.g1[cat.patch != i],
g2=cat.g2[cat.patch != i])
kkk1 = treecorr.KKKCorrelation(nbins=3, min_sep=100., max_sep=300., brute=True,
min_u=0., max_u=1.0, nubins=1,
min_v=0., max_v=1.0, nvbins=1)
kkk1.process(cat1)
print('zeta = ',kkk1.zeta.ravel())
kkk_zeta_list.append(kkk1.zeta.ravel())
kkk_zeta_list = np.array(kkk_zeta_list)
cov = np.cov(kkk_zeta_list.T, bias=True) * (len(kkk_zeta_list)-1)
varzeta = np.diagonal(np.cov(kkk_zeta_list.T, bias=True)) * (len(kkk_zeta_list)-1)
print('KKK: treecorr jackknife varzeta = ',kkk.varzeta.ravel())
print('KKK: direct jackknife varzeta = ',varzeta)
np.testing.assert_allclose(kkk.varzeta.ravel(), varzeta)
# Now GGG
ggg1 = treecorr.GGGCorrelation(nbins=3, min_sep=100., max_sep=300., brute=True,
min_u=0., max_u=1.0, nubins=1,
min_v=0., max_v=1.0, nvbins=1)
ggg1.process(cat_nopatch)
ggg = treecorr.GGGCorrelation(nbins=3, min_sep=100., max_sep=300., brute=True,
min_u=0., max_u=1.0, nubins=1,
min_v=0., max_v=1.0, nvbins=1,
var_method='jackknife')
ggg.process(cat)
np.testing.assert_allclose(ggg.gam0, ggg1.gam0)
np.testing.assert_allclose(ggg.gam1, ggg1.gam1)
np.testing.assert_allclose(ggg.gam2, ggg1.gam2)
np.testing.assert_allclose(ggg.gam3, ggg1.gam3)
ggg_gam0_list = []
ggg_gam1_list = []
ggg_gam2_list = []
ggg_gam3_list = []
ggg_map3_list = []
for i in range(npatch):
cat1 = treecorr.Catalog(x=cat.x[cat.patch != i],
y=cat.y[cat.patch != i],
k=cat.k[cat.patch != i],
g1=cat.g1[cat.patch != i],
g2=cat.g2[cat.patch != i])
ggg1 = treecorr.GGGCorrelation(nbins=3, min_sep=100., max_sep=300., brute=True,
min_u=0., max_u=1.0, nubins=1,
min_v=0., max_v=1.0, nvbins=1)
ggg1.process(cat1)
ggg_gam0_list.append(ggg1.gam0.ravel())
ggg_gam1_list.append(ggg1.gam1.ravel())
ggg_gam2_list.append(ggg1.gam2.ravel())
ggg_gam3_list.append(ggg1.gam3.ravel())
ggg_map3_list.append(ggg1.calculateMap3()[0])
ggg_gam0_list = np.array(ggg_gam0_list)
vargam0 = np.diagonal(np.cov(ggg_gam0_list.T, bias=True)) * (len(ggg_gam0_list)-1)
print('GGG: treecorr jackknife vargam0 = ',ggg.vargam0.ravel())
print('GGG: direct jackknife vargam0 = ',vargam0)
np.testing.assert_allclose(ggg.vargam0.ravel(), vargam0)
ggg_gam1_list = np.array(ggg_gam1_list)
vargam1 = np.diagonal(np.cov(ggg_gam1_list.T, bias=True)) * (len(ggg_gam1_list)-1)
print('GGG: treecorr jackknife vargam1 = ',ggg.vargam1.ravel())
print('GGG: direct jackknife vargam1 = ',vargam1)
np.testing.assert_allclose(ggg.vargam1.ravel(), vargam1)
ggg_gam2_list = np.array(ggg_gam2_list)
vargam2 = np.diagonal(np.cov(ggg_gam2_list.T, bias=True)) * (len(ggg_gam2_list)-1)
print('GGG: treecorr jackknife vargam2 = ',ggg.vargam2.ravel())
print('GGG: direct jackknife vargam2 = ',vargam2)
np.testing.assert_allclose(ggg.vargam2.ravel(), vargam2)
ggg_gam3_list = np.array(ggg_gam3_list)
vargam3 = np.diagonal(np.cov(ggg_gam3_list.T, bias=True)) * (len(ggg_gam3_list)-1)
print('GGG: treecorr jackknife vargam3 = ',ggg.vargam3.ravel())
print('GGG: direct jackknife vargam3 = ',vargam3)
np.testing.assert_allclose(ggg.vargam3.ravel(), vargam3)
ggg_map3_list = np.array(ggg_map3_list)
varmap3 = np.diagonal(np.cov(ggg_map3_list.T, bias=True)) * (len(ggg_map3_list)-1)
covmap3 = treecorr.estimate_multi_cov([ggg], 'jackknife',
lambda corrs: corrs[0].calculateMap3()[0])
print('GGG: treecorr jackknife varmap3 = ',np.diagonal(covmap3))
print('GGG: direct jackknife varmap3 = ',varmap3)
np.testing.assert_allclose(np.diagonal(covmap3), varmap3)
# Finally NNN, where we need to use randoms. Both simple and compensated.
ddd = treecorr.NNNCorrelation(nbins=3, min_sep=100., max_sep=300., bin_slop=0,
min_u=0., max_u=1.0, nubins=1,
min_v=0., max_v=1.0, nvbins=1,
var_method='jackknife')
drr = ddd.copy()
rdd = ddd.copy()
rrr = ddd.copy()
ddd.process(cat)
drr.process(cat, rand_cat)
rdd.process(rand_cat, cat)
rrr.process(rand_cat)
zeta1_list = []
zeta2_list = []
for i in range(npatch):
cat1 = treecorr.Catalog(x=cat.x[cat.patch != i],
y=cat.y[cat.patch != i],
k=cat.k[cat.patch != i],
g1=cat.g1[cat.patch != i],
g2=cat.g2[cat.patch != i])
rand_cat1 = treecorr.Catalog(x=rand_cat.x[rand_cat.patch != i],
y=rand_cat.y[rand_cat.patch != i])
ddd1 = treecorr.NNNCorrelation(nbins=3, min_sep=100., max_sep=300., bin_slop=0,
min_u=0., max_u=1.0, nubins=1,
min_v=0., max_v=1.0, nvbins=1)
drr1 = ddd1.copy()
rdd1 = ddd1.copy()
rrr1 = ddd1.copy()
ddd1.process(cat1)
drr1.process(cat1, rand_cat1)
rdd1.process(rand_cat1, cat1)
rrr1.process(rand_cat1)
zeta1_list.append(ddd1.calculateZeta(rrr1)[0].ravel())
zeta2_list.append(ddd1.calculateZeta(rrr1, drr1, rdd1)[0].ravel())
print('simple')
zeta1_list = np.array(zeta1_list)
zeta2, varzeta2 = ddd.calculateZeta(rrr)
varzeta1 = np.diagonal(np.cov(zeta1_list.T, bias=True)) * (len(zeta1_list)-1)
print('NNN: treecorr jackknife varzeta = ',ddd.varzeta.ravel())
print('NNN: direct jackknife varzeta = ',varzeta1)
np.testing.assert_allclose(ddd.varzeta.ravel(), varzeta1)
print('compensated')
print(zeta2_list)
zeta2_list = np.array(zeta2_list)
zeta2, varzeta2 = ddd.calculateZeta(rrr, drr=drr, rdd=rdd)
varzeta2 = np.diagonal(np.cov(zeta2_list.T, bias=True)) * (len(zeta2_list)-1)
print('NNN: treecorr jackknife varzeta = ',ddd.varzeta.ravel())
print('NNN: direct jackknife varzeta = ',varzeta2)
np.testing.assert_allclose(ddd.varzeta.ravel(), varzeta2)
# Can't do patch calculation with different numbers of patches in rrr, drr, rdd.
rand_cat3 = treecorr.Catalog(x=rx, y=ry, npatch=3)
cat3 = treecorr.Catalog(x=x, y=y, patch_centers=rand_cat3.patch_centers)
rrr3 = rrr.copy()
drr3 = drr.copy()
rdd3 = rdd.copy()
rrr3.process(rand_cat3)
drr3.process(cat3, rand_cat3)
rdd3.process(rand_cat3, cat3)
with assert_raises(RuntimeError):
ddd.calculateZeta(rrr3)
with assert_raises(RuntimeError):
ddd.calculateZeta(rrr3, drr, rdd)
with assert_raises(RuntimeError):
ddd.calculateZeta(rrr, drr3, rdd3)
with assert_raises(RuntimeError):
ddd.calculateZeta(rrr, drr, rdd3)
with assert_raises(RuntimeError):
ddd.calculateZeta(rrr, drr3, rdd)
@timer
def test_finalize_false():
nsource = 80
nhalo = 100
npatch = 16
# Make three independent data sets
rng = np.random.RandomState(8675309)
x_1, y_1, g1_1, g2_1, k_1 = generate_shear_field(nsource, nhalo, rng)
x_2, y_2, g1_2, g2_2, k_2 = generate_shear_field(nsource, nhalo, rng)
x_3, y_3, g1_3, g2_3, k_3 = generate_shear_field(nsource, nhalo, rng)
# Make a single catalog with all three together
cat = treecorr.Catalog(x=np.concatenate([x_1, x_2, x_3]),
y=np.concatenate([y_1, y_2, y_3]),
g1=np.concatenate([g1_1, g1_2, g1_3]),
g2=np.concatenate([g2_1, g2_2, g2_3]),
k=np.concatenate([k_1, k_2, k_3]),
npatch=npatch)
# Now the three separately, using the same patch centers
cat1 = treecorr.Catalog(x=x_1, y=y_1, g1=g1_1, g2=g2_1, k=k_1, patch_centers=cat.patch_centers)
cat2 = treecorr.Catalog(x=x_2, y=y_2, g1=g1_2, g2=g2_2, k=k_2, patch_centers=cat.patch_centers)
cat3 = treecorr.Catalog(x=x_3, y=y_3, g1=g1_3, g2=g2_3, k=k_3, patch_centers=cat.patch_centers)
np.testing.assert_array_equal(cat1.patch, cat.patch[0:nsource])
np.testing.assert_array_equal(cat2.patch, cat.patch[nsource:2*nsource])
np.testing.assert_array_equal(cat3.patch, cat.patch[2*nsource:3*nsource])
# KKK auto
kkk1 = treecorr.KKKCorrelation(nbins=3, min_sep=30., max_sep=100., brute=True,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0., max_v=0.2, nvbins=1)
kkk1.process(cat)
kkk2 = treecorr.KKKCorrelation(nbins=3, min_sep=30., max_sep=100., brute=True,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0., max_v=0.2, nvbins=1)
kkk2.process(cat1, initialize=True, finalize=False)
kkk2.process(cat2, initialize=False, finalize=False)
kkk2.process(cat3, initialize=False, finalize=False)
kkk2.process(cat1, cat2, initialize=False, finalize=False)
kkk2.process(cat1, cat3, initialize=False, finalize=False)
kkk2.process(cat2, cat1, initialize=False, finalize=False)
kkk2.process(cat2, cat3, initialize=False, finalize=False)
kkk2.process(cat3, cat1, initialize=False, finalize=False)
kkk2.process(cat3, cat2, initialize=False, finalize=False)
kkk2.process(cat1, cat2, cat3, initialize=False, finalize=True)
np.testing.assert_allclose(kkk1.ntri, kkk2.ntri)
np.testing.assert_allclose(kkk1.weight, kkk2.weight)
np.testing.assert_allclose(kkk1.meand1, kkk2.meand1)
np.testing.assert_allclose(kkk1.meand2, kkk2.meand2)
np.testing.assert_allclose(kkk1.meand3, kkk2.meand3)
np.testing.assert_allclose(kkk1.zeta, kkk2.zeta)
# KKK cross12
cat23 = treecorr.Catalog(x=np.concatenate([x_2, x_3]),
y=np.concatenate([y_2, y_3]),
g1=np.concatenate([g1_2, g1_3]),
g2=np.concatenate([g2_2, g2_3]),
k=np.concatenate([k_2, k_3]),
patch_centers=cat.patch_centers)
np.testing.assert_array_equal(cat23.patch, cat.patch[nsource:3*nsource])
kkk1.process(cat1, cat23)
kkk2.process(cat1, cat2, initialize=True, finalize=False)
kkk2.process(cat1, cat3, initialize=False, finalize=False)
kkk2.process(cat1, cat2, cat3, initialize=False, finalize=True)
np.testing.assert_allclose(kkk1.ntri, kkk2.ntri)
np.testing.assert_allclose(kkk1.weight, kkk2.weight)
np.testing.assert_allclose(kkk1.meand1, kkk2.meand1)
np.testing.assert_allclose(kkk1.meand2, kkk2.meand2)
np.testing.assert_allclose(kkk1.meand3, kkk2.meand3)
np.testing.assert_allclose(kkk1.zeta, kkk2.zeta)
# KKKCross cross12
kkkc1 = treecorr.KKKCrossCorrelation(nbins=3, min_sep=30., max_sep=100., brute=True,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0., max_v=0.2, nvbins=1)
kkkc1.process(cat1, cat23)
kkkc2 = treecorr.KKKCrossCorrelation(nbins=3, min_sep=30., max_sep=100., brute=True,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0., max_v=0.2, nvbins=1)
kkkc2.process(cat1, cat2, initialize=True, finalize=False)
kkkc2.process(cat1, cat3, initialize=False, finalize=False)
kkkc2.process(cat1, cat2, cat3, initialize=False, finalize=True)
for perm in ['k1k2k3', 'k1k3k2', 'k2k1k3', 'k2k3k1', 'k3k1k2', 'k3k2k1']:
kkk1 = getattr(kkkc1, perm)
kkk2 = getattr(kkkc2, perm)
np.testing.assert_allclose(kkk1.ntri, kkk2.ntri)
np.testing.assert_allclose(kkk1.weight, kkk2.weight)
np.testing.assert_allclose(kkk1.meand1, kkk2.meand1)
np.testing.assert_allclose(kkk1.meand2, kkk2.meand2)
np.testing.assert_allclose(kkk1.meand3, kkk2.meand3)
np.testing.assert_allclose(kkk1.zeta, kkk2.zeta)
# KKK cross
kkk1.process(cat, cat2, cat3)
kkk2.process(cat1, cat2, cat3, initialize=True, finalize=False)
kkk2.process(cat2, cat2, cat3, initialize=False, finalize=False)
kkk2.process(cat3, cat2, cat3, initialize=False, finalize=True)
np.testing.assert_allclose(kkk1.ntri, kkk2.ntri)
np.testing.assert_allclose(kkk1.weight, kkk2.weight)
np.testing.assert_allclose(kkk1.meand1, kkk2.meand1)
np.testing.assert_allclose(kkk1.meand2, kkk2.meand2)
np.testing.assert_allclose(kkk1.meand3, kkk2.meand3)
np.testing.assert_allclose(kkk1.zeta, kkk2.zeta)
# KKKCross cross
kkkc1.process(cat, cat2, cat3)
kkkc2.process(cat1, cat2, cat3, initialize=True, finalize=False)
kkkc2.process(cat2, cat2, cat3, initialize=False, finalize=False)
kkkc2.process(cat3, cat2, cat3, initialize=False, finalize=True)
for perm in ['k1k2k3', 'k1k3k2', 'k2k1k3', 'k2k3k1', 'k3k1k2', 'k3k2k1']:
kkk1 = getattr(kkkc1, perm)
kkk2 = getattr(kkkc2, perm)
np.testing.assert_allclose(kkk1.ntri, kkk2.ntri)
np.testing.assert_allclose(kkk1.weight, kkk2.weight)
np.testing.assert_allclose(kkk1.meand1, kkk2.meand1)
np.testing.assert_allclose(kkk1.meand2, kkk2.meand2)
np.testing.assert_allclose(kkk1.meand3, kkk2.meand3)
np.testing.assert_allclose(kkk1.zeta, kkk2.zeta)
# GGG auto
ggg1 = treecorr.GGGCorrelation(nbins=3, min_sep=30., max_sep=100., brute=True,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0., max_v=0.2, nvbins=1)
ggg1.process(cat)
ggg2 = treecorr.GGGCorrelation(nbins=3, min_sep=30., max_sep=100., brute=True,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0., max_v=0.2, nvbins=1)
ggg2.process(cat1, initialize=True, finalize=False)
ggg2.process(cat2, initialize=False, finalize=False)
ggg2.process(cat3, initialize=False, finalize=False)
ggg2.process(cat1, cat2, initialize=False, finalize=False)
ggg2.process(cat1, cat3, initialize=False, finalize=False)
ggg2.process(cat2, cat1, initialize=False, finalize=False)
ggg2.process(cat2, cat3, initialize=False, finalize=False)
ggg2.process(cat3, cat1, initialize=False, finalize=False)
ggg2.process(cat3, cat2, initialize=False, finalize=False)
ggg2.process(cat1, cat2, cat3, initialize=False, finalize=True)
np.testing.assert_allclose(ggg1.ntri, ggg2.ntri)
np.testing.assert_allclose(ggg1.weight, ggg2.weight)
np.testing.assert_allclose(ggg1.meand1, ggg2.meand1)
np.testing.assert_allclose(ggg1.meand2, ggg2.meand2)
np.testing.assert_allclose(ggg1.meand3, ggg2.meand3)
np.testing.assert_allclose(ggg1.gam0, ggg2.gam0)
np.testing.assert_allclose(ggg1.gam1, ggg2.gam1)
np.testing.assert_allclose(ggg1.gam2, ggg2.gam2)
np.testing.assert_allclose(ggg1.gam3, ggg2.gam3)
# GGG cross12
ggg1.process(cat1, cat23)
ggg2.process(cat1, cat2, initialize=True, finalize=False)
ggg2.process(cat1, cat3, initialize=False, finalize=False)
ggg2.process(cat1, cat2, cat3, initialize=False, finalize=True)
np.testing.assert_allclose(ggg1.ntri, ggg2.ntri)
np.testing.assert_allclose(ggg1.weight, ggg2.weight)
np.testing.assert_allclose(ggg1.meand1, ggg2.meand1)
np.testing.assert_allclose(ggg1.meand2, ggg2.meand2)
np.testing.assert_allclose(ggg1.meand3, ggg2.meand3)
np.testing.assert_allclose(ggg1.gam0, ggg2.gam0)
np.testing.assert_allclose(ggg1.gam1, ggg2.gam1)
np.testing.assert_allclose(ggg1.gam2, ggg2.gam2)
np.testing.assert_allclose(ggg1.gam3, ggg2.gam3)
# GGGCross cross12
gggc1 = treecorr.GGGCrossCorrelation(nbins=3, min_sep=30., max_sep=100., brute=True,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0., max_v=0.2, nvbins=1)
gggc1.process(cat1, cat23)
gggc2 = treecorr.GGGCrossCorrelation(nbins=3, min_sep=30., max_sep=100., brute=True,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0., max_v=0.2, nvbins=1)
gggc2.process(cat1, cat2, initialize=True, finalize=False)
gggc2.process(cat1, cat3, initialize=False, finalize=False)
gggc2.process(cat1, cat2, cat3, initialize=False, finalize=True)
for perm in ['g1g2g3', 'g1g3g2', 'g2g1g3', 'g2g3g1', 'g3g1g2', 'g3g2g1']:
ggg1 = getattr(gggc1, perm)
ggg2 = getattr(gggc2, perm)
np.testing.assert_allclose(ggg1.ntri, ggg2.ntri)
np.testing.assert_allclose(ggg1.weight, ggg2.weight)
np.testing.assert_allclose(ggg1.meand1, ggg2.meand1)
np.testing.assert_allclose(ggg1.meand2, ggg2.meand2)
np.testing.assert_allclose(ggg1.meand3, ggg2.meand3)
np.testing.assert_allclose(ggg1.gam0, ggg2.gam0)
np.testing.assert_allclose(ggg1.gam1, ggg2.gam1)
np.testing.assert_allclose(ggg1.gam2, ggg2.gam2)
np.testing.assert_allclose(ggg1.gam3, ggg2.gam3)
# GGG cross
ggg1.process(cat, cat2, cat3)
ggg2.process(cat1, cat2, cat3, initialize=True, finalize=False)
ggg2.process(cat2, cat2, cat3, initialize=False, finalize=False)
ggg2.process(cat3, cat2, cat3, initialize=False, finalize=True)
np.testing.assert_allclose(ggg1.ntri, ggg2.ntri)
np.testing.assert_allclose(ggg1.weight, ggg2.weight)
np.testing.assert_allclose(ggg1.meand1, ggg2.meand1)
np.testing.assert_allclose(ggg1.meand2, ggg2.meand2)
np.testing.assert_allclose(ggg1.meand3, ggg2.meand3)
np.testing.assert_allclose(ggg1.gam0, ggg2.gam0)
np.testing.assert_allclose(ggg1.gam1, ggg2.gam1)
np.testing.assert_allclose(ggg1.gam2, ggg2.gam2)
np.testing.assert_allclose(ggg1.gam3, ggg2.gam3)
# GGGCross cross
gggc1.process(cat, cat2, cat3)
gggc2.process(cat1, cat2, cat3, initialize=True, finalize=False)
gggc2.process(cat2, cat2, cat3, initialize=False, finalize=False)
gggc2.process(cat3, cat2, cat3, initialize=False, finalize=True)
for perm in ['g1g2g3', 'g1g3g2', 'g2g1g3', 'g2g3g1', 'g3g1g2', 'g3g2g1']:
ggg1 = getattr(gggc1, perm)
ggg2 = getattr(gggc2, perm)
np.testing.assert_allclose(ggg1.ntri, ggg2.ntri)
np.testing.assert_allclose(ggg1.weight, ggg2.weight)
np.testing.assert_allclose(ggg1.meand1, ggg2.meand1)
np.testing.assert_allclose(ggg1.meand2, ggg2.meand2)
np.testing.assert_allclose(ggg1.meand3, ggg2.meand3)
np.testing.assert_allclose(ggg1.gam0, ggg2.gam0)
np.testing.assert_allclose(ggg1.gam1, ggg2.gam1)
np.testing.assert_allclose(ggg1.gam2, ggg2.gam2)
np.testing.assert_allclose(ggg1.gam3, ggg2.gam3)
# NNN auto
nnn1 = treecorr.NNNCorrelation(nbins=3, min_sep=10., max_sep=200., bin_slop=0,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0., max_v=0.2, nvbins=1)
nnn1.process(cat)
nnn2 = treecorr.NNNCorrelation(nbins=3, min_sep=10., max_sep=200., bin_slop=0,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0., max_v=0.2, nvbins=1)
nnn2.process(cat1, initialize=True, finalize=False)
nnn2.process(cat2, initialize=False, finalize=False)
nnn2.process(cat3, initialize=False, finalize=False)
nnn2.process(cat1, cat2, initialize=False, finalize=False)
nnn2.process(cat1, cat3, initialize=False, finalize=False)
nnn2.process(cat2, cat1, initialize=False, finalize=False)
nnn2.process(cat2, cat3, initialize=False, finalize=False)
nnn2.process(cat3, cat1, initialize=False, finalize=False)
nnn2.process(cat3, cat2, initialize=False, finalize=False)
nnn2.process(cat1, cat2, cat3, initialize=False, finalize=True)
np.testing.assert_allclose(nnn1.ntri, nnn2.ntri)
np.testing.assert_allclose(nnn1.weight, nnn2.weight)
np.testing.assert_allclose(nnn1.meand1, nnn2.meand1)
np.testing.assert_allclose(nnn1.meand2, nnn2.meand2)
np.testing.assert_allclose(nnn1.meand3, nnn2.meand3)
# NNN cross12
nnn1.process(cat1, cat23)
nnn2.process(cat1, cat2, initialize=True, finalize=False)
nnn2.process(cat1, cat3, initialize=False, finalize=False)
nnn2.process(cat1, cat2, cat3, initialize=False, finalize=True)
np.testing.assert_allclose(nnn1.ntri, nnn2.ntri)
np.testing.assert_allclose(nnn1.weight, nnn2.weight)
np.testing.assert_allclose(nnn1.meand1, nnn2.meand1)
np.testing.assert_allclose(nnn1.meand2, nnn2.meand2)
np.testing.assert_allclose(nnn1.meand3, nnn2.meand3)
# NNNCross cross12
nnnc1 = treecorr.NNNCrossCorrelation(nbins=3, min_sep=10., max_sep=200., bin_slop=0,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0., max_v=0.2, nvbins=1)
nnnc1.process(cat1, cat23)
nnnc2 = treecorr.NNNCrossCorrelation(nbins=3, min_sep=10., max_sep=200., bin_slop=0,
min_u=0.8, max_u=1.0, nubins=1,
min_v=0., max_v=0.2, nvbins=1)
nnnc2.process(cat1, cat2, initialize=True, finalize=False)
nnnc2.process(cat1, cat3, initialize=False, finalize=False)
nnnc2.process(cat1, cat2, cat3, initialize=False, finalize=True)
for perm in ['n1n2n3', 'n1n3n2', 'n2n1n3', 'n2n3n1', 'n3n1n2', 'n3n2n1']:
nnn1 = getattr(nnnc1, perm)
nnn2 = getattr(nnnc2, perm)
np.testing.assert_allclose(nnn1.ntri, nnn2.ntri)
np.testing.assert_allclose(nnn1.weight, nnn2.weight)
np.testing.assert_allclose(nnn1.meand1, nnn2.meand1)
np.testing.assert_allclose(nnn1.meand2, nnn2.meand2)
np.testing.assert_allclose(nnn1.meand3, nnn2.meand3)
# NNN cross
nnn1.process(cat, cat2, cat3)
nnn2.process(cat1, cat2, cat3, initialize=True, finalize=False)
nnn2.process(cat2, cat2, cat3, initialize=False, finalize=False)
nnn2.process(cat3, cat2, cat3, initialize=False, finalize=True)
np.testing.assert_allclose(nnn1.ntri, nnn2.ntri)
np.testing.assert_allclose(nnn1.weight, nnn2.weight)
np.testing.assert_allclose(nnn1.meand1, nnn2.meand1)
np.testing.assert_allclose(nnn1.meand2, nnn2.meand2)
np.testing.assert_allclose(nnn1.meand3, nnn2.meand3)
# NNNCross cross
nnnc1.process(cat, cat2, cat3)
nnnc2.process(cat1, cat2, cat3, initialize=True, finalize=False)
nnnc2.process(cat2, cat2, cat3, initialize=False, finalize=False)
nnnc2.process(cat3, cat2, cat3, initialize=False, finalize=True)
for perm in ['n1n2n3', 'n1n3n2', 'n2n1n3', 'n2n3n1', 'n3n1n2', 'n3n2n1']:
nnn1 = getattr(nnnc1, perm)
nnn2 = getattr(nnnc2, perm)
np.testing.assert_allclose(nnn1.ntri, nnn2.ntri)
np.testing.assert_allclose(nnn1.weight, nnn2.weight)
np.testing.assert_allclose(nnn1.meand1, nnn2.meand1)
np.testing.assert_allclose(nnn1.meand2, nnn2.meand2)
np.testing.assert_allclose(nnn1.meand3, nnn2.meand3)
@timer
def test_lowmem():
# Test using patches to keep the memory usage lower.
if __name__ == '__main__':
nsource = 10000
nhalo = 100
npatch = 4
himem = 7.e5
lomem = 8.e4
else:
nsource = 1000
nhalo = 100
npatch = 4
himem = 1.3e5
lomem = 8.e4
rng = np.random.RandomState(8675309)
x, y, g1, g2, k = generate_shear_field(nsource, nhalo, rng)
file_name = os.path.join('output','test_lowmem_3pt.fits')
orig_cat = treecorr.Catalog(x=x, y=y, g1=g1, g2=g2, k=k, npatch=npatch)
patch_centers = orig_cat.patch_centers
orig_cat.write(file_name)
del orig_cat
try:
import guppy
hp = guppy.hpy()
hp.setrelheap()
except Exception:
hp = None
full_cat = treecorr.Catalog(file_name,
x_col='x', y_col='y', g1_col='g1', g2_col='g2', k_col='k',
patch_centers=patch_centers)
kkk = treecorr.KKKCorrelation(nbins=1, min_sep=280., max_sep=300.,
min_u=0.95, max_u=1.0, nubins=1,
min_v=0., max_v=0.05, nvbins=1)
t0 = time.time()
s0 = hp.heap().size if hp else 0
kkk.process(full_cat)
t1 = time.time()
s1 = hp.heap().size if hp else 2*himem
print('regular: ',s1, t1-t0, s1-s0)
assert s1-s0 > himem # This version uses a lot of memory.
ntri1 = kkk.ntri
zeta1 = kkk.zeta
full_cat.unload()
kkk.clear()
# Remake with save_patch_dir.
clear_save('test_lowmem_3pt_%03d.fits', npatch)
save_cat = treecorr.Catalog(file_name,
x_col='x', y_col='y', g1_col='g1', g2_col='g2', k_col='k',
patch_centers=patch_centers, save_patch_dir='output')
t0 = time.time()
s0 = hp.heap().size if hp else 0
kkk.process(save_cat, low_mem=True, finalize=False)
t1 = time.time()
s1 = hp.heap().size if hp else 0
print('lomem 1: ',s1, t1-t0, s1-s0)
assert s1-s0 < lomem # This version uses a lot less memory
ntri2 = kkk.ntri
zeta2 = kkk.zeta
print('ntri1 = ',ntri1)
print('zeta1 = ',zeta1)
np.testing.assert_array_equal(ntri2, ntri1)
np.testing.assert_array_equal(zeta2, zeta1)
# Check running as a cross-correlation
save_cat.unload()
t0 = time.time()
s0 = hp.heap().size if hp else 0
kkk.process(save_cat, save_cat, low_mem=True)
t1 = time.time()
s1 = hp.heap().size if hp else 0
print('lomem 2: ',s1, t1-t0, s1-s0)
assert s1-s0 < lomem
ntri3 = kkk.ntri
zeta3 = kkk.zeta
np.testing.assert_array_equal(ntri3, ntri1)
np.testing.assert_array_equal(zeta3, zeta1)
# Check running as a cross-correlation
save_cat.unload()
t0 = time.time()
s0 = hp.heap().size if hp else 0
kkk.process(save_cat, save_cat, save_cat, low_mem=True)
t1 = time.time()
s1 = hp.heap().size if hp else 0
print('lomem 3: ',s1, t1-t0, s1-s0)
assert s1-s0 < lomem
ntri4 = kkk.ntri
zeta4 = kkk.zeta
np.testing.assert_array_equal(ntri4, ntri1)
np.testing.assert_array_equal(zeta4, zeta1)
if __name__ == '__main__':
test_kkk_jk()
test_ggg_jk()
test_nnn_jk()
test_brute_jk()
test_finalize_false()
test_lowmem
| 44.307889 | 100 | 0.636312 | 13,265 | 86,489 | 4.026159 | 0.05473 | 0.03595 | 0.069373 | 0.112401 | 0.784972 | 0.756493 | 0.727845 | 0.693486 | 0.660007 | 0.646638 | 0 | 0.040244 | 0.212224 | 86,489 | 1,951 | 101 | 44.3306 | 0.743608 | 0.086809 | 0 | 0.645513 | 0 | 0 | 0.068016 | 0.001192 | 0 | 0 | 0 | 0 | 0.184615 | 1 | 0.005128 | false | 0 | 0.00641 | 0 | 0.012821 | 0.289103 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
9af3fe22fc370eeece65a97b160dbe2d429e6850 | 109 | py | Python | venv/Lib/site-packages/zmq/eventloop/__init__.py | ajayiagbebaku/NFL-Model | afcc67a85ca7138c58c3334d45988ada2da158ed | [
"MIT"
] | 603 | 2020-12-23T13:49:32.000Z | 2022-03-31T23:38:03.000Z | venv/Lib/site-packages/zmq/eventloop/__init__.py | ajayiagbebaku/NFL-Model | afcc67a85ca7138c58c3334d45988ada2da158ed | [
"MIT"
] | 387 | 2020-12-15T14:54:04.000Z | 2022-03-31T07:00:21.000Z | venv/Lib/site-packages/zmq/eventloop/__init__.py | ajayiagbebaku/NFL-Model | afcc67a85ca7138c58c3334d45988ada2da158ed | [
"MIT"
] | 64 | 2018-04-25T08:51:57.000Z | 2022-01-29T14:13:57.000Z | """Tornado eventloop integration for pyzmq"""
from zmq.eventloop.ioloop import IOLoop
__all__ = ['IOLoop']
| 18.166667 | 45 | 0.752294 | 13 | 109 | 6 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12844 | 109 | 5 | 46 | 21.8 | 0.821053 | 0.357798 | 0 | 0 | 0 | 0 | 0.09375 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
9af40678e1fd11b11387453b24c8927a44b02287 | 1,766 | py | Python | src/the_tale/the_tale/clans/meta_relations.py | al-arz/the-tale | 542770257eb6ebd56a5ac44ea1ef93ff4ab19eb5 | [
"BSD-3-Clause"
] | 85 | 2017-11-21T12:22:02.000Z | 2022-03-27T23:07:17.000Z | src/the_tale/the_tale/clans/meta_relations.py | al-arz/the-tale | 542770257eb6ebd56a5ac44ea1ef93ff4ab19eb5 | [
"BSD-3-Clause"
] | 545 | 2017-11-04T14:15:04.000Z | 2022-03-27T14:19:27.000Z | src/the_tale/the_tale/clans/meta_relations.py | al-arz/the-tale | 542770257eb6ebd56a5ac44ea1ef93ff4ab19eb5 | [
"BSD-3-Clause"
] | 45 | 2017-11-11T12:36:30.000Z | 2022-02-25T06:10:44.000Z |
import smart_imports
smart_imports.all()
class Clan(meta_relations_objects.MetaType):
__slots__ = ('caption', )
TYPE = 8
TYPE_CAPTION = 'Гильдия'
def __init__(self, caption, **kwargs):
super().__init__(**kwargs)
self.caption = caption
@property
def url(self):
return utils_urls.url('clans:show', self.id)
@classmethod
def create_from_object(cls, clan):
return cls(id=clan.id, caption=clan.name)
@classmethod
def create_removed(cls):
return cls(id=None, caption='неизвестная гильдия')
@classmethod
def create_from_id(cls, id):
clan = logic.load_clan(clan_id=id)
if clan is None:
return cls.create_removed()
return cls.create_from_object(clan)
@classmethod
def create_from_ids(cls, ids):
records = models.Clan.objects.filter(id__in=ids)
if len(ids) != len(records):
raise meta_relations_exceptions.ObjectsNotFound(type=cls.TYPE, ids=ids)
return [cls.create_from_object(logic.load_clan(clan_model=record)) for record in records]
class Event(meta_relations_objects.MetaType):
__slots__ = ('caption', )
TYPE = 13
TYPE_CAPTION = 'Событие гильдии'
def __init__(self, caption, **kwargs):
super().__init__(**kwargs)
self.caption = caption
@property
def url(self):
return None
@classmethod
def create_from_object(cls, record):
return cls(id=record.value, caption=record.text)
@classmethod
def create_from_id(cls, id):
from . import relations
return cls.create_from_object(relations.EVENT(id))
@classmethod
def create_from_ids(cls, ids):
return [cls.create_from_id(id) for id in ids]
| 24.527778 | 97 | 0.653454 | 223 | 1,766 | 4.892377 | 0.264574 | 0.091659 | 0.128323 | 0.131989 | 0.499542 | 0.406966 | 0.346471 | 0.148488 | 0.148488 | 0.148488 | 0 | 0.002235 | 0.240091 | 1,766 | 71 | 98 | 24.873239 | 0.81073 | 0 | 0 | 0.46 | 0 | 0 | 0.036827 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.22 | false | 0 | 0.06 | 0.12 | 0.64 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
9af68b0b71e5893b7b965587969da634a5976afc | 95,533 | py | Python | pysnmp_mibs/TCP-ESTATS-MIB.py | jackjack821/pysnmp-mibs | 9835ea0bb2420715caf4ee9aaa07d59bb263acd6 | [
"BSD-2-Clause"
] | 6 | 2017-04-21T13:48:08.000Z | 2022-01-06T19:42:52.000Z | pysnmp_mibs/TCP-ESTATS-MIB.py | jackjack821/pysnmp-mibs | 9835ea0bb2420715caf4ee9aaa07d59bb263acd6 | [
"BSD-2-Clause"
] | 1 | 2020-05-05T16:42:25.000Z | 2020-05-05T16:42:25.000Z | pysnmp_mibs/TCP-ESTATS-MIB.py | jackjack821/pysnmp-mibs | 9835ea0bb2420715caf4ee9aaa07d59bb263acd6 | [
"BSD-2-Clause"
] | 6 | 2020-02-08T20:28:49.000Z | 2021-09-14T13:36:46.000Z | #
# PySNMP MIB module TCP-ESTATS-MIB (http://pysnmp.sf.net)
# ASN.1 source http://mibs.snmplabs.com:80/asn1/TCP-ESTATS-MIB
# Produced by pysmi-0.0.7 at Sun Feb 14 00:31:06 2016
# On host bldfarm platform Linux version 4.1.13-100.fc21.x86_64 by user goose
# Using Python version 3.5.0 (default, Jan 5 2016, 17:11:52)
#
( Integer, ObjectIdentifier, OctetString, ) = mibBuilder.importSymbols("ASN1", "Integer", "ObjectIdentifier", "OctetString")
( NamedValues, ) = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
( ValueRangeConstraint, ValueSizeConstraint, ConstraintsUnion, ConstraintsIntersection, SingleValueConstraint, ) = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueRangeConstraint", "ValueSizeConstraint", "ConstraintsUnion", "ConstraintsIntersection", "SingleValueConstraint")
( ZeroBasedCounter64, ) = mibBuilder.importSymbols("HCNUM-TC", "ZeroBasedCounter64")
( ZeroBasedCounter32, ) = mibBuilder.importSymbols("RMON2-MIB", "ZeroBasedCounter32")
( ModuleCompliance, ObjectGroup, NotificationGroup, ) = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "ObjectGroup", "NotificationGroup")
( MibScalar, MibTable, MibTableRow, MibTableColumn, MibIdentifier, mib_2, Integer32, ModuleIdentity, IpAddress, Bits, ObjectIdentity, iso, NotificationType, Gauge32, Counter64, Counter32, Unsigned32, TimeTicks, ) = mibBuilder.importSymbols("SNMPv2-SMI", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "MibIdentifier", "mib-2", "Integer32", "ModuleIdentity", "IpAddress", "Bits", "ObjectIdentity", "iso", "NotificationType", "Gauge32", "Counter64", "Counter32", "Unsigned32", "TimeTicks")
( DateAndTime, TextualConvention, TimeStamp, DisplayString, TruthValue, ) = mibBuilder.importSymbols("SNMPv2-TC", "DateAndTime", "TextualConvention", "TimeStamp", "DisplayString", "TruthValue")
( tcpListenerEntry, tcpConnectionEntry, ) = mibBuilder.importSymbols("TCP-MIB", "tcpListenerEntry", "tcpConnectionEntry")
tcpEStatsMIB = ModuleIdentity((1, 3, 6, 1, 2, 1, 156)).setRevisions(("2007-05-18 00:00",))
if mibBuilder.loadTexts: tcpEStatsMIB.setLastUpdated('200705180000Z')
if mibBuilder.loadTexts: tcpEStatsMIB.setOrganization('IETF TSV Working Group')
if mibBuilder.loadTexts: tcpEStatsMIB.setContactInfo('Matt Mathis\n John Heffner\n Web100 Project\n Pittsburgh Supercomputing Center\n 300 S. Craig St.\n Pittsburgh, PA 15213\n Email: mathis@psc.edu, jheffner@psc.edu\n\n Rajiv Raghunarayan\n Cisco Systems Inc.\n San Jose, CA 95134\n Phone: 408 853 9612\n Email: raraghun@cisco.com\n\n Jon Saperia\n 84 Kettell Plain Road\n Stow, MA 01775\n Phone: 617-201-2655\n Email: saperia@jdscons.com ')
if mibBuilder.loadTexts: tcpEStatsMIB.setDescription('Documentation of TCP Extended Performance Instrumentation\n variables from the Web100 project. [Web100]\n\n All of the objects in this MIB MUST have the same\n persistence properties as the underlying TCP implementation.\n On a reboot, all zero-based counters MUST be cleared, all\n dynamically created table rows MUST be deleted, and all\n read-write objects MUST be restored to their default values.\n\n It is assumed that all TCP implementation have some\n initialization code (if nothing else to set IP addresses)\n that has the opportunity to adjust tcpEStatsConnTableLatency\n and other read-write scalars controlling the creation of the\n various tables, before establishing the first TCP\n connection. Implementations MAY also choose to make these\n control scalars persist across reboots.\n\n Copyright (C) The IETF Trust (2007). This version\n of this MIB module is a part of RFC 4898; see the RFC\n itself for full legal notices.')
tcpEStatsNotifications = MibIdentifier((1, 3, 6, 1, 2, 1, 156, 0))
tcpEStatsMIBObjects = MibIdentifier((1, 3, 6, 1, 2, 1, 156, 1))
tcpEStatsConformance = MibIdentifier((1, 3, 6, 1, 2, 1, 156, 2))
tcpEStats = MibIdentifier((1, 3, 6, 1, 2, 1, 156, 1, 1))
tcpEStatsControl = MibIdentifier((1, 3, 6, 1, 2, 1, 156, 1, 2))
tcpEStatsScalar = MibIdentifier((1, 3, 6, 1, 2, 1, 156, 1, 3))
class TcpEStatsNegotiated(Integer32, TextualConvention):
subtypeSpec = Integer32.subtypeSpec+ConstraintsUnion(SingleValueConstraint(1, 2, 3,))
namedValues = NamedValues(("enabled", 1), ("selfDisabled", 2), ("peerDisabled", 3),)
tcpEStatsListenerTableLastChange = MibScalar((1, 3, 6, 1, 2, 1, 156, 1, 3, 3), TimeStamp()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerTableLastChange.setDescription('The value of sysUpTime at the time of the last\n creation or deletion of an entry in the tcpListenerTable.\n If the number of entries has been unchanged since the\n last re-initialization of the local network management\n subsystem, then this object contains a zero value.')
tcpEStatsControlPath = MibScalar((1, 3, 6, 1, 2, 1, 156, 1, 2, 1), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: tcpEStatsControlPath.setDescription("Controls the activation of the TCP Path Statistics\n table.\n\n A value 'true' indicates that the TCP Path Statistics\n table is active, while 'false' indicates that the\n table is inactive.")
tcpEStatsControlStack = MibScalar((1, 3, 6, 1, 2, 1, 156, 1, 2, 2), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: tcpEStatsControlStack.setDescription("Controls the activation of the TCP Stack Statistics\n table.\n\n A value 'true' indicates that the TCP Stack Statistics\n table is active, while 'false' indicates that the\n table is inactive.")
tcpEStatsControlApp = MibScalar((1, 3, 6, 1, 2, 1, 156, 1, 2, 3), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: tcpEStatsControlApp.setDescription("Controls the activation of the TCP Application\n Statistics table.\n\n A value 'true' indicates that the TCP Application\n Statistics table is active, while 'false' indicates\n that the table is inactive.")
tcpEStatsControlTune = MibScalar((1, 3, 6, 1, 2, 1, 156, 1, 2, 4), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: tcpEStatsControlTune.setDescription("Controls the activation of the TCP Tuning table.\n\n A value 'true' indicates that the TCP Tuning\n table is active, while 'false' indicates that the\n table is inactive.")
tcpEStatsControlNotify = MibScalar((1, 3, 6, 1, 2, 1, 156, 1, 2, 5), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: tcpEStatsControlNotify.setDescription("Controls the generation of all notifications defined in\n this MIB.\n\n A value 'true' indicates that the notifications\n are active, while 'false' indicates that the\n notifications are inactive.")
tcpEStatsConnTableLatency = MibScalar((1, 3, 6, 1, 2, 1, 156, 1, 2, 6), Unsigned32()).setUnits('seconds').setMaxAccess("readwrite")
if mibBuilder.loadTexts: tcpEStatsConnTableLatency.setDescription('Specifies the number of seconds that the entity will\n retain entries in the TCP connection tables, after the\n connection first enters the closed state. The entity\n SHOULD provide a configuration option to enable\n\n\n\n customization of this value. A value of 0\n results in entries being removed from the tables as soon as\n the connection enters the closed state. The value of\n this object pertains to the following tables:\n tcpEStatsConnectIdTable\n tcpEStatsPerfTable\n tcpEStatsPathTable\n tcpEStatsStackTable\n tcpEStatsAppTable\n tcpEStatsTuneTable')
tcpEStatsListenerTable = MibTable((1, 3, 6, 1, 2, 1, 156, 1, 1, 1), )
if mibBuilder.loadTexts: tcpEStatsListenerTable.setDescription('This table contains information about TCP Listeners,\n in addition to the information maintained by the\n tcpListenerTable RFC 4022.')
tcpEStatsListenerEntry = MibTableRow((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1), )
tcpListenerEntry.registerAugmentions(("TCP-ESTATS-MIB", "tcpEStatsListenerEntry"))
tcpEStatsListenerEntry.setIndexNames(*tcpListenerEntry.getIndexNames())
if mibBuilder.loadTexts: tcpEStatsListenerEntry.setDescription('Each entry in the table contains information about\n a specific TCP Listener.')
tcpEStatsListenerStartTime = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 1), TimeStamp()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerStartTime.setDescription('The value of sysUpTime at the time this listener was\n established. If the current state was entered prior to\n the last re-initialization of the local network management\n subsystem, then this object contains a zero value.')
tcpEStatsListenerSynRcvd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 2), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerSynRcvd.setDescription('The number of SYNs which have been received for this\n listener. The total number of failed connections for\n all reasons can be estimated to be tcpEStatsListenerSynRcvd\n minus tcpEStatsListenerAccepted and\n tcpEStatsListenerCurBacklog.')
tcpEStatsListenerInitial = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 3), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerInitial.setDescription('The total number of connections for which the Listener\n has allocated initial state and placed the\n connection in the backlog. This may happen in the\n SYN-RCVD or ESTABLISHED states, depending on the\n implementation.')
tcpEStatsListenerEstablished = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 4), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerEstablished.setDescription('The number of connections that have been established to\n this endpoint (e.g., the number of first ACKs that have\n been received for this listener).')
tcpEStatsListenerAccepted = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 5), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerAccepted.setDescription('The total number of connections for which the Listener\n has successfully issued an accept, removing the connection\n from the backlog.')
tcpEStatsListenerExceedBacklog = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 6), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerExceedBacklog.setDescription('The total number of connections dropped from the\n backlog by this listener due to all reasons. This\n includes all connections that are allocated initial\n resources, but are not accepted for some reason.')
tcpEStatsListenerHCSynRcvd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 7), ZeroBasedCounter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerHCSynRcvd.setDescription('The number of SYNs that have been received for this\n listener on systems that can process (or reject) more\n than 1 million connections per second. See\n tcpEStatsListenerSynRcvd.')
tcpEStatsListenerHCInitial = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 8), ZeroBasedCounter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerHCInitial.setDescription('The total number of connections for which the Listener\n has allocated initial state and placed the connection\n in the backlog on systems that can process (or reject)\n more than 1 million connections per second. See\n tcpEStatsListenerInitial.')
tcpEStatsListenerHCEstablished = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 9), ZeroBasedCounter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerHCEstablished.setDescription('The number of connections that have been established to\n this endpoint on systems that can process (or reject) more\n than 1 million connections per second. See\n tcpEStatsListenerEstablished.')
tcpEStatsListenerHCAccepted = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 10), ZeroBasedCounter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerHCAccepted.setDescription('The total number of connections for which the Listener\n has successfully issued an accept, removing the connection\n from the backlog on systems that can process (or reject)\n more than 1 million connections per second. See\n tcpEStatsListenerAccepted.')
tcpEStatsListenerHCExceedBacklog = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 11), ZeroBasedCounter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerHCExceedBacklog.setDescription('The total number of connections dropped from the\n backlog by this listener due to all reasons on\n systems that can process (or reject) more than\n 1 million connections per second. See\n tcpEStatsListenerExceedBacklog.')
tcpEStatsListenerCurConns = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 12), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerCurConns.setDescription('The current number of connections in the ESTABLISHED\n state, which have also been accepted. It excludes\n connections that have been established but not accepted\n because they are still subject to being discarded to\n shed load without explicit action by either endpoint.')
tcpEStatsListenerMaxBacklog = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 13), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerMaxBacklog.setDescription('The maximum number of connections allowed in the\n backlog at one time.')
tcpEStatsListenerCurBacklog = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 14), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerCurBacklog.setDescription('The current number of connections that are in the backlog.\n This gauge includes connections in ESTABLISHED or\n SYN-RECEIVED states for which the Listener has not yet\n issued an accept.\n\n If this listener is using some technique to implicitly\n represent the SYN-RECEIVED states (e.g., by\n cryptographically encoding the state information in the\n initial sequence number, ISS), it MAY elect to exclude\n connections in the SYN-RECEIVED state from the backlog.')
tcpEStatsListenerCurEstabBacklog = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 1, 1, 15), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsListenerCurEstabBacklog.setDescription('The current number of connections in the backlog that are\n in the ESTABLISHED state, but for which the Listener has\n not yet issued an accept.')
tcpEStatsConnectIdTable = MibTable((1, 3, 6, 1, 2, 1, 156, 1, 1, 2), )
if mibBuilder.loadTexts: tcpEStatsConnectIdTable.setDescription('This table maps information that uniquely identifies\n each active TCP connection to the connection ID used by\n\n\n\n other tables in this MIB Module. It is an extension of\n tcpConnectionTable in RFC 4022.\n\n Entries are retained in this table for the number of\n seconds indicated by the tcpEStatsConnTableLatency\n object, after the TCP connection first enters the closed\n state.')
tcpEStatsConnectIdEntry = MibTableRow((1, 3, 6, 1, 2, 1, 156, 1, 1, 2, 1), )
tcpConnectionEntry.registerAugmentions(("TCP-ESTATS-MIB", "tcpEStatsConnectIdEntry"))
tcpEStatsConnectIdEntry.setIndexNames(*tcpConnectionEntry.getIndexNames())
if mibBuilder.loadTexts: tcpEStatsConnectIdEntry.setDescription('Each entry in this table maps a TCP connection\n 4-tuple to a connection index.')
tcpEStatsConnectIndex = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 2, 1, 1), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1,4294967295))).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsConnectIndex.setDescription('A unique integer value assigned to each TCP Connection\n entry.\n\n The RECOMMENDED algorithm is to begin at 1 and increase to\n some implementation-specific maximum value and then start\n again at 1 skipping values already in use.')
tcpEStatsPerfTable = MibTable((1, 3, 6, 1, 2, 1, 156, 1, 1, 3), )
if mibBuilder.loadTexts: tcpEStatsPerfTable.setDescription('This table contains objects that are useful for\n\n\n\n measuring TCP performance and first line problem\n diagnosis. Most objects in this table directly expose\n some TCP state variable or are easily implemented as\n simple functions (e.g., the maximum value) of TCP\n state variables.\n\n Entries are retained in this table for the number of\n seconds indicated by the tcpEStatsConnTableLatency\n object, after the TCP connection first enters the closed\n state.')
tcpEStatsPerfEntry = MibTableRow((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1), ).setIndexNames((0, "TCP-ESTATS-MIB", "tcpEStatsConnectIndex"))
if mibBuilder.loadTexts: tcpEStatsPerfEntry.setDescription('Each entry in this table has information about the\n characteristics of each active and recently closed TCP\n connection.')
tcpEStatsPerfSegsOut = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 1), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfSegsOut.setDescription('The total number of segments sent.')
tcpEStatsPerfDataSegsOut = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 2), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfDataSegsOut.setDescription('The number of segments sent containing a positive length\n data segment.')
tcpEStatsPerfDataOctetsOut = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 3), ZeroBasedCounter32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfDataOctetsOut.setDescription('The number of octets of data contained in transmitted\n segments, including retransmitted data. Note that this does\n not include TCP headers.')
tcpEStatsPerfHCDataOctetsOut = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 4), ZeroBasedCounter64()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfHCDataOctetsOut.setDescription('The number of octets of data contained in transmitted\n segments, including retransmitted data, on systems that can\n transmit more than 10 million bits per second. Note that\n this does not include TCP headers.')
tcpEStatsPerfSegsRetrans = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 5), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfSegsRetrans.setDescription('The number of segments transmitted containing at least some\n retransmitted data.')
tcpEStatsPerfOctetsRetrans = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 6), ZeroBasedCounter32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfOctetsRetrans.setDescription('The number of octets retransmitted.')
tcpEStatsPerfSegsIn = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 7), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfSegsIn.setDescription('The total number of segments received.')
tcpEStatsPerfDataSegsIn = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 8), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfDataSegsIn.setDescription('The number of segments received containing a positive\n\n\n\n length data segment.')
tcpEStatsPerfDataOctetsIn = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 9), ZeroBasedCounter32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfDataOctetsIn.setDescription('The number of octets contained in received data segments,\n including retransmitted data. Note that this does not\n include TCP headers.')
tcpEStatsPerfHCDataOctetsIn = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 10), ZeroBasedCounter64()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfHCDataOctetsIn.setDescription('The number of octets contained in received data segments,\n including retransmitted data, on systems that can receive\n more than 10 million bits per second. Note that this does\n not include TCP headers.')
tcpEStatsPerfElapsedSecs = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 11), ZeroBasedCounter32()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfElapsedSecs.setDescription('The seconds part of the time elapsed between\n tcpEStatsPerfStartTimeStamp and the most recent protocol\n event (segment sent or received).')
tcpEStatsPerfElapsedMicroSecs = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 12), ZeroBasedCounter32()).setUnits('microseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfElapsedMicroSecs.setDescription('The micro-second part of time elapsed between\n tcpEStatsPerfStartTimeStamp to the most recent protocol\n event (segment sent or received). This may be updated in\n whatever time granularity is the system supports.')
tcpEStatsPerfStartTimeStamp = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 13), DateAndTime()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfStartTimeStamp.setDescription('Time at which this row was created and all\n ZeroBasedCounters in the row were initialized to zero.')
tcpEStatsPerfCurMSS = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 14), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfCurMSS.setDescription('The current maximum segment size (MSS), in octets.')
tcpEStatsPerfPipeSize = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 15), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfPipeSize.setDescription("The TCP senders current estimate of the number of\n unacknowledged data octets in the network.\n\n While not in recovery (e.g., while the receiver is not\n reporting missing data to the sender), this is precisely the\n same as 'Flight size' as defined in RFC 2581, which can be\n computed as SND.NXT minus SND.UNA. [RFC793]\n\n During recovery, the TCP sender has incomplete information\n about the state of the network (e.g., which segments are\n lost vs reordered, especially if the return path is also\n dropping TCP acknowledgments). Current TCP standards do not\n mandate any specific algorithm for estimating the number of\n unacknowledged data octets in the network.\n\n RFC 3517 describes a conservative algorithm to use SACK\n\n\n\n information to estimate the number of unacknowledged data\n octets in the network. tcpEStatsPerfPipeSize object SHOULD\n be the same as 'pipe' as defined in RFC 3517 if it is\n implemented. (Note that while not in recovery the pipe\n algorithm yields the same values as flight size).\n\n If RFC 3517 is not implemented, the data octets in flight\n SHOULD be estimated as SND.NXT minus SND.UNA adjusted by\n some measure of the data that has left the network and\n retransmitted data. For example, with Reno or NewReno style\n TCP, the number of duplicate acknowledgment is used to\n count the number of segments that have left the network.\n That is,\n PipeSize=SND.NXT-SND.UNA+(retransmits-dupacks)*CurMSS")
tcpEStatsPerfMaxPipeSize = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 16), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfMaxPipeSize.setDescription('The maximum value of tcpEStatsPerfPipeSize, for this\n connection.')
tcpEStatsPerfSmoothedRTT = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 17), Gauge32()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfSmoothedRTT.setDescription('The smoothed round trip time used in calculation of the\n RTO. See SRTT in [RFC2988].')
tcpEStatsPerfCurRTO = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 18), Gauge32()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfCurRTO.setDescription('The current value of the retransmit timer RTO.')
tcpEStatsPerfCongSignals = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 19), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfCongSignals.setDescription('The number of multiplicative downward congestion window\n adjustments due to all forms of congestion signals,\n including Fast Retransmit, Explicit Congestion Notification\n (ECN), and timeouts. This object summarizes all events that\n invoke the MD portion of Additive Increase Multiplicative\n Decrease (AIMD) congestion control, and as such is the best\n indicator of how a cwnd is being affected by congestion.\n\n Note that retransmission timeouts multiplicatively reduce\n the window implicitly by setting ssthresh, and SHOULD be\n included in tcpEStatsPerfCongSignals. In order to minimize\n spurious congestion indications due to out-of-order\n segments, tcpEStatsPerfCongSignals SHOULD be incremented in\n association with the Fast Retransmit algorithm.')
tcpEStatsPerfCurCwnd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 20), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfCurCwnd.setDescription('The current congestion window, in octets.')
tcpEStatsPerfCurSsthresh = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 21), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfCurSsthresh.setDescription('The current slow start threshold in octets.')
tcpEStatsPerfTimeouts = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 22), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfTimeouts.setDescription('The number of times the retransmit timeout has expired when\n the RTO backoff multiplier is equal to one.')
tcpEStatsPerfCurRwinSent = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 23), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfCurRwinSent.setDescription('The most recent window advertisement sent, in octets.')
tcpEStatsPerfMaxRwinSent = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 24), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfMaxRwinSent.setDescription('The maximum window advertisement sent, in octets.')
tcpEStatsPerfZeroRwinSent = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 25), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfZeroRwinSent.setDescription('The number of acknowledgments sent announcing a zero\n\n\n\n receive window, when the previously announced window was\n not zero.')
tcpEStatsPerfCurRwinRcvd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 26), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfCurRwinRcvd.setDescription('The most recent window advertisement received, in octets.')
tcpEStatsPerfMaxRwinRcvd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 27), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfMaxRwinRcvd.setDescription('The maximum window advertisement received, in octets.')
tcpEStatsPerfZeroRwinRcvd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 28), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfZeroRwinRcvd.setDescription('The number of acknowledgments received announcing a zero\n receive window, when the previously announced window was\n not zero.')
tcpEStatsPerfSndLimTransRwin = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 31), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfSndLimTransRwin.setDescription("The number of transitions into the 'Receiver Limited' state\n from either the 'Congestion Limited' or 'Sender Limited'\n states. This state is entered whenever TCP transmission\n stops because the sender has filled the announced receiver\n window, i.e., when SND.NXT has advanced to SND.UNA +\n SND.WND - 1 as described in RFC 793.")
tcpEStatsPerfSndLimTransCwnd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 32), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfSndLimTransCwnd.setDescription("The number of transitions into the 'Congestion Limited'\n state from either the 'Receiver Limited' or 'Sender\n Limited' states. This state is entered whenever TCP\n transmission stops because the sender has reached some\n limit defined by congestion control (e.g., cwnd) or other\n algorithms (retransmission timeouts) designed to control\n network traffic. See the definition of 'CONGESTION WINDOW'\n\n\n\n in RFC 2581.")
tcpEStatsPerfSndLimTransSnd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 33), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfSndLimTransSnd.setDescription("The number of transitions into the 'Sender Limited' state\n from either the 'Receiver Limited' or 'Congestion Limited'\n states. This state is entered whenever TCP transmission\n stops due to some sender limit such as running out of\n application data or other resources and the Karn algorithm.\n When TCP stops sending data for any reason, which cannot be\n classified as Receiver Limited or Congestion Limited, it\n MUST be treated as Sender Limited.")
tcpEStatsPerfSndLimTimeRwin = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 34), ZeroBasedCounter32()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfSndLimTimeRwin.setDescription("The cumulative time spent in the 'Receiver Limited' state.\n See tcpEStatsPerfSndLimTransRwin.")
tcpEStatsPerfSndLimTimeCwnd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 35), ZeroBasedCounter32()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfSndLimTimeCwnd.setDescription("The cumulative time spent in the 'Congestion Limited'\n state. See tcpEStatsPerfSndLimTransCwnd. When there is a\n retransmission timeout, it SHOULD be counted in\n tcpEStatsPerfSndLimTimeCwnd (and not the cumulative time\n for some other state.)")
tcpEStatsPerfSndLimTimeSnd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 3, 1, 36), ZeroBasedCounter32()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPerfSndLimTimeSnd.setDescription("The cumulative time spent in the 'Sender Limited' state.\n See tcpEStatsPerfSndLimTransSnd.")
tcpEStatsPathTable = MibTable((1, 3, 6, 1, 2, 1, 156, 1, 1, 4), )
if mibBuilder.loadTexts: tcpEStatsPathTable.setDescription('This table contains objects that can be used to infer\n detailed behavior of the Internet path, such as the\n extent that there is reordering, ECN bits, and if\n RTT fluctuations are correlated to losses.\n\n Entries are retained in this table for the number of\n seconds indicated by the tcpEStatsConnTableLatency\n object, after the TCP connection first enters the closed\n state.')
tcpEStatsPathEntry = MibTableRow((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1), ).setIndexNames((0, "TCP-ESTATS-MIB", "tcpEStatsConnectIndex"))
if mibBuilder.loadTexts: tcpEStatsPathEntry.setDescription('Each entry in this table has information about the\n characteristics of each active and recently closed TCP\n connection.')
tcpEStatsPathRetranThresh = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 1), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathRetranThresh.setDescription('The number of duplicate acknowledgments required to trigger\n Fast Retransmit. Note that although this is constant in\n traditional Reno TCP implementations, it is adaptive in\n many newer TCPs.')
tcpEStatsPathNonRecovDAEpisodes = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 2), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathNonRecovDAEpisodes.setDescription("The number of duplicate acknowledgment episodes that did\n not trigger a Fast Retransmit because ACK advanced prior to\n the number of duplicate acknowledgments reaching\n RetranThresh.\n\n\n\n\n In many implementations this is the number of times the\n 'dupacks' counter is set to zero when it is non-zero but\n less than RetranThresh.\n\n Note that the change in tcpEStatsPathNonRecovDAEpisodes\n divided by the change in tcpEStatsPerfDataSegsOut is an\n estimate of the frequency of data reordering on the forward\n path over some interval.")
tcpEStatsPathSumOctetsReordered = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 3), ZeroBasedCounter32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathSumOctetsReordered.setDescription('The sum of the amounts SND.UNA advances on the\n acknowledgment which ends a dup-ack episode without a\n retransmission.\n\n Note the change in tcpEStatsPathSumOctetsReordered divided\n by the change in tcpEStatsPathNonRecovDAEpisodes is an\n estimates of the average reordering distance, over some\n interval.')
tcpEStatsPathNonRecovDA = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 4), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathNonRecovDA.setDescription("Duplicate acks (or SACKS) that did not trigger a Fast\n Retransmit because ACK advanced prior to the number of\n duplicate acknowledgments reaching RetranThresh.\n\n In many implementations, this is the sum of the 'dupacks'\n counter, just before it is set to zero because ACK advanced\n without a Fast Retransmit.\n\n Note that the change in tcpEStatsPathNonRecovDA divided by\n the change in tcpEStatsPathNonRecovDAEpisodes is an\n estimate of the average reordering distance in segments\n over some interval.")
tcpEStatsPathSampleRTT = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 11), Gauge32()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathSampleRTT.setDescription('The most recent raw round trip time measurement used in\n calculation of the RTO.')
tcpEStatsPathRTTVar = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 12), Gauge32()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathRTTVar.setDescription('The round trip time variation used in calculation of the\n RTO. See RTTVAR in [RFC2988].')
tcpEStatsPathMaxRTT = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 13), Gauge32()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathMaxRTT.setDescription('The maximum sampled round trip time.')
tcpEStatsPathMinRTT = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 14), Gauge32()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathMinRTT.setDescription('The minimum sampled round trip time.')
tcpEStatsPathSumRTT = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 15), ZeroBasedCounter32()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathSumRTT.setDescription('The sum of all sampled round trip times.\n\n Note that the change in tcpEStatsPathSumRTT divided by the\n change in tcpEStatsPathCountRTT is the mean RTT, uniformly\n averaged over an enter interval.')
tcpEStatsPathHCSumRTT = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 16), ZeroBasedCounter64()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathHCSumRTT.setDescription('The sum of all sampled round trip times, on all systems\n that implement multiple concurrent RTT measurements.\n\n Note that the change in tcpEStatsPathHCSumRTT divided by\n the change in tcpEStatsPathCountRTT is the mean RTT,\n uniformly averaged over an enter interval.')
tcpEStatsPathCountRTT = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 17), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathCountRTT.setDescription('The number of round trip time samples included in\n tcpEStatsPathSumRTT and tcpEStatsPathHCSumRTT.')
tcpEStatsPathMaxRTO = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 18), Gauge32()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathMaxRTO.setDescription('The maximum value of the retransmit timer RTO.')
tcpEStatsPathMinRTO = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 19), Gauge32()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathMinRTO.setDescription('The minimum value of the retransmit timer RTO.')
tcpEStatsPathIpTtl = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 20), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathIpTtl.setDescription('The value of the TTL field carried in the most recently\n received IP header. This is sometimes useful to detect\n changing or unstable routes.')
tcpEStatsPathIpTosIn = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 21), OctetString().subtype(subtypeSpec=ValueSizeConstraint(1,1)).setFixedLength(1)).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathIpTosIn.setDescription('The value of the IPv4 Type of Service octet, or the IPv6\n traffic class octet, carried in the most recently received\n IP header.\n\n This is useful to diagnose interactions between TCP and any\n IP layer packet scheduling and delivery policy, which might\n be in effect to implement Diffserv.')
tcpEStatsPathIpTosOut = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 22), OctetString().subtype(subtypeSpec=ValueSizeConstraint(1,1)).setFixedLength(1)).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathIpTosOut.setDescription('The value of the IPv4 Type Of Service octet, or the IPv6\n traffic class octet, carried in the most recently\n transmitted IP header.\n\n This is useful to diagnose interactions between TCP and any\n IP layer packet scheduling and delivery policy, which might\n be in effect to implement Diffserv.')
tcpEStatsPathPreCongSumCwnd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 23), ZeroBasedCounter32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathPreCongSumCwnd.setDescription('The sum of the values of the congestion window, in octets,\n captured each time a congestion signal is received. This\n MUST be updated each time tcpEStatsPerfCongSignals is\n incremented, such that the change in\n tcpEStatsPathPreCongSumCwnd divided by the change in\n tcpEStatsPerfCongSignals is the average window (over some\n interval) just prior to a congestion signal.')
tcpEStatsPathPreCongSumRTT = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 24), ZeroBasedCounter32()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathPreCongSumRTT.setDescription('Sum of the last sample of the RTT (tcpEStatsPathSampleRTT)\n prior to the received congestion signals. This MUST be\n updated each time tcpEStatsPerfCongSignals is incremented,\n such that the change in tcpEStatsPathPreCongSumRTT divided by\n the change in tcpEStatsPerfCongSignals is the average RTT\n (over some interval) just prior to a congestion signal.')
tcpEStatsPathPostCongSumRTT = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 25), ZeroBasedCounter32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathPostCongSumRTT.setDescription('Sum of the first sample of the RTT (tcpEStatsPathSampleRTT)\n following each congestion signal. Such that the change in\n tcpEStatsPathPostCongSumRTT divided by the change in\n tcpEStatsPathPostCongCountRTT is the average RTT (over some\n interval) just after a congestion signal.')
tcpEStatsPathPostCongCountRTT = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 26), ZeroBasedCounter32()).setUnits('milliseconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathPostCongCountRTT.setDescription('The number of RTT samples included in\n tcpEStatsPathPostCongSumRTT such that the change in\n tcpEStatsPathPostCongSumRTT divided by the change in\n tcpEStatsPathPostCongCountRTT is the average RTT (over some\n interval) just after a congestion signal.')
tcpEStatsPathECNsignals = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 27), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathECNsignals.setDescription('The number of congestion signals delivered to the TCP\n sender via explicit congestion notification (ECN). This is\n typically the number of segments bearing Echo Congestion\n\n\n\n Experienced (ECE) bits, but\n should also include segments failing the ECN nonce check or\n other explicit congestion signals.')
tcpEStatsPathDupAckEpisodes = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 28), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathDupAckEpisodes.setDescription('The number of Duplicate Acks Sent when prior Ack was not\n duplicate. This is the number of times that a contiguous\n series of duplicate acknowledgments have been sent.\n\n This is an indication of the number of data segments lost\n or reordered on the path from the remote TCP endpoint to\n the near TCP endpoint.')
tcpEStatsPathRcvRTT = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 29), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathRcvRTT.setDescription("The receiver's estimate of the Path RTT.\n\n Adaptive receiver window algorithms depend on the receiver\n to having a good estimate of the path RTT.")
tcpEStatsPathDupAcksOut = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 30), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathDupAcksOut.setDescription('The number of duplicate ACKs sent. The ratio of the change\n in tcpEStatsPathDupAcksOut to the change in\n tcpEStatsPathDupAckEpisodes is an indication of reorder or\n recovery distance over some interval.')
tcpEStatsPathCERcvd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 31), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathCERcvd.setDescription('The number of segments received with IP headers bearing\n Congestion Experienced (CE) markings.')
tcpEStatsPathECESent = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 4, 1, 32), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsPathECESent.setDescription('Number of times the Echo Congestion Experienced (ECE) bit\n in the TCP header has been set (transitioned from 0 to 1),\n due to a Congestion Experienced (CE) marking on an IP\n header. Note that ECE can be set and reset only once per\n RTT, while CE can be set on many segments per RTT.')
tcpEStatsStackTable = MibTable((1, 3, 6, 1, 2, 1, 156, 1, 1, 5), )
if mibBuilder.loadTexts: tcpEStatsStackTable.setDescription('This table contains objects that are most useful for\n determining how well some of the TCP control\n algorithms are coping with this particular\n\n\n\n path.\n\n Entries are retained in this table for the number of\n seconds indicated by the tcpEStatsConnTableLatency\n object, after the TCP connection first enters the closed\n state.')
tcpEStatsStackEntry = MibTableRow((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1), ).setIndexNames((0, "TCP-ESTATS-MIB", "tcpEStatsConnectIndex"))
if mibBuilder.loadTexts: tcpEStatsStackEntry.setDescription('Each entry in this table has information about the\n characteristics of each active and recently closed TCP\n connection.')
tcpEStatsStackActiveOpen = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 1), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackActiveOpen.setDescription('True(1) if the local connection traversed the SYN-SENT\n state, else false(2).')
tcpEStatsStackMSSSent = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 2), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackMSSSent.setDescription('The value sent in an MSS option, or zero if none.')
tcpEStatsStackMSSRcvd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 3), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackMSSRcvd.setDescription('The value received in an MSS option, or zero if none.')
tcpEStatsStackWinScaleSent = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-1,14))).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackWinScaleSent.setDescription('The value of the transmitted window scale option if one was\n sent; otherwise, a value of -1.\n\n Note that if both tcpEStatsStackWinScaleSent and\n tcpEStatsStackWinScaleRcvd are not -1, then Rcv.Wind.Scale\n will be the same as this value and used to scale receiver\n window announcements from the local host to the remote\n host.')
tcpEStatsStackWinScaleRcvd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-1,14))).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackWinScaleRcvd.setDescription('The value of the received window scale option if one was\n received; otherwise, a value of -1.\n\n Note that if both tcpEStatsStackWinScaleSent and\n tcpEStatsStackWinScaleRcvd are not -1, then Snd.Wind.Scale\n will be the same as this value and used to scale receiver\n window announcements from the remote host to the local\n host.')
tcpEStatsStackTimeStamps = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 6), TcpEStatsNegotiated()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackTimeStamps.setDescription('Enabled(1) if TCP timestamps have been negotiated on,\n selfDisabled(2) if they are disabled or not implemented on\n the local host, or peerDisabled(3) if not negotiated by the\n remote hosts.')
tcpEStatsStackECN = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 7), TcpEStatsNegotiated()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackECN.setDescription('Enabled(1) if Explicit Congestion Notification (ECN) has\n been negotiated on, selfDisabled(2) if it is disabled or\n not implemented on the local host, or peerDisabled(3) if\n not negotiated by the remote hosts.')
tcpEStatsStackWillSendSACK = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 8), TcpEStatsNegotiated()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackWillSendSACK.setDescription('Enabled(1) if the local host will send SACK options,\n selfDisabled(2) if SACK is disabled or not implemented on\n the local host, or peerDisabled(3) if the remote host did\n not send the SACK-permitted option.\n\n Note that SACK negotiation is not symmetrical. SACK can\n enabled on one side of the connection and not the other.')
tcpEStatsStackWillUseSACK = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 9), TcpEStatsNegotiated()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackWillUseSACK.setDescription('Enabled(1) if the local host will process SACK options,\n selfDisabled(2) if SACK is disabled or not implemented on\n the local host, or peerDisabled(3) if the remote host sends\n\n\n\n duplicate ACKs without SACK options, or the local host\n otherwise decides not to process received SACK options.\n\n Unlike other TCP options, the remote data receiver cannot\n explicitly indicate if it is able to generate SACK options.\n When sending data, the local host has to deduce if the\n remote receiver is sending SACK options. This object can\n transition from Enabled(1) to peerDisabled(3) after the SYN\n exchange.\n\n Note that SACK negotiation is not symmetrical. SACK can\n enabled on one side of the connection and not the other.')
tcpEStatsStackState = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 10), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12,))).clone(namedValues=NamedValues(("tcpESStateClosed", 1), ("tcpESStateListen", 2), ("tcpESStateSynSent", 3), ("tcpESStateSynReceived", 4), ("tcpESStateEstablished", 5), ("tcpESStateFinWait1", 6), ("tcpESStateFinWait2", 7), ("tcpESStateCloseWait", 8), ("tcpESStateLastAck", 9), ("tcpESStateClosing", 10), ("tcpESStateTimeWait", 11), ("tcpESStateDeleteTcb", 12),))).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackState.setDescription('An integer value representing the connection state from the\n TCP State Transition Diagram.\n\n The value listen(2) is included only for parallelism to the\n old tcpConnTable, and SHOULD NOT be used because the listen\n state in managed by the tcpListenerTable.\n\n The value DeleteTcb(12) is included only for parallelism to\n the tcpConnTable mechanism for terminating connections,\n\n\n\n although this table does not permit writing.')
tcpEStatsStackNagle = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 11), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackNagle.setDescription('True(1) if the Nagle algorithm is being used, else\n false(2).')
tcpEStatsStackMaxSsCwnd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 12), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackMaxSsCwnd.setDescription('The maximum congestion window used during Slow Start, in\n octets.')
tcpEStatsStackMaxCaCwnd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 13), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackMaxCaCwnd.setDescription('The maximum congestion window used during Congestion\n Avoidance, in octets.')
tcpEStatsStackMaxSsthresh = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 14), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackMaxSsthresh.setDescription('The maximum slow start threshold, excluding the initial\n value.')
tcpEStatsStackMinSsthresh = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 15), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackMinSsthresh.setDescription('The minimum slow start threshold.')
tcpEStatsStackInRecovery = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 16), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3,))).clone(namedValues=NamedValues(("tcpESDataContiguous", 1), ("tcpESDataUnordered", 2), ("tcpESDataRecovery", 3),))).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackInRecovery.setDescription('An integer value representing the state of the loss\n recovery for this connection.\n\n tcpESDataContiguous(1) indicates that the remote receiver\n is reporting contiguous data (no duplicate acknowledgments\n or SACK options) and that there are no unacknowledged\n retransmissions.\n\n tcpESDataUnordered(2) indicates that the remote receiver is\n reporting missing or out-of-order data (e.g., sending\n duplicate acknowledgments or SACK options) and that there\n are no unacknowledged retransmissions (because the missing\n data has not yet been retransmitted).\n\n tcpESDataRecovery(3) indicates that the sender has\n outstanding retransmitted data that is still\n\n\n\n unacknowledged.')
tcpEStatsStackDupAcksIn = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 17), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackDupAcksIn.setDescription('The number of duplicate ACKs received.')
tcpEStatsStackSpuriousFrDetected = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 18), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackSpuriousFrDetected.setDescription("The number of acknowledgments reporting out-of-order\n segments after the Fast Retransmit algorithm has already\n retransmitted the segments. (For example as detected by the\n Eifel algorithm).'")
tcpEStatsStackSpuriousRtoDetected = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 19), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackSpuriousRtoDetected.setDescription('The number of acknowledgments reporting segments that have\n already been retransmitted due to a Retransmission Timeout.')
tcpEStatsStackSoftErrors = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 21), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackSoftErrors.setDescription('The number of segments that fail various consistency tests\n during TCP input processing. Soft errors might cause the\n segment to be discarded but some do not. Some of these soft\n errors cause the generation of a TCP acknowledgment, while\n others are silently discarded.')
tcpEStatsStackSoftErrorReason = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 22), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8,))).clone(namedValues=NamedValues(("belowDataWindow", 1), ("aboveDataWindow", 2), ("belowAckWindow", 3), ("aboveAckWindow", 4), ("belowTSWindow", 5), ("aboveTSWindow", 6), ("dataCheckSum", 7), ("otherSoftError", 8),))).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackSoftErrorReason.setDescription('This object identifies which consistency test most recently\n failed during TCP input processing. This object SHOULD be\n set every time tcpEStatsStackSoftErrors is incremented. The\n codes are as follows:\n\n belowDataWindow(1) - All data in the segment is below\n SND.UNA. (Normal for keep-alives and zero window probes).\n\n aboveDataWindow(2) - Some data in the segment is above\n SND.WND. (Indicates an implementation bug or possible\n attack).\n\n belowAckWindow(3) - ACK below SND.UNA. (Indicates that the\n return path is reordering ACKs)\n\n aboveAckWindow(4) - An ACK for data that we have not sent.\n (Indicates an implementation bug or possible attack).\n\n belowTSWindow(5) - TSecr on the segment is older than the\n current TS.Recent (Normal for the rare case where PAWS\n detects data reordered by the network).\n\n aboveTSWindow(6) - TSecr on the segment is newer than the\n current TS.Recent. (Indicates an implementation bug or\n possible attack).\n\n\n\n\n dataCheckSum(7) - Incorrect checksum. Note that this value\n is intrinsically fragile, because the header fields used to\n identify the connection may have been corrupted.\n\n otherSoftError(8) - All other soft errors not listed\n above.')
tcpEStatsStackSlowStart = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 23), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackSlowStart.setDescription('The number of times the congestion window has been\n increased by the Slow Start algorithm.')
tcpEStatsStackCongAvoid = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 24), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackCongAvoid.setDescription('The number of times the congestion window has been\n increased by the Congestion Avoidance algorithm.')
tcpEStatsStackOtherReductions = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 25), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackOtherReductions.setDescription('The number of congestion window reductions made as a result\n of anything other than AIMD congestion control algorithms.\n Examples of non-multiplicative window reductions include\n Congestion Window Validation [RFC2861] and experimental\n algorithms such as Vegas [Bra94].\n\n\n\n\n All window reductions MUST be counted as either\n tcpEStatsPerfCongSignals or tcpEStatsStackOtherReductions.')
tcpEStatsStackCongOverCount = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 26), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackCongOverCount.setDescription("The number of congestion events that were 'backed out' of\n the congestion control state machine such that the\n congestion window was restored to a prior value. This can\n happen due to the Eifel algorithm [RFC3522] or other\n algorithms that can be used to detect and cancel spurious\n invocations of the Fast Retransmit Algorithm.\n\n Although it may be feasible to undo the effects of spurious\n invocation of the Fast Retransmit congestion events cannot\n easily be backed out of tcpEStatsPerfCongSignals and\n tcpEStatsPathPreCongSumCwnd, etc.")
tcpEStatsStackFastRetran = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 27), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackFastRetran.setDescription('The number of invocations of the Fast Retransmit algorithm.')
tcpEStatsStackSubsequentTimeouts = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 28), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackSubsequentTimeouts.setDescription('The number of times the retransmit timeout has expired after\n the RTO has been doubled. See Section 5.5 of RFC 2988.')
tcpEStatsStackCurTimeoutCount = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 29), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackCurTimeoutCount.setDescription('The current number of times the retransmit timeout has\n expired without receiving an acknowledgment for new data.\n tcpEStatsStackCurTimeoutCount is reset to zero when new\n data is acknowledged and incremented for each invocation of\n Section 5.5 of RFC 2988.')
tcpEStatsStackAbruptTimeouts = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 30), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackAbruptTimeouts.setDescription('The number of timeouts that occurred without any\n immediately preceding duplicate acknowledgments or other\n indications of congestion. Abrupt Timeouts indicate that\n the path lost an entire window of data or acknowledgments.\n\n Timeouts that are preceded by duplicate acknowledgments or\n other congestion signals (e.g., ECN) are not counted as\n abrupt, and might have been avoided by a more sophisticated\n Fast Retransmit algorithm.')
tcpEStatsStackSACKsRcvd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 31), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackSACKsRcvd.setDescription('The number of SACK options received.')
tcpEStatsStackSACKBlocksRcvd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 32), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackSACKBlocksRcvd.setDescription('The number of SACK blocks received (within SACK options).')
tcpEStatsStackSendStall = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 33), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackSendStall.setDescription('The number of interface stalls or other sender local\n resource limitations that are treated as congestion\n signals.')
tcpEStatsStackDSACKDups = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 34), ZeroBasedCounter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackDSACKDups.setDescription('The number of duplicate segments reported to the local host\n by D-SACK blocks.')
tcpEStatsStackMaxMSS = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 35), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackMaxMSS.setDescription('The maximum MSS, in octets.')
tcpEStatsStackMinMSS = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 36), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackMinMSS.setDescription('The minimum MSS, in octets.')
tcpEStatsStackSndInitial = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 37), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackSndInitial.setDescription('Initial send sequence number. Note that by definition\n tcpEStatsStackSndInitial never changes for a given\n connection.')
tcpEStatsStackRecInitial = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 38), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackRecInitial.setDescription('Initial receive sequence number. Note that by definition\n tcpEStatsStackRecInitial never changes for a given\n connection.')
tcpEStatsStackCurRetxQueue = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 39), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackCurRetxQueue.setDescription('The current number of octets of data occupying the\n retransmit queue.')
tcpEStatsStackMaxRetxQueue = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 40), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackMaxRetxQueue.setDescription('The maximum number of octets of data occupying the\n retransmit queue.')
tcpEStatsStackCurReasmQueue = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 41), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackCurReasmQueue.setDescription('The current number of octets of sequence space spanned by\n the reassembly queue. This is generally the difference\n between rcv.nxt and the sequence number of the right most\n edge of the reassembly queue.')
tcpEStatsStackMaxReasmQueue = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 5, 1, 42), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsStackMaxReasmQueue.setDescription('The maximum value of tcpEStatsStackCurReasmQueue')
tcpEStatsAppTable = MibTable((1, 3, 6, 1, 2, 1, 156, 1, 1, 6), )
if mibBuilder.loadTexts: tcpEStatsAppTable.setDescription('This table contains objects that are useful for\n determining if the application using TCP is\n\n\n\n limiting TCP performance.\n\n Entries are retained in this table for the number of\n seconds indicated by the tcpEStatsConnTableLatency\n object, after the TCP connection first enters the closed\n state.')
tcpEStatsAppEntry = MibTableRow((1, 3, 6, 1, 2, 1, 156, 1, 1, 6, 1), ).setIndexNames((0, "TCP-ESTATS-MIB", "tcpEStatsConnectIndex"))
if mibBuilder.loadTexts: tcpEStatsAppEntry.setDescription('Each entry in this table has information about the\n characteristics of each active and recently closed TCP\n connection.')
tcpEStatsAppSndUna = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 6, 1, 1), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsAppSndUna.setDescription('The value of SND.UNA, the oldest unacknowledged sequence\n number.\n\n Note that SND.UNA is a TCP state variable that is congruent\n to Counter32 semantics.')
tcpEStatsAppSndNxt = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 6, 1, 2), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsAppSndNxt.setDescription('The value of SND.NXT, the next sequence number to be sent.\n Note that tcpEStatsAppSndNxt is not monotonic (and thus not\n a counter) because TCP sometimes retransmits lost data by\n pulling tcpEStatsAppSndNxt back to the missing data.')
tcpEStatsAppSndMax = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 6, 1, 3), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsAppSndMax.setDescription('The farthest forward (right most or largest) SND.NXT value.\n Note that this will be equal to tcpEStatsAppSndNxt except\n when tcpEStatsAppSndNxt is pulled back during recovery.')
tcpEStatsAppThruOctetsAcked = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 6, 1, 4), ZeroBasedCounter32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsAppThruOctetsAcked.setDescription('The number of octets for which cumulative acknowledgments\n have been received. Note that this will be the sum of\n changes to tcpEStatsAppSndUna.')
tcpEStatsAppHCThruOctetsAcked = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 6, 1, 5), ZeroBasedCounter64()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsAppHCThruOctetsAcked.setDescription('The number of octets for which cumulative acknowledgments\n have been received, on systems that can receive more than\n 10 million bits per second. Note that this will be the sum\n of changes in tcpEStatsAppSndUna.')
tcpEStatsAppRcvNxt = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 6, 1, 6), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsAppRcvNxt.setDescription('The value of RCV.NXT. The next sequence number expected on\n an incoming segment, and the left or lower edge of the\n receive window.\n\n Note that RCV.NXT is a TCP state variable that is congruent\n to Counter32 semantics.')
tcpEStatsAppThruOctetsReceived = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 6, 1, 7), ZeroBasedCounter32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsAppThruOctetsReceived.setDescription('The number of octets for which cumulative acknowledgments\n have been sent. Note that this will be the sum of changes\n to tcpEStatsAppRcvNxt.')
tcpEStatsAppHCThruOctetsReceived = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 6, 1, 8), ZeroBasedCounter64()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsAppHCThruOctetsReceived.setDescription('The number of octets for which cumulative acknowledgments\n have been sent, on systems that can transmit more than 10\n million bits per second. Note that this will be the sum of\n changes in tcpEStatsAppRcvNxt.')
tcpEStatsAppCurAppWQueue = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 6, 1, 11), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsAppCurAppWQueue.setDescription('The current number of octets of application data buffered\n by TCP, pending first transmission, i.e., to the left of\n SND.NXT or SndMax. This data will generally be transmitted\n (and SND.NXT advanced to the left) as soon as there is an\n available congestion window (cwnd) or receiver window\n (rwin). This is the amount of data readily available for\n transmission, without scheduling the application. TCP\n performance may suffer if there is insufficient queued\n write data.')
tcpEStatsAppMaxAppWQueue = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 6, 1, 12), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsAppMaxAppWQueue.setDescription('The maximum number of octets of application data buffered\n by TCP, pending first transmission. This is the maximum\n value of tcpEStatsAppCurAppWQueue. This pair of objects can\n be used to determine if insufficient queued data is steady\n state (suggesting insufficient queue space) or transient\n (suggesting insufficient application performance or\n excessive CPU load or scheduler latency).')
tcpEStatsAppCurAppRQueue = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 6, 1, 13), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsAppCurAppRQueue.setDescription('The current number of octets of application data that has\n been acknowledged by TCP but not yet delivered to the\n application.')
tcpEStatsAppMaxAppRQueue = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 6, 1, 14), Gauge32()).setUnits('octets').setMaxAccess("readonly")
if mibBuilder.loadTexts: tcpEStatsAppMaxAppRQueue.setDescription('The maximum number of octets of application data that has\n been acknowledged by TCP but not yet delivered to the\n application.')
tcpEStatsTuneTable = MibTable((1, 3, 6, 1, 2, 1, 156, 1, 1, 7), )
if mibBuilder.loadTexts: tcpEStatsTuneTable.setDescription('This table contains per-connection controls that can\n be used to work around a number of common problems that\n plague TCP over some paths. All can be characterized as\n limiting the growth of the congestion window so as to\n prevent TCP from overwhelming some component in the\n path.\n\n Entries are retained in this table for the number of\n seconds indicated by the tcpEStatsConnTableLatency\n object, after the TCP connection first enters the closed\n state.')
tcpEStatsTuneEntry = MibTableRow((1, 3, 6, 1, 2, 1, 156, 1, 1, 7, 1), ).setIndexNames((0, "TCP-ESTATS-MIB", "tcpEStatsConnectIndex"))
if mibBuilder.loadTexts: tcpEStatsTuneEntry.setDescription('Each entry in this table is a control that can be used to\n place limits on each active TCP connection.')
tcpEStatsTuneLimCwnd = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 7, 1, 1), Unsigned32()).setUnits('octets').setMaxAccess("readwrite")
if mibBuilder.loadTexts: tcpEStatsTuneLimCwnd.setDescription('A control to set the maximum congestion window that may be\n used, in octets.')
tcpEStatsTuneLimSsthresh = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 7, 1, 2), Unsigned32()).setUnits('octets').setMaxAccess("readwrite")
if mibBuilder.loadTexts: tcpEStatsTuneLimSsthresh.setDescription('A control to limit the maximum queue space (in octets) that\n this TCP connection is likely to occupy during slowstart.\n\n It can be implemented with the algorithm described in\n RFC 3742 by setting the max_ssthresh parameter to twice\n tcpEStatsTuneLimSsthresh.\n\n This algorithm can be used to overcome some TCP performance\n problems over network paths that do not have sufficient\n buffering to withstand the bursts normally present during\n slowstart.')
tcpEStatsTuneLimRwin = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 7, 1, 3), Unsigned32()).setUnits('octets').setMaxAccess("readwrite")
if mibBuilder.loadTexts: tcpEStatsTuneLimRwin.setDescription('A control to set the maximum window advertisement that may\n be sent, in octets.')
tcpEStatsTuneLimMSS = MibTableColumn((1, 3, 6, 1, 2, 1, 156, 1, 1, 7, 1, 4), Unsigned32()).setUnits('octets').setMaxAccess("readwrite")
if mibBuilder.loadTexts: tcpEStatsTuneLimMSS.setDescription('A control to limit the maximum segment size in octets, that\n this TCP connection can use.')
tcpEStatsEstablishNotification = NotificationType((1, 3, 6, 1, 2, 1, 156, 0, 1)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsConnectIndex"),))
if mibBuilder.loadTexts: tcpEStatsEstablishNotification.setDescription('The indicated connection has been accepted\n (or alternatively entered the established state).')
tcpEStatsCloseNotification = NotificationType((1, 3, 6, 1, 2, 1, 156, 0, 2)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsConnectIndex"),))
if mibBuilder.loadTexts: tcpEStatsCloseNotification.setDescription('The indicated connection has left the\n established state')
tcpEStatsCompliances = MibIdentifier((1, 3, 6, 1, 2, 1, 156, 2, 1))
tcpEStatsGroups = MibIdentifier((1, 3, 6, 1, 2, 1, 156, 2, 2))
tcpEStatsCompliance = ModuleCompliance((1, 3, 6, 1, 2, 1, 156, 2, 1, 1)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsListenerGroup"), ("TCP-ESTATS-MIB", "tcpEStatsConnectIdGroup"), ("TCP-ESTATS-MIB", "tcpEStatsPerfGroup"), ("TCP-ESTATS-MIB", "tcpEStatsPathGroup"), ("TCP-ESTATS-MIB", "tcpEStatsStackGroup"), ("TCP-ESTATS-MIB", "tcpEStatsAppGroup"), ("TCP-ESTATS-MIB", "tcpEStatsListenerHCGroup"), ("TCP-ESTATS-MIB", "tcpEStatsPerfOptionalGroup"), ("TCP-ESTATS-MIB", "tcpEStatsPerfHCGroup"), ("TCP-ESTATS-MIB", "tcpEStatsPathOptionalGroup"), ("TCP-ESTATS-MIB", "tcpEStatsPathHCGroup"), ("TCP-ESTATS-MIB", "tcpEStatsStackOptionalGroup"), ("TCP-ESTATS-MIB", "tcpEStatsAppHCGroup"), ("TCP-ESTATS-MIB", "tcpEStatsAppOptionalGroup"), ("TCP-ESTATS-MIB", "tcpEStatsTuneOptionalGroup"), ("TCP-ESTATS-MIB", "tcpEStatsNotificationsGroup"), ("TCP-ESTATS-MIB", "tcpEStatsNotificationsCtlGroup"),))
if mibBuilder.loadTexts: tcpEStatsCompliance.setDescription('Compliance statement for all systems that implement TCP\n extended statistics.')
tcpEStatsListenerGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 1)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsListenerTableLastChange"), ("TCP-ESTATS-MIB", "tcpEStatsListenerStartTime"), ("TCP-ESTATS-MIB", "tcpEStatsListenerSynRcvd"), ("TCP-ESTATS-MIB", "tcpEStatsListenerInitial"), ("TCP-ESTATS-MIB", "tcpEStatsListenerEstablished"), ("TCP-ESTATS-MIB", "tcpEStatsListenerAccepted"), ("TCP-ESTATS-MIB", "tcpEStatsListenerExceedBacklog"), ("TCP-ESTATS-MIB", "tcpEStatsListenerCurConns"), ("TCP-ESTATS-MIB", "tcpEStatsListenerMaxBacklog"), ("TCP-ESTATS-MIB", "tcpEStatsListenerCurBacklog"), ("TCP-ESTATS-MIB", "tcpEStatsListenerCurEstabBacklog"),))
if mibBuilder.loadTexts: tcpEStatsListenerGroup.setDescription('The tcpEStatsListener group includes objects that\n provide valuable statistics and debugging\n information for TCP Listeners.')
tcpEStatsListenerHCGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 2)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsListenerHCSynRcvd"), ("TCP-ESTATS-MIB", "tcpEStatsListenerHCInitial"), ("TCP-ESTATS-MIB", "tcpEStatsListenerHCEstablished"), ("TCP-ESTATS-MIB", "tcpEStatsListenerHCAccepted"), ("TCP-ESTATS-MIB", "tcpEStatsListenerHCExceedBacklog"),))
if mibBuilder.loadTexts: tcpEStatsListenerHCGroup.setDescription('The tcpEStatsListenerHC group includes 64-bit\n counters in tcpEStatsListenerTable.')
tcpEStatsConnectIdGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 3)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsConnTableLatency"), ("TCP-ESTATS-MIB", "tcpEStatsConnectIndex"),))
if mibBuilder.loadTexts: tcpEStatsConnectIdGroup.setDescription('The tcpEStatsConnectId group includes objects that\n identify TCP connections and control how long TCP\n connection entries are retained in the tables.')
tcpEStatsPerfGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 4)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsPerfSegsOut"), ("TCP-ESTATS-MIB", "tcpEStatsPerfDataSegsOut"), ("TCP-ESTATS-MIB", "tcpEStatsPerfDataOctetsOut"), ("TCP-ESTATS-MIB", "tcpEStatsPerfSegsRetrans"), ("TCP-ESTATS-MIB", "tcpEStatsPerfOctetsRetrans"), ("TCP-ESTATS-MIB", "tcpEStatsPerfSegsIn"), ("TCP-ESTATS-MIB", "tcpEStatsPerfDataSegsIn"), ("TCP-ESTATS-MIB", "tcpEStatsPerfDataOctetsIn"), ("TCP-ESTATS-MIB", "tcpEStatsPerfElapsedSecs"), ("TCP-ESTATS-MIB", "tcpEStatsPerfElapsedMicroSecs"), ("TCP-ESTATS-MIB", "tcpEStatsPerfStartTimeStamp"), ("TCP-ESTATS-MIB", "tcpEStatsPerfCurMSS"), ("TCP-ESTATS-MIB", "tcpEStatsPerfPipeSize"), ("TCP-ESTATS-MIB", "tcpEStatsPerfMaxPipeSize"), ("TCP-ESTATS-MIB", "tcpEStatsPerfSmoothedRTT"), ("TCP-ESTATS-MIB", "tcpEStatsPerfCurRTO"), ("TCP-ESTATS-MIB", "tcpEStatsPerfCongSignals"), ("TCP-ESTATS-MIB", "tcpEStatsPerfCurCwnd"), ("TCP-ESTATS-MIB", "tcpEStatsPerfCurSsthresh"), ("TCP-ESTATS-MIB", "tcpEStatsPerfTimeouts"), ("TCP-ESTATS-MIB", "tcpEStatsPerfCurRwinSent"), ("TCP-ESTATS-MIB", "tcpEStatsPerfMaxRwinSent"), ("TCP-ESTATS-MIB", "tcpEStatsPerfZeroRwinSent"), ("TCP-ESTATS-MIB", "tcpEStatsPerfCurRwinRcvd"), ("TCP-ESTATS-MIB", "tcpEStatsPerfMaxRwinRcvd"), ("TCP-ESTATS-MIB", "tcpEStatsPerfZeroRwinRcvd"),))
if mibBuilder.loadTexts: tcpEStatsPerfGroup.setDescription('The tcpEStatsPerf group includes those objects that\n provide basic performance data for a TCP connection.')
tcpEStatsPerfOptionalGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 5)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsPerfSndLimTransRwin"), ("TCP-ESTATS-MIB", "tcpEStatsPerfSndLimTransCwnd"), ("TCP-ESTATS-MIB", "tcpEStatsPerfSndLimTransSnd"), ("TCP-ESTATS-MIB", "tcpEStatsPerfSndLimTimeRwin"), ("TCP-ESTATS-MIB", "tcpEStatsPerfSndLimTimeCwnd"), ("TCP-ESTATS-MIB", "tcpEStatsPerfSndLimTimeSnd"),))
if mibBuilder.loadTexts: tcpEStatsPerfOptionalGroup.setDescription('The tcpEStatsPerf group includes those objects that\n provide basic performance data for a TCP connection.')
tcpEStatsPerfHCGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 6)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsPerfHCDataOctetsOut"), ("TCP-ESTATS-MIB", "tcpEStatsPerfHCDataOctetsIn"),))
if mibBuilder.loadTexts: tcpEStatsPerfHCGroup.setDescription('The tcpEStatsPerfHC group includes 64-bit\n counters in the tcpEStatsPerfTable.')
tcpEStatsPathGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 7)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsControlPath"), ("TCP-ESTATS-MIB", "tcpEStatsPathRetranThresh"), ("TCP-ESTATS-MIB", "tcpEStatsPathNonRecovDAEpisodes"), ("TCP-ESTATS-MIB", "tcpEStatsPathSumOctetsReordered"), ("TCP-ESTATS-MIB", "tcpEStatsPathNonRecovDA"),))
if mibBuilder.loadTexts: tcpEStatsPathGroup.setDescription('The tcpEStatsPath group includes objects that\n control the creation of the tcpEStatsPathTable,\n and provide information about the path\n for each TCP connection.')
tcpEStatsPathOptionalGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 8)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsPathSampleRTT"), ("TCP-ESTATS-MIB", "tcpEStatsPathRTTVar"), ("TCP-ESTATS-MIB", "tcpEStatsPathMaxRTT"), ("TCP-ESTATS-MIB", "tcpEStatsPathMinRTT"), ("TCP-ESTATS-MIB", "tcpEStatsPathSumRTT"), ("TCP-ESTATS-MIB", "tcpEStatsPathCountRTT"), ("TCP-ESTATS-MIB", "tcpEStatsPathMaxRTO"), ("TCP-ESTATS-MIB", "tcpEStatsPathMinRTO"), ("TCP-ESTATS-MIB", "tcpEStatsPathIpTtl"), ("TCP-ESTATS-MIB", "tcpEStatsPathIpTosIn"), ("TCP-ESTATS-MIB", "tcpEStatsPathIpTosOut"), ("TCP-ESTATS-MIB", "tcpEStatsPathPreCongSumCwnd"), ("TCP-ESTATS-MIB", "tcpEStatsPathPreCongSumRTT"), ("TCP-ESTATS-MIB", "tcpEStatsPathPostCongSumRTT"), ("TCP-ESTATS-MIB", "tcpEStatsPathPostCongCountRTT"), ("TCP-ESTATS-MIB", "tcpEStatsPathECNsignals"), ("TCP-ESTATS-MIB", "tcpEStatsPathDupAckEpisodes"), ("TCP-ESTATS-MIB", "tcpEStatsPathRcvRTT"), ("TCP-ESTATS-MIB", "tcpEStatsPathDupAcksOut"), ("TCP-ESTATS-MIB", "tcpEStatsPathCERcvd"), ("TCP-ESTATS-MIB", "tcpEStatsPathECESent"),))
if mibBuilder.loadTexts: tcpEStatsPathOptionalGroup.setDescription('The tcpEStatsPath group includes objects that\n provide additional information about the path\n for each TCP connection.')
tcpEStatsPathHCGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 9)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsPathHCSumRTT"),))
if mibBuilder.loadTexts: tcpEStatsPathHCGroup.setDescription('The tcpEStatsPathHC group includes 64-bit\n counters in the tcpEStatsPathTable.')
tcpEStatsStackGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 10)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsControlStack"), ("TCP-ESTATS-MIB", "tcpEStatsStackActiveOpen"), ("TCP-ESTATS-MIB", "tcpEStatsStackMSSSent"), ("TCP-ESTATS-MIB", "tcpEStatsStackMSSRcvd"), ("TCP-ESTATS-MIB", "tcpEStatsStackWinScaleSent"), ("TCP-ESTATS-MIB", "tcpEStatsStackWinScaleRcvd"), ("TCP-ESTATS-MIB", "tcpEStatsStackTimeStamps"), ("TCP-ESTATS-MIB", "tcpEStatsStackECN"), ("TCP-ESTATS-MIB", "tcpEStatsStackWillSendSACK"), ("TCP-ESTATS-MIB", "tcpEStatsStackWillUseSACK"), ("TCP-ESTATS-MIB", "tcpEStatsStackState"), ("TCP-ESTATS-MIB", "tcpEStatsStackNagle"), ("TCP-ESTATS-MIB", "tcpEStatsStackMaxSsCwnd"), ("TCP-ESTATS-MIB", "tcpEStatsStackMaxCaCwnd"), ("TCP-ESTATS-MIB", "tcpEStatsStackMaxSsthresh"), ("TCP-ESTATS-MIB", "tcpEStatsStackMinSsthresh"), ("TCP-ESTATS-MIB", "tcpEStatsStackInRecovery"), ("TCP-ESTATS-MIB", "tcpEStatsStackDupAcksIn"), ("TCP-ESTATS-MIB", "tcpEStatsStackSpuriousFrDetected"), ("TCP-ESTATS-MIB", "tcpEStatsStackSpuriousRtoDetected"),))
if mibBuilder.loadTexts: tcpEStatsStackGroup.setDescription('The tcpEStatsConnState group includes objects that\n control the creation of the tcpEStatsStackTable,\n and provide information about the operation of\n algorithms used within TCP.')
tcpEStatsStackOptionalGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 11)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsStackSoftErrors"), ("TCP-ESTATS-MIB", "tcpEStatsStackSoftErrorReason"), ("TCP-ESTATS-MIB", "tcpEStatsStackSlowStart"), ("TCP-ESTATS-MIB", "tcpEStatsStackCongAvoid"), ("TCP-ESTATS-MIB", "tcpEStatsStackOtherReductions"), ("TCP-ESTATS-MIB", "tcpEStatsStackCongOverCount"), ("TCP-ESTATS-MIB", "tcpEStatsStackFastRetran"), ("TCP-ESTATS-MIB", "tcpEStatsStackSubsequentTimeouts"), ("TCP-ESTATS-MIB", "tcpEStatsStackCurTimeoutCount"), ("TCP-ESTATS-MIB", "tcpEStatsStackAbruptTimeouts"), ("TCP-ESTATS-MIB", "tcpEStatsStackSACKsRcvd"), ("TCP-ESTATS-MIB", "tcpEStatsStackSACKBlocksRcvd"), ("TCP-ESTATS-MIB", "tcpEStatsStackSendStall"), ("TCP-ESTATS-MIB", "tcpEStatsStackDSACKDups"), ("TCP-ESTATS-MIB", "tcpEStatsStackMaxMSS"), ("TCP-ESTATS-MIB", "tcpEStatsStackMinMSS"), ("TCP-ESTATS-MIB", "tcpEStatsStackSndInitial"), ("TCP-ESTATS-MIB", "tcpEStatsStackRecInitial"), ("TCP-ESTATS-MIB", "tcpEStatsStackCurRetxQueue"), ("TCP-ESTATS-MIB", "tcpEStatsStackMaxRetxQueue"), ("TCP-ESTATS-MIB", "tcpEStatsStackCurReasmQueue"), ("TCP-ESTATS-MIB", "tcpEStatsStackMaxReasmQueue"),))
if mibBuilder.loadTexts: tcpEStatsStackOptionalGroup.setDescription('The tcpEStatsConnState group includes objects that\n provide additional information about the operation of\n algorithms used within TCP.')
tcpEStatsAppGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 12)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsControlApp"), ("TCP-ESTATS-MIB", "tcpEStatsAppSndUna"), ("TCP-ESTATS-MIB", "tcpEStatsAppSndNxt"), ("TCP-ESTATS-MIB", "tcpEStatsAppSndMax"), ("TCP-ESTATS-MIB", "tcpEStatsAppThruOctetsAcked"), ("TCP-ESTATS-MIB", "tcpEStatsAppRcvNxt"), ("TCP-ESTATS-MIB", "tcpEStatsAppThruOctetsReceived"),))
if mibBuilder.loadTexts: tcpEStatsAppGroup.setDescription('The tcpEStatsConnState group includes objects that\n control the creation of the tcpEStatsAppTable,\n and provide information about the operation of\n algorithms used within TCP.')
tcpEStatsAppHCGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 13)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsAppHCThruOctetsAcked"), ("TCP-ESTATS-MIB", "tcpEStatsAppHCThruOctetsReceived"),))
if mibBuilder.loadTexts: tcpEStatsAppHCGroup.setDescription('The tcpEStatsStackHC group includes 64-bit\n counters in the tcpEStatsStackTable.')
tcpEStatsAppOptionalGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 14)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsAppCurAppWQueue"), ("TCP-ESTATS-MIB", "tcpEStatsAppMaxAppWQueue"), ("TCP-ESTATS-MIB", "tcpEStatsAppCurAppRQueue"), ("TCP-ESTATS-MIB", "tcpEStatsAppMaxAppRQueue"),))
if mibBuilder.loadTexts: tcpEStatsAppOptionalGroup.setDescription('The tcpEStatsConnState group includes objects that\n provide additional information about how applications\n are interacting with each TCP connection.')
tcpEStatsTuneOptionalGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 15)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsControlTune"), ("TCP-ESTATS-MIB", "tcpEStatsTuneLimCwnd"), ("TCP-ESTATS-MIB", "tcpEStatsTuneLimSsthresh"), ("TCP-ESTATS-MIB", "tcpEStatsTuneLimRwin"), ("TCP-ESTATS-MIB", "tcpEStatsTuneLimMSS"),))
if mibBuilder.loadTexts: tcpEStatsTuneOptionalGroup.setDescription('The tcpEStatsConnState group includes objects that\n control the creation of the tcpEStatsConnectionTable,\n which can be used to set tuning parameters\n for each TCP connection.')
tcpEStatsNotificationsGroup = NotificationGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 16)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsEstablishNotification"), ("TCP-ESTATS-MIB", "tcpEStatsCloseNotification"),))
if mibBuilder.loadTexts: tcpEStatsNotificationsGroup.setDescription('Notifications sent by a TCP extended statistics agent.')
tcpEStatsNotificationsCtlGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 156, 2, 2, 17)).setObjects(*(("TCP-ESTATS-MIB", "tcpEStatsControlNotify"),))
if mibBuilder.loadTexts: tcpEStatsNotificationsCtlGroup.setDescription('The tcpEStatsNotificationsCtl group includes the\n object that controls the creation of the events\n in the tcpEStatsNotificationsGroup.')
mibBuilder.exportSymbols("TCP-ESTATS-MIB", tcpEStatsPerfSegsIn=tcpEStatsPerfSegsIn, tcpEStatsAppHCThruOctetsAcked=tcpEStatsAppHCThruOctetsAcked, tcpEStatsStackMSSSent=tcpEStatsStackMSSSent, tcpEStatsTuneLimRwin=tcpEStatsTuneLimRwin, tcpEStatsStackTimeStamps=tcpEStatsStackTimeStamps, tcpEStatsStackState=tcpEStatsStackState, tcpEStatsPerfZeroRwinRcvd=tcpEStatsPerfZeroRwinRcvd, tcpEStatsStackSpuriousFrDetected=tcpEStatsStackSpuriousFrDetected, tcpEStatsStackMaxMSS=tcpEStatsStackMaxMSS, tcpEStatsPerfDataOctetsIn=tcpEStatsPerfDataOctetsIn, tcpEStatsStackSACKsRcvd=tcpEStatsStackSACKsRcvd, tcpEStatsTuneTable=tcpEStatsTuneTable, TcpEStatsNegotiated=TcpEStatsNegotiated, tcpEStatsPathCERcvd=tcpEStatsPathCERcvd, tcpEStatsPerfEntry=tcpEStatsPerfEntry, tcpEStatsConnectIndex=tcpEStatsConnectIndex, tcpEStatsPerfSndLimTransSnd=tcpEStatsPerfSndLimTransSnd, tcpEStatsPerfZeroRwinSent=tcpEStatsPerfZeroRwinSent, tcpEStatsStackSACKBlocksRcvd=tcpEStatsStackSACKBlocksRcvd, tcpEStatsPerfSndLimTimeRwin=tcpEStatsPerfSndLimTimeRwin, tcpEStatsPerfTable=tcpEStatsPerfTable, tcpEStatsPathSampleRTT=tcpEStatsPathSampleRTT, tcpEStatsEstablishNotification=tcpEStatsEstablishNotification, tcpEStatsPerfMaxRwinRcvd=tcpEStatsPerfMaxRwinRcvd, tcpEStatsAppMaxAppRQueue=tcpEStatsAppMaxAppRQueue, tcpEStatsPerfCurSsthresh=tcpEStatsPerfCurSsthresh, tcpEStatsStackDSACKDups=tcpEStatsStackDSACKDups, tcpEStatsCloseNotification=tcpEStatsCloseNotification, tcpEStatsAppEntry=tcpEStatsAppEntry, tcpEStatsControlApp=tcpEStatsControlApp, tcpEStatsStackRecInitial=tcpEStatsStackRecInitial, tcpEStatsStackMaxReasmQueue=tcpEStatsStackMaxReasmQueue, tcpEStatsStackWillSendSACK=tcpEStatsStackWillSendSACK, tcpEStatsAppRcvNxt=tcpEStatsAppRcvNxt, tcpEStatsPerfHCGroup=tcpEStatsPerfHCGroup, tcpEStatsPerfSndLimTimeCwnd=tcpEStatsPerfSndLimTimeCwnd, tcpEStatsPerfStartTimeStamp=tcpEStatsPerfStartTimeStamp, tcpEStatsConnectIdTable=tcpEStatsConnectIdTable, tcpEStatsControlStack=tcpEStatsControlStack, tcpEStatsStackDupAcksIn=tcpEStatsStackDupAcksIn, tcpEStatsListenerGroup=tcpEStatsListenerGroup, tcpEStatsControlPath=tcpEStatsControlPath, tcpEStatsPathIpTosIn=tcpEStatsPathIpTosIn, tcpEStatsStackOtherReductions=tcpEStatsStackOtherReductions, tcpEStatsStackCurRetxQueue=tcpEStatsStackCurRetxQueue, tcpEStatsTuneEntry=tcpEStatsTuneEntry, tcpEStatsPerfHCDataOctetsIn=tcpEStatsPerfHCDataOctetsIn, tcpEStatsStackMaxSsCwnd=tcpEStatsStackMaxSsCwnd, tcpEStatsPathNonRecovDA=tcpEStatsPathNonRecovDA, tcpEStatsStackSoftErrorReason=tcpEStatsStackSoftErrorReason, tcpEStatsStackTable=tcpEStatsStackTable, tcpEStatsPathECESent=tcpEStatsPathECESent, tcpEStatsPerfPipeSize=tcpEStatsPerfPipeSize, tcpEStatsStackSlowStart=tcpEStatsStackSlowStart, tcpEStatsStackMSSRcvd=tcpEStatsStackMSSRcvd, tcpEStatsListenerAccepted=tcpEStatsListenerAccepted, tcpEStatsAppGroup=tcpEStatsAppGroup, tcpEStatsStackAbruptTimeouts=tcpEStatsStackAbruptTimeouts, tcpEStatsPathPostCongCountRTT=tcpEStatsPathPostCongCountRTT, tcpEStatsPathSumRTT=tcpEStatsPathSumRTT, tcpEStatsPathEntry=tcpEStatsPathEntry, tcpEStatsPathHCGroup=tcpEStatsPathHCGroup, tcpEStatsListenerSynRcvd=tcpEStatsListenerSynRcvd, tcpEStatsStackMinMSS=tcpEStatsStackMinMSS, tcpEStatsPathSumOctetsReordered=tcpEStatsPathSumOctetsReordered, tcpEStatsAppSndUna=tcpEStatsAppSndUna, tcpEStatsPerfTimeouts=tcpEStatsPerfTimeouts, tcpEStatsListenerExceedBacklog=tcpEStatsListenerExceedBacklog, tcpEStatsPathMinRTO=tcpEStatsPathMinRTO, tcpEStatsPerfOctetsRetrans=tcpEStatsPerfOctetsRetrans, tcpEStatsStackMaxSsthresh=tcpEStatsStackMaxSsthresh, tcpEStatsAppOptionalGroup=tcpEStatsAppOptionalGroup, tcpEStatsPathPreCongSumCwnd=tcpEStatsPathPreCongSumCwnd, tcpEStatsListenerMaxBacklog=tcpEStatsListenerMaxBacklog, tcpEStatsPerfCongSignals=tcpEStatsPerfCongSignals, tcpEStatsStackFastRetran=tcpEStatsStackFastRetran, tcpEStatsTuneOptionalGroup=tcpEStatsTuneOptionalGroup, tcpEStatsCompliance=tcpEStatsCompliance, tcpEStatsListenerCurBacklog=tcpEStatsListenerCurBacklog, tcpEStatsStackMaxCaCwnd=tcpEStatsStackMaxCaCwnd, tcpEStatsPathIpTosOut=tcpEStatsPathIpTosOut, tcpEStatsControlNotify=tcpEStatsControlNotify, tcpEStatsNotificationsCtlGroup=tcpEStatsNotificationsCtlGroup, tcpEStatsAppTable=tcpEStatsAppTable, tcpEStatsPerfSndLimTimeSnd=tcpEStatsPerfSndLimTimeSnd, tcpEStatsPathRcvRTT=tcpEStatsPathRcvRTT, tcpEStatsStackEntry=tcpEStatsStackEntry, tcpEStatsStackWillUseSACK=tcpEStatsStackWillUseSACK, tcpEStatsPerfSmoothedRTT=tcpEStatsPerfSmoothedRTT, tcpEStatsControl=tcpEStatsControl, tcpEStatsPathMaxRTO=tcpEStatsPathMaxRTO, tcpEStatsAppHCThruOctetsReceived=tcpEStatsAppHCThruOctetsReceived, tcpEStatsAppCurAppWQueue=tcpEStatsAppCurAppWQueue, tcpEStatsGroups=tcpEStatsGroups, tcpEStatsMIBObjects=tcpEStatsMIBObjects, tcpEStatsListenerEstablished=tcpEStatsListenerEstablished, tcpEStatsPerfCurMSS=tcpEStatsPerfCurMSS, tcpEStatsListenerHCEstablished=tcpEStatsListenerHCEstablished, tcpEStatsPathECNsignals=tcpEStatsPathECNsignals, tcpEStatsPerfCurCwnd=tcpEStatsPerfCurCwnd, tcpEStatsNotifications=tcpEStatsNotifications, tcpEStatsListenerHCExceedBacklog=tcpEStatsListenerHCExceedBacklog, tcpEStatsPerfSegsRetrans=tcpEStatsPerfSegsRetrans, tcpEStatsPerfMaxRwinSent=tcpEStatsPerfMaxRwinSent, tcpEStatsPathCountRTT=tcpEStatsPathCountRTT, tcpEStatsPerfSegsOut=tcpEStatsPerfSegsOut, tcpEStatsAppSndNxt=tcpEStatsAppSndNxt, tcpEStatsPerfDataSegsIn=tcpEStatsPerfDataSegsIn, tcpEStatsControlTune=tcpEStatsControlTune, tcpEStatsTuneLimMSS=tcpEStatsTuneLimMSS, tcpEStatsStackSpuriousRtoDetected=tcpEStatsStackSpuriousRtoDetected, tcpEStatsStackSendStall=tcpEStatsStackSendStall, tcpEStatsListenerTable=tcpEStatsListenerTable, tcpEStatsStackInRecovery=tcpEStatsStackInRecovery, tcpEStatsAppThruOctetsAcked=tcpEStatsAppThruOctetsAcked, tcpEStatsStackGroup=tcpEStatsStackGroup, tcpEStatsPathRTTVar=tcpEStatsPathRTTVar, tcpEStatsConnectIdEntry=tcpEStatsConnectIdEntry, tcpEStatsPathHCSumRTT=tcpEStatsPathHCSumRTT, tcpEStatsListenerHCInitial=tcpEStatsListenerHCInitial, tcpEStatsAppMaxAppWQueue=tcpEStatsAppMaxAppWQueue, tcpEStatsListenerCurEstabBacklog=tcpEStatsListenerCurEstabBacklog, tcpEStatsListenerHCSynRcvd=tcpEStatsListenerHCSynRcvd, tcpEStatsStackWinScaleRcvd=tcpEStatsStackWinScaleRcvd, tcpEStatsPerfOptionalGroup=tcpEStatsPerfOptionalGroup, tcpEStatsConformance=tcpEStatsConformance, tcpEStatsPerfHCDataOctetsOut=tcpEStatsPerfHCDataOctetsOut, tcpEStatsStackCurTimeoutCount=tcpEStatsStackCurTimeoutCount, tcpEStatsListenerInitial=tcpEStatsListenerInitial, tcpEStatsStackNagle=tcpEStatsStackNagle, tcpEStatsAppCurAppRQueue=tcpEStatsAppCurAppRQueue, tcpEStatsPerfElapsedMicroSecs=tcpEStatsPerfElapsedMicroSecs, tcpEStatsStackCurReasmQueue=tcpEStatsStackCurReasmQueue, tcpEStatsStackSubsequentTimeouts=tcpEStatsStackSubsequentTimeouts, tcpEStatsStackECN=tcpEStatsStackECN, tcpEStatsAppHCGroup=tcpEStatsAppHCGroup, tcpEStatsConnTableLatency=tcpEStatsConnTableLatency, tcpEStatsPathDupAckEpisodes=tcpEStatsPathDupAckEpisodes, tcpEStatsStackMinSsthresh=tcpEStatsStackMinSsthresh, tcpEStatsPathMaxRTT=tcpEStatsPathMaxRTT, tcpEStatsMIB=tcpEStatsMIB, tcpEStatsPathRetranThresh=tcpEStatsPathRetranThresh, tcpEStatsConnectIdGroup=tcpEStatsConnectIdGroup, tcpEStatsTuneLimSsthresh=tcpEStatsTuneLimSsthresh, tcpEStatsPerfSndLimTransCwnd=tcpEStatsPerfSndLimTransCwnd, tcpEStatsPerfCurRTO=tcpEStatsPerfCurRTO, tcpEStatsPathTable=tcpEStatsPathTable, PYSNMP_MODULE_ID=tcpEStatsMIB, tcpEStatsAppSndMax=tcpEStatsAppSndMax, tcpEStatsListenerHCGroup=tcpEStatsListenerHCGroup, tcpEStatsPathIpTtl=tcpEStatsPathIpTtl, tcpEStatsStackCongAvoid=tcpEStatsStackCongAvoid, tcpEStatsPathGroup=tcpEStatsPathGroup, tcpEStatsStackSndInitial=tcpEStatsStackSndInitial, tcpEStatsPathPostCongSumRTT=tcpEStatsPathPostCongSumRTT, tcpEStatsPathMinRTT=tcpEStatsPathMinRTT, tcpEStats=tcpEStats, tcpEStatsPathPreCongSumRTT=tcpEStatsPathPreCongSumRTT, tcpEStatsPathDupAcksOut=tcpEStatsPathDupAcksOut, tcpEStatsStackCongOverCount=tcpEStatsStackCongOverCount, tcpEStatsPathOptionalGroup=tcpEStatsPathOptionalGroup, tcpEStatsNotificationsGroup=tcpEStatsNotificationsGroup, tcpEStatsPerfMaxPipeSize=tcpEStatsPerfMaxPipeSize, tcpEStatsListenerEntry=tcpEStatsListenerEntry, tcpEStatsPerfSndLimTransRwin=tcpEStatsPerfSndLimTransRwin, tcpEStatsPerfGroup=tcpEStatsPerfGroup, tcpEStatsListenerHCAccepted=tcpEStatsListenerHCAccepted, tcpEStatsTuneLimCwnd=tcpEStatsTuneLimCwnd, tcpEStatsPerfElapsedSecs=tcpEStatsPerfElapsedSecs, tcpEStatsListenerStartTime=tcpEStatsListenerStartTime, tcpEStatsPerfCurRwinSent=tcpEStatsPerfCurRwinSent, tcpEStatsPathNonRecovDAEpisodes=tcpEStatsPathNonRecovDAEpisodes, tcpEStatsStackMaxRetxQueue=tcpEStatsStackMaxRetxQueue, tcpEStatsStackSoftErrors=tcpEStatsStackSoftErrors, tcpEStatsStackWinScaleSent=tcpEStatsStackWinScaleSent, tcpEStatsListenerTableLastChange=tcpEStatsListenerTableLastChange, tcpEStatsPerfDataSegsOut=tcpEStatsPerfDataSegsOut, tcpEStatsCompliances=tcpEStatsCompliances, tcpEStatsStackActiveOpen=tcpEStatsStackActiveOpen, tcpEStatsPerfCurRwinRcvd=tcpEStatsPerfCurRwinRcvd, tcpEStatsAppThruOctetsReceived=tcpEStatsAppThruOctetsReceived, tcpEStatsPerfDataOctetsOut=tcpEStatsPerfDataOctetsOut, tcpEStatsListenerCurConns=tcpEStatsListenerCurConns, tcpEStatsScalar=tcpEStatsScalar, tcpEStatsStackOptionalGroup=tcpEStatsStackOptionalGroup)
| 246.855297 | 9,197 | 0.758544 | 11,361 | 95,533 | 6.378048 | 0.106593 | 0.006265 | 0.007701 | 0.010102 | 0.404811 | 0.362043 | 0.309725 | 0.283035 | 0.203558 | 0.194463 | 0 | 0.038826 | 0.138894 | 95,533 | 386 | 9,198 | 247.494819 | 0.842009 | 0.003182 | 0 | 0 | 0 | 0.296296 | 0.518305 | 0.057393 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.02381 | 0 | 0.031746 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b11c8dded78e635418e3c64b41c1a802bab3263c | 72 | py | Python | prebuilder/tools/dpkgSig.py | prebuilder/prebuilder.py | e5ee9dce0e46a3bc022dfaa0ce9f1be0563e2bdc | [
"Unlicense"
] | 4 | 2019-11-10T19:53:00.000Z | 2020-11-03T00:35:25.000Z | prebuilder/tools/dpkgSig.py | prebuilder/prebuilder.py | e5ee9dce0e46a3bc022dfaa0ce9f1be0563e2bdc | [
"Unlicense"
] | null | null | null | prebuilder/tools/dpkgSig.py | prebuilder/prebuilder.py | e5ee9dce0e46a3bc022dfaa0ce9f1be0563e2bdc | [
"Unlicense"
] | 1 | 2019-11-15T08:49:49.000Z | 2019-11-15T08:49:49.000Z | import sh
dpkgSig = sh.Command("dpkg-sig").bake(s="builder", _fg=True)
| 18 | 60 | 0.694444 | 12 | 72 | 4.083333 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097222 | 72 | 3 | 61 | 24 | 0.753846 | 0 | 0 | 0 | 0 | 0 | 0.208333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
b11d2d7e70393e6ece88839fb75d527cf569b8c6 | 232 | py | Python | src/main2/main/BottleAPI/test.py | scianand/Clustering_BinPacking | 20b6b31cd1a07bdbfb686523f603e79d0edbfbfb | [
"MIT"
] | 1 | 2020-02-04T15:21:31.000Z | 2020-02-04T15:21:31.000Z | src/main2/main/BottleAPI/test.py | scianand/Clustering_BinPacking | 20b6b31cd1a07bdbfb686523f603e79d0edbfbfb | [
"MIT"
] | null | null | null | src/main2/main/BottleAPI/test.py | scianand/Clustering_BinPacking | 20b6b31cd1a07bdbfb686523f603e79d0edbfbfb | [
"MIT"
] | null | null | null | import pandas as pd
from TestJson import main
df = pd.read_json("C:\\Users\\1716293.RGU.000\\Clustering-Bin-Packing\\src\\main\\resources\\loads_temp.json")
# clusters = int(input("how many clusters?\n"))
bins = main(df)
print(bins) | 38.666667 | 110 | 0.737069 | 38 | 232 | 4.447368 | 0.789474 | 0.071006 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04717 | 0.086207 | 232 | 6 | 111 | 38.666667 | 0.75 | 0.193966 | 0 | 0 | 0 | 0.2 | 0.478495 | 0.478495 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
b1309dae06863e82f9511dc89494fc7ec18f80b6 | 6,426 | py | Python | Deep Learning in Python/Chapter 4 - Fine-tuning keras models.py | nabeelsana/DataCamp-courses | f6208c44b2c21d0da87013b6ef624c75af8820f8 | [
"MIT"
] | 464 | 2018-03-01T21:53:12.000Z | 2022-03-30T16:56:26.000Z | Deep Learning in Python/Chapter 4 - Fine-tuning keras models.py | citnan/datacamp-python-data-science-track | 383b644907cca1c14befb4706a32579bec01a134 | [
"MIT"
] | 4 | 2018-02-28T14:34:05.000Z | 2022-01-23T05:10:33.000Z | Deep Learning in Python/Chapter 4 - Fine-tuning keras models.py | citnan/datacamp-python-data-science-track | 383b644907cca1c14befb4706a32579bec01a134 | [
"MIT"
] | 423 | 2018-04-06T15:40:54.000Z | 2022-03-29T03:20:14.000Z | #------------------------------------------------------------------------------------------------------------------------$
#------------------------------------------------------------------------------------------------------------------------$
#------------------------------------------------------------------------------------------------------------------------$
#Chapter 4 - Fine-tuning keras models
#------------------------------------------------------------------------------------------------------------------------$
#Changing optimization parameters
# Import the SGD optimizer
from keras.optimizers import SGD
# Create list of learning rates: lr_to_test
lr_to_test = [.000001, 0.01, 1]
# Loop over learning rates
for lr in lr_to_test:
print('\n\nTesting model with learning rate: %f\n'%lr )
# Build new model to test, unaffected by previous models
model = get_new_model()
# Create SGD optimizer with specified learning rate: my_optimizer
my_optimizer = SGD(lr=lr)
# Compile the model
model.compile(optimizer=my_optimizer, loss='categorical_crossentropy')
# Fit the model
model.fit(predictors, target)
#------------------------------------------------------------------------------------------------------------------------$
#Evaluating model accuracy on validation dataset
# Save the number of columns in predictors: n_cols
n_cols = predictors.shape[1]
input_shape = (n_cols,)
# Specify the model
model = Sequential()
model.add(Dense(100, activation='relu', input_shape = input_shape))
model.add(Dense(100, activation='relu'))
model.add(Dense(2, activation='softmax'))
# Compile the model
model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
# Fit the model
hist = model.fit(predictors, target, validation_split=0.3)
#------------------------------------------------------------------------------------------------------------------------$
#Early stopping: Optimizing the optimization
# Import EarlyStopping
from keras.callbacks import EarlyStopping
# Save the number of columns in predictors: n_cols
n_cols = predictors.shape[1]
input_shape = (n_cols,)
# Specify the model
model = Sequential()
model.add(Dense(100, activation='relu', input_shape = input_shape))
model.add(Dense(100, activation='relu'))
model.add(Dense(2, activation='softmax'))
# Compile the model
model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
# Define early_stopping_monitor
early_stopping_monitor = EarlyStopping(patience=2)
# Fit the model
model.fit(predictors, target, epochs=30, validation_split=0.3, callbacks=[early_stopping_monitor])
#------------------------------------------------------------------------------------------------------------------------$
#Experimenting with wider networks
# Define early_stopping_monitor
early_stopping_monitor = EarlyStopping(patience=2)
# Create the new model: model_2
model_2 = Sequential()
# Add the first and second layers
model_2.add(Dense(100, activation='relu', input_shape=input_shape))
model_2.add(Dense(100, activation='relu'))
# Add the output layer
model_2.add(Dense(2, activation='softmax'))
# Compile model_2
model_2.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
# Fit model_1
model_1_training = model_1.fit(predictors, target, epochs=15, validation_split=0.2, callbacks=[early_stopping_monitor], verbose=False)
# Fit model_2
model_2_training = model_2.fit(predictors, target, epochs=15, validation_split=0.2, callbacks=[early_stopping_monitor], verbose=False)
# Create the plot
plt.plot(model_1_training.history['val_loss'], 'r', model_2_training.history['val_loss'], 'b')
plt.xlabel('Epochs')
plt.ylabel('Validation score')
plt.show()
#------------------------------------------------------------------------------------------------------------------------$
#Adding layers to a network
# The input shape to use in the first hidden layer
input_shape = (n_cols,)
# Create the new model: model_2
model_2 = Sequential()
# Add the first, second, and third hidden layers
model_2.add(Dense(50, activation='relu', input_shape=input_shape))
model_2.add(Dense(50, activation='relu'))
model_2.add(Dense(50, activation='relu'))
# Add the output layer
model_2.add(Dense(2, activation='softmax'))
# Compile model_2
model_2.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
# Fit model 1
model_1_training = model_1.fit(predictors, target, epochs=20, validation_split=0.4, callbacks=[early_stopping_monitor], verbose=False)
# Fit model 2
model_2_training = model_2.fit(predictors, target, epochs=20, validation_split=0.4, callbacks=[early_stopping_monitor], verbose=False)
# Create the plot
plt.plot(model_1_training.history['val_loss'], 'r', model_2_training.history['val_loss'], 'b')
plt.xlabel('Epochs')
plt.ylabel('Validation score')
plt.show()
#------------------------------------------------------------------------------------------------------------------------$
#Building your own digit recognition model
# Create the model: model
model = Sequential()
# Add the first hidden layer
model.add(Dense(50, activation='relu', input_shape=(784,)))
# Add the second hidden layer
model.add(Dense(50, activation='relu'))
# Add the output layer
model.add(Dense(10, activation='softmax'))
# Compile the model
model.compile(optimizer='adam',
loss='categorical_crossentropy',
metrics=['accuracy'])
# Fit the model
model.fit(X, y, validation_split=0.3)
#------------------------------------------------------------------------------------------------------------------------$
#------------------------------------------------------------------------------------------------------------------------$
#------------------------------------------------------------------------------------------------------------------------$
#------------------------------------------------------------------------------------------------------------------------$
#------------------------------------------------------------------------------------------------------------------------$
#------------------------------------------------------------------------------------------------------------------------$
#------------------------------------------------------------------------------------------------------------------------$ | 33.821053 | 134 | 0.521164 | 643 | 6,426 | 5.049767 | 0.199067 | 0.042501 | 0.040037 | 0.030182 | 0.722821 | 0.716662 | 0.705574 | 0.666461 | 0.649215 | 0.649215 | 0 | 0.018331 | 0.091659 | 6,426 | 190 | 135 | 33.821053 | 0.537948 | 0.496421 | 0 | 0.576271 | 0 | 0 | 0.127559 | 0.045354 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.033898 | 0 | 0.033898 | 0.016949 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b1413d170e79c1a0471825839941bf4682865429 | 138 | py | Python | google_url/api/urls.py | OSAMAMOHAMED1234/google_api_django | f644d871658354bb92bceabbfb2546b4f25b7c9b | [
"MIT"
] | 3 | 2018-05-02T20:37:11.000Z | 2020-10-15T17:19:26.000Z | google_url/api/urls.py | OSAMAMOHAMED1234/google_api_django | f644d871658354bb92bceabbfb2546b4f25b7c9b | [
"MIT"
] | 1 | 2019-06-10T21:35:13.000Z | 2019-06-10T21:35:13.000Z | google_url/api/urls.py | OSAMAMOHAMED1234/google_api_django | f644d871658354bb92bceabbfb2546b4f25b7c9b | [
"MIT"
] | null | null | null | from django.conf.urls import url
from .views import UrlAPIView
urlpatterns = [
url(r'^$', UrlAPIView.as_view(), name='home_api'),
]
| 17.25 | 54 | 0.695652 | 19 | 138 | 4.947368 | 0.789474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152174 | 138 | 7 | 55 | 19.714286 | 0.803419 | 0 | 0 | 0 | 0 | 0 | 0.072464 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
b1429175db53f7f31bea6dd47441944524fbe484 | 545 | py | Python | Algo_Ds_Notes-master/Algo_Ds_Notes-master/Centered_Decagonal_Number/Centered_Decagonal_Number.py | rajatenzyme/Coding-Journey- | 65a0570153b7e3393d78352e78fb2111223049f3 | [
"MIT"
] | null | null | null | Algo_Ds_Notes-master/Algo_Ds_Notes-master/Centered_Decagonal_Number/Centered_Decagonal_Number.py | rajatenzyme/Coding-Journey- | 65a0570153b7e3393d78352e78fb2111223049f3 | [
"MIT"
] | null | null | null | Algo_Ds_Notes-master/Algo_Ds_Notes-master/Centered_Decagonal_Number/Centered_Decagonal_Number.py | rajatenzyme/Coding-Journey- | 65a0570153b7e3393d78352e78fb2111223049f3 | [
"MIT"
] | null | null | null | '''
A centered decagonal number is a centered figurate number that represents
a decagon with a dot in the center and all other dots surrounding the center
dot in successive decagonal layers.
The centered decagonal number for n is given by the formula
5n^2+5n+1
'''
def centeredDecagonal (num):
# Using formula
return 5 * num * num + 5 * num + 1
# Driver code
num = int(input())
print(num, "centered decagonal number :", centeredDecagonal(num))
'''
Input:
6
output:
6 centered decagonal number : 211
'''
| 22.708333 | 80 | 0.686239 | 78 | 545 | 4.794872 | 0.551282 | 0.181818 | 0.245989 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028777 | 0.234862 | 545 | 23 | 81 | 23.695652 | 0.868106 | 0.522936 | 0 | 0 | 0 | 0 | 0.160714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.25 | 0.5 | 0.25 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
b152bc0527f60a24e1da938472daf21c9bd4e09b | 495 | py | Python | Functions/RE-HASHTAG.py | niteappantest/test-follower-insta | bad4a0c2f4ee1e1d70acdc9a9bccbac722152d9f | [
"Apache-2.0"
] | 1 | 2021-05-25T14:49:15.000Z | 2021-05-25T14:49:15.000Z | Functions/RE-HASHTAG.py | niteappantest/test-follower-insta | bad4a0c2f4ee1e1d70acdc9a9bccbac722152d9f | [
"Apache-2.0"
] | null | null | null | Functions/RE-HASHTAG.py | niteappantest/test-follower-insta | bad4a0c2f4ee1e1d70acdc9a9bccbac722152d9f | [
"Apache-2.0"
] | null | null | null | [[cyan2]] ____ __ __ __ __
/ __ \___ / / / /___ ______/ /_ / /_____ _____ _
/ /_/ / _ \______/ /_/ / __ `/ ___/ __ \/ __/ __ `/ __ `/
/ _, _/ __/_____/ __ / /_/ (__ ) / / / /_/ /_/ / /_/ /
/_/ |_|\___/ /_/ /_/\__,_/____/_/ /_/\__/\__,_/\__, /
/____/ [[reset]]
[ Note:- [[yellow]]FIRST COMMENT ON THE POST THEN STRAT THE BOT[[reset]] ]
[ -USE AT YOUR OWN RISK- ]
| 45 | 74 | 0.365657 | 19 | 495 | 4.052632 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003546 | 0.430303 | 495 | 10 | 75 | 49.5 | 0.269504 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b1564ebbf6b309db5b72320a885652449efef125 | 133 | py | Python | src/dayN.py | chipturner/advent-of-code-2021 | 52d8f84eb9243fa076c9f7c2a2e3836e138ab127 | [
"Apache-2.0"
] | null | null | null | src/dayN.py | chipturner/advent-of-code-2021 | 52d8f84eb9243fa076c9f7c2a2e3836e138ab127 | [
"Apache-2.0"
] | null | null | null | src/dayN.py | chipturner/advent-of-code-2021 | 52d8f84eb9243fa076c9f7c2a2e3836e138ab127 | [
"Apache-2.0"
] | null | null | null | import helpers
import itertools
import collections
def main() -> None:
lines = helpers.read_input()
print(lines)
main()
| 10.230769 | 32 | 0.691729 | 16 | 133 | 5.6875 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 133 | 12 | 33 | 11.083333 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.428571 | 0 | 0.571429 | 0.142857 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
b16a7743444391dae69b2456825772d7d82f5e2a | 67 | py | Python | Python/game.py | sdjmchattie/PygamePython | f902d064419d00755fd05f0d373ef4440fdee549 | [
"Apache-2.0"
] | null | null | null | Python/game.py | sdjmchattie/PygamePython | f902d064419d00755fd05f0d373ef4440fdee549 | [
"Apache-2.0"
] | null | null | null | Python/game.py | sdjmchattie/PygamePython | f902d064419d00755fd05f0d373ef4440fdee549 | [
"Apache-2.0"
] | null | null | null | import pygame as pg
class Game:
TILE_COLS = 28
TILE_ROWS = 22
| 11.166667 | 19 | 0.701493 | 12 | 67 | 3.75 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08 | 0.253731 | 67 | 5 | 20 | 13.4 | 0.82 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
b17a90328cfdb0508e820bf24f51ef06deba47d7 | 23 | py | Python | tests/__init__.py | MobileCloudNetworking/cdnaas | a4b990bd2af6fcf368678cdcc2eb6b37acc19b4b | [
"Apache-2.0"
] | null | null | null | tests/__init__.py | MobileCloudNetworking/cdnaas | a4b990bd2af6fcf368678cdcc2eb6b37acc19b4b | [
"Apache-2.0"
] | null | null | null | tests/__init__.py | MobileCloudNetworking/cdnaas | a4b990bd2af6fcf368678cdcc2eb6b37acc19b4b | [
"Apache-2.0"
] | null | null | null | __author__ = 'florian'
| 11.5 | 22 | 0.73913 | 2 | 23 | 6.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130435 | 23 | 1 | 23 | 23 | 0.65 | 0 | 0 | 0 | 0 | 0 | 0.304348 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b199667fe398380aaa78f53bff7c7f23c597ff7c | 16,388 | py | Python | src/structlib/standard.py | ModernMAK/Object-Struct-Library | 650b1c141d13ba02e30f071ee395e9d052816d3f | [
"MIT"
] | null | null | null | src/structlib/standard.py | ModernMAK/Object-Struct-Library | 650b1c141d13ba02e30f071ee395e9d052816d3f | [
"MIT"
] | null | null | null | src/structlib/standard.py | ModernMAK/Object-Struct-Library | 650b1c141d13ba02e30f071ee395e9d052816d3f | [
"MIT"
] | null | null | null | import re
from struct import Struct
from typing import Tuple, Iterable, Optional, Union
from .core import StructObj, ByteLayoutFlag
from .types import UnpackResult, UnpackLenResult, BufferStream
from .util import hybridmethod, pack_into, pack_stream, unpack_stream, unpack, unpack_stream_with_len, unpack_with_len, unpack_from, unpack_from_with_len, iter_unpack
STANDARD_BOSA_MARKS = r"@=<>!" # Byte Order, Size, Alignment
STANDARD_FMT_MARKS = r"xcbB?hHiIlLqQnNefdspP"
# Functions for common functionality on builtin structs
__struct_regex = re.compile(rf"([0-9]*)([{STANDARD_FMT_MARKS}])") # 'x' is excluded because it is padding
def _count_args(fmt: str) -> int:
count = 0
pos = 0
while pos < len(fmt):
match = __struct_regex.search(fmt, pos)
if match is None:
break
else:
repeat = match.group(1)
code = match.group(2)
if code == "s":
count += 1
else:
count += int(repeat) if repeat else 1
pos = match.span()[1]
return count
class StandardStruct(StructObj):
"""
A representation of a standard struct.Struct
"""
__DEF_FLAG = ByteLayoutFlag.NativeSize | ByteLayoutFlag.NativeEndian | ByteLayoutFlag.NativeAlignment
__BLM_FLAG_MAP = {
"@": ByteLayoutFlag.NativeSize | ByteLayoutFlag.NativeEndian | ByteLayoutFlag.NativeAlignment,
"=": ByteLayoutFlag.StandardSize | ByteLayoutFlag.NativeEndian | ByteLayoutFlag.NoAlignment,
"<": ByteLayoutFlag.StandardSize | ByteLayoutFlag.LittleEndian | ByteLayoutFlag.NoAlignment,
">": ByteLayoutFlag.StandardSize | ByteLayoutFlag.BigEndian | ByteLayoutFlag.NoAlignment,
"!": ByteLayoutFlag.StandardSize | ByteLayoutFlag.NetworkEndian | ByteLayoutFlag.NoAlignment,
}
def __init__(self, repeat: int, code: str, repeat_size: int = None, byte_layout_mark: str = None):
self.__repeat = repeat
fmt_str = f"{repeat if repeat > 1 else ''}{code}"
if repeat_size: # used for special case: string, where repeat is used for string length
fmt_str = " ".join(fmt_str for _ in range(repeat_size))
if byte_layout_mark:
fmt_str = byte_layout_mark + fmt_str
self.__layout = Struct(fmt_str)
self.__flags = self.__BLM_FLAG_MAP[byte_layout_mark] if byte_layout_mark else None
@hybridmethod
@property
def byte_flags(self) -> None:
return None
@byte_flags.instancemethod
@property
def byte_flags(self) -> Optional[ByteLayoutFlag]:
"""
The flags this structure was created with, if None, no flags were specified.
:return: Flags if the struct specified them, None otherwise.
"""
return self.__flags
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
raise NotImplementedError
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_ARGS(cls) -> int:
return 1
@hybridmethod
@property
def format(self) -> str:
return self.DEFAULT_LAYOUT.format
@format.instancemethod
@property
def format(self) -> str:
return self.__layout.format
@hybridmethod
@property
def fixed_size(self) -> int:
return self.DEFAULT_LAYOUT.size
@fixed_size.instancemethod
@property
def fixed_size(self) -> int:
return self.__layout.size
@hybridmethod
@property
def is_var_size(self) -> bool:
return False
@is_var_size.instancemethod
@property
def is_var_size(self) -> bool:
return False
@hybridmethod
@property
def args(self) -> int:
return self.DEFAULT_ARGS
@args.instancemethod
@property
def args(self) -> int:
return self.__repeat
@hybridmethod
def pack(self, *args) -> bytes:
return self.DEFAULT_LAYOUT.pack(*args)
@pack.instancemethod
def pack(self, *args) -> bytes:
return self.__layout.pack(*args)
@hybridmethod
def pack_into(cls, buffer, *args, offset: int = None) -> int:
return pack_into(cls.DEFAULT_LAYOUT, buffer, *args, offset=offset)
@pack_into.instancemethod
def pack_into(self, buffer, *args, offset: int = None) -> int:
return pack_into(self.__layout, buffer, *args, offset=offset)
@hybridmethod
def pack_stream(self, buffer: BufferStream, *args) -> int:
return pack_stream(self.DEFAULT_LAYOUT, buffer, *args)
@pack_stream.instancemethod
def pack_stream(self, buffer: BufferStream, *args) -> int:
return pack_stream(self.__layout, buffer, *args)
@hybridmethod
def unpack(self, buffer) -> UnpackResult:
return unpack(self.DEFAULT_LAYOUT, buffer)
@unpack.instancemethod
def unpack(self, buffer) -> UnpackResult:
return unpack(self.__layout, buffer)
@hybridmethod
def unpack_with_len(self, buffer) -> UnpackLenResult:
return unpack_with_len(self.DEFAULT_LAYOUT, buffer)
@unpack_with_len.instancemethod
def unpack_with_len(self, buffer) -> UnpackLenResult:
return unpack_with_len(self.__layout, buffer)
@hybridmethod
def unpack_from(self, buffer, offset: int = 0) -> UnpackResult:
return unpack_from(self.DEFAULT_LAYOUT, buffer, offset)
@unpack_from.instancemethod
def unpack_from(self, buffer, offset: int = 0) -> UnpackResult:
return unpack_from(self.__layout, buffer, offset)
@hybridmethod
def unpack_from_with_len(self, buffer, offset: int = 0) -> UnpackLenResult:
return unpack_from_with_len(self.DEFAULT_LAYOUT, buffer, offset)
@unpack_from_with_len.instancemethod
def unpack_from_with_len(self, buffer, offset: int = 0) -> UnpackLenResult:
return unpack_from_with_len(self.__layout, buffer, offset)
@hybridmethod
def unpack_stream(cls, buffer) -> UnpackResult:
return unpack_from(cls.DEFAULT_LAYOUT, buffer)
@unpack_stream.instancemethod
def unpack_stream(self, buffer) -> UnpackResult:
return unpack_from(self.__layout, buffer)
@hybridmethod
def iter_unpack(self, buffer) -> Iterable[Tuple]:
return iter_unpack(self.DEFAULT_LAYOUT, buffer)
@iter_unpack.instancemethod
def iter_unpack(self, buffer) -> Iterable[Tuple]:
return iter_unpack(self.__layout, buffer)
@hybridmethod
def unpack_stream_with_len(self, buffer) -> UnpackLenResult:
return unpack_stream_with_len(self.DEFAULT_LAYOUT, buffer)
@unpack_stream_with_len.instancemethod
def unpack_stream_with_len(self, buffer) -> UnpackLenResult:
return unpack_stream_with_len(self.__layout, buffer)
class StructWrapper(StructObj):
def __init__(self, s: Union[str, Struct]):
if isinstance(s, str):
s = Struct(s)
self.__layout = s
self.__args = _count_args(s.format)
@property
def fixed_size(self) -> int:
return self.__layout.size
@property
def args(self) -> int:
return self.__layout.size
@property
def is_var_size(self) -> bool:
return False
def pack(self, *args) -> bytes:
return self.__layout.pack(*args)
def pack_into(self, buffer, *args, offset: int = 0) -> int:
return pack_into(self.__layout, buffer, *args, offset=offset)
def pack_stream(self, buffer: BufferStream, *args) -> int:
return pack_stream(self.__layout, buffer, *args)
def unpack(self, buffer) -> UnpackResult:
return unpack(self.__layout, buffer)
def unpack_with_len(self, buffer) -> UnpackLenResult:
return unpack_with_len(self.__layout, buffer)
def unpack_from(self, buffer, offset: int = 0) -> UnpackResult:
return unpack_from(self.__layout, buffer, offset)
def unpack_from_with_len(self, buffer, offset: int = 0) -> UnpackLenResult:
return unpack_from_with_len(self.__layout, buffer, offset)
def iter_unpack(self, buffer) -> Iterable[Tuple]:
return iter_unpack(self.__layout, buffer)
def unpack_stream(self, buffer: BufferStream) -> UnpackResult:
return unpack_stream(self.__layout, buffer)
def unpack_stream_with_len(self, buffer) -> UnpackLenResult:
return unpack_stream_with_len(self.__layout, buffer)
@property
def format(self) -> str:
return self.__layout.format
class Padding(StandardStruct):
"""
Padding Byte(s)
For convenience, the class inherently represents a single pad byte.
"""
DEFAULT_CODE = "x"
__DEFAULT_LAYOUT = Struct("x")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "x", byte_layout_mark)
@hybridmethod
@property
def args(self) -> int:
return 0
@args.instancemethod
@property
def args(self) -> int:
return 0
class Char(StandardStruct):
DEFAULT_CODE = "c"
__DEFAULT_LAYOUT = Struct("c")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "c", byte_layout_mark)
class Int8(StandardStruct):
DEFAULT_CODE = "b"
__DEFAULT_LAYOUT = Struct("b")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "b", byte_layout_mark)
class UInt8(StandardStruct):
DEFAULT_CODE = "B"
__DEFAULT_LAYOUT = Struct("B")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "B", byte_layout_mark)
class Boolean(StandardStruct):
DEFAULT_CODE = "?"
__DEFAULT_LAYOUT = Struct("?")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "?", byte_layout_mark)
class Int16(StandardStruct):
DEFAULT_CODE = "h"
__DEFAULT_LAYOUT = Struct("h")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "h", byte_layout_mark)
class UInt16(StandardStruct):
DEFAULT_CODE = "H"
__DEFAULT_LAYOUT = Struct("H")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "H", byte_layout_mark)
class Int32(StandardStruct):
DEFAULT_CODE = "i" # l
__DEFAULT_LAYOUT = Struct("i")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "i", byte_layout_mark)
class UInt32(StandardStruct):
DEFAULT_CODE = "I" # L
__DEFAULT_LAYOUT = Struct("I")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "I", byte_layout_mark)
class Int64(StandardStruct):
DEFAULT_CODE = "q" # l
__DEFAULT_LAYOUT = Struct("q")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "q", byte_layout_mark)
class UInt64(StandardStruct):
DEFAULT_CODE = "Q" # L
__DEFAULT_LAYOUT = Struct("Q")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "Q", byte_layout_mark)
# C Size-Type
class SSizeT(StandardStruct):
DEFAULT_CODE = "n"
__DEFAULT_LAYOUT = Struct("n")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "n", byte_layout_mark)
class SizeT(StandardStruct):
DEFAULT_CODE = "N"
__DEFAULT_LAYOUT = Struct("N")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "N", byte_layout_mark)
class Float16(StandardStruct):
DEFAULT_CODE = "e"
__DEFAULT_LAYOUT = Struct("e")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "e", byte_layout_mark)
class Float32(StandardStruct):
DEFAULT_CODE = "f"
__DEFAULT_LAYOUT = Struct("f")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "f", byte_layout_mark)
class Float64(StandardStruct):
DEFAULT_CODE = "d"
__DEFAULT_LAYOUT = Struct("d")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "d", byte_layout_mark)
class Bytes(StandardStruct):
DEFAULT_CODE = "s"
__DEFAULT_LAYOUT = Struct("s")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, size: int = None):
super().__init__(repeat, "s", size)
class FixedPascalString(StandardStruct):
DEFAULT_CODE = "p"
__DEFAULT_LAYOUT = Struct("p")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, size: int = None, byte_layout_mark: str = None):
super().__init__(repeat, "p", size, byte_layout_mark=byte_layout_mark)
class CPointer(StandardStruct):
DEFAULT_CODE = "P"
__DEFAULT_LAYOUT = Struct("P")
# noinspection PyPropertyDefinition
@classmethod
@property
def DEFAULT_LAYOUT(cls) -> Struct:
return cls.__DEFAULT_LAYOUT
def __init__(self, repeat: int = 1, byte_layout_mark: str = None):
super().__init__(repeat, "P", byte_layout_mark=byte_layout_mark)
struct_code2class = {
}
for c in [Padding, Char, Int8, UInt8, Bytes, Boolean, Int16, UInt16, Int32, UInt32, Int64, UInt64, SSizeT, SizeT, Float16, Float32, Float64, FixedPascalString, CPointer]:
struct_code2class[c.DEFAULT_CODE] = c
# Struct allows l/L to substitute for Int32
struct_code2class["l"] = Int32
struct_code2class["L"] = UInt32
# ALIASES
# Int / Long can also be known as Long / LongLong; I'm going by C# keywords, but if there is any ambiguity, the underlying types are still available
Byte, SByte, Short, UShort, Int, UInt, Long, ULong, Half, Float, Double = UInt8, Int8, Int16, UInt16, Int32, UInt32, Int64, UInt64, Float16, Float32, Float64
| 28.550523 | 170 | 0.677447 | 1,899 | 16,388 | 5.510269 | 0.105845 | 0.086965 | 0.057531 | 0.102351 | 0.711391 | 0.656919 | 0.620222 | 0.584002 | 0.55581 | 0.518922 | 0 | 0.00856 | 0.223029 | 16,388 | 573 | 171 | 28.600349 | 0.813241 | 0.084818 | 0 | 0.533333 | 0 | 0 | 0.010743 | 0.003558 | 0 | 0 | 0 | 0 | 0 | 1 | 0.237333 | false | 0 | 0.016 | 0.173333 | 0.594667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
b19b346b731f41def05624b6f3d6a0f17f76c37c | 36,252 | py | Python | src/ivac/linear.py | chatipat/ivac | c4673a8b5e425bc1841415763190996794e48a1e | [
"MIT"
] | 1 | 2021-02-05T16:22:16.000Z | 2021-02-05T16:22:16.000Z | src/ivac/linear.py | chatipat/ivac | c4673a8b5e425bc1841415763190996794e48a1e | [
"MIT"
] | null | null | null | src/ivac/linear.py | chatipat/ivac | c4673a8b5e425bc1841415763190996794e48a1e | [
"MIT"
] | null | null | null | import numba as nb
import numpy as np
import warnings
from scipy import optimize
from .utils import (
preprocess_trajs,
get_nfeatures,
trajs_matmul,
symeig,
solve_stationary,
compute_ic,
compute_c0,
batch_compute_ic,
batch_compute_c0,
is_cutlag,
)
# -----------------------------------------------------------------------------
# linear VAC and IVAC
class LinearVAC:
r"""Solve linear VAC at a given lag time.
Linear VAC solves the equation
.. math::
C(\tau) v_i = \lambda_i C(0) v_i
for eigenvalues :math:`\lambda_i`
and eigenvector coefficients :math:`v_i`.
The correlation matrices are given by
.. math::
C_{ij}(\tau) = E[\phi_i(x_t) \phi_j(x_{t+\tau})]
C_{ij}(0) = E[\phi_i(x_t) \phi_j(x_t)]
where :math:`\phi_i` are the input features
and :math:`\tau` is the lag time parameter.
This implementation assumes that the constant feature can be
represented by a linear combination of the other features.
If this is not the case, addones=True will augment the input
features with the constant feature.
Parameters
----------
lag : int
Lag time, in units of frames.
nevecs : int, optional
Number of eigenvectors (including the trivial eigenvector)
to compute.
If None, use the maximum possible number of eigenvectors
(n_features).
addones : bool, optional
If True, add a feature of ones before solving VAC.
This increases n_features by 1.
This should only be set to True if the constant feature
is not contained within the span of the input features.
reweight : bool, optional
If True, reweight trajectories to equilibrium.
adjust : bool, optional
If True, adjust :math:`C(0)` to ensure that the trivial
eigenvector is exactly solved.
Attributes
----------
lag : int
VAC lag time in units of frames.
evals : (n_evecs,) ndarray
VAC eigenvalues in decreasing order.
This includes the trivial eigenvalue.
its : (n_evecs,) ndarray
Implied timescales corresponding to the eigenvalues,
in units of frames.
evecs : (n_features, n_evecs) ndarray
Coefficients of the VAC eigenvectors
corresponding to the eigenvalues.
cov : (n_features, n_features) ndarray
Covariance matrix of the fitted data.
trajs : list of (n_frames[i], n_features) ndarray
List of featurized trajectories used to solve VAC.
weights : list of (n_frames[i],) ndarray
Equilibrium weight of trajectories starting at each configuration.
"""
def __init__(
self,
lag,
nevecs=None,
addones=False,
reweight=False,
adjust=True,
):
self.lag = lag
self.nevecs = nevecs
self.addones = addones
self.reweight = reweight
self.adjust = adjust
self._isfit = False
def fit(self, trajs, weights=None):
"""Compute VAC results from input trajectories.
Calculate and store VAC eigenvalues, eigenvector coefficients,
and implied timescales from the input trajectories.
Parameters
----------
trajs : list of (n_frames[i], n_features) ndarray
List of featurized trajectories.
weights : int or list of (n_frames[i],) ndarray, optional
If int, the number of frames to drop from the end of each
trajectory, which must be greater than or equal to the VAC
lag time. This is equivalent to passing a list of uniform
weights but with the last int frames having zero weight.
If a list of ndarray, the weight of the trajectory starting
at each configuration. Note that the last frames of each
trajectory must have zero weight. This number of ending
frames with zero weight must be at least the VAC lag time.
"""
trajs = preprocess_trajs(trajs, addones=self.addones)
if self.reweight:
if weights is None:
weights = _ivac_weights(trajs, self.lag)
else:
if weights is not None:
raise ValueError("weights provided but not reweighting")
c0, evals, evecs = _solve_ivac(
trajs,
self.lag,
weights=weights,
adjust=self.adjust,
)
its = _vac_its(evals, self.lag)
self._set_fit_data(c0, evals, evecs, its, trajs, weights)
def transform(self, trajs):
"""Compute VAC eigenvectors on the input trajectories.
Use the fitted VAC eigenvector coefficients to calculate
the values of the VAC eigenvectors on the input trajectories.
Parameters
----------
trajs : list of (traj_len[i], n_features) ndarray
List of featurized trajectories.
Returns
-------
list of (traj_len[i], n_evecs) ndarray
VAC eigenvectors at each frame of the input trajectories.
"""
trajs = preprocess_trajs(trajs, addones=self.addones)
return trajs_matmul(trajs, self.evecs[:, : self.nevecs])
def _set_fit_data(self, cov, evals, evecs, its, trajs, weights):
"""Set fields computed by the fit method."""
self._isfit = True
self._cov = cov
self._evals = evals
self._evecs = evecs
self._its = its
self._trajs = trajs
self._weights = weights
@property
def cov(self):
if self._isfit:
return self._cov
raise ValueError("object has not been fit to data")
@property
def evals(self):
if self._isfit:
return self._evals
raise ValueError("object has not been fit to data")
@property
def evecs(self):
if self._isfit:
return self._evecs
raise ValueError("object has not been fit to data")
@property
def its(self):
if self._isfit:
return self._its
raise ValueError("object has not been fit to data")
@property
def trajs(self):
if self._isfit:
return self._trajs
raise ValueError("object has not been fit to data")
@property
def weights(self):
if self._isfit:
return self._weights
raise ValueError("object has not been fit to data")
class LinearIVAC:
r"""Solve linear IVAC for a given range of lag times.
Linear IVAC solves the equation
.. math::
\sum_\tau C(\tau) v_i = \lambda_i C(0) v_i
for eigenvalues :math:`\lambda_i`
and eigenvector coefficients :math:`v_i`.
The covariance matrices are given by
.. math::
C_{ij}(\tau) = E[\phi_i(x_t) \phi_j(x_{t+\tau})]
C_{ij}(0) = E[\phi_i(x_t) \phi_j(x_t)]
where :math:`\phi_i` are the input features
and :math:`\tau` is the lag time parameter.
This implementation assumes that the constant feature can be
represented by a linear combination of the other features.
If this is not the case, addones=True will augment the input
features with the constant feature.
Parameters
----------
minlag : int
Minimum lag time in units of frames.
maxlag : int
Maximum lag time (inclusive) in units of frames.
If minlag == maxlag, this is equivalent to VAC.
lagstep : int, optional
Number of frames between each lag time.
This must evenly divide maxlag - minlag.
The integrated covariance matrix is computed using lag times
(minlag, minlag + lagstep, ..., maxlag)
nevecs : int, optional
Number of eigenvectors (including the trivial eigenvector)
to compute.
If None, use the maximum possible number of eigenvectors
(n_features).
addones : bool, optional
If True, add a feature of ones before solving VAC.
This increases n_features by 1.
reweight : bool, optional
If True, reweight trajectories to equilibrium.
adjust : bool, optional
If True, adjust :math:`C(0)` to ensure that the trivial
eigenvector is exactly solved.
method : str, optional
Method to compute the integrated covariance matrix.
Currently, 'direct', 'fft' are supported.
Both 'direct' and 'fft' integrate features over lag times before
computing the correlation matrix.
Method 'direct' does so by summing the time-lagged features.
Its runtime increases linearly with the number of lag times.
Method 'fft' does so by performing an FFT convolution.
It takes around the same amount of time to run regardless
of the number of lag times, and is faster than 'direct' when
there is more than around 100 lag times.
Attributes
----------
minlag : int
Minimum IVAC lag time in units of frames.
maxlag : int
Maximum IVAC lag time in units of frames.
lagstep : int
Interval between IVAC lag times, in units of frames.
evals : (n_evecs,) ndarray
IVAC eigenvalues in decreasing order.
This includes the trivial eigenvalue.
its : (n_evecs,) ndarray
Implied timescales corresponding to the eigenvalues,
in units of frames.
evecs : (n_features, n_evecs) ndarray
Coefficients of the IVAC eigenvectors
corresponding to the eigenvalues.
cov : (n_features, n_features) ndarray
Covariance matrix of the fitted data.
trajs : list of (n_frames[i], n_features) ndarray
List of featurized trajectories used to solve IVAC.
weights : list of (n_frames[i],) ndarray
Equilibrium weight of trajectories starting at each configuration.
"""
def __init__(
self,
minlag,
maxlag,
lagstep=1,
nevecs=None,
addones=False,
reweight=False,
adjust=True,
method="fft",
):
if minlag > maxlag:
raise ValueError("minlag must be less than or equal to maxlag")
if (maxlag - minlag) % lagstep != 0:
raise ValueError("lag time interval must be a multiple of lagstep")
if method not in ["direct", "fft"]:
raise ValueError("method must be 'direct', or 'fft'")
self.minlag = minlag
self.maxlag = maxlag
self.lagstep = lagstep
self.lags = np.arange(self.minlag, self.maxlag + 1, self.lagstep)
self.nevecs = nevecs
self.addones = addones
self.reweight = reweight
self.adjust = adjust
self.method = method
self._isfit = False
def fit(self, trajs, weights=None):
"""Compute IVAC results from input trajectories.
Calculate and store IVAC eigenvalues, eigenvector coefficients,
and implied timescales from the input trajectories.
Parameters
----------
trajs : list of (n_frames[i], n_features) ndarray
List of featurized trajectories.
weights : int or list of (n_frames[i],) ndarray, optional
If int, the number of frames to drop from the end of each
trajectory, which must be greater than or equal to the
maximum IVAC lag time. This is equivalent to passing a list
of uniform weights but with the last int frames having zero
weight.
If a list of ndarray, the weight of the trajectory starting
at each configuration. Note that the last frames of each
trajectory must have zero weight. This number of ending
frames with zero weight must be at least the maximum IVAC
lag time.
"""
trajs = preprocess_trajs(trajs, addones=self.addones)
if self.reweight:
if weights is None:
weights = _ivac_weights(trajs, self.lags, method=self.method)
else:
if weights is not None:
raise ValueError("weights provided but not reweighting")
c0, evals, evecs = _solve_ivac(
trajs,
self.lags,
weights=weights,
adjust=self.adjust,
method=self.method,
)
its = _ivac_its(evals, self.minlag, self.maxlag, self.lagstep)
self._set_fit_data(c0, evals, evecs, its, trajs, weights)
def transform(self, trajs):
"""Compute IVAC eigenvectors on the input trajectories.
Use the fitted IVAC eigenvector coefficients to calculate
the values of the IVAC eigenvectors on the input trajectories.
Parameters
----------
trajs : list of (traj_len[i], n_features) ndarray
List of featurized trajectories.
Returns
-------
list of (traj_len[i], n_evecs) ndarray
IVAC eigenvectors at each frame of the input trajectories.
"""
trajs = preprocess_trajs(trajs, addones=self.addones)
return trajs_matmul(trajs, self.evecs[:, : self.nevecs])
def _set_fit_data(self, cov, evals, evecs, its, trajs, weights):
"""Set fields computed by the fit method."""
self._isfit = True
self._cov = cov
self._evals = evals
self._evecs = evecs
self._its = its
self._trajs = trajs
self._weights = weights
@property
def cov(self):
if self._isfit:
return self._cov
raise ValueError("object has not been fit to data")
@property
def evals(self):
if self._isfit:
return self._evals
raise ValueError("object has not been fit to data")
@property
def evecs(self):
if self._isfit:
return self._evecs
raise ValueError("object has not been fit to data")
@property
def its(self):
if self._isfit:
return self._its
raise ValueError("object has not been fit to data")
@property
def trajs(self):
if self._isfit:
return self._trajs
raise ValueError("object has not been fit to data")
@property
def weights(self):
if self._isfit:
return self._weights
raise ValueError("object has not been fit to data")
def _solve_ivac(
trajs,
lags,
*,
weights=None,
adjust=True,
method="fft",
):
"""Solve IVAC with the given parameters.
Parameters
----------
trajs : list of (n_frames[i], n_features) ndarray
List of featurized trajectories.
lags : int or 1d array-like of int
VAC lag time or IVAC lag times, in units of frames.
For IVAC, this should be a list of lag times that will be used,
not the 2 or 3 values specifying the range.
weights : int or list of (n_frames[i],) ndarray, optional
If int, the number of frames to drop from the end of each
trajectory, which must be greater than or equal to the maximum
IVAC lag time. This is equivalent to passing a list of uniform
weights but with the last int frames having zero weight.
If a list of ndarray, the weight of the trajectory starting at
each configuration. Note that the last frames of each trajectory
must have zero weight. This number of ending frames with zero
weight must be at least the maximum IVAC lag time.
adjust : bool, optional
If True, adjust :math:`C(0)` to ensure that the trivial
eigenvector is exactly solved.
method : str, optional
Method to compute the integrated covariance matrix.
Currently, 'direct', 'fft' are supported.
Both 'direct' and 'fft' integrate features over lag times before
computing the correlation matrix.
Method 'direct' does so by summing the time-lagged features.
Its runtime increases linearly with the number of lag times.
Method 'fft' does so by performing an FFT convolution.
It takes around the same amount of time to run regardless
of the number of lag times, and is faster than 'direct' when
there is more than around 100 lag times.
"""
ic = compute_ic(trajs, lags, weights=weights, method=method)
if adjust:
c0 = compute_c0(trajs, lags=lags, weights=weights, method=method)
else:
c0 = compute_c0(trajs, weights=weights, method=method)
evals, evecs = symeig(ic, c0)
return c0, evals, evecs
# -----------------------------------------------------------------------------
# linear VAC and IVAC scans
class LinearVACScan:
"""Solve linear VAC at each given lag time.
This class provides a more optimized way of solving linear VAC at a
set of lag times with the same input trajectories. The code
.. code-block:: python
scan = LinearVACScan(lags)
vac = scan[lags[i]]
is equivalent to
.. code-block:: python
vac = LinearVAC(lags[i])
Parameters
----------
lag : int
Lag time, in units of frames.
nevecs : int, optional
Number of eigenvectors (including the trivial eigenvector)
to compute.
If None, use the maximum possible number of eigenvectors
(n_features).
addones : bool, optional
If True, add a feature of ones before solving VAC.
This increases n_features by 1.
This should only be set to True if the constant feature
is not contained within the span of the input features.
reweight : bool, optional
If True, reweight trajectories to equilibrium.
adjust : bool, optional
If True, adjust :math:`C(0)` to ensure that the trivial
eigenvector is exactly solved.
method : str, optional
Method used to compute the time lagged covariance matrices.
Currently supported methods are 'direct',
which computes each time lagged covariance matrix separately,
and 'fft-all', which computes all time-lagged correlation
matrices at once by convolving each pair of features.
The runtime of 'fft-all' is almost independent of the number
of lag times, and is faster then 'direct' when scanning a
large number of lag times.
Attributes
----------
lags : 1d array-like of int
VAC lag time, in units of frames.
cov : (n_features, n_features) ndarray
Covariance matrix of the fitted data.
"""
def __init__(
self,
lags,
nevecs=None,
addones=False,
reweight=False,
adjust=True,
method="direct",
):
maxlag = np.max(lags)
if method not in ["direct", "fft-all"]:
raise ValueError("method must be 'direct' or 'fft-all'")
self.lags = lags
self.nevecs = nevecs
self.addones = addones
self.reweight = reweight
self.adjust = adjust
self.method = method
def fit(self, trajs, weights=None):
"""Compute VAC results from input trajectories.
Calculate and store VAC eigenvalues, eigenvector coefficients,
and implied timescales from the input trajectories.
Parameters
----------
trajs : list of (n_frames[i], n_features) ndarray
List of featurized trajectories.
weights : int or list of (n_frames[i],) ndarray, optional
If int, the number of frames to drop from the end of each
trajectory, which must be greater than or equal to the VAC
lag time. This is equivalent to passing a list of uniform
weights but with the last int frames having zero weight.
If a list of ndarray, the weight of the trajectory starting
at each configuration. Note that the last frames of each
trajectory must have zero weight. This number of ending
frames with zero weight must be at least the VAC lag time.
"""
trajs = preprocess_trajs(trajs, addones=self.addones)
nfeatures = get_nfeatures(trajs)
nlags = len(self.lags)
nevecs = self.nevecs
if nevecs is None:
nevecs = nfeatures
cts = batch_compute_ic(
trajs,
self.lags,
weights=weights,
method=self.method,
)
if self.adjust:
c0s = batch_compute_c0(
trajs,
lags=self.lags,
weights=weights,
method=self.method,
)
else:
c0s = batch_compute_c0(
trajs,
weights=weights,
method=self.method,
)
self.evals = np.empty((nlags, nevecs))
self.evecs = np.empty((nlags, nfeatures, nevecs))
self.its = np.empty((nlags, nevecs))
for n, (ct, c0, lag) in enumerate(zip(cts, c0s, self.lags)):
evals, evecs = symeig(ct, c0, nevecs)
self.evals[n] = evals
self.evecs[n] = evecs
self.its[n] = _vac_its(evals, lag)
if self.adjust:
self.cov = None
else:
self.cov = c0
self.trajs = trajs
self.weights = weights
def __getitem__(self, lag):
"""Get a fitted LinearVAC with the specified lag time.
Parameters
----------
lag : int
Lag time, in units of frames.
Returns
-------
LinearVAC
Fitted LinearVAC instance.
"""
i = np.argwhere(self.lags == lag)[0, 0]
vac = LinearVAC(lag, nevecs=self.nevecs, addones=self.addones)
vac._set_fit_data(
self.cov,
self.evals[i],
self.evecs[i],
self.its[i],
self.trajs,
self.weights,
)
return vac
class LinearIVACScan:
"""Solve linear IVAC for each pair of lag times.
This class provides a more optimized way of solving linear IVAC
with the same input trajectories
for all intervals within a set of lag times,
The code
.. code-block:: python
scan = LinearIVACScan(lags)
ivac = scan[lags[i], lags[j]]
is equivalent to
.. code-block:: python
ivac = LinearVAC(lags[i], lags[j])
Parameters
----------
lags : int
Lag times, in units of frames.
lagstep : int, optional
Number of frames between each lag time.
This must evenly divide maxlag - minlag.
The integrated covariance matrix is computed using lag times
(minlag, minlag + lagstep, ..., maxlag)
nevecs : int, optional
Number of eigenvectors (including the trivial eigenvector)
to compute.
If None, use the maximum possible number of eigenvectors
(n_features).
addones : bool, optional
If True, add a feature of ones before solving VAC.
This increases n_features by 1.
reweight : bool, optional
If True, reweight trajectories to equilibrium.
adjust : bool, optional
If True, adjust :math:`C(0)` to ensure that the trivial
eigenvector is exactly solved.
method : str, optional
Method to compute the integrated covariance matrix.
Currently, 'direct', 'fft', and 'fft-all' are supported.
Both 'direct' and 'fft' integrate features over lag times before
computing the correlation matrix. They scale linearly with
the number of parameter sets.
Method 'direct' does so by summing the time-lagged features.
Its runtime increases linearly with the number of lag times.
Method 'fft' does so by performing an FFT convolution.
It takes around the same amount of time to run regardless
of the number of lag times, and is faster than 'direct' when
there is more than around 100 lag times.
Method 'fft-all' computes all time-lagged correlation matrices
at once by convolving each pair of features, before summing
up those correlation matrices to obtain integrated correlation
matrices. It is the slowest of these methods for calculating
a few sets of parameters, but is almost independent of the
number of lag times or parameter sets.
Attributes
----------
lags : 1d array-like of int
VAC lag time, in units of frames.
cov : (n_features, n_features) ndarray
Covariance matrix of the fitted data.
"""
def __init__(
self,
lags,
lagstep=1,
nevecs=None,
addones=False,
reweight=False,
adjust=True,
method="fft",
):
if np.any(lags[1:] < lags[:-1]):
raise ValueError("lags must be nondecreasing")
if np.any((lags[1:] - lags[:-1]) % lagstep != 0):
raise ValueError(
"lags time intervals must be multiples of lagstep"
)
maxlag = np.max(lags)
if method not in ["direct", "fft", "fft-all"]:
raise ValueError("method must be 'direct', 'fft', or 'fft-all")
self.lags = lags
self.lagstep = lagstep
self.nevecs = nevecs
self.addones = addones
self.reweight = reweight
self.adjust = adjust
self.method = method
def fit(self, trajs, weights=None):
"""Compute IVAC results from input trajectories.
Calculate and store IVAC eigenvalues, eigenvector coefficients,
and implied timescales from the input trajectories.
Parameters
----------
trajs : list of (n_frames[i], n_features) ndarray
List of featurized trajectories.
weights : int or list of (n_frames[i],) ndarray, optional
If int, the number of frames to drop from the end of each
trajectory, which must be greater than or equal to the
maximum IVAC lag time. This is equivalent to passing a list
of uniform weights but with the last int frames having zero
weight.
If a list of ndarray, the weight of the trajectory starting
at each configuration. Note that the last frames of each
trajectory must have zero weight. This number of ending
frames with zero weight must be at least the maximum IVAC
lag time.
"""
trajs = preprocess_trajs(trajs, addones=self.addones)
nfeatures = get_nfeatures(trajs)
nlags = len(self.lags)
nevecs = self.nevecs
if nevecs is None:
nevecs = nfeatures
params = [
np.arange(start + self.lagstep, end + 1, self.lagstep)
for start, end in zip(self.lags[:-1], self.lags[1:])
]
ics = list(
batch_compute_ic(
trajs,
params,
weights=weights,
method=self.method,
)
)
if self.adjust:
c0s = list(
batch_compute_c0(
trajs,
params,
weights=weights,
method=self.method,
)
)
else:
c0 = compute_c0(trajs, weights=weights, method=self.method)
denom = 1
self.evals = np.full((nlags, nlags, nevecs), np.nan)
self.evecs = np.full((nlags, nlags, nfeatures, nevecs), np.nan)
self.its = np.full((nlags, nlags, nevecs), np.nan)
for i in range(nlags):
ic = compute_ic(
trajs,
self.lags[i],
weights=weights,
method=self.method,
)
if self.adjust:
c0 = compute_c0(
trajs,
lags=self.lags[i],
weights=weights,
method=self.method,
)
denom = 1
evals, evecs = symeig(ic, c0, nevecs)
if self.lags[i] > 0:
self.evals[i, i] = evals
self.evecs[i, i] = evecs
self.its[i, i] = _ivac_its(
evals, self.lags[i], self.lags[i], self.lagstep
)
for j in range(i + 1, nlags):
ic += ics[j - 1]
if self.adjust:
count = (self.lags[j] - self.lags[j - 1]) // self.lagstep
c0 += c0s[j - 1] * count
denom += count
evals, evecs = symeig(ic, c0 / denom, nevecs)
self.evals[i, j] = evals
self.evecs[i, j] = evecs
self.its[i, j] = _ivac_its(
evals, self.lags[i], self.lags[j], self.lagstep
)
if self.adjust:
self.cov = c0
else:
self.cov = None
self.trajs = trajs
self.weights = weights
def __getitem__(self, lags):
"""Get a fitted LinearIVAC with the specified lag times.
Parameters
----------
lags : Tuple[int, int]
Minimum and maximum lag times, in units of frames.
Returns
-------
LinearIVAC
Fitted LinearIVAC instance.
"""
minlag, maxlag = lags
i = np.argwhere(self.lags == minlag)[0, 0]
j = np.argwhere(self.lags == maxlag)[0, 0]
ivac = LinearIVAC(
minlag,
maxlag,
lagstep=self.lagstep,
nevecs=self.nevecs,
addones=self.addones,
)
ivac._set_fit_data(
self.cov,
self.evals[i, j],
self.evecs[i, j],
self.its[i, j],
self.trajs,
self.weights,
)
return ivac
# -----------------------------------------------------------------------------
# reweighting
def _ivac_weights(trajs, lags, weights=None, method="fft"):
"""Estimate weights for IVAC.
Parameters
----------
trajs : list of (n_frames[i], n_features) ndarray
List of featurized trajectories.
The features must be able to represent constant features.
lags : array-like of int
Lag times at which to evaluate IVAC, in units of frames.
weights : int or list of (n_frames[i],) ndarray, optional
If int, the number of frames to drop from the end of each
trajectory, which must be greater than or equal to the maximum
IVAC lag time. This is equivalent to passing a list of uniform
weights but with the last int frames having zero weight.
If a list of ndarray, the weight of the trajectory starting at
each configuration. Note that the last frames of each trajectory
must have zero weight. This number of ending frames with zero
weight must be at least the maximum IVAC lag time.
method : string, optional
Method to use for calculating the integrated correlation matrix.
Currently, 'direct' and 'fft' are supported. Method 'direct', is
usually faster for smaller numbers of lag times. The speed of
method 'fft' is mostly independent of the number of lag times
used.
Returns
-------
list of (n_frames[i],) ndarray
Weight of trajectory starting at each configuration.
"""
lags = np.atleast_1d(lags)
assert lags.ndim == 1
if weights is None:
weights = np.max(lags)
elif is_cutlag(weights):
assert weights >= np.max(lags)
ic = compute_ic(trajs, lags, weights=weights, method=method)
c0 = compute_c0(trajs, weights=weights)
w = solve_stationary(ic / len(lags), c0)
return _build_weights(trajs, w, weights)
def _build_weights(trajs, coeffs, old_weights):
"""Build weights from reweighting coefficients.
Parameters
----------
trajs : list of (n_frames[i], n_features) ndarray
List of featurized trajectories.
coeffs : (n_features,) ndarray
Expansion coefficients of the new weights.
old_weights : list of (n_frames[i],) ndarray
Initial weight of trajectory starting at each configuration,
which was used to estimate the expansion coefficients.
Returns
-------
list of (n_frames[i],) ndarray
Weight of trajectory starting at each configuration.
"""
weights = []
total = 0.0
if is_cutlag(old_weights):
for traj in trajs:
weight = traj @ coeffs
weight[len(traj) - old_weights :] = 0.0
total += np.sum(weight)
weights.append(weight)
else:
for traj, old_weight in zip(trajs, old_weights):
weight = traj @ coeffs
weight *= old_weight
total += np.sum(weight)
weights.append(weight)
# normalize weights so that their sum is 1
for weight in weights:
weight /= total
return weights
# -----------------------------------------------------------------------------
# implied timescales
def _vac_its(evals, lag):
"""Calculate implied timescales from VAC eigenvalues.
Parameters
----------
evals : (n_evecs,) array-like
VAC eigenvalues.
lag : int
VAC lag time in units of frames.
Returns
-------
(n_evecs,) ndarray
Estimated implied timescales.
This is NaN when the VAC eigenvalues are negative.
"""
its = np.full(len(evals), np.nan)
its[evals >= 1.0] = np.inf
mask = np.logical_and(0.0 < evals, evals < 1.0)
its[mask] = -lag / np.log(evals[mask])
return its
def _ivac_its(evals, minlag, maxlag, lagstep=1):
"""Calculate implied timescales from IVAC eigenvalues.
Parameters
----------
evals : (n_evecs,) array-like
IVAC eigenvalues.
minlag, maxlag : int
Minimum and maximum lag times (inclusive) in units of frames.
lagstep : int, optional
Number of frames between adjacent lag times.
Lag times are given by minlag, minlag + lagstep, ..., maxlag.
Returns
-------
(n_evecs,) ndarray
Estimated implied timescales.
This is NaN when the IVAC eigenvalues are negative
or when the calculation did not converge.
"""
its = np.full(len(evals), np.nan)
if minlag == 0:
# remove component corresponding to zero lag time
evals = evals - 1.0
minlag = lagstep
for i, val in enumerate(evals):
dlag = maxlag - minlag + lagstep
nlags = dlag / lagstep
assert nlags > 0
avg = val / nlags
if avg >= 1.0:
its[i] = np.inf
elif avg > 0.0:
# eigenvalues are bound by
# exp(-sigma * tmin) <= eval
# and
# nlags * exp(-sigma * tmax) <= eval <= nlags * exp(-sigma * tmin)
lower = max(
0.0,
-np.log(val) / minlag,
-np.log(avg) / maxlag,
)
upper = -np.log(avg) / minlag
# make sure solution is inside bracket
lower *= 0.999
upper *= 1.001
sol = optimize.root_scalar(
_ivac_its_f,
args=(val, minlag, dlag, lagstep),
method="brentq",
bracket=[lower, upper],
)
if sol.converged and sol.root > 0.0:
its[i] = 1.0 / sol.root
else:
warnings.warn("implied timescale calculation did not converge")
return its
@nb.njit
def _ivac_its_f(sigma, val, minlag, dlag, lagstep=1):
"""Objective function for IVAC implied timescale calculation.
Parameters
----------
sigma : float
Inverse implied timescale.
val : float
IVAC eigenvalue.
minlag : int
Minimum lag time in units of frames.
dlag : int
Number of frames in the interval from the minimum lag time
to the maximum lag time (inclusive).
lagstep : int, optional
Number of frames between adjacent lag times.
Lag times are given by minlag, minlag + lagstep, ..., maxlag.
Returns
-------
float
Difference between given and predicted IVAC eigenvalue.
"""
return (
np.exp(-sigma * minlag)
* np.expm1(-sigma * dlag)
/ np.expm1(-sigma * lagstep)
) - val
| 32.630063 | 80 | 0.589154 | 4,507 | 36,252 | 4.679166 | 0.084313 | 0.013656 | 0.008535 | 0.014225 | 0.757694 | 0.734648 | 0.715634 | 0.686898 | 0.654464 | 0.61947 | 0 | 0.005111 | 0.325389 | 36,252 | 1,110 | 81 | 32.659459 | 0.857213 | 0.527419 | 0 | 0.568928 | 0 | 0 | 0.056242 | 0 | 0 | 0 | 0 | 0 | 0.006565 | 1 | 0.070022 | false | 0 | 0.010941 | 0 | 0.137856 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b1acc47fdf84879d5ca794ab7b35894b85fbedae | 3,041 | py | Python | examples/keystone/v3/01_auth_session.py | hustbeta/openstack-juno-api-adventure | 0c62cdc33599256ca7063478bb1e2906a8a6c2a2 | [
"MIT"
] | 4 | 2015-08-27T08:39:15.000Z | 2020-06-24T01:47:30.000Z | examples/keystone/v3/01_auth_session.py | hustbeta/openstack-juno-api-adventure | 0c62cdc33599256ca7063478bb1e2906a8a6c2a2 | [
"MIT"
] | null | null | null | examples/keystone/v3/01_auth_session.py | hustbeta/openstack-juno-api-adventure | 0c62cdc33599256ca7063478bb1e2906a8a6c2a2 | [
"MIT"
] | 2 | 2015-08-27T08:39:20.000Z | 2018-11-20T08:48:49.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""Authenticate using username/password, or using token, with or without session.
Note that auth without session is deprecated.
"""
import time
import keystoneclient
import keystoneclient.auth.identity.v3
import keystoneclient.session
import keystoneclient.v3.client
import local_settings
def auth_user(username, password, project_name):
"""Authenticate using username/password"""
try:
keystone = keystoneclient.v3.client.Client(username=username,
password=password,
project_name=project_name,
auth_url=local_settings.auth_url_v3)
except keystoneclient.openstack.common.apiclient.exceptions.Unauthorized:
return None
return keystone
def auth_user_with_session(username, password, project_name):
"""Authenticate using username/password
This method doesn't verify username/password.
Instead, client will authenticate on first request,
and re-authenticate automatically when the token expires.
"""
auth = keystoneclient.auth.identity.v3.Password(auth_url=local_settings.auth_url_v3,
username=username,
password=password,
user_domain_name='Default',
project_domain_name='Default',
project_name=project_name)
session = keystoneclient.session.Session(auth=auth)
keystone = keystoneclient.v3.client.Client(session=session)
return keystone
def auth_token(token):
"""Authenticate using token"""
try:
keystone = keystoneclient.v3.client.Client(token=token,
auth_url=local_settings.auth_url_v3)
except keystoneclient.openstack.common.apiclient.exceptions.Unauthorized:
return None
return keystone
def auth_token_with_session(token):
"""Authenticate using token."""
# TODO Failed, need investigating.
auth = keystoneclient.auth.identity.v3.Token(auth_url=local_settings.auth_url_v3,
token=token)
session = keystoneclient.session.Session(auth=auth)
keystone = keystoneclient.v3.client.Client(session=session)
return keystone
keystone = auth_user_with_session(local_settings.username,
local_settings.password,
local_settings.tenant_name)
try:
result = keystone.domains.list()
print result
# set token expiration to a very short period, and wait for expiration here
for i in range(1, 40):
time.sleep(1)
result = keystone.projects.list()
print result
except keystoneclient.openstack.common.apiclient.exceptions.Unauthorized:
print 'authentication failed'
| 37.54321 | 88 | 0.620848 | 302 | 3,041 | 6.112583 | 0.294702 | 0.069339 | 0.059588 | 0.065005 | 0.476165 | 0.432286 | 0.390033 | 0.33857 | 0.252438 | 0.252438 | 0 | 0.008072 | 0.307465 | 3,041 | 80 | 89 | 38.0125 | 0.868471 | 0.048997 | 0 | 0.44 | 0 | 0 | 0.014368 | 0 | 0 | 0 | 0 | 0.0125 | 0 | 0 | null | null | 0.12 | 0.12 | null | null | 0.06 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
493424d353026ba6a6d87b9d8fe1e729da7a661a | 3,349 | py | Python | webapp/user/models.py | Guitaryuga/LearnPython-E-learn-project | cbafb8c93f0931d0d1d411473eaf41a11193e756 | [
"MIT"
] | 1 | 2021-04-17T15:25:42.000Z | 2021-04-17T15:25:42.000Z | webapp/user/models.py | Guitaryuga/LearnPython-E-learn-project | cbafb8c93f0931d0d1d411473eaf41a11193e756 | [
"MIT"
] | null | null | null | webapp/user/models.py | Guitaryuga/LearnPython-E-learn-project | cbafb8c93f0931d0d1d411473eaf41a11193e756 | [
"MIT"
] | null | null | null | import jwt
from time import time
from flask import current_app
from flask_login import UserMixin
from werkzeug.security import generate_password_hash, check_password_hash
from webapp.db import db
"""Модели пользователя для базы данных и ответов на тесты,
которые дает пользователь"""
users_to_courses = db.Table('users_to_courses',
db.Column('course_id', db.Integer, db.ForeignKey('Course.id')),
db.Column('user_id', db.Integer, db.ForeignKey('User.id')))
class User(db.Model, UserMixin):
"""Модель пользователя"""
__tablename__ = 'User'
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(64), index=True, unique=True)
fio = db.Column(db.String(128))
password = db.Column(db.String(128))
company = db.Column(db.String(128))
position = db.Column(db.String(128))
date_of_birth = db.Column(db.String(50))
phone_number = db.Column(db.String(50))
role = db.Column(db.String(10), index=True)
courses = db.relationship("Course", secondary=users_to_courses)
confirmed = db.Column(db.Boolean)
confirmed_on = db.Column(db.DateTime)
def get_reset_password_token(self, expires_in=600):
"""Метод генерирующий токен для сброса пароля пользователя"""
return jwt.encode(
{'reset_password': self.id, 'exp': time() + expires_in},
current_app.config['SECRET_KEY'], algorithm='HS256')
@staticmethod
def verify_reset_password_token(token):
"""Метод, проверяющий сгенерированный токен для сброса пароля"""
try:
id = jwt.decode(token, current_app.config['SECRET_KEY'],
algorithms=['HS256'])['reset_password']
except:
return
return User.query.get(id)
@property
def is_admin(self):
"""Метод, проверяющий пользвоателя на принадлежность к классу администраторов"""
return self.role == 'admin'
def set_password(self, password):
"""Метод, хеширующий пароль для хранения в БД"""
self.password = generate_password_hash(password)
def check_password(self, password):
"""Метод, проверяющий пароль на соответствие при логине"""
return check_password_hash(self.password, password)
def __repr__(self):
return '<User {}>'.format(self.username)
class User_answer(db.Model):
"""Модель хранения ответов, данных пользователем"""
__tablename__ = 'User_answer'
id = db.Column(db.Integer, primary_key=True)
user_id = db.Column(db.Integer, db.ForeignKey('User.id',
ondelete='CASCADE'),
index=True)
users = db.relationship('User', backref='user_answers')
question_id = db.Column(db.Integer, db.ForeignKey('Question.id',
ondelete='CASCADE'),
index=True)
questions = db.relationship('Question', backref='user_answers')
lesson_id = db.Column(db.Integer)
lesson_name = db.Column(db.String(128))
user_answer = db.Column(db.String(50))
answer_status = db.Column(db.String(50))
course_id = db.Column(db.Integer)
def __repr__(self):
return f'Пользователь {self.user_id}, вопрос {self.question_id}, ответ {self.user_answer}'
| 38.494253 | 98 | 0.642878 | 408 | 3,349 | 5.102941 | 0.321078 | 0.080692 | 0.091258 | 0.084534 | 0.239673 | 0.081172 | 0.061479 | 0.0317 | 0 | 0 | 0 | 0.014079 | 0.236489 | 3,349 | 86 | 99 | 38.94186 | 0.800156 | 0.104807 | 0 | 0.129032 | 1 | 0 | 0.101565 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.112903 | false | 0.16129 | 0.096774 | 0.032258 | 0.725806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
493e16c3599d684483551d58a65182d14c23e588 | 1,569 | py | Python | mlsnippet/datafs/errors.py | haowen-xu/mlsnippet | 94f0b419340e763747a008b8c93feca06140adc5 | [
"MIT"
] | 1 | 2018-05-25T07:57:13.000Z | 2018-05-25T07:57:13.000Z | mlsnippet/datafs/errors.py | haowen-xu/mlsnippet | 94f0b419340e763747a008b8c93feca06140adc5 | [
"MIT"
] | 2 | 2018-06-02T04:03:36.000Z | 2018-07-18T03:57:46.000Z | mlsnippet/datafs/errors.py | haowen-xu/mltoolkit | 94f0b419340e763747a008b8c93feca06140adc5 | [
"MIT"
] | null | null | null | __all__ = [
'DataFSError', 'UnsupportedOperation', 'InvalidOpenMode',
'DataFileNotExist', 'MetaKeyNotExist',
]
class DataFSError(Exception):
"""Base class for all :class:`DataFS` errors."""
class UnsupportedOperation(DataFSError):
"""
Class to indicate that a requested operation is not supported by the
specific :class:`DataFS` subclass.
"""
class InvalidOpenMode(UnsupportedOperation):
"""
Class to indicate that the specified open mode is not supported.
"""
def __init__(self, mode):
super(InvalidOpenMode, self).__init__(mode)
@property
def mode(self):
return self.args[0]
def __str__(self):
return 'Invalid open mode: {!r}'.format(self.mode)
class DataFileNotExist(DataFSError):
"""Class to indicate a requested data file does not exist."""
def __init__(self, filename):
super(DataFileNotExist, self).__init__(filename)
@property
def filename(self):
return self.args[0]
def __str__(self):
return 'Data file not exist: {!r}'.format(self.filename)
class MetaKeyNotExist(DataFSError):
"""Class to indicate a requested meta key does not exist."""
def __init__(self, filename, meta_key):
super(MetaKeyNotExist, self).__init__(filename, meta_key)
@property
def filename(self):
return self.args[0]
@property
def meta_key(self):
return self.args[1]
def __str__(self):
return 'In file {!r}: meta key not exist: {!r}'. \
format(self.filename, self.meta_key)
| 24.138462 | 72 | 0.656469 | 180 | 1,569 | 5.477778 | 0.272222 | 0.070994 | 0.060852 | 0.073022 | 0.319473 | 0.319473 | 0.191684 | 0.128803 | 0.070994 | 0 | 0 | 0.0033 | 0.227533 | 1,569 | 64 | 73 | 24.515625 | 0.810231 | 0.205226 | 0 | 0.352941 | 0 | 0 | 0.136174 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.294118 | false | 0 | 0 | 0.205882 | 0.647059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
494cf076f2cb5127ef453b84cb7ad1404b7abc24 | 1,555 | py | Python | src/cm/models/__init__.py | cc1-cloud/cc1 | 8113673fa13b6fe195cea99dedab9616aeca3ae8 | [
"Apache-2.0"
] | 11 | 2015-05-06T14:16:54.000Z | 2022-02-08T23:21:31.000Z | src/cm/models/__init__.py | fortress-shell/cc1 | 8113673fa13b6fe195cea99dedab9616aeca3ae8 | [
"Apache-2.0"
] | 1 | 2015-10-30T21:08:11.000Z | 2015-10-30T21:08:11.000Z | src/cm/models/__init__.py | fortress-shell/cc1 | 8113673fa13b6fe195cea99dedab9616aeca3ae8 | [
"Apache-2.0"
] | 5 | 2016-02-12T22:01:38.000Z | 2021-12-06T16:56:54.000Z | # -*- coding: utf-8 -*-
# @COPYRIGHT_begin
#
# Copyright [2010-2014] Institute of Nuclear Physics PAN, Krakow, Poland
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# @COPYRIGHT_end
"""@package src.cm.models
@author Gaetano
@author Maciej Nabozny
@date Mar 6, 2013
Each new entity model should be imported by this file. Also each model
should have subclass Meta with app_label='cm':
class Meta:
app_label = 'cm'
"""
from cm.models.admin import Admin
from cm.models.available_network import AvailableNetwork
from cm.models.command import Command
from cm.models.iso_image import IsoImage
from cm.models.lease import Lease
from cm.models.node import Node
from cm.models.public_ip import PublicIP
from cm.models.storage import Storage
from cm.models.storage_image import StorageImage
from cm.models.system_image import SystemImage
from cm.models.system_image_group import SystemImageGroup
from cm.models.template import Template
from cm.models.user import User
from cm.models.user_network import UserNetwork
from cm.models.vm import VM
| 33.085106 | 77 | 0.780064 | 240 | 1,555 | 5.004167 | 0.516667 | 0.106578 | 0.149875 | 0.026644 | 0.038301 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013616 | 0.149839 | 1,555 | 46 | 78 | 33.804348 | 0.894856 | 0.57492 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
4975955d816908942a22b981098b362af473bfb4 | 275 | py | Python | casrs.py | fireballpoint1/fortranTOpy | 55843a62c6f0a2f8e2a777ef70193940d3d2d141 | [
"Apache-2.0"
] | 1 | 2018-08-26T05:10:56.000Z | 2018-08-26T05:10:56.000Z | casrs.py | fireballpoint1/fortranTOpy | 55843a62c6f0a2f8e2a777ef70193940d3d2d141 | [
"Apache-2.0"
] | null | null | null | casrs.py | fireballpoint1/fortranTOpy | 55843a62c6f0a2f8e2a777ef70193940d3d2d141 | [
"Apache-2.0"
] | 1 | 2018-06-26T18:06:44.000Z | 2018-06-26T18:06:44.000Z | import numpy
E=numpy.zeros((400+1))
X=numpy.zeros((400+1))
Y=numpy.zeros((400+1))
Z=numpy.zeros((400+1))
DRX=numpy.zeros((400+1))
DRY=numpy.zeros((400+1))
DRZ=numpy.zeros((400+1))
T=numpy.zeros((400+1))
NFLGF=numpy.zeros((400+1))
NFLGPP=numpy.zeros((400+1))
IEVENT=float(0.0) | 22.916667 | 27 | 0.687273 | 56 | 275 | 3.375 | 0.321429 | 0.529101 | 0.687831 | 0.740741 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.159696 | 0.043636 | 275 | 12 | 28 | 22.916667 | 0.558935 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
4984cafcac4cbef2cbe7a0a3f911389729dfc07c | 152 | py | Python | drones/graphql/enums.py | miguelzetina/Django-RESTful-Web-Services-Hillar-Gastn | 75143af8948c445a03042168cb09f344234a4a04 | [
"MIT"
] | null | null | null | drones/graphql/enums.py | miguelzetina/Django-RESTful-Web-Services-Hillar-Gastn | 75143af8948c445a03042168cb09f344234a4a04 | [
"MIT"
] | null | null | null | drones/graphql/enums.py | miguelzetina/Django-RESTful-Web-Services-Hillar-Gastn | 75143af8948c445a03042168cb09f344234a4a04 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from graphene import Enum
class GenderChoices(Enum):
MALE = 'M'
FEMALE = 'F'
| 15.2 | 39 | 0.671053 | 19 | 152 | 5.105263 | 0.842105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008333 | 0.210526 | 152 | 9 | 40 | 16.888889 | 0.8 | 0.138158 | 0 | 0 | 0 | 0 | 0.015504 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
498c75b959c12f987d0dce5f148abb5b090fe2f5 | 186 | py | Python | frontend_files/ci_project/ci_account/apps.py | CS3250-Ctrl-Intelligence/Team-Project_Python | 6b8fef2b09628666a198f4c86f5c8226eabed406 | [
"CC0-1.0"
] | 3 | 2022-03-01T10:23:12.000Z | 2022-03-05T01:38:01.000Z | frontend_files/ci_project/ci_account/apps.py | CS3250-Ctrl-Intelligence/Team-Project_Python | 6b8fef2b09628666a198f4c86f5c8226eabed406 | [
"CC0-1.0"
] | 18 | 2022-02-25T17:53:49.000Z | 2022-03-19T03:31:11.000Z | frontend_files/ci_project/ci_account/apps.py | CS3250-Ctrl-Intelligence/Team-Project_Python | 6b8fef2b09628666a198f4c86f5c8226eabed406 | [
"CC0-1.0"
] | null | null | null | from django.apps import AppConfig
class CiAccountConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'ci_account'
verbose_name= "Account"
| 23.25 | 57 | 0.725806 | 21 | 186 | 6.238095 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188172 | 186 | 7 | 58 | 26.571429 | 0.86755 | 0 | 0 | 0 | 0 | 0 | 0.256983 | 0.162011 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
b8dccfab004a731bf7ed7e55dc3e236aba9f2541 | 162 | py | Python | static/Pygeostat/plotting-4.py | MHadavand/MHadavand.github.io | b830d45ac64e393905a612e3e0bef793689d2848 | [
"MIT"
] | null | null | null | static/Pygeostat/plotting-4.py | MHadavand/MHadavand.github.io | b830d45ac64e393905a612e3e0bef793689d2848 | [
"MIT"
] | 1 | 2021-05-11T06:18:32.000Z | 2021-05-11T06:18:32.000Z | static/Pygeostat/plotting-4.py | MHadavand/MHadavand.github.io | b830d45ac64e393905a612e3e0bef793689d2848 | [
"MIT"
] | null | null | null | import pygeostat as gs
# load data
data_file = gs.ExampleData('point3d_ind_mv')
gs.location_plot(data_file, var='Phi', orient='yz', aspect =5, plot_collar = True) | 40.5 | 82 | 0.759259 | 27 | 162 | 4.333333 | 0.777778 | 0.136752 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013793 | 0.104938 | 162 | 4 | 82 | 40.5 | 0.793103 | 0.055556 | 0 | 0 | 0 | 0 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
b8dec1c065ec0c47e20e9f4308be1f2acd3a97d7 | 395 | py | Python | tests/functional/memoryview_usage.py | tonybaloney/perflint | ecb3077eb2dabdaa9d3b937710a896d8670d2095 | [
"MIT"
] | 195 | 2022-01-07T06:36:04.000Z | 2022-03-31T17:56:58.000Z | tests/functional/memoryview_usage.py | tonybaloney/perflint | ecb3077eb2dabdaa9d3b937710a896d8670d2095 | [
"MIT"
] | 10 | 2022-01-20T23:43:34.000Z | 2022-03-30T22:04:10.000Z | tests/functional/memoryview_usage.py | tonybaloney/perflint | ecb3077eb2dabdaa9d3b937710a896d8670d2095 | [
"MIT"
] | 1 | 2022-01-20T07:15:08.000Z | 2022-01-20T07:15:08.000Z | def example_bytes_slice():
word = b'the lazy brown dog jumped'
for i in range(10):
# Memoryview slicing is 10x faster than bytes slicing
if word[0:i] == 'the':
return True
def example_bytes_slice_as_arg(word: bytes):
for i in range(10):
# Memoryview slicing is 10x faster than bytes slicing
if word[0:i] == 'the':
return True
| 30.384615 | 61 | 0.612658 | 59 | 395 | 4 | 0.474576 | 0.084746 | 0.127119 | 0.169492 | 0.661017 | 0.661017 | 0.661017 | 0.661017 | 0.661017 | 0.661017 | 0 | 0.036232 | 0.301266 | 395 | 12 | 62 | 32.916667 | 0.818841 | 0.260759 | 0 | 0.666667 | 0 | 0 | 0.107266 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b8eea82177658850adddb9d5c289272b8ee6ea12 | 1,197 | py | Python | Python3/1648-Sell-Diminishing-Valued-Colored-Balls/soln.py | zhangyaqi1989/LeetCode-Solutions | 2655a1ffc8678ad1de6c24295071308a18c5dc6e | [
"MIT"
] | 5 | 2020-07-24T17:48:59.000Z | 2020-12-21T05:56:00.000Z | Python3/1648-Sell-Diminishing-Valued-Colored-Balls/soln.py | zhangyaqi1989/LeetCode-Solutions | 2655a1ffc8678ad1de6c24295071308a18c5dc6e | [
"MIT"
] | null | null | null | Python3/1648-Sell-Diminishing-Valued-Colored-Balls/soln.py | zhangyaqi1989/LeetCode-Solutions | 2655a1ffc8678ad1de6c24295071308a18c5dc6e | [
"MIT"
] | 2 | 2020-07-24T17:49:01.000Z | 2020-08-31T19:57:35.000Z | class Solution:
def maxProfit(self, inventory: List[int], orders: int) -> int:
# [5, 5, 2]
inventory.sort(reverse=True)
inventory.append(0)
ans = 0
idx = 0
n = len(inventory)
while orders:
# find the first index such that inv[j] < inv[i]
lo, hi = idx + 1, n - 1
while lo < hi:
mid = (lo + hi) // 2
if inventory[mid] == inventory[idx]:
lo = mid + 1
else:
hi = mid
if lo >= n:
break
mult = lo
if mult * (inventory[idx] - inventory[lo]) >= orders:
# from inventory[idx] to inventory[lo]
q, r = divmod(orders, mult)
ans += mult * (inventory[idx] + inventory[idx] - q + 1) * q // 2
ans += r * (inventory[idx] - q)
orders = 0
else:
orders -= mult * (inventory[idx] - inventory[lo])
ans += mult * (inventory[idx] + inventory[lo] + 1) * (inventory[idx] - inventory[lo]) // 2
idx = lo
ans %= 1_000_000_007
return ans
| 36.272727 | 106 | 0.427736 | 133 | 1,197 | 3.827068 | 0.345865 | 0.212181 | 0.206287 | 0.196464 | 0.220039 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03811 | 0.451963 | 1,197 | 32 | 107 | 37.40625 | 0.737805 | 0.077694 | 0 | 0.068966 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034483 | false | 0 | 0 | 0 | 0.103448 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b8eee9bb66c885f911bf1b2706469ebc2a0a0906 | 517 | py | Python | while_loop.py | HuuHoangNguyen/Python_learning | c33940ca95866cefa6381cdef901062be755052d | [
"MIT"
] | null | null | null | while_loop.py | HuuHoangNguyen/Python_learning | c33940ca95866cefa6381cdef901062be755052d | [
"MIT"
] | null | null | null | while_loop.py | HuuHoangNguyen/Python_learning | c33940ca95866cefa6381cdef901062be755052d | [
"MIT"
] | null | null | null | #!/usr/bin/python
print "============================================================="
count = 0
while ( count < 9):
print "The count is: ", count;
count = count + 1
print "============================================================="
var = 1
while var == 1: # THis is construct an infinite loop
num = raw_input("Enter a number : ")
print "You entered: ", num
var_num = int(num)
if var_num == 0: var = 0
print "============================================================="
print "END"
| 24.619048 | 69 | 0.375242 | 52 | 517 | 3.673077 | 0.538462 | 0.104712 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016509 | 0.179884 | 517 | 20 | 70 | 25.85 | 0.433962 | 0.098646 | 0 | 0.214286 | 0 | 0 | 0.49569 | 0.394397 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.428571 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
b8f15f0f5922ed27448b2fbf5e93457d4e117d26 | 907 | py | Python | app/config/Config.py | pvanassen/sensors | 459eed4409e54f0537b1610d5eace9f71cb51fc2 | [
"Apache-2.0"
] | null | null | null | app/config/Config.py | pvanassen/sensors | 459eed4409e54f0537b1610d5eace9f71cb51fc2 | [
"Apache-2.0"
] | null | null | null | app/config/Config.py | pvanassen/sensors | 459eed4409e54f0537b1610d5eace9f71cb51fc2 | [
"Apache-2.0"
] | null | null | null | from six.moves import configparser
import six
if six.PY2:
ConfigParser = configparser.SafeConfigParser
else:
ConfigParser = configparser.ConfigParser
class Config:
def __init__(self):
self._config = ConfigParser()
self._config.read('config/default.ini')
self._config.read('config/user.ini')
def get_bus(self):
return int(self._config.get('BH1750', 'bus'), 0)
def get_device(self):
return int(self._config.get('BH1750', 'device'), 0)
def get_mode(self):
return int(self._config.get('BH1750', 'mode'), 0)
def get_pin(self):
return int(self._config.get('DHT11', 'pin'), 0)
def get_hostname(self):
return self._config.get('STATSD', 'hostname')
def get_port(self):
return int(self._config.get('STATSD', 'port'), 0)
def get_prefix(self):
return self._config.get('STATSD', 'prefix')
| 24.513514 | 59 | 0.640573 | 117 | 907 | 4.786325 | 0.273504 | 0.178571 | 0.1625 | 0.151786 | 0.367857 | 0.367857 | 0.171429 | 0 | 0 | 0 | 0 | 0.02809 | 0.214994 | 907 | 36 | 60 | 25.194444 | 0.758427 | 0 | 0 | 0 | 0 | 0 | 0.119074 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.32 | false | 0 | 0.08 | 0.28 | 0.72 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
770a2466492c7486515280abaa551c0cd4b4fab1 | 392 | py | Python | multi_tools/math/errors.py | Jerem2360/multitools | cd2c5aee72e5c2c8b60bbedd458303051b104c29 | [
"Unlicense"
] | null | null | null | multi_tools/math/errors.py | Jerem2360/multitools | cd2c5aee72e5c2c8b60bbedd458303051b104c29 | [
"Unlicense"
] | null | null | null | multi_tools/math/errors.py | Jerem2360/multitools | cd2c5aee72e5c2c8b60bbedd458303051b104c29 | [
"Unlicense"
] | null | null | null | from multi_tools.errors import exceptions
class InfiniteDivisionError(exceptions.ErrorImitation):
def __init__(self, text="Division by infinity", immediateRaise=True):
"""
An error that occurs when something is divided by an infinite().
"""
exceptions.ErrorImitation.__init__(self, name="InfiniteDivisionError", text=text, immediateRaise=immediateRaise)
| 39.2 | 120 | 0.739796 | 40 | 392 | 7.025 | 0.7 | 0.170819 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170918 | 392 | 9 | 121 | 43.555556 | 0.864615 | 0.163265 | 0 | 0 | 0 | 0 | 0.134868 | 0.069079 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
7711ecdc055d93618bc12fe11266be05b282cf5f | 40,964 | bzl | Python | starlark/src/syntax/testcases/workspace.bzl | levels3d/starlark-rust | 83afc4604f3e2ce510b20d9c22538a7ea079c4ab | [
"Apache-2.0"
] | null | null | null | starlark/src/syntax/testcases/workspace.bzl | levels3d/starlark-rust | 83afc4604f3e2ce510b20d9c22538a7ea079c4ab | [
"Apache-2.0"
] | null | null | null | starlark/src/syntax/testcases/workspace.bzl | levels3d/starlark-rust | 83afc4604f3e2ce510b20d9c22538a7ea079c4ab | [
"Apache-2.0"
] | null | null | null | def maven_dependencies(callback):
callback({"artifact": "antlr:antlr:2.7.6", "lang": "java", "sha1": "cf4f67dae5df4f9932ae7810f4548ef3e14dd35e", "repository": "https://repo.maven.apache.org/maven2/", "name": "antlr_antlr", "actual": "@antlr_antlr//jar", "bind": "jar/antlr/antlr"})
callback({"artifact": "aopalliance:aopalliance:1.0", "lang": "java", "sha1": "0235ba8b489512805ac13a8f9ea77a1ca5ebe3e8", "repository": "https://repo.maven.apache.org/maven2/", "name": "aopalliance_aopalliance", "actual": "@aopalliance_aopalliance//jar", "bind": "jar/aopalliance/aopalliance"})
callback({"artifact": "args4j:args4j:2.0.31", "lang": "java", "sha1": "6b870d81551ce93c5c776c3046299db8ad6c39d2", "repository": "https://repo.maven.apache.org/maven2/", "name": "args4j_args4j", "actual": "@args4j_args4j//jar", "bind": "jar/args4j/args4j"})
callback({"artifact": "com.cloudbees:groovy-cps:1.12", "lang": "java", "sha1": "d766273a59e0b954c016e805779106bca22764b9", "repository": "https://repo.maven.apache.org/maven2/", "name": "com_cloudbees_groovy_cps", "actual": "@com_cloudbees_groovy_cps//jar", "bind": "jar/com/cloudbees/groovy_cps"})
callback({"artifact": "com.github.jnr:jffi:1.2.15", "lang": "java", "sha1": "f480f0234dd8f053da2421e60574cfbd9d85e1f5", "repository": "https://repo.maven.apache.org/maven2/", "name": "com_github_jnr_jffi", "actual": "@com_github_jnr_jffi//jar", "bind": "jar/com/github/jnr/jffi"})
callback({"artifact": "com.github.jnr:jnr-constants:0.9.8", "lang": "java", "sha1": "478036404879bd582be79e9a7939f3a161601c8b", "repository": "https://repo.maven.apache.org/maven2/", "name": "com_github_jnr_jnr_constants", "actual": "@com_github_jnr_jnr_constants//jar", "bind": "jar/com/github/jnr/jnr_constants"})
callback({"artifact": "com.github.jnr:jnr-ffi:2.1.4", "lang": "java", "sha1": "0a63bbd4af5cee55d820ef40dc5347d45765b788", "repository": "https://repo.maven.apache.org/maven2/", "name": "com_github_jnr_jnr_ffi", "actual": "@com_github_jnr_jnr_ffi//jar", "bind": "jar/com/github/jnr/jnr_ffi"})
callback({"artifact": "com.github.jnr:jnr-posix:3.0.41", "lang": "java", "sha1": "36eff018149e53ed814a340ddb7de73ceb66bf96", "repository": "https://repo.maven.apache.org/maven2/", "name": "com_github_jnr_jnr_posix", "actual": "@com_github_jnr_jnr_posix//jar", "bind": "jar/com/github/jnr/jnr_posix"})
callback({"artifact": "com.github.jnr:jnr-x86asm:1.0.2", "lang": "java", "sha1": "006936bbd6c5b235665d87bd450f5e13b52d4b48", "repository": "https://repo.maven.apache.org/maven2/", "name": "com_github_jnr_jnr_x86asm", "actual": "@com_github_jnr_jnr_x86asm//jar", "bind": "jar/com/github/jnr/jnr_x86asm"})
callback({"artifact": "com.google.code.findbugs:jsr305:1.3.9", "lang": "java", "sha1": "40719ea6961c0cb6afaeb6a921eaa1f6afd4cfdf", "repository": "https://repo.maven.apache.org/maven2/", "name": "com_google_code_findbugs_jsr305", "actual": "@com_google_code_findbugs_jsr305//jar", "bind": "jar/com/google/code/findbugs/jsr305"})
callback({"artifact": "com.google.guava:guava:11.0.1", "lang": "java", "sha1": "57b40a943725d43610c898ac0169adf1b2d55742", "repository": "https://repo.maven.apache.org/maven2/", "name": "com_google_guava_guava", "actual": "@com_google_guava_guava//jar", "bind": "jar/com/google/guava/guava"})
callback({"artifact": "com.google.inject:guice:4.0", "lang": "java", "sha1": "0f990a43d3725781b6db7cd0acf0a8b62dfd1649", "repository": "https://repo.maven.apache.org/maven2/", "name": "com_google_inject_guice", "actual": "@com_google_inject_guice//jar", "bind": "jar/com/google/inject/guice"})
callback({"artifact": "com.infradna.tool:bridge-method-annotation:1.13", "lang": "java", "sha1": "18cdce50cde6f54ee5390d0907384f72183ff0fe", "repository": "https://repo.maven.apache.org/maven2/", "name": "com_infradna_tool_bridge_method_annotation", "actual": "@com_infradna_tool_bridge_method_annotation//jar", "bind": "jar/com/infradna/tool/bridge_method_annotation"})
callback({"artifact": "com.jcraft:jzlib:1.1.3-kohsuke-1", "lang": "java", "sha1": "af5d27e1de29df05db95da5d76b546d075bc1bc5", "repository": "http://repo.jenkins-ci.org/public/", "name": "com_jcraft_jzlib", "actual": "@com_jcraft_jzlib//jar", "bind": "jar/com/jcraft/jzlib"})
callback({"artifact": "com.lesfurets:jenkins-pipeline-unit:1.0", "lang": "java", "sha1": "3aa90c606c541e88c268df3cc9e87306af69b29f", "repository": "https://repo.maven.apache.org/maven2/", "name": "com_lesfurets_jenkins_pipeline_unit", "actual": "@com_lesfurets_jenkins_pipeline_unit//jar", "bind": "jar/com/lesfurets/jenkins_pipeline_unit"})
callback({"artifact": "com.sun.solaris:embedded_su4j:1.1", "lang": "java", "sha1": "9404130cc4e60670429f1ab8dbf94d669012725d", "repository": "https://repo.maven.apache.org/maven2/", "name": "com_sun_solaris_embedded_su4j", "actual": "@com_sun_solaris_embedded_su4j//jar", "bind": "jar/com/sun/solaris/embedded_su4j"})
callback({"artifact": "com.sun.xml.txw2:txw2:20110809", "lang": "java", "sha1": "46afa3f3c468680875adb8f2a26086a126c89902", "repository": "https://repo.maven.apache.org/maven2/", "name": "com_sun_xml_txw2_txw2", "actual": "@com_sun_xml_txw2_txw2//jar", "bind": "jar/com/sun/xml/txw2/txw2"})
callback({"artifact": "commons-beanutils:commons-beanutils:1.8.3", "lang": "java", "sha1": "686ef3410bcf4ab8ce7fd0b899e832aaba5facf7", "repository": "https://repo.maven.apache.org/maven2/", "name": "commons_beanutils_commons_beanutils", "actual": "@commons_beanutils_commons_beanutils//jar", "bind": "jar/commons_beanutils/commons_beanutils"})
callback({"artifact": "commons-codec:commons-codec:1.8", "lang": "java", "sha1": "af3be3f74d25fc5163b54f56a0d394b462dafafd", "repository": "https://repo.maven.apache.org/maven2/", "name": "commons_codec_commons_codec", "actual": "@commons_codec_commons_codec//jar", "bind": "jar/commons_codec/commons_codec"})
callback({"artifact": "commons-collections:commons-collections:3.2.2", "lang": "java", "sha1": "8ad72fe39fa8c91eaaf12aadb21e0c3661fe26d5", "repository": "https://repo.maven.apache.org/maven2/", "name": "commons_collections_commons_collections", "actual": "@commons_collections_commons_collections//jar", "bind": "jar/commons_collections/commons_collections"})
callback({"artifact": "commons-digester:commons-digester:2.1", "lang": "java", "sha1": "73a8001e7a54a255eef0f03521ec1805dc738ca0", "repository": "https://repo.maven.apache.org/maven2/", "name": "commons_digester_commons_digester", "actual": "@commons_digester_commons_digester//jar", "bind": "jar/commons_digester/commons_digester"})
callback({"artifact": "commons-discovery:commons-discovery:0.4", "lang": "java", "sha1": "9e3417d3866d9f71e83b959b229b35dc723c7bea", "repository": "https://repo.maven.apache.org/maven2/", "name": "commons_discovery_commons_discovery", "actual": "@commons_discovery_commons_discovery//jar", "bind": "jar/commons_discovery/commons_discovery"})
callback({"artifact": "commons-fileupload:commons-fileupload:1.3.1-jenkins-1", "lang": "java", "sha1": "5d0270b78ad9d5344ce4a8e35482ad8802526aca", "repository": "http://repo.jenkins-ci.org/public/", "name": "commons_fileupload_commons_fileupload", "actual": "@commons_fileupload_commons_fileupload//jar", "bind": "jar/commons_fileupload/commons_fileupload"})
callback({"artifact": "commons-httpclient:commons-httpclient:3.1", "lang": "java", "sha1": "964cd74171f427720480efdec40a7c7f6e58426a", "repository": "https://repo.maven.apache.org/maven2/", "name": "commons_httpclient_commons_httpclient", "actual": "@commons_httpclient_commons_httpclient//jar", "bind": "jar/commons_httpclient/commons_httpclient"})
# duplicates in commons-io:commons-io promoted to 2.5. Versions: 2.4 2.5
callback({"artifact": "commons-io:commons-io:2.5", "lang": "java", "sha1": "2852e6e05fbb95076fc091f6d1780f1f8fe35e0f", "repository": "https://repo.maven.apache.org/maven2/", "name": "commons_io_commons_io", "actual": "@commons_io_commons_io//jar", "bind": "jar/commons_io/commons_io"})
callback({"artifact": "commons-jelly:commons-jelly-tags-fmt:1.0", "lang": "java", "sha1": "2107da38fdd287ab78a4fa65c1300b5ad9999274", "repository": "https://repo.maven.apache.org/maven2/", "name": "commons_jelly_commons_jelly_tags_fmt", "actual": "@commons_jelly_commons_jelly_tags_fmt//jar", "bind": "jar/commons_jelly/commons_jelly_tags_fmt"})
callback({"artifact": "commons-jelly:commons-jelly-tags-xml:1.1", "lang": "java", "sha1": "cc0efc2ae0ff81ef7737afc786a0ce16a8540efc", "repository": "https://repo.maven.apache.org/maven2/", "name": "commons_jelly_commons_jelly_tags_xml", "actual": "@commons_jelly_commons_jelly_tags_xml//jar", "bind": "jar/commons_jelly/commons_jelly_tags_xml"})
callback({"artifact": "commons-lang:commons-lang:2.6", "lang": "java", "sha1": "0ce1edb914c94ebc388f086c6827e8bdeec71ac2", "repository": "https://repo.maven.apache.org/maven2/", "name": "commons_lang_commons_lang", "actual": "@commons_lang_commons_lang//jar", "bind": "jar/commons_lang/commons_lang"})
callback({"artifact": "javax.annotation:javax.annotation-api:1.2", "lang": "java", "sha1": "479c1e06db31c432330183f5cae684163f186146", "repository": "https://repo.maven.apache.org/maven2/", "name": "javax_annotation_javax_annotation_api", "actual": "@javax_annotation_javax_annotation_api//jar", "bind": "jar/javax/annotation/javax_annotation_api"})
callback({"artifact": "javax.inject:javax.inject:1", "lang": "java", "sha1": "6975da39a7040257bd51d21a231b76c915872d38", "repository": "https://repo.maven.apache.org/maven2/", "name": "javax_inject_javax_inject", "actual": "@javax_inject_javax_inject//jar", "bind": "jar/javax/inject/javax_inject"})
callback({"artifact": "javax.mail:mail:1.4.4", "lang": "java", "sha1": "b907ef0a02ff6e809392b1e7149198497fcc8e49", "repository": "https://repo.maven.apache.org/maven2/", "name": "javax_mail_mail", "actual": "@javax_mail_mail//jar", "bind": "jar/javax/mail/mail"})
callback({"artifact": "javax.servlet:jstl:1.1.0", "lang": "java", "sha1": "bca201e52333629c59e459e874e5ecd8f9899e15", "repository": "https://repo.maven.apache.org/maven2/", "name": "javax_servlet_jstl", "actual": "@javax_servlet_jstl//jar", "bind": "jar/javax/servlet/jstl"})
callback({"artifact": "javax.xml.stream:stax-api:1.0-2", "lang": "java", "sha1": "d6337b0de8b25e53e81b922352fbea9f9f57ba0b", "repository": "https://repo.maven.apache.org/maven2/", "name": "javax_xml_stream_stax_api", "actual": "@javax_xml_stream_stax_api//jar", "bind": "jar/javax/xml/stream/stax_api"})
callback({"artifact": "jaxen:jaxen:1.1-beta-11", "lang": "java", "sha1": "81e32b8bafcc778e5deea4e784670299f1c26b96", "repository": "https://repo.maven.apache.org/maven2/", "name": "jaxen_jaxen", "actual": "@jaxen_jaxen//jar", "bind": "jar/jaxen/jaxen"})
callback({"artifact": "jfree:jcommon:1.0.12", "lang": "java", "sha1": "737f02607d2f45bb1a589a85c63b4cd907e5e634", "repository": "https://repo.maven.apache.org/maven2/", "name": "jfree_jcommon", "actual": "@jfree_jcommon//jar", "bind": "jar/jfree/jcommon"})
callback({"artifact": "jfree:jfreechart:1.0.9", "lang": "java", "sha1": "6e522aa603bf7ac69da59edcf519b335490e93a6", "repository": "https://repo.maven.apache.org/maven2/", "name": "jfree_jfreechart", "actual": "@jfree_jfreechart//jar", "bind": "jar/jfree/jfreechart"})
callback({"artifact": "jline:jline:2.12", "lang": "java", "sha1": "ce9062c6a125e0f9ad766032573c041ae8ecc986", "repository": "https://repo.maven.apache.org/maven2/", "name": "jline_jline", "actual": "@jline_jline//jar", "bind": "jar/jline/jline"})
callback({"artifact": "junit:junit:4.12", "lang": "java", "sha1": "2973d150c0dc1fefe998f834810d68f278ea58ec", "repository": "https://repo.maven.apache.org/maven2/", "name": "junit_junit", "actual": "@junit_junit//jar", "bind": "jar/junit/junit"})
callback({"artifact": "net.i2p.crypto:eddsa:0.2.0", "lang": "java", "sha1": "0856a92559c4daf744cb27c93cd8b7eb1f8c4780", "repository": "https://repo.maven.apache.org/maven2/", "name": "net_i2p_crypto_eddsa", "actual": "@net_i2p_crypto_eddsa//jar", "bind": "jar/net/i2p/crypto/eddsa"})
callback({"artifact": "net.java.dev.jna:jna:4.2.1", "lang": "java", "sha1": "fcc5b10cb812c41b00708e7b57baccc3aee5567c", "repository": "https://repo.maven.apache.org/maven2/", "name": "net_java_dev_jna_jna", "actual": "@net_java_dev_jna_jna//jar", "bind": "jar/net/java/dev/jna/jna"})
callback({"artifact": "net.java.sezpoz:sezpoz:1.12", "lang": "java", "sha1": "01f7e4a04e06fdbc91d66ddf80c443c3f7c6503c", "repository": "https://repo.maven.apache.org/maven2/", "name": "net_java_sezpoz_sezpoz", "actual": "@net_java_sezpoz_sezpoz//jar", "bind": "jar/net/java/sezpoz/sezpoz"})
callback({"artifact": "net.sf.ezmorph:ezmorph:1.0.6", "lang": "java", "sha1": "01e55d2a0253ea37745d33062852fd2c90027432", "repository": "https://repo.maven.apache.org/maven2/", "name": "net_sf_ezmorph_ezmorph", "actual": "@net_sf_ezmorph_ezmorph//jar", "bind": "jar/net/sf/ezmorph/ezmorph"})
callback({"artifact": "org.acegisecurity:acegi-security:1.0.7", "lang": "java", "sha1": "72901120d299e0c6ed2f6a23dd37f9186eeb8cc3", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_acegisecurity_acegi_security", "actual": "@org_acegisecurity_acegi_security//jar", "bind": "jar/org/acegisecurity/acegi_security"})
callback({"artifact": "org.apache.ant:ant-launcher:1.8.4", "lang": "java", "sha1": "22f1e0c32a2bfc8edd45520db176bac98cebbbfe", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_apache_ant_ant_launcher", "actual": "@org_apache_ant_ant_launcher//jar", "bind": "jar/org/apache/ant/ant_launcher"})
callback({"artifact": "org.apache.ant:ant:1.8.4", "lang": "java", "sha1": "8acff3fb57e74bc062d4675d9dcfaffa0d524972", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_apache_ant_ant", "actual": "@org_apache_ant_ant//jar", "bind": "jar/org/apache/ant/ant"})
callback({"artifact": "org.apache.commons:commons-compress:1.10", "lang": "java", "sha1": "5eeb27c57eece1faf2d837868aeccc94d84dcc9a", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_apache_commons_commons_compress", "actual": "@org_apache_commons_commons_compress//jar", "bind": "jar/org/apache/commons/commons_compress"})
callback({"artifact": "org.apache.ivy:ivy:2.4.0", "lang": "java", "sha1": "5abe4c24bbe992a9ac07ca563d5bd3e8d569e9ed", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_apache_ivy_ivy", "actual": "@org_apache_ivy_ivy//jar", "bind": "jar/org/apache/ivy/ivy"})
# duplicates in org.codehaus.groovy:groovy-all fixed to 2.4.6. Versions: 2.4.6 2.4.11
callback({"artifact": "org.codehaus.groovy:groovy-all:2.4.6", "lang": "java", "sha1": "478feadca929a946b2f1fb962bb2179264759821", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_codehaus_groovy_groovy_all", "actual": "@org_codehaus_groovy_groovy_all//jar", "bind": "jar/org/codehaus/groovy/groovy_all"})
callback({"artifact": "org.codehaus.woodstox:wstx-asl:3.2.9", "lang": "java", "sha1": "c82b6e8f225bb799540e558b10ee24d268035597", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_codehaus_woodstox_wstx_asl", "actual": "@org_codehaus_woodstox_wstx_asl//jar", "bind": "jar/org/codehaus/woodstox/wstx_asl"})
callback({"artifact": "org.connectbot.jbcrypt:jbcrypt:1.0.0", "lang": "java", "sha1": "f37bba2b8b78fcc8111bb932318b621dcc6c5194", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_connectbot_jbcrypt_jbcrypt", "actual": "@org_connectbot_jbcrypt_jbcrypt//jar", "bind": "jar/org/connectbot/jbcrypt/jbcrypt"})
callback({"artifact": "org.fusesource.jansi:jansi:1.11", "lang": "java", "sha1": "655c643309c2f45a56a747fda70e3fadf57e9f11", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_fusesource_jansi_jansi", "actual": "@org_fusesource_jansi_jansi//jar", "bind": "jar/org/fusesource/jansi/jansi"})
callback({"artifact": "org.hamcrest:hamcrest-all:1.3", "lang": "java", "sha1": "63a21ebc981131004ad02e0434e799fd7f3a8d5a", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_hamcrest_hamcrest_all", "actual": "@org_hamcrest_hamcrest_all//jar", "bind": "jar/org/hamcrest/hamcrest_all"})
callback({"artifact": "org.hamcrest:hamcrest-core:1.3", "lang": "java", "sha1": "42a25dc3219429f0e5d060061f71acb49bf010a0", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_hamcrest_hamcrest_core", "actual": "@org_hamcrest_hamcrest_core//jar", "bind": "jar/org/hamcrest/hamcrest_core"})
callback({"artifact": "org.jboss.marshalling:jboss-marshalling-river:1.4.9.Final", "lang": "java", "sha1": "d41e3e1ed9cf4afd97d19df8ecc7f2120effeeb4", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_jboss_marshalling_jboss_marshalling_river", "actual": "@org_jboss_marshalling_jboss_marshalling_river//jar", "bind": "jar/org/jboss/marshalling/jboss_marshalling_river"})
callback({"artifact": "org.jboss.marshalling:jboss-marshalling:1.4.9.Final", "lang": "java", "sha1": "8fd342ee3dde0448c7600275a936ea1b17deb494", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_jboss_marshalling_jboss_marshalling", "actual": "@org_jboss_marshalling_jboss_marshalling//jar", "bind": "jar/org/jboss/marshalling/jboss_marshalling"})
callback({"artifact": "org.jenkins-ci.dom4j:dom4j:1.6.1-jenkins-4", "lang": "java", "sha1": "9a370b2010b5a1223c7a43dae6c05226918e17b1", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_dom4j_dom4j", "actual": "@org_jenkins_ci_dom4j_dom4j//jar", "bind": "jar/org/jenkins_ci/dom4j/dom4j"})
callback({"artifact": "org.jenkins-ci.main:cli:2.73.1", "lang": "java", "sha1": "03ae1decd36ee069108e66e70cd6ffcdd4320aec", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_main_cli", "actual": "@org_jenkins_ci_main_cli//jar", "bind": "jar/org/jenkins_ci/main/cli"})
callback({"artifact": "org.jenkins-ci.main:jenkins-core:2.73.1", "lang": "java", "sha1": "30c9e7029d46fd18a8720f9a491bf41ab8f2bdb2", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_main_jenkins_core", "actual": "@org_jenkins_ci_main_jenkins_core//jar", "bind": "jar/org/jenkins_ci/main/jenkins_core"})
callback({"artifact": "org.jenkins-ci.main:remoting:3.10", "lang": "java", "sha1": "19905fa1550ab34a33bb92a5e27e2a86733c9d15", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_main_remoting", "actual": "@org_jenkins_ci_main_remoting//jar", "bind": "jar/org/jenkins_ci/main/remoting"})
callback({"artifact": "org.jenkins-ci.plugins.icon-shim:icon-set:1.0.5", "lang": "java", "sha1": "dedc76ac61797dafc66f31e8507d65b98c9e57df", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_plugins_icon_shim_icon_set", "actual": "@org_jenkins_ci_plugins_icon_shim_icon_set//jar", "bind": "jar/org/jenkins_ci/plugins/icon_shim/icon_set"})
callback({"artifact": "org.jenkins-ci.plugins.workflow:workflow-api:2.11", "lang": "java", "sha1": "3a8a6e221a8b32fd9faabb33939c28f79fd961d7", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_plugins_workflow_workflow_api", "actual": "@org_jenkins_ci_plugins_workflow_workflow_api//jar", "bind": "jar/org/jenkins_ci/plugins/workflow/workflow_api"})
callback({"artifact": "org.jenkins-ci.plugins.workflow:workflow-step-api:2.9", "lang": "java", "sha1": "7d1ad140c092cf4a68a7763db9eac459b5ed86ff", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_plugins_workflow_workflow_step_api", "actual": "@org_jenkins_ci_plugins_workflow_workflow_step_api//jar", "bind": "jar/org/jenkins_ci/plugins/workflow/workflow_step_api"})
callback({"artifact": "org.jenkins-ci.plugins.workflow:workflow-support:2.14", "lang": "java", "sha1": "cd5f68c533ddd46fea3332ce788dffc80707ddb5", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_plugins_workflow_workflow_support", "actual": "@org_jenkins_ci_plugins_workflow_workflow_support//jar", "bind": "jar/org/jenkins_ci/plugins/workflow/workflow_support"})
callback({"artifact": "org.jenkins-ci.plugins:script-security:1.26", "lang": "java", "sha1": "44aacd104c0d5c8fe5d0f93e4a4001cae0e48c2b", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_plugins_script_security", "actual": "@org_jenkins_ci_plugins_script_security//jar", "bind": "jar/org/jenkins_ci/plugins/script_security"})
callback({"artifact": "org.jenkins-ci.plugins:structs:1.5", "lang": "java", "sha1": "72d429f749151f1c983c1fadcb348895cc6da20e", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_plugins_structs", "actual": "@org_jenkins_ci_plugins_structs//jar", "bind": "jar/org/jenkins_ci/plugins/structs"})
# duplicates in org.jenkins-ci:annotation-indexer promoted to 1.12. Versions: 1.9 1.12
callback({"artifact": "org.jenkins-ci:annotation-indexer:1.12", "lang": "java", "sha1": "8f6ee0cd64c305dcca29e2f5b46631d50890208f", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_annotation_indexer", "actual": "@org_jenkins_ci_annotation_indexer//jar", "bind": "jar/org/jenkins_ci/annotation_indexer"})
callback({"artifact": "org.jenkins-ci:bytecode-compatibility-transformer:1.8", "lang": "java", "sha1": "aded88ffe12f1904758397f96f16957e97b88e6e", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_bytecode_compatibility_transformer", "actual": "@org_jenkins_ci_bytecode_compatibility_transformer//jar", "bind": "jar/org/jenkins_ci/bytecode_compatibility_transformer"})
callback({"artifact": "org.jenkins-ci:commons-jelly:1.1-jenkins-20120928", "lang": "java", "sha1": "2720a0d54b7f32479b08970d7738041362e1f410", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_commons_jelly", "actual": "@org_jenkins_ci_commons_jelly//jar", "bind": "jar/org/jenkins_ci/commons_jelly"})
callback({"artifact": "org.jenkins-ci:commons-jexl:1.1-jenkins-20111212", "lang": "java", "sha1": "0a990a77bea8c5a400d58a6f5d98122236300f7d", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_commons_jexl", "actual": "@org_jenkins_ci_commons_jexl//jar", "bind": "jar/org/jenkins_ci/commons_jexl"})
callback({"artifact": "org.jenkins-ci:constant-pool-scanner:1.2", "lang": "java", "sha1": "e5e0b7c7fcb67767dbd195e0ca1f0ee9406dd423", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_jenkins_ci_constant_pool_scanner", "actual": "@org_jenkins_ci_constant_pool_scanner//jar", "bind": "jar/org/jenkins_ci/constant_pool_scanner"})
callback({"artifact": "org.jenkins-ci:crypto-util:1.1", "lang": "java", "sha1": "3a199a4c3748012b9dbbf3080097dc9f302493d8", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_crypto_util", "actual": "@org_jenkins_ci_crypto_util//jar", "bind": "jar/org/jenkins_ci/crypto_util"})
callback({"artifact": "org.jenkins-ci:jmdns:3.4.0-jenkins-3", "lang": "java", "sha1": "264d0c402b48c365f34d072b864ed57f25e92e63", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_jmdns", "actual": "@org_jenkins_ci_jmdns//jar", "bind": "jar/org/jenkins_ci/jmdns"})
callback({"artifact": "org.jenkins-ci:memory-monitor:1.9", "lang": "java", "sha1": "1935bfb46474e3043ee2310a9bb790d42dde2ed7", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_memory_monitor", "actual": "@org_jenkins_ci_memory_monitor//jar", "bind": "jar/org/jenkins_ci/memory_monitor"})
# duplicates in org.jenkins-ci:symbol-annotation promoted to 1.5. Versions: 1.1 1.5
callback({"artifact": "org.jenkins-ci:symbol-annotation:1.5", "lang": "java", "sha1": "17694feb24cb69793914d0c1c11ff479ee4c1b38", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_symbol_annotation", "actual": "@org_jenkins_ci_symbol_annotation//jar", "bind": "jar/org/jenkins_ci/symbol_annotation"})
callback({"artifact": "org.jenkins-ci:task-reactor:1.4", "lang": "java", "sha1": "b89e501a3bc64fe9f28cb91efe75ed8745974ef8", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_task_reactor", "actual": "@org_jenkins_ci_task_reactor//jar", "bind": "jar/org/jenkins_ci/task_reactor"})
callback({"artifact": "org.jenkins-ci:trilead-ssh2:build-217-jenkins-11", "lang": "java", "sha1": "f10f4dd4121cc233cac229c51adb4775960fee0a", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_trilead_ssh2", "actual": "@org_jenkins_ci_trilead_ssh2//jar", "bind": "jar/org/jenkins_ci/trilead_ssh2"})
callback({"artifact": "org.jenkins-ci:version-number:1.4", "lang": "java", "sha1": "5d0f2ea16514c0ec8de86c102ce61a7837e45eb8", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jenkins_ci_version_number", "actual": "@org_jenkins_ci_version_number//jar", "bind": "jar/org/jenkins_ci/version_number"})
callback({"artifact": "org.jruby.ext.posix:jna-posix:1.0.3-jenkins-1", "lang": "java", "sha1": "fb1148cc8192614ec1418d414f7b6026cc0ec71b", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jruby_ext_posix_jna_posix", "actual": "@org_jruby_ext_posix_jna_posix//jar", "bind": "jar/org/jruby/ext/posix/jna_posix"})
callback({"artifact": "org.jvnet.hudson:activation:1.1.1-hudson-1", "lang": "java", "sha1": "7957d80444223277f84676aabd5b0421b65888c4", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_jvnet_hudson_activation", "actual": "@org_jvnet_hudson_activation//jar", "bind": "jar/org/jvnet/hudson/activation"})
callback({"artifact": "org.jvnet.hudson:commons-jelly-tags-define:1.0.1-hudson-20071021", "lang": "java", "sha1": "8b952d0e504ee505d234853119e5648441894234", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_jvnet_hudson_commons_jelly_tags_define", "actual": "@org_jvnet_hudson_commons_jelly_tags_define//jar", "bind": "jar/org/jvnet/hudson/commons_jelly_tags_define"})
callback({"artifact": "org.jvnet.hudson:jtidy:4aug2000r7-dev-hudson-1", "lang": "java", "sha1": "ad8553d0acfa6e741d21d5b2c2beb737972ab7c7", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_jvnet_hudson_jtidy", "actual": "@org_jvnet_hudson_jtidy//jar", "bind": "jar/org/jvnet/hudson/jtidy"})
callback({"artifact": "org.jvnet.hudson:xstream:1.4.7-jenkins-1", "lang": "java", "sha1": "161ed1603117c2d37b864f81a0d62f36cf7e958a", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jvnet_hudson_xstream", "actual": "@org_jvnet_hudson_xstream//jar", "bind": "jar/org/jvnet/hudson/xstream"})
callback({"artifact": "org.jvnet.localizer:localizer:1.24", "lang": "java", "sha1": "e20e7668dbf36e8d354dab922b89adb6273b703f", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jvnet_localizer_localizer", "actual": "@org_jvnet_localizer_localizer//jar", "bind": "jar/org/jvnet/localizer/localizer"})
callback({"artifact": "org.jvnet.robust-http-client:robust-http-client:1.2", "lang": "java", "sha1": "dee9fda92ad39a94a77ec6cf88300d4dd6db8a4d", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jvnet_robust_http_client_robust_http_client", "actual": "@org_jvnet_robust_http_client_robust_http_client//jar", "bind": "jar/org/jvnet/robust_http_client/robust_http_client"})
callback({"artifact": "org.jvnet.winp:winp:1.25", "lang": "java", "sha1": "1c88889f80c0e03a7fb62c26b706d68813f8e657", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_jvnet_winp_winp", "actual": "@org_jvnet_winp_winp//jar", "bind": "jar/org/jvnet/winp/winp"})
callback({"artifact": "org.jvnet:tiger-types:2.2", "lang": "java", "sha1": "7ddc6bbc8ca59be8879d3a943bf77517ec190f39", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_jvnet_tiger_types", "actual": "@org_jvnet_tiger_types//jar", "bind": "jar/org/jvnet/tiger_types"})
callback({"artifact": "org.kohsuke.jinterop:j-interop:2.0.6-kohsuke-1", "lang": "java", "sha1": "b2e243227608c1424ab0084564dc71659d273007", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_jinterop_j_interop", "actual": "@org_kohsuke_jinterop_j_interop//jar", "bind": "jar/org/kohsuke/jinterop/j_interop"})
callback({"artifact": "org.kohsuke.jinterop:j-interopdeps:2.0.6-kohsuke-1", "lang": "java", "sha1": "778400517a3419ce8c361498c194036534851736", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_jinterop_j_interopdeps", "actual": "@org_kohsuke_jinterop_j_interopdeps//jar", "bind": "jar/org/kohsuke/jinterop/j_interopdeps"})
callback({"artifact": "org.kohsuke.stapler:json-lib:2.4-jenkins-2", "lang": "java", "sha1": "7f4f9016d8c8b316ecbe68afe7c26df06d301366", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_kohsuke_stapler_json_lib", "actual": "@org_kohsuke_stapler_json_lib//jar", "bind": "jar/org/kohsuke/stapler/json_lib"})
callback({"artifact": "org.kohsuke.stapler:stapler-adjunct-codemirror:1.3", "lang": "java", "sha1": "fd1d45544400d2a4da6dfee9e60edd4ec3368806", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_kohsuke_stapler_stapler_adjunct_codemirror", "actual": "@org_kohsuke_stapler_stapler_adjunct_codemirror//jar", "bind": "jar/org/kohsuke/stapler/stapler_adjunct_codemirror"})
callback({"artifact": "org.kohsuke.stapler:stapler-adjunct-timeline:1.5", "lang": "java", "sha1": "3fa806cbb94679ceab9c1ecaaf5fea8207390cb7", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_stapler_stapler_adjunct_timeline", "actual": "@org_kohsuke_stapler_stapler_adjunct_timeline//jar", "bind": "jar/org/kohsuke/stapler/stapler_adjunct_timeline"})
callback({"artifact": "org.kohsuke.stapler:stapler-adjunct-zeroclipboard:1.3.5-1", "lang": "java", "sha1": "20184ea79888b55b6629e4479615b52f88b55173", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_stapler_stapler_adjunct_zeroclipboard", "actual": "@org_kohsuke_stapler_stapler_adjunct_zeroclipboard//jar", "bind": "jar/org/kohsuke/stapler/stapler_adjunct_zeroclipboard"})
callback({"artifact": "org.kohsuke.stapler:stapler-groovy:1.250", "lang": "java", "sha1": "a8b910923b8eef79dd99c8aa6418d8ada0de4c86", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_stapler_stapler_groovy", "actual": "@org_kohsuke_stapler_stapler_groovy//jar", "bind": "jar/org/kohsuke/stapler/stapler_groovy"})
callback({"artifact": "org.kohsuke.stapler:stapler-jelly:1.250", "lang": "java", "sha1": "6ac2202bf40e48a63623803697cd1801ee716273", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_stapler_stapler_jelly", "actual": "@org_kohsuke_stapler_stapler_jelly//jar", "bind": "jar/org/kohsuke/stapler/stapler_jelly"})
callback({"artifact": "org.kohsuke.stapler:stapler-jrebel:1.250", "lang": "java", "sha1": "b6f10cb14cf3462f5a51d03a7a00337052355c8c", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_stapler_stapler_jrebel", "actual": "@org_kohsuke_stapler_stapler_jrebel//jar", "bind": "jar/org/kohsuke/stapler/stapler_jrebel"})
callback({"artifact": "org.kohsuke.stapler:stapler:1.250", "lang": "java", "sha1": "d5afb2c46a2919d22e5bc3adccf5f09fbb0fb4e3", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_stapler_stapler", "actual": "@org_kohsuke_stapler_stapler//jar", "bind": "jar/org/kohsuke/stapler/stapler"})
callback({"artifact": "org.kohsuke:access-modifier-annotation:1.11", "lang": "java", "sha1": "d1ca3a10d8be91d1525f51dbc6a3c7644e0fc6ea", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_access_modifier_annotation", "actual": "@org_kohsuke_access_modifier_annotation//jar", "bind": "jar/org/kohsuke/access_modifier_annotation"})
callback({"artifact": "org.kohsuke:akuma:1.10", "lang": "java", "sha1": "0e2c6a1f79f17e3fab13332ab8e9b9016eeab0b6", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_akuma", "actual": "@org_kohsuke_akuma//jar", "bind": "jar/org/kohsuke/akuma"})
callback({"artifact": "org.kohsuke:asm5:5.0.1", "lang": "java", "sha1": "71ab0620a41ed37f626b96d80c2a7c58165550df", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_asm5", "actual": "@org_kohsuke_asm5//jar", "bind": "jar/org/kohsuke/asm5"})
callback({"artifact": "org.kohsuke:groovy-sandbox:1.10", "lang": "java", "sha1": "f4f33a2122cca74ce8beaaf6a3c5ab9c8644d977", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_groovy_sandbox", "actual": "@org_kohsuke_groovy_sandbox//jar", "bind": "jar/org/kohsuke/groovy_sandbox"})
callback({"artifact": "org.kohsuke:libpam4j:1.8", "lang": "java", "sha1": "548d4a1177adad8242fe03a6930c335669d669ad", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_libpam4j", "actual": "@org_kohsuke_libpam4j//jar", "bind": "jar/org/kohsuke/libpam4j"})
callback({"artifact": "org.kohsuke:libzfs:0.8", "lang": "java", "sha1": "5bb311276283921f7e1082c348c0253b17922dcc", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_libzfs", "actual": "@org_kohsuke_libzfs//jar", "bind": "jar/org/kohsuke/libzfs"})
callback({"artifact": "org.kohsuke:trilead-putty-extension:1.2", "lang": "java", "sha1": "0f2f41517e1f73be8e319da27a69e0dc0c524bf6", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_trilead_putty_extension", "actual": "@org_kohsuke_trilead_putty_extension//jar", "bind": "jar/org/kohsuke/trilead_putty_extension"})
callback({"artifact": "org.kohsuke:windows-package-checker:1.2", "lang": "java", "sha1": "86b5d2f9023633808d65dbcfdfd50dc5ad3ca31f", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_kohsuke_windows_package_checker", "actual": "@org_kohsuke_windows_package_checker//jar", "bind": "jar/org/kohsuke/windows_package_checker"})
callback({"artifact": "org.mindrot:jbcrypt:0.4", "lang": "java", "sha1": "af7e61017f73abb18ac4e036954f9f28c6366c07", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_mindrot_jbcrypt", "actual": "@org_mindrot_jbcrypt//jar", "bind": "jar/org/mindrot/jbcrypt"})
callback({"artifact": "org.ow2.asm:asm-analysis:5.0.3", "lang": "java", "sha1": "c7126aded0e8e13fed5f913559a0dd7b770a10f3", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_ow2_asm_asm_analysis", "actual": "@org_ow2_asm_asm_analysis//jar", "bind": "jar/org/ow2/asm/asm_analysis"})
callback({"artifact": "org.ow2.asm:asm-commons:5.0.3", "lang": "java", "sha1": "a7111830132c7f87d08fe48cb0ca07630f8cb91c", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_ow2_asm_asm_commons", "actual": "@org_ow2_asm_asm_commons//jar", "bind": "jar/org/ow2/asm/asm_commons"})
callback({"artifact": "org.ow2.asm:asm-tree:5.0.3", "lang": "java", "sha1": "287749b48ba7162fb67c93a026d690b29f410bed", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_ow2_asm_asm_tree", "actual": "@org_ow2_asm_asm_tree//jar", "bind": "jar/org/ow2/asm/asm_tree"})
callback({"artifact": "org.ow2.asm:asm-util:5.0.3", "lang": "java", "sha1": "1512e5571325854b05fb1efce1db75fcced54389", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_ow2_asm_asm_util", "actual": "@org_ow2_asm_asm_util//jar", "bind": "jar/org/ow2/asm/asm_util"})
callback({"artifact": "org.ow2.asm:asm:5.0.3", "lang": "java", "sha1": "dcc2193db20e19e1feca8b1240dbbc4e190824fa", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_ow2_asm_asm", "actual": "@org_ow2_asm_asm//jar", "bind": "jar/org/ow2/asm/asm"})
callback({"artifact": "org.samba.jcifs:jcifs:1.3.17-kohsuke-1", "lang": "java", "sha1": "6c9114dc4075277d829ea09e15d6ffab52f2d0c0", "repository": "http://repo.jenkins-ci.org/public/", "name": "org_samba_jcifs_jcifs", "actual": "@org_samba_jcifs_jcifs//jar", "bind": "jar/org/samba/jcifs/jcifs"})
callback({"artifact": "org.slf4j:jcl-over-slf4j:1.7.7", "lang": "java", "sha1": "56003dcd0a31deea6391b9e2ef2f2dc90b205a92", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_slf4j_jcl_over_slf4j", "actual": "@org_slf4j_jcl_over_slf4j//jar", "bind": "jar/org/slf4j/jcl_over_slf4j"})
callback({"artifact": "org.slf4j:log4j-over-slf4j:1.7.7", "lang": "java", "sha1": "d521cb26a9c4407caafcec302e7804b048b07cea", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_slf4j_log4j_over_slf4j", "actual": "@org_slf4j_log4j_over_slf4j//jar", "bind": "jar/org/slf4j/log4j_over_slf4j"})
callback({"artifact": "org.slf4j:slf4j-api:1.7.7", "lang": "java", "sha1": "2b8019b6249bb05d81d3a3094e468753e2b21311", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_slf4j_slf4j_api", "actual": "@org_slf4j_slf4j_api//jar", "bind": "jar/org/slf4j/slf4j_api"})
callback({"artifact": "org.springframework:spring-aop:2.5.6.SEC03", "lang": "java", "sha1": "6468695557500723a18630b712ce112ec58827c1", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_springframework_spring_aop", "actual": "@org_springframework_spring_aop//jar", "bind": "jar/org/springframework/spring_aop"})
callback({"artifact": "org.springframework:spring-beans:2.5.6.SEC03", "lang": "java", "sha1": "79b2c86ff12c21b2420b4c46dca51f0e58762aae", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_springframework_spring_beans", "actual": "@org_springframework_spring_beans//jar", "bind": "jar/org/springframework/spring_beans"})
callback({"artifact": "org.springframework:spring-context-support:2.5.6.SEC03", "lang": "java", "sha1": "edf496f4ce066edc6b212e0e5521cb11ff97d55e", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_springframework_spring_context_support", "actual": "@org_springframework_spring_context_support//jar", "bind": "jar/org/springframework/spring_context_support"})
callback({"artifact": "org.springframework:spring-context:2.5.6.SEC03", "lang": "java", "sha1": "5f1c24b26308afedc48a90a1fe2ed334a6475921", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_springframework_spring_context", "actual": "@org_springframework_spring_context//jar", "bind": "jar/org/springframework/spring_context"})
callback({"artifact": "org.springframework:spring-core:2.5.6.SEC03", "lang": "java", "sha1": "644a23805a7ea29903bde0ccc1cd1a8b5f0432d6", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_springframework_spring_core", "actual": "@org_springframework_spring_core//jar", "bind": "jar/org/springframework/spring_core"})
callback({"artifact": "org.springframework:spring-dao:1.2.9", "lang": "java", "sha1": "6f90baf86fc833cac3c677a8f35d3333ed86baea", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_springframework_spring_dao", "actual": "@org_springframework_spring_dao//jar", "bind": "jar/org/springframework/spring_dao"})
callback({"artifact": "org.springframework:spring-jdbc:1.2.9", "lang": "java", "sha1": "8a81d42995e61e2deac49c2bc75cfacbb28e7218", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_springframework_spring_jdbc", "actual": "@org_springframework_spring_jdbc//jar", "bind": "jar/org/springframework/spring_jdbc"})
callback({"artifact": "org.springframework:spring-web:2.5.6.SEC03", "lang": "java", "sha1": "699f171339f20126f1d09dde2dd17d6db2943fce", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_springframework_spring_web", "actual": "@org_springframework_spring_web//jar", "bind": "jar/org/springframework/spring_web"})
callback({"artifact": "org.springframework:spring-webmvc:2.5.6.SEC03", "lang": "java", "sha1": "275c5ac6ade12819f49e984c8e06b114a4e23458", "repository": "https://repo.maven.apache.org/maven2/", "name": "org_springframework_spring_webmvc", "actual": "@org_springframework_spring_webmvc//jar", "bind": "jar/org/springframework/spring_webmvc"})
callback({"artifact": "oro:oro:2.0.8", "lang": "java", "sha1": "5592374f834645c4ae250f4c9fbb314c9369d698", "repository": "https://repo.maven.apache.org/maven2/", "name": "oro_oro", "actual": "@oro_oro//jar", "bind": "jar/oro/oro"})
callback({"artifact": "relaxngDatatype:relaxngDatatype:20020414", "lang": "java", "sha1": "de7952cecd05b65e0e4370cc93fc03035175eef5", "repository": "https://repo.maven.apache.org/maven2/", "name": "relaxngDatatype_relaxngDatatype", "actual": "@relaxngDatatype_relaxngDatatype//jar", "bind": "jar/relaxngDatatype/relaxngDatatype"})
callback({"artifact": "stax:stax-api:1.0.1", "lang": "java", "sha1": "49c100caf72d658aca8e58bd74a4ba90fa2b0d70", "repository": "https://repo.maven.apache.org/maven2/", "name": "stax_stax_api", "actual": "@stax_stax_api//jar", "bind": "jar/stax/stax_api"})
callback({"artifact": "xpp3:xpp3:1.1.4c", "lang": "java", "sha1": "9b988ea84b9e4e9f1874e390ce099b8ac12cfff5", "repository": "https://repo.maven.apache.org/maven2/", "name": "xpp3_xpp3", "actual": "@xpp3_xpp3//jar", "bind": "jar/xpp3/xpp3"})
| 308 | 405 | 0.738575 | 4,983 | 40,964 | 5.866346 | 0.075657 | 0.069513 | 0.052135 | 0.077997 | 0.604646 | 0.424774 | 0.324678 | 0.273125 | 0.227969 | 0.178982 | 0 | 0.102133 | 0.05683 | 40,964 | 132 | 406 | 310.333333 | 0.654467 | 0.007836 | 0 | 0 | 0 | 0.023438 | 0.777179 | 0.502165 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007813 | false | 0 | 0 | 0 | 0.007813 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
772c6187d92edb619c8814e1ca01b1532cdd27ae | 170 | py | Python | gserver.py | skupriienko/mini2 | e51a2a3bc1b53920cca1f148b8522fe173e76c0e | [
"MIT"
] | null | null | null | gserver.py | skupriienko/mini2 | e51a2a3bc1b53920cca1f148b8522fe173e76c0e | [
"MIT"
] | null | null | null | gserver.py | skupriienko/mini2 | e51a2a3bc1b53920cca1f148b8522fe173e76c0e | [
"MIT"
] | null | null | null | from gevent.pywsgi import WSGIServer
from webapp import create_app
app = create_app('webapp.config.ProdConfig')
server = WSGIServer(('', 80), app)
server.serve_forever() | 28.333333 | 44 | 0.782353 | 23 | 170 | 5.652174 | 0.608696 | 0.138462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013072 | 0.1 | 170 | 6 | 45 | 28.333333 | 0.836601 | 0 | 0 | 0 | 0 | 0 | 0.140351 | 0.140351 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
77358f20f9bf1d482b3cf6076526eef4cdf56870 | 434 | py | Python | python/py_refresh/lambda_functions.py | star-junk/references | 5bf8f4eb710ebf953131722efea55d998ea98ed2 | [
"MIT"
] | null | null | null | python/py_refresh/lambda_functions.py | star-junk/references | 5bf8f4eb710ebf953131722efea55d998ea98ed2 | [
"MIT"
] | null | null | null | python/py_refresh/lambda_functions.py | star-junk/references | 5bf8f4eb710ebf953131722efea55d998ea98ed2 | [
"MIT"
] | null | null | null |
def double(num):
return num * 2
# way 1
multiply = lambda x,y: x*y
print(multiply(5,10))
# way 2
print((lambda x,y: x+y)(6, 82))
numbers = [23, 73, 62, 3]
added = [ x*2 for x in numbers]
print(added)
added = [double(x) for x in numbers]
print(added)
added = [ (lambda x: x*2)(x) for x in numbers]
print(added)
added = map(double, numbers)
print(added)
print(list(added))
added = list(map(lambda x:x*2, numbers))
print(added) | 18.083333 | 46 | 0.64977 | 82 | 434 | 3.439024 | 0.317073 | 0.212766 | 0.301418 | 0.138298 | 0.375887 | 0.304965 | 0.304965 | 0.205674 | 0 | 0 | 0 | 0.053221 | 0.177419 | 434 | 24 | 47 | 18.083333 | 0.736695 | 0.025346 | 0 | 0.294118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0 | 0.058824 | 0.117647 | 0.470588 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
91f7ac32cedc067df58e99336a7626ac7532ce9e | 270 | py | Python | docs/examples/model_config_smart_union_on.py | fictorial/pydantic | 9d631a3429a66f30742c1a52c94ac18ec6ba848d | [
"MIT"
] | 2 | 2021-12-30T02:10:56.000Z | 2021-12-30T02:10:58.000Z | docs/examples/model_config_smart_union_on.py | fictorial/pydantic | 9d631a3429a66f30742c1a52c94ac18ec6ba848d | [
"MIT"
] | 189 | 2020-07-12T08:13:29.000Z | 2022-03-28T01:16:29.000Z | docs/examples/model_config_smart_union_on.py | amirkdv/pydantic | ef4678999f94625819ebad61b44ea264479aeb0a | [
"MIT"
] | 1 | 2022-03-01T09:58:06.000Z | 2022-03-01T09:58:06.000Z | from typing import Union
from pydantic import BaseModel
class Foo(BaseModel):
pass
class Bar(BaseModel):
pass
class Model(BaseModel):
x: Union[str, int]
y: Union[Foo, Bar]
class Config:
smart_union = True
print(Model(x=1, y=Bar()))
| 11.73913 | 30 | 0.648148 | 38 | 270 | 4.578947 | 0.526316 | 0.149425 | 0.206897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004926 | 0.248148 | 270 | 22 | 31 | 12.272727 | 0.852217 | 0 | 0 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.166667 | 0.166667 | 0 | 0.666667 | 0.083333 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
6218e54fff8faeb133835ea2bec246b407fcb7a0 | 8,247 | py | Python | generator/article_devops_cloud.py | david-salac/itblog.github.io | 2f8be9f7ab058ae196bc46d0cde4d75d936c7e86 | [
"MIT"
] | 1 | 2020-11-23T09:36:25.000Z | 2020-11-23T09:36:25.000Z | generator/article_devops_cloud.py | david-salac/itblog.github.io | 2f8be9f7ab058ae196bc46d0cde4d75d936c7e86 | [
"MIT"
] | null | null | null | generator/article_devops_cloud.py | david-salac/itblog.github.io | 2f8be9f7ab058ae196bc46d0cde4d75d936c7e86 | [
"MIT"
] | null | null | null | # More about some useful concepts in Python language
import datetime
import crinita as cr
lead = """There are many common challenges related to systems that process large data sets (like big netCDF or GRIB files). One of the most important decision is whether to deploy infrastructure on some cloud service or if it is better to use an on-premises solution. If the cloud appeals to you then other challenges arise, which provider would be the best one, how to deploy - use infrastructure as a code. """
content = """There are many common challenges related to systems that process large data sets (like big netCDF or GRIB files). One of the most important decision is whether to deploy infrastructure on some cloud service or if it is better to use an on-premises solution. If the cloud appeals to you then other challenges arise, which provider would be the best one, how to deploy - use infrastructure as a code, or just some simple solution etc.
<h2>What systems do we analyse?</h2>
<p>This article is mainly focused on systems that operate with satellite images (and similar very big data sets). The common feature of these systems is that just a single data file can have many gigabytes (or tens of gigabytes) and it needs many of them to operate. The typical example might be a system for the prediction of energy of renewable resources (like solar photovoltaic panels, wind turbines or wave devices). These generators require a long data series of various variables (like wind speed, irradiance, temperature) to correct prediction in a fine resolution (like ten years in resolution ten minutes), also a spatial resolution should be very high. The same challenge is to predict doses of irradiance for a system designed to protect from sunburns. There are also many other applications.</p>
<h2>What is the common challenge?</h2>
<p>We need to have data quickly available, store them as cheaply as possible and optimally on some distributed system. As you can guess, it is technically not possible to have all these properties simultaneously. You can always have just two of these options. The cloud-based solution offers you a quickly available distributed interface - but it is certainly not cheap. If you have an on-premises solution - it is cheap and quick (but not available - distributed).</p>
<figure>
<img src="images/cloud_dilema.png" alt="Figure 1: The challenge of the system">
<figcaption>Figure 1: The challenge of the system - always only two of these three requirements are available</figcaption>
</figure>
<h2>Local and production stack</h2>
<p>One of the challenges when deploying your code is the difference between local and production (or staging) stack. Some people tend to use different technologies on the local stack and the production one - that is not a good idea. Especially the database technology should be the same on both. If all your containers should be the same in production as they are on your local machine. If you use Kubertness for your deployment stack and Docker Swarm on the local one can also make a difference (although in this case tolerable one).</p>
<h2>Cloud-based solution</h2>
<p>Let's start with a selection of cloud services provider. Without any impudence I dare myself to say that it almost does not matter - the difference is very subtle. Both price and the quality of services will be similar no matter if you chose Amazon, G-Cloud, Azure (having experience with all of them). It is though reasonable to spend some time studying the possibilities of each provider - as practically impossible to stay cloud-agnostic for a long time.</p>
<p>The best practical approach is infrastructure as code - by deploying Terraforms or Kubertness with Docker. It will save you a lot of time in the future. Optimally, use environmental variables to determining the difference between local and production stack.</p>
<p>If it comes to price calculation of your cloud-based solution - be aware of a few things. Among them is the fact that most of the expensive parts of your stack can be off for most of the time (that saves money - as the price of services is lower when they are not running). Also, surprisingly a quite expensive part of your system will be the storage place (as you will need many terabytes of data). It can be helpful to study the optimal way how to store data for each provider (for example MS Azure provides different tiers for storing data regarding how quickly you need to access them).</p>
<p>Regarding the practical experience with a system that was designed for renewable energy prediction with a few active users and about 1TB of data - expect a price of around £1000 per month (in 2020 price levels). The same price level was in the system designed for processing satellite data for healthcare purposes.</p>
<h2>On-premises solution</h2>
<p>An on-premises solution (meaning servers that you really own) provides you with a cheaper solution. The disadvantages compared to the cloud-based solution are quite clear. You need to spend a lot of time maintaining devices. On the other hand - do not believe that cloud-based solution works without any intervention. You will probably have one dedicated DevOps person for your systems one way or another (so all the savings go nowhere).</p>
<p>The local solution is a perfect thing from a data science perspective - for testing and pre-processing of data. Mainly doing some filtration of input images and their classification (or other numerically difficult computations). You can save a lot of money this way. Similarly, the on-premises solution does not suffer from expensive storage spaces - as disks are quite cheap.</p>
<h2>Dropbox (or similar technologies) for storage</h2>
<p>There are some surprising ways how to save many and stay on the cloud. One of them is to use external storages for your big data sets. One such solution is Dropbox (from Google). There is a Python API for accessing it and it is quite cheap (compared to cloud-based storages). There are of course many other similar technologies. Using them, you can save many hundreds of pounds every month without losing any advantage of a cloud-based environment.</p>
<h2>Dockerizing your entities</h2>
<p>The structure of your system will probably be the same no matter what your application specifically does. It is however good to be aware that dockerizing most of the containers is a trivial task - however, the remaining few per cent will cause you a massive headache. For example, GeoServer, an application for providing map tires is a total disaster - an old fashion big (state-full) server. The traditional Pareto principle applies here (spend 80 per cent of your time with something that provides you less than 20 per cent of outcomes). One thing is also important to bear in mind if you decide to deploy in the cloud - try to use cloud services for things like databases as much as you can (it can save you a lot of time and also money).</p>
<h2>Summary</h2>
<p>There are many other things regarding operational tasks for systems processing environmental data. If it comes to the decision where your infrastructure runs - there is no simple pattern to follow. The on-premises solution has many pros and cons as does a cloud-based solution. Generally, it makes sense to have the production part on the cloud and the data (pre)processing part running locally. However, if you decide to run everything on your machines it is a fully legitimate approach as well.</p>
"""
ENTITY = cr.Article(
title="DevOps challenges in system processing satellite environmental data",
url_alias='devops-challenges-in-system-processing-satellite-environmental-data',
large_image_path="images/devops_big.jpg",
small_image_path="images/devops_small.jpg",
date=datetime.datetime(2020, 3, 7),
tags=[cr.Tag('Python', 'python'),
cr.Tag('Design', 'design'),
cr.Tag('DevOps', 'devops'),
cr.Tag('Geospatial', 'geospatial'),
cr.Tag('Web application', 'web-application')],
content=content,
lead=lead,
description="There are many common challenges related to systems that process large data sets. The most important decision is if to deploy on a cloud service or locally." # noqa: E501
)
| 142.189655 | 808 | 0.781618 | 1,374 | 8,247 | 4.686317 | 0.316594 | 0.005591 | 0.019568 | 0.012424 | 0.159031 | 0.154993 | 0.13791 | 0.128591 | 0.109955 | 0.109955 | 0 | 0.005834 | 0.168667 | 8,247 | 57 | 809 | 144.684211 | 0.933197 | 0.007397 | 0 | 0 | 0 | 0.386364 | 0.950507 | 0.019797 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.136364 | 0 | 0.136364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
622c73ea5e40f2a0d268749bd75972c15ca07d60 | 127 | py | Python | core/fields.py | dmoney/djangopackages | 746cd47f8171229da3276b81d3c8454bdd887928 | [
"MIT"
] | 383 | 2015-05-06T03:51:51.000Z | 2022-03-26T07:56:44.000Z | core/fields.py | dmoney/djangopackages | 746cd47f8171229da3276b81d3c8454bdd887928 | [
"MIT"
] | 257 | 2017-04-17T08:31:16.000Z | 2022-03-27T02:30:49.000Z | core/fields.py | dmoney/djangopackages | 746cd47f8171229da3276b81d3c8454bdd887928 | [
"MIT"
] | 105 | 2017-04-17T06:21:26.000Z | 2022-03-30T05:24:19.000Z | from django_extensions.db.fields import ( # unimport:skip
CreationDateTimeField,
ModificationDateTimeField,
) # noqa
| 25.4 | 58 | 0.76378 | 11 | 127 | 8.727273 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.165354 | 127 | 4 | 59 | 31.75 | 0.90566 | 0.141732 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
6232b29228a4e4241335361d917736d820e336ca | 634 | py | Python | Scripts/002_python_challenge/q013.py | OrangePeelFX/Python-Tutorial | 0d47f194553666304765f5bbc928374b7aec8a48 | [
"MIT"
] | null | null | null | Scripts/002_python_challenge/q013.py | OrangePeelFX/Python-Tutorial | 0d47f194553666304765f5bbc928374b7aec8a48 | [
"MIT"
] | 1 | 2021-06-02T00:28:17.000Z | 2021-06-02T00:28:17.000Z | Scripts/002_python_challenge/q013.py | florianwns/python-scripts | 0d47f194553666304765f5bbc928374b7aec8a48 | [
"MIT"
] | 1 | 2020-01-13T11:08:18.000Z | 2020-01-13T11:08:18.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""Question 013
Call him
phone that evil
Source: http://www.pythonchallenge.com/pc/return/disproportional.html
http://www.pythonchallenge.com/pc/phonebook.php
'Bert' is the devil
"""
import xmlrpc.client
with xmlrpc.client.ServerProxy("http://www.pythonchallenge.com/pc/phonebook.php") as proxy:
# help(proxy)
try:
print(proxy.phone("Bert")) # 555-ITALY
except xmlrpc.client.Fault as err:
print("A fault occurred")
print("Fault code: %d" % err.faultCode)
print("Fault string: %s" % err.faultString)
# the good answer is 'italy'
| 23.481481 | 91 | 0.667192 | 86 | 634 | 4.918605 | 0.627907 | 0.049645 | 0.156028 | 0.177305 | 0.248227 | 0.184397 | 0.184397 | 0 | 0 | 0 | 0 | 0.015444 | 0.182965 | 634 | 26 | 92 | 24.384615 | 0.801158 | 0.440063 | 0 | 0 | 0 | 0 | 0.281977 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.125 | 0 | 0.125 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
623ededce03df67722d23a2189404a68abe313db | 6,903 | py | Python | tests/autoscaling/test_ec2_fitness.py | jackchi/paasta | 0899adcef43cb07c247a36f5af82f09bb6f8db12 | [
"Apache-2.0"
] | 2 | 2020-04-09T06:58:46.000Z | 2021-05-03T21:56:03.000Z | tests/autoscaling/test_ec2_fitness.py | jackchi/paasta | 0899adcef43cb07c247a36f5af82f09bb6f8db12 | [
"Apache-2.0"
] | 4 | 2021-02-08T21:00:33.000Z | 2021-06-02T03:29:31.000Z | tests/autoscaling/test_ec2_fitness.py | jackchi/paasta | 0899adcef43cb07c247a36f5af82f09bb6f8db12 | [
"Apache-2.0"
] | 1 | 2020-09-29T03:23:02.000Z | 2020-09-29T03:23:02.000Z | from datetime import datetime
import mock
from mock import Mock
from paasta_tools.autoscaling import ec2_fitness
from paasta_tools.mesos_tools import SlaveTaskCount
def test_sort_by_total_tasks():
mock_slave_1 = Mock(task_counts=SlaveTaskCount(count=3, slave=Mock(), chronos_count=0))
mock_slave_2 = Mock(task_counts=SlaveTaskCount(count=2, slave=Mock(), chronos_count=1))
mock_slave_3 = Mock(task_counts=SlaveTaskCount(count=5, slave=Mock(), chronos_count=0))
ret = ec2_fitness.sort_by_total_tasks([mock_slave_1, mock_slave_2, mock_slave_3])
assert ret == [mock_slave_3, mock_slave_1, mock_slave_2]
def test_sort_by_running_batch_count():
mock_slave_1 = Mock(task_counts=SlaveTaskCount(count=3, slave=Mock(), chronos_count=1))
mock_slave_2 = Mock(task_counts=SlaveTaskCount(count=2, slave=Mock(), chronos_count=2))
mock_slave_3 = Mock(task_counts=SlaveTaskCount(count=5, slave=Mock(), chronos_count=3))
ret = ec2_fitness.sort_by_running_batch_count([mock_slave_1, mock_slave_2, mock_slave_3])
assert ret == [mock_slave_3, mock_slave_2, mock_slave_1]
def test_sort_by_health_system_instance_health_system_status_failed():
mock_slave_1 = Mock(name='slave1')
mock_slave_1.task_counts = SlaveTaskCount(
count=3,
slave=Mock(),
chronos_count=1,
)
mock_slave_1.instance_status = {
'Events': [
{
'Code': 'instance-reboot',
'Description': 'string',
'NotBefore': datetime(2015, 1, 1),
'NotAfter': datetime(2015, 1, 1),
},
],
'SystemStatus': {
'Status': 'impaired',
},
'InstanceStatus': {
'Status': 'ok',
},
}
mock_slave_2 = Mock(name='slave2')
mock_slave_2.task_counts = SlaveTaskCount(
count=3,
slave=Mock(),
chronos_count=1,
),
mock_slave_2.instance_status = {
'Events': [
{
'Code': 'instance-reboot',
'Description': 'string',
'NotBefore': datetime(2015, 1, 1),
'NotAfter': datetime(2015, 1, 1),
},
],
'SystemStatus': {
'Status': 'ok',
},
'InstanceStatus': {
'Status': 'ok',
},
}
ret = ec2_fitness.sort_by_system_instance_health([mock_slave_1, mock_slave_2])
assert ret == [mock_slave_2, mock_slave_1]
def test_sort_by_upcoming_events():
mock_slave_1 = Mock()
mock_slave_1.task_counts = SlaveTaskCount(
count=3,
slave=Mock(),
chronos_count=1,
)
mock_slave_1.instance_status = {
'Events': [],
'SystemStatus': {
'Status': 'ok',
},
'InstanceStatus': {
'Status': 'ok',
},
}
mock_slave_2 = Mock()
mock_slave_2. task_counts = SlaveTaskCount(
count=3,
slave=Mock(),
chronos_count=1,
)
mock_slave_2.instance_status = {
'Events': [
{
'Code': 'instance-reboot',
'Description': 'string',
'NotBefore': datetime(2015, 1, 1),
'NotAfter': datetime(2015, 1, 1),
},
],
'SystemStatus': {
'Status': 'ok',
},
'InstanceStatus': {
'Status': 'ok',
},
}
ret = ec2_fitness.sort_by_upcoming_events([mock_slave_1, mock_slave_2])
assert ret == [mock_slave_1, mock_slave_2]
def test_sort_by_fitness_calls_all_sorting_funcs():
with mock.patch(
'paasta_tools.autoscaling.ec2_fitness.sort_by_system_instance_health',
autospec=True,
) as mock_sort_by_system_instance_health, mock.patch(
'paasta_tools.autoscaling.ec2_fitness.sort_by_upcoming_events',
autospec=True,
) as mock_sort_by_upcoming_events, mock.patch(
'paasta_tools.autoscaling.ec2_fitness.sort_by_running_batch_count',
autospec=True,
) as mock_sort_by_running_batch_count, mock.patch(
'paasta_tools.autoscaling.ec2_fitness.sort_by_total_tasks',
autospec=True,
) as mock_sort_by_total_tasks:
instances = []
ec2_fitness.sort_by_ec2_fitness(instances)
assert mock_sort_by_total_tasks.called
assert mock_sort_by_running_batch_count.called
assert mock_sort_by_upcoming_events.called
assert mock_sort_by_system_instance_health.called
def test_sort_by_fitness():
mock_slave_1 = Mock(name='slave1')
mock_slave_1.task_counts = SlaveTaskCount(
count=3,
slave=Mock(),
chronos_count=1,
)
mock_slave_1.instance_status = {
'Events': [],
'SystemStatus': {'Status': 'impaired', },
'InstanceStatus': {'Status': 'ok', },
}
mock_slave_2 = Mock(name='slave2')
mock_slave_2.task_counts = SlaveTaskCount(
count=3,
slave=Mock(),
chronos_count=1,
)
mock_slave_2.instance_status = {
'Events': [
{
'Code': 'instance-reboot',
'Description': 'foo',
'NotBefore': datetime(2015, 1, 1),
'NotAfter': datetime(2015, 1, 1),
},
],
'SystemStatus': {'Status': 'ok', },
'InstanceStatus': {'Status': 'ok', },
}
mock_slave_3 = Mock(name='slave3')
mock_slave_3.task_counts = SlaveTaskCount(
count=2,
slave=Mock(),
chronos_count=3,
)
mock_slave_3.instance_status = {
'Events': [],
'SystemStatus': {'Status': 'ok', },
'InstanceStatus': {'Status': 'ok', },
}
mock_slave_4 = Mock(name='slave4')
mock_slave_4.task_counts = SlaveTaskCount(
count=3,
slave=Mock(),
chronos_count=1,
)
mock_slave_4.instance_status = {
'Events': [],
'SystemStatus': {'Status': 'ok', },
'InstanceStatus': {'Status': 'ok', },
}
mock_slave_5 = Mock(name='slave5')
mock_slave_5.task_counts = SlaveTaskCount(
count=1,
slave=Mock(),
chronos_count=1,
)
mock_slave_5.instance_status = {
'Events': [],
'SystemStatus': {'Status': 'ok', },
'InstanceStatus': {'Status': 'ok', },
}
ret = ec2_fitness.sort_by_ec2_fitness([mock_slave_1, mock_slave_2, mock_slave_3, mock_slave_4, mock_slave_5])
# we expect this order for the following reason:
# mock_slave_1 is impaired and so should be killed asap
# mock_slave_2 has an upcoming event
# mock_slave_5 and mock_slave_4 have the fewest chronos tasks, and so should be killed before
# mock_slave_3 (we cant drain chronos tasks, so try and save them)
# mock_slave_5 has fewer tasks than mock_slave_4, and so is a better candidate for killing
assert ret == [mock_slave_3, mock_slave_4, mock_slave_5, mock_slave_2, mock_slave_1]
| 32.71564 | 113 | 0.605389 | 818 | 6,903 | 4.743276 | 0.122249 | 0.162371 | 0.056701 | 0.112113 | 0.837887 | 0.791495 | 0.71933 | 0.690206 | 0.668814 | 0.571134 | 0 | 0.033546 | 0.274518 | 6,903 | 210 | 114 | 32.871429 | 0.741214 | 0.055193 | 0 | 0.507937 | 0 | 0 | 0.144589 | 0.037913 | 0 | 0 | 0 | 0 | 0.047619 | 1 | 0.031746 | false | 0 | 0.026455 | 0 | 0.058201 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
62538d0da3a4c56140848b19292c7fe10686f115 | 647 | py | Python | conans/test/utils/runner.py | wahlm/conan | 1afadb5cca9e1c688c7b37c69a0ff3c6a6dbe257 | [
"MIT"
] | 3 | 2016-11-11T01:09:44.000Z | 2017-07-19T13:30:17.000Z | conans/test/utils/runner.py | wahlm/conan | 1afadb5cca9e1c688c7b37c69a0ff3c6a6dbe257 | [
"MIT"
] | 6 | 2017-06-14T11:40:15.000Z | 2020-05-23T01:43:28.000Z | conans/test/utils/runner.py | wahlm/conan | 1afadb5cca9e1c688c7b37c69a0ff3c6a6dbe257 | [
"MIT"
] | 2 | 2017-11-29T14:05:22.000Z | 2018-09-19T12:43:33.000Z | from conans.client.runner import ConanRunner
class TestRunner(object):
"""Wraps Conan runner and allows to redirect all the ouput to an StrinIO passed
in the __init__ method"""
def __init__(self, output, runner=None):
self._output = output
self.runner = runner or ConanRunner(print_commands_to_output=True,
generate_run_log_file=True,
log_run_to_output=True)
def __call__(self, command, output=None, log_filepath=None, cwd=None):
return self.runner(command, output=self._output, log_filepath=log_filepath, cwd=cwd)
| 40.4375 | 92 | 0.646059 | 80 | 647 | 4.9 | 0.5125 | 0.076531 | 0.061224 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.279753 | 647 | 15 | 93 | 43.133333 | 0.841202 | 0.153014 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0.111111 | 0.555556 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
625fc0b3b0005a0b7f1e7120b4bcd72cc6ac2f3b | 354 | py | Python | mcdc_tnt/numba_kernels/kernels.py | jpmorgan98/MCDC-TNT | a7772b169eb431c54e729feff4128545a735c7c2 | [
"BSD-3-Clause"
] | 1 | 2022-02-26T02:12:12.000Z | 2022-02-26T02:12:12.000Z | mcdc_tnt/numba_kernels/kernels.py | jpmorgan98/MCDC-TNT | a7772b169eb431c54e729feff4128545a735c7c2 | [
"BSD-3-Clause"
] | null | null | null | mcdc_tnt/numba_kernels/kernels.py | jpmorgan98/MCDC-TNT | a7772b169eb431c54e729feff4128545a735c7c2 | [
"BSD-3-Clause"
] | 1 | 2022-02-22T20:31:25.000Z | 2022-02-22T20:31:25.000Z | """
Created on Thu Nov 18 11:46:11 2021
@author: jack
"""
from mcdc.SourceParticles import SourceParticles
import Advance.Advance as Advance
import SampleEvent.SampleEvent as SampleEvent
import FissionsAdd.FissionsAdd as FissionsAdd
import CleanUp.CleanUp as BringOutYourDead
import Scatter.Scatter as Scatter
import SourceParticles.StillIn as StillIn
| 25.285714 | 48 | 0.836158 | 46 | 354 | 6.434783 | 0.478261 | 0.141892 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038339 | 0.115819 | 354 | 13 | 49 | 27.230769 | 0.907348 | 0.141243 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
626311efcb447279b67aea6e4634d86342b5fe34 | 3,338 | py | Python | congruence_rings.py | HubertHolin/Cayley-Dickson | d85d4837aa5e6ffa1b7d9a977a7fdce885dec40f | [
"BSL-1.0"
] | null | null | null | congruence_rings.py | HubertHolin/Cayley-Dickson | d85d4837aa5e6ffa1b7d9a977a7fdce885dec40f | [
"BSL-1.0"
] | null | null | null | congruence_rings.py | HubertHolin/Cayley-Dickson | d85d4837aa5e6ffa1b7d9a977a7fdce885dec40f | [
"BSL-1.0"
] | null | null | null | #!/usr/bin/env python3
"""
congruence_rings
This script provides congruence ring objects. Inspired (but different) from Jeremy Kun's
implementation (https://github.com/j2kun/elliptic-curve-signature,
https://jeremykun.com/2014/02/08/introducing-elliptic-curves/).
(C) Copyright Hubert Holin 2018.
Distributed under the Boost Software License, Version 1.0. (See
accompanying file LICENSE_1_0.txt or copy at
http://www.boost.org/LICENSE_1_0.txt)
"""
from arithmetic import PGCD, Bézout
from compatibility_check import *
def congruence_ring(cardinal):
class congruence_ring: # Yes, the class and the enclosing function have the same name
base_ring = ("congruence_ring", cardinal)
current_algebra = base_ring
def dump(self):
return (self.__n,)
def flatten(self):
return (self.__n,)
def upcast(factor):
return __class__(factor)
def neutral_element_for_multiplication():
return __class__(1)
def is_invertible(self):
return PGCD(congruence_ring.base_ring[1], self.__n) == 1
def is_unimodular(self):
return (self.__n*self.__n) % congruence_ring.base_ring[1] == 1
def __init__(self, n = 0): # We will need a default constructor
self.__n = n % congruence_ring.base_ring[1]
@compatibility_check
def __add__(self, other):
return congruence_ring(self.__n + other.__n)
@compatibility_check
def __sub__(self, other):
return congruence_ring(self.__n - other.__n)
def __neg__(self):
return congruence_ring(-self.__n)
@compatibility_check
def __mul__(self, other):
return congruence_ring(self.__n * other.__n)
def __matmul__(self, other):
# We hijack this operator to represent the external product
if self.current_algebra == other.current_algebra:
return self*other
else:
return NotImplemented
def inverse(self):
if not __class__.is_invertible(self):
raise ZeroDivisionError("{0:s} is not invertible!".format(self))
u, v, pgcd = Bézout(self.__n, congruence_ring.base_ring[1])
return congruence_ring(u)
@compatibility_check
def __truediv__(self, other):
return self * other.inverse()
def __pow__(self, a_power):
return congruence_ring(pow(self.__n, a_power, cardinal))
def conjugate(self):
return congruence_ring(self.__n)
def __eq__(self, other):
if not hasattr(other, 'current_algebra'):
return False
elif self.current_algebra != other.current_algebra:
return False
else:
return self.__n == other.__n
def __ne__(self, other):
if not hasattr(other, 'current_algebra'):
return True
elif self.current_algebra != other.current_algebra:
return True
else:
return self.__n != other.__n
def __str__(self):
return str(self.__n)
def __repr__(self):
return "{0:d} [{1:d}]".format(self.__n, congruence_ring.base_ring[1])
def __format__(self, spec):
return "{0:d} [{1:d}]".format(self.__n, congruence_ring.base_ring[1])
if cardinal <= 1:
raise ValueError("For our intended use, we must have 'cardinal' > 1 "+
"but 'cadinal' == {0:d}!".format(cardinal))
return congruence_ring
| 18.752809 | 88 | 0.662073 | 433 | 3,338 | 4.729792 | 0.318707 | 0.046387 | 0.078125 | 0.064453 | 0.344727 | 0.307129 | 0.26709 | 0.195313 | 0.149414 | 0.084961 | 0 | 0.014844 | 0.233074 | 3,338 | 177 | 89 | 18.858757 | 0.785156 | 0.177951 | 0 | 0.263889 | 0 | 0 | 0.061516 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.305556 | false | 0 | 0.027778 | 0.222222 | 0.736111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
62668e9d58e233d7e83a33a0150e5826f797b11e | 150 | py | Python | config.debug.py | M-Mueller/health-tracker | e8497deca2eaaf18ed60ce621c399dd8ec541f89 | [
"MIT"
] | null | null | null | config.debug.py | M-Mueller/health-tracker | e8497deca2eaaf18ed60ce621c399dd8ec541f89 | [
"MIT"
] | null | null | null | config.debug.py | M-Mueller/health-tracker | e8497deca2eaaf18ed60ce621c399dd8ec541f89 | [
"MIT"
] | null | null | null | SQLALCHEMY_DATABASE_URI = 'sqlite:////tmp/test.db'
SQLALCHEMY_TRACK_MODIFICATIONS = False
SECRET_KEY = b'E\xbdv\xf0\xbd7\x0b\xf9\xce.\x94\xcerx5\xcd'
| 37.5 | 59 | 0.78 | 24 | 150 | 4.666667 | 0.958333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049645 | 0.06 | 150 | 3 | 60 | 50 | 0.744681 | 0 | 0 | 0 | 0 | 0 | 0.433333 | 0.433333 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
627d04cd0e605f630b12815374983fc26f7c52cf | 462 | py | Python | vizdoomgym/envs/__init__.py | ArnaudFickinger/vizdoomgym | de001b4158d49d9eb1ae516346f05ad28b163961 | [
"MIT"
] | null | null | null | vizdoomgym/envs/__init__.py | ArnaudFickinger/vizdoomgym | de001b4158d49d9eb1ae516346f05ad28b163961 | [
"MIT"
] | null | null | null | vizdoomgym/envs/__init__.py | ArnaudFickinger/vizdoomgym | de001b4158d49d9eb1ae516346f05ad28b163961 | [
"MIT"
] | null | null | null | from vizdoomgym.envs.vizdoomenv import VizdoomEnv
from vizdoomgym.envs.vizdoom_env_definitions import (
VizdoomBasic,
VizdoomCorridor5,
VizdoomCorridor1,
VizdoomCorridor3,
VizdoomCorridor7,
VizdoomCorridorSparse5,
VizdoomCorridorSparse1,
VizdoomDefendCenter,
VizdoomDefendLine,
VizdoomHealthGathering,
VizdoomMyWayHome,
VizdoomPredictPosition,
VizdoomTakeCover,
VizdoomDeathmatch,
VizdoomHealthGatheringSupreme,
)
| 24.315789 | 53 | 0.798701 | 28 | 462 | 13.107143 | 0.821429 | 0.076294 | 0.098093 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015385 | 0.155844 | 462 | 18 | 54 | 25.666667 | 0.925641 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.111111 | 0 | 0.111111 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
628806ab4f308736cb8bf8f9f3162c1f362ca092 | 164 | py | Python | jspider/http/__init__.py | goodking-bq/Jspider | 2ae484fe8ec59824ba40df5aa3d3c434486da2d8 | [
"Apache-2.0"
] | null | null | null | jspider/http/__init__.py | goodking-bq/Jspider | 2ae484fe8ec59824ba40df5aa3d3c434486da2d8 | [
"Apache-2.0"
] | null | null | null | jspider/http/__init__.py | goodking-bq/Jspider | 2ae484fe8ec59824ba40df5aa3d3c434486da2d8 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding:utf-8 -*-
from .request import Request
from .response import Response
__author__ = 'golden'
__create_date__ = '2018/5/26 22:22'
| 20.5 | 35 | 0.713415 | 24 | 164 | 4.5 | 0.791667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084507 | 0.134146 | 164 | 7 | 36 | 23.428571 | 0.676056 | 0.25 | 0 | 0 | 0 | 0 | 0.173554 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
656603ce7f279a63c4069c659275c5569d4f49cf | 307 | py | Python | datek_jaipur/application/state_machine/fsm.py | DAtek/datek-jaipur | e49e4b391f2e23ed5a333477cc479ccbc1c90dee | [
"MIT"
] | null | null | null | datek_jaipur/application/state_machine/fsm.py | DAtek/datek-jaipur | e49e4b391f2e23ed5a333477cc479ccbc1c90dee | [
"MIT"
] | 1 | 2022-03-26T11:05:28.000Z | 2022-03-26T11:05:28.000Z | datek_jaipur/application/state_machine/fsm.py | DAtek/datek-jaipur | e49e4b391f2e23ed5a333477cc479ccbc1c90dee | [
"MIT"
] | null | null | null | from typing import AsyncGenerator
from datek_async_fsm.fsm import BaseFSM
from datek_jaipur.application.state_machine.scope import Scope
class FSM(BaseFSM):
scope: Scope
async def _input_generator(self) -> AsyncGenerator[dict, None]:
while True:
yield {"scope": self.scope}
| 21.928571 | 67 | 0.729642 | 39 | 307 | 5.589744 | 0.589744 | 0.082569 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.19544 | 307 | 13 | 68 | 23.615385 | 0.882591 | 0 | 0 | 0 | 0 | 0 | 0.016287 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.375 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
65b7629ab018143fd3fd4b9981c30cd57cfea71e | 516 | py | Python | d2l-enhancements/d2l/environment_enhancements.py | Brightspace/circleci-elastalert-docker-image | 872077811253182d33316c2710e9b2eb4f16f687 | [
"MIT"
] | null | null | null | d2l-enhancements/d2l/environment_enhancements.py | Brightspace/circleci-elastalert-docker-image | 872077811253182d33316c2710e9b2eb4f16f687 | [
"MIT"
] | 3 | 2021-10-30T05:23:02.000Z | 2021-11-02T18:45:04.000Z | d2l-enhancements/d2l/environment_enhancements.py | Brightspace/circleci-elastalert-docker-image | 872077811253182d33316c2710e9b2eb4f16f687 | [
"MIT"
] | null | null | null | from elastalert.enhancements import BaseEnhancement
from elastalert.enhancements import DropMatchException
from os import environ
class AppendDataCenter(BaseEnhancement):
def process(self, match):
if 'D2L_DATA_CENTER' in environ:
match['d2l_data_center'] = environ['D2L_DATA_CENTER']
else:
match['d2l_data_center'] = 'unknown'
class ExcludeDevelopmentEnvironments(BaseEnhancement):
def process(self, match):
pass
class OnlyDevelopmentEnvironments(BaseEnhancement):
def process(self, match):
pass
| 27.157895 | 56 | 0.800388 | 57 | 516 | 7.105263 | 0.421053 | 0.069136 | 0.128395 | 0.214815 | 0.271605 | 0.187654 | 0 | 0 | 0 | 0 | 0 | 0.008772 | 0.116279 | 516 | 18 | 57 | 28.666667 | 0.879386 | 0 | 0 | 0.333333 | 0 | 0 | 0.129845 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0.133333 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
65bab45b181d5abe781b9eab8a78f18aa1c38590 | 4,332 | py | Python | aliyun-python-sdk-ecs/aliyunsdkecs/request/v20140526/DescribeDemandsRequest.py | jia-jerry/aliyun-openapi-python-sdk | e90f3683a250cfec5b681b5f1d73a68f0dc9970d | [
"Apache-2.0"
] | null | null | null | aliyun-python-sdk-ecs/aliyunsdkecs/request/v20140526/DescribeDemandsRequest.py | jia-jerry/aliyun-openapi-python-sdk | e90f3683a250cfec5b681b5f1d73a68f0dc9970d | [
"Apache-2.0"
] | 1 | 2020-05-31T14:51:47.000Z | 2020-05-31T14:51:47.000Z | aliyun-python-sdk-ecs/aliyunsdkecs/request/v20140526/DescribeDemandsRequest.py | jia-jerry/aliyun-openapi-python-sdk | e90f3683a250cfec5b681b5f1d73a68f0dc9970d | [
"Apache-2.0"
] | null | null | null | # Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
#
# http://www.apache.org/licenses/LICENSE-2.0
#
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
from aliyunsdkcore.request import RpcRequest
from aliyunsdkecs.endpoint import endpoint_data
class DescribeDemandsRequest(RpcRequest):
def __init__(self):
RpcRequest.__init__(self, 'Ecs', '2014-05-26', 'DescribeDemands','ecs')
self.set_method('POST')
if hasattr(self, "endpoint_map"):
setattr(self, "endpoint_map", endpoint_data.getEndpointMap())
if hasattr(self, "endpoint_regional"):
setattr(self, "endpoint_regional", endpoint_data.getEndpointRegional())
def get_ResourceOwnerId(self):
return self.get_query_params().get('ResourceOwnerId')
def set_ResourceOwnerId(self,ResourceOwnerId):
self.add_query_param('ResourceOwnerId',ResourceOwnerId)
def get_PageNumber(self):
return self.get_query_params().get('PageNumber')
def set_PageNumber(self,PageNumber):
self.add_query_param('PageNumber',PageNumber)
def get_PageSize(self):
return self.get_query_params().get('PageSize')
def set_PageSize(self,PageSize):
self.add_query_param('PageSize',PageSize)
def get_InstanceType(self):
return self.get_query_params().get('InstanceType')
def set_InstanceType(self,InstanceType):
self.add_query_param('InstanceType',InstanceType)
def get_Tags(self):
return self.get_query_params().get('Tags')
def set_Tags(self, Tags):
for depth1 in range(len(Tags)):
if Tags[depth1].get('Key') is not None:
self.add_query_param('Tag.' + str(depth1 + 1) + '.Key', Tags[depth1].get('Key'))
if Tags[depth1].get('Value') is not None:
self.add_query_param('Tag.' + str(depth1 + 1) + '.Value', Tags[depth1].get('Value'))
def get_InstanceChargeType(self):
return self.get_query_params().get('InstanceChargeType')
def set_InstanceChargeType(self,InstanceChargeType):
self.add_query_param('InstanceChargeType',InstanceChargeType)
def get_DryRun(self):
return self.get_query_params().get('DryRun')
def set_DryRun(self,DryRun):
self.add_query_param('DryRun',DryRun)
def get_ResourceOwnerAccount(self):
return self.get_query_params().get('ResourceOwnerAccount')
def set_ResourceOwnerAccount(self,ResourceOwnerAccount):
self.add_query_param('ResourceOwnerAccount',ResourceOwnerAccount)
def get_OwnerAccount(self):
return self.get_query_params().get('OwnerAccount')
def set_OwnerAccount(self,OwnerAccount):
self.add_query_param('OwnerAccount',OwnerAccount)
def get_InstanceTypeFamily(self):
return self.get_query_params().get('InstanceTypeFamily')
def set_InstanceTypeFamily(self,InstanceTypeFamily):
self.add_query_param('InstanceTypeFamily',InstanceTypeFamily)
def get_OwnerId(self):
return self.get_query_params().get('OwnerId')
def set_OwnerId(self,OwnerId):
self.add_query_param('OwnerId',OwnerId)
def get_DemandStatuss(self):
return self.get_query_params().get('DemandStatuss')
def set_DemandStatuss(self, DemandStatuss):
for depth1 in range(len(DemandStatuss)):
if DemandStatuss[depth1] is not None:
self.add_query_param('DemandStatus.' + str(depth1 + 1) , DemandStatuss[depth1])
def get_DemandId(self):
return self.get_query_params().get('DemandId')
def set_DemandId(self,DemandId):
self.add_query_param('DemandId',DemandId)
def get_ZoneId(self):
return self.get_query_params().get('ZoneId')
def set_ZoneId(self,ZoneId):
self.add_query_param('ZoneId',ZoneId)
def get_DemandType(self):
return self.get_query_params().get('DemandType')
def set_DemandType(self,DemandType):
self.add_query_param('DemandType',DemandType) | 33.84375 | 89 | 0.753232 | 562 | 4,332 | 5.613879 | 0.233096 | 0.035499 | 0.060856 | 0.086212 | 0.192393 | 0.180349 | 0.180349 | 0.024723 | 0.024723 | 0.024723 | 0 | 0.006919 | 0.132502 | 4,332 | 128 | 90 | 33.84375 | 0.832624 | 0.174054 | 0 | 0 | 0 | 0 | 0.131967 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.407895 | false | 0 | 0.026316 | 0.197368 | 0.644737 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
65bfdefb09a9ecddf707662343e79f134e25226d | 359 | py | Python | app/errors.py | dunkmann00/DVCTracker | 9d415062cd0c766d6b213112d2f8f5ebcf5d93ea | [
"MIT"
] | 1 | 2019-02-25T03:23:51.000Z | 2019-02-25T03:23:51.000Z | app/errors.py | dunkmann00/DVCTracker | 9d415062cd0c766d6b213112d2f8f5ebcf5d93ea | [
"MIT"
] | 4 | 2021-01-14T21:50:48.000Z | 2021-07-14T21:00:56.000Z | app/errors.py | dunkmann00/DVCTracker | 9d415062cd0c766d6b213112d2f8f5ebcf5d93ea | [
"MIT"
] | null | null | null | class SpecialError(Exception):
"""
Exception raised when there is an error while parsing a special.
"""
def __init__(self, attribute, content=None):
self.attribute = attribute
self.content = content
def __str__(self):
return (f"Unable to parse '{self.attribute}'\n"
f"Content: '{self.content}'\n")
| 29.916667 | 68 | 0.618384 | 42 | 359 | 5.095238 | 0.619048 | 0.182243 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.264624 | 359 | 11 | 69 | 32.636364 | 0.810606 | 0.178273 | 0 | 0 | 0 | 0 | 0.225806 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0.142857 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
65c8fb2f5b3aaf35de3c900d5c729cae91ce9f31 | 366 | py | Python | Chapter08/manage.py | jayakumardhananjayan/pythonwebtut | a7547473fec5b90a91aea5395131e6eff245b495 | [
"MIT"
] | 135 | 2018-10-31T11:52:35.000Z | 2022-03-23T12:23:04.000Z | Chapter08/manage.py | jayakumardhananjayan/pythonwebtut | a7547473fec5b90a91aea5395131e6eff245b495 | [
"MIT"
] | 6 | 2019-03-21T02:04:43.000Z | 2022-03-22T11:07:25.000Z | Chapter08/manage.py | jayakumardhananjayan/pythonwebtut | a7547473fec5b90a91aea5395131e6eff245b495 | [
"MIT"
] | 109 | 2018-10-30T22:26:23.000Z | 2022-03-24T14:53:13.000Z | import os
from webapp import db, migrate, create_app
from webapp.auth.models import User
from webapp.blog.models import Post, Tag
env = os.environ.get('WEBAPP_ENV', 'dev')
app = create_app('config.%sConfig' % env.capitalize())
@app.shell_context_processor
def make_shell_context():
return dict(app=app, db=db, User=User, Post=Post, Tag=Tag, migrate=migrate)
| 26.142857 | 79 | 0.754098 | 58 | 366 | 4.637931 | 0.482759 | 0.111524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120219 | 366 | 13 | 80 | 28.153846 | 0.835404 | 0 | 0 | 0 | 0 | 0 | 0.076503 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.444444 | 0.111111 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 3 |
65d1e2002b2ba0cd51ca2edde720ce6c1f034ee1 | 1,706 | py | Python | gmn/src/d1_gmn/app/management/commands/diag-proxy-set-url.py | DataONEorg/d1_python | dfab267c3adea913ab0e0073ed9dc1ee50b5b8eb | [
"Apache-2.0"
] | 15 | 2016-10-28T13:56:52.000Z | 2022-01-31T19:07:49.000Z | gmn/src/d1_gmn/app/management/commands/diag-proxy-set-url.py | DataONEorg/d1_python | dfab267c3adea913ab0e0073ed9dc1ee50b5b8eb | [
"Apache-2.0"
] | 56 | 2017-03-16T03:52:32.000Z | 2022-03-12T01:05:28.000Z | gmn/src/d1_gmn/app/management/commands/diag-proxy-set-url.py | DataONEorg/d1_python | dfab267c3adea913ab0e0073ed9dc1ee50b5b8eb | [
"Apache-2.0"
] | 11 | 2016-05-31T16:22:02.000Z | 2020-10-05T14:37:10.000Z | # This work was created by participants in the DataONE project, and is
# jointly copyrighted by participating institutions in DataONE. For
# more information on DataONE, see our web site at http://dataone.org.
#
# Copyright 2009-2019 DataONE
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Update the URL reference for a proxy object.
A single URL can be modified by passing the PID for the object to update and the new URL
on the command line. A bulk update can be performed by passing in a JSON or CSV file.
By default, this command verifies proxy objects by fully downloading the object bytes,
recalculating the checksum and comparing it with the checksum that was originally
supplied by the client that created the object.
See `audit-proxy-sciobj`_ for more information about proxy object URL references.
set-url2
"""
import d1_gmn.app.did
import d1_gmn.app.mgmt_base
import d1_gmn.app.models
class Command(d1_gmn.app.mgmt_base.GMNCommandBase):
def __init__(self, *args, **kwargs):
super().__init__(__doc__, __name__, *args, **kwargs)
def add_arguments(self, parser):
# self.add_arg_force(parser)
pass
def handle_serial(self):
# TODO.
pass
| 35.541667 | 88 | 0.752638 | 266 | 1,706 | 4.725564 | 0.533835 | 0.047733 | 0.025457 | 0.033413 | 0.025457 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012178 | 0.181712 | 1,706 | 47 | 89 | 36.297872 | 0.888252 | 0.773154 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021277 | 0 | 1 | 0.3 | false | 0.2 | 0.3 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
65d80fe1f85e0bfe5c7e1fa347b2a9b05658dda1 | 149 | py | Python | utils/mongo.py | GeneralWolf/EasyGif | 3fb4618ba9d0508c6dbfc8b1216cd18b5792bae0 | [
"MIT"
] | 6 | 2020-05-27T21:25:51.000Z | 2021-09-18T04:19:20.000Z | utils/mongo.py | GeneralWolf/EasyGif | 3fb4618ba9d0508c6dbfc8b1216cd18b5792bae0 | [
"MIT"
] | 1 | 2020-05-27T22:29:40.000Z | 2020-05-27T22:29:40.000Z | utils/mongo.py | GeneralWolf/EasyGif | 3fb4618ba9d0508c6dbfc8b1216cd18b5792bae0 | [
"MIT"
] | 7 | 2020-06-09T10:38:20.000Z | 2022-02-19T17:00:24.000Z | from pymongo import MongoClient
from utils.log import log
log("Connecting to MongoDB")
mongo = MongoClient()["easygif"]
mongo_users = mongo["users"] | 24.833333 | 32 | 0.771812 | 20 | 149 | 5.7 | 0.6 | 0.175439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114094 | 149 | 6 | 33 | 24.833333 | 0.863636 | 0 | 0 | 0 | 0 | 0 | 0.22 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
65e13e92afcc7320a892fbbfeee94e1243e89a44 | 21 | py | Python | sensors.py | aggronerd/pi_robot | 1944d02aa03afc839508915de3804960e3d1fc82 | [
"CC-BY-4.0"
] | null | null | null | sensors.py | aggronerd/pi_robot | 1944d02aa03afc839508915de3804960e3d1fc82 | [
"CC-BY-4.0"
] | null | null | null | sensors.py | aggronerd/pi_robot | 1944d02aa03afc839508915de3804960e3d1fc82 | [
"CC-BY-4.0"
] | null | null | null | __author__ = 'greg'
| 7 | 19 | 0.666667 | 2 | 21 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 21 | 2 | 20 | 10.5 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
65e7393b852febdc43a8463618aa6b05297af1d2 | 4,544 | py | Python | dbcArchives/2021/000_0-sds-3-x-projects/student-project-20_group-Generalization/01_Background.py | r-e-x-a-g-o-n/scalable-data-science | a97451a768cf12eec9a20fbe5552bbcaf215d662 | [
"Unlicense"
] | 138 | 2017-07-25T06:48:28.000Z | 2022-03-31T12:23:36.000Z | dbcArchives/2021/000_0-sds-3-x-projects/student-project-20_group-Generalization/01_Background.py | r-e-x-a-g-o-n/scalable-data-science | a97451a768cf12eec9a20fbe5552bbcaf215d662 | [
"Unlicense"
] | 11 | 2017-08-17T13:45:54.000Z | 2021-06-04T09:06:53.000Z | dbcArchives/2021/000_0-sds-3-x-projects/student-project-20_group-Generalization/01_Background.py | r-e-x-a-g-o-n/scalable-data-science | a97451a768cf12eec9a20fbe5552bbcaf215d662 | [
"Unlicense"
] | 74 | 2017-08-18T17:04:46.000Z | 2022-03-21T14:30:51.000Z | # Databricks notebook source
# MAGIC %md
# MAGIC ScaDaMaLe Course [site](https://lamastex.github.io/scalable-data-science/sds/3/x/) and [book](https://lamastex.github.io/ScaDaMaLe/index.html)
# COMMAND ----------
# MAGIC %md
# MAGIC # MixUp and Generalization
# MAGIC
# MAGIC Group Project Authors:
# MAGIC
# MAGIC - Olof Zetterqvist
# MAGIC
# MAGIC - Jimmy Aronsson
# MAGIC
# MAGIC - Fredrik Hellström
# MAGIC
# MAGIC Video: https://chalmersuniversity.box.com/s/ubij9bjekg6lcov13kw16kjhk01uzsmy
# COMMAND ----------
# MAGIC %md
# MAGIC
# MAGIC ## Introduction
# MAGIC
# MAGIC The goal of supervised machine learning is to predict labels given examples. Specifically, we want to choose some mapping *f*, referred to as a hypothesis, from a space of examples *X* to a space of labels *Y*. As a concrete example, *X* can be the set of pictures of cats and dogs of a given size, *Y* can be the set *{cat, dog}*, and *f* can be a neural network. To choose *f*, we rely on a set of labelled data. However, our true goal is to perform well on unseen data, i.e., test data. If an algorithm performs similarly well on unseen data as on the training data we used, we say that it *generalizes*.
# MAGIC
# MAGIC A pertinent question, then, is to explain why a model generalizes and using the answer to improve learning algorithms. For overparameterized deep learning methods, this question has yet to be answered conclusively. Recently, a training procedure called MixUp was proposed to improve the generalization capabilities of neural networks [[1]]. The basic idea is that instead of feeding the raw training data to our supervised learning algorithm, we instead use convex combinations of two randomly selected data points. The benefit of this is two-fold. First, it plays the role of data augmentation: the network will never see two completely identical training samples, since we constantly produce new random combinations. Second, the network is encouraged to behave nicely in-between training samples, which has the potential to reduce overfitting. A connection between performance on MixUp data and generalization abilities of networks trained without the MixUp procedure was also studied in [[2]].
# MAGIC
# MAGIC
# MAGIC [1]: https://arxiv.org/abs/1710.09412
# MAGIC [2]: https://arxiv.org/abs/2012.02775
# COMMAND ----------
# MAGIC %md
# MAGIC ** Project description **
# MAGIC
# MAGIC In this project, we will investigate the connection between MixUp and generalization at a large scale by performing a distributed hyperparameter search. We will look at both Random Forests and convolutional neural networks. First, we will the algorithms without MixUp, and study the connection between MixUp performance and test error. Then, we will train the networks on MixUp data, and see whether directly optimizing MixUp performance will yield more beneficial test errors.
# MAGIC
# MAGIC To make the hyperparameter search distributed and scalable, we will use the Ray Tune package [[3]]. We also planned to use Horovod to enable the individual networks to handle data in a distributed fashion [[4]]. Scalability would then have entered our project in both the scope of the hyperparameter search and the size of the data set. However, we had unexpected GPU problems and were ultimately forced to skip Horovod due to lack of time.
# MAGIC
# MAGIC [3]: https://docs.ray.io/en/master/tune/
# MAGIC [4]: https://github.com/horovod/horovod
# COMMAND ----------
# MAGIC %md
# MAGIC **Summary of findings**
# MAGIC
# MAGIC Our findings were as follows. For Random Forests, we did not find any significant improvement when using MixUp. This may be due to the fact that Random Forests, since they are not trained iteratively, cannot efficiently utilize MixUp. Furthermore, since Decision Trees are piecewise constant, it is unclear what it would mean to force them to behave nicely in-between training samples. When training a CNN to classify MNIST images, we found practically no difference between training on MixUp data and normal, untouched data. This may be due to MNIST being "too easy". However, for a CNN trained on CIFAR-10, the benefits of MixUp became noticable. First of all, training the same number of epochs on MixUp data as the normal training data gave a higher accuracy on the validation set. Secondly, while the network started to overfit on normal data, this did not occur to a significant degree when using MixUp data. This indicates that MixUp can be beneficial when the algorithm and data are sufficiently complex. | 89.098039 | 1,020 | 0.772667 | 708 | 4,544 | 4.95904 | 0.425141 | 0.039875 | 0.017089 | 0.021646 | 0.029621 | 0.021646 | 0.021646 | 0 | 0 | 0 | 0 | 0.009716 | 0.161972 | 4,544 | 51 | 1,020 | 89.098039 | 0.91229 | 0.978653 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
028c6ffa86abe9765ff52b0475c52f17f9ca45d9 | 187 | py | Python | src/main.py | rmulumba/nairobi_ambulance_location | e5be6f36f33a60ce25abba421390e660546f09d8 | [
"MIT"
] | null | null | null | src/main.py | rmulumba/nairobi_ambulance_location | e5be6f36f33a60ce25abba421390e660546f09d8 | [
"MIT"
] | null | null | null | src/main.py | rmulumba/nairobi_ambulance_location | e5be6f36f33a60ce25abba421390e660546f09d8 | [
"MIT"
] | null | null | null | import train_kmeans
import predictions
"""
Training the ML model and making predictions.
"""
if __name__ == "__main__":
train_kmeans.train()
predictions.create_prediction_file() | 18.7 | 45 | 0.754011 | 22 | 187 | 5.863636 | 0.727273 | 0.170543 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149733 | 187 | 10 | 46 | 18.7 | 0.811321 | 0 | 0 | 0 | 0 | 0 | 0.059259 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
02920a9ba99661d9abc3fc11132643b4e8d6ca34 | 2,747 | py | Python | Msgrpc/serializers.py | evi1hack/viperpython | 04bf8e31e21385edb58ea9d25296df062197df39 | [
"BSD-3-Clause"
] | 42 | 2021-01-20T15:30:33.000Z | 2022-03-31T07:51:11.000Z | Msgrpc/serializers.py | evi1hack/viperpython | 04bf8e31e21385edb58ea9d25296df062197df39 | [
"BSD-3-Clause"
] | 2 | 2021-08-17T00:16:33.000Z | 2022-02-21T11:37:45.000Z | Msgrpc/serializers.py | evi1hack/viperpython | 04bf8e31e21385edb58ea9d25296df062197df39 | [
"BSD-3-Clause"
] | 28 | 2021-01-22T05:06:39.000Z | 2022-03-31T03:27:42.000Z | # -*- coding: utf-8 -*-
# @File : serializers.py
# @Date : 2018/11/15
# @Desc :
from rest_framework.serializers import *
class SessionLibSerializer(Serializer):
sessionid = IntegerField()
# 权限部分
user = CharField(max_length=100)
is_system = BooleanField()
is_admin = BooleanField()
is_in_admin_group = BooleanField()
is_in_domain = BooleanField()
is_uac_enable = BooleanField()
uac_level = IntegerField()
integrity = CharField(max_length=100)
# 进程信息
pid = IntegerField()
pname = CharField(max_length=100)
ppath = CharField(max_length=100)
puser = CharField(max_length=100)
parch = CharField(max_length=100)
processes = ListField()
load_powershell = BooleanField()
load_python = BooleanField()
# 域信息
domain = CharField(max_length=100)
# session基本信息
session_host = CharField(max_length=100)
type = CharField(max_length=100)
computer = CharField(max_length=100)
arch = CharField(max_length=100)
platform = CharField(max_length=100)
last_checkin = IntegerField()
fromnow = IntegerField()
tunnel_local = CharField(max_length=100)
tunnel_peer = CharField(max_length=100)
tunnel_peer_ip = CharField(max_length=100)
tunnel_peer_locate_zh = CharField(max_length=100)
tunnel_peer_locate_en = CharField(max_length=100)
via_exploit = CharField(max_length=100)
via_payload = CharField(max_length=100)
os = CharField(max_length=100)
os_short = CharField(max_length=100)
logged_on_users = IntegerField()
class PostModuleSerializer(Serializer):
NAME_ZH = CharField(max_length=100)
NAME_EN = CharField(max_length=100)
DESC_ZH = CharField(max_length=100)
DESC_EN = CharField(max_length=100)
REQUIRE_SESSION = BooleanField()
MODULETYPE = CharField(max_length=100) # 模块类型
AUTHOR = ListField() # 模块作者
PLATFORM = ListField() # 平台
PERMISSIONS = ListField()
README = ListField()
ATTCK = ListField()
REFERENCES = ListField()
_custom_param = DictField() # 前端传入的参数信息
_sessionid = IntegerField() # 前端传入的sessionid
_ipaddress = CharField(max_length=100) # 前端传入的ipaddress信息
class BotModuleSerializer(Serializer):
NAME_ZH = CharField(max_length=100)
NAME_EN = CharField(max_length=100)
DESC_ZH = CharField(max_length=100)
DESC_EN = CharField(max_length=100)
MODULETYPE = CharField(max_length=100) # 模块类型
AUTHOR = ListField() # 模块作者
REFERENCES = ListField()
README = ListField()
SEARCH = CharField(max_length=200)
_custom_param = DictField() # 前端传入的参数信息
_ip = CharField(max_length=100) # 前端传入的ip地址
_port = IntegerField() # 前端传入的端口信息
_protocol = CharField(max_length=100) # 前端传入的协议类型
| 30.186813 | 62 | 0.702585 | 312 | 2,747 | 5.919872 | 0.323718 | 0.227396 | 0.341094 | 0.386573 | 0.326475 | 0.259881 | 0.226313 | 0.186248 | 0.186248 | 0.127775 | 0 | 0.051771 | 0.198398 | 2,747 | 90 | 63 | 30.522222 | 0.787012 | 0.074627 | 0 | 0.268657 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.014925 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
029bc5b0e983f02b1280e763667efeb4c61846d8 | 352 | py | Python | tubbs/util/string.py | tek/tubbs | cd4c174c31b6c58a6935ca8a5f0f141377a9a04c | [
"MIT"
] | null | null | null | tubbs/util/string.py | tek/tubbs | cd4c174c31b6c58a6935ca8a5f0f141377a9a04c | [
"MIT"
] | null | null | null | tubbs/util/string.py | tek/tubbs | cd4c174c31b6c58a6935ca8a5f0f141377a9a04c | [
"MIT"
] | null | null | null | from typing import Callable, Any
from hues import huestr
from amino import _
def simple_col(msg: Any, col: Callable[[huestr], huestr]) -> str:
return col(huestr(str(msg))).colorized
def yellow(msg: Any) -> str:
return simple_col(msg, _.yellow)
def blue(msg: Any) -> str:
return simple_col(msg, _.blue)
__all__ = ('yellow', 'blue')
| 17.6 | 65 | 0.678977 | 51 | 352 | 4.490196 | 0.352941 | 0.117904 | 0.157205 | 0.131004 | 0.235808 | 0.235808 | 0.235808 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 352 | 19 | 66 | 18.526316 | 0.795139 | 0 | 0 | 0 | 0 | 0 | 0.028409 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0.3 | 0.3 | 0.9 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
02aec4f5cc73b348f97e7044d3df436d4855504f | 150 | py | Python | your_projects/noornee_restApi/seriesapi/apps.py | kdj309/H4ckT0b3rF3st-2k21 | 5395c0bfb442a64ad7efc7d83e12e1d08cdb7438 | [
"MIT"
] | 23 | 2021-09-21T15:48:16.000Z | 2022-01-10T10:54:49.000Z | your_projects/noornee_restApi/seriesapi/apps.py | kdj309/H4ckT0b3rF3st-2k21 | 5395c0bfb442a64ad7efc7d83e12e1d08cdb7438 | [
"MIT"
] | 14 | 2021-10-05T07:10:31.000Z | 2021-10-17T04:55:29.000Z | your_projects/noornee_restApi/seriesapi/apps.py | kdj309/H4ckT0b3rF3st-2k21 | 5395c0bfb442a64ad7efc7d83e12e1d08cdb7438 | [
"MIT"
] | 30 | 2021-09-25T19:45:22.000Z | 2021-10-31T19:16:43.000Z | from django.apps import AppConfig
class SeriesapiConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'seriesapi'
| 21.428571 | 56 | 0.766667 | 17 | 150 | 6.647059 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.146667 | 150 | 6 | 57 | 25 | 0.882813 | 0 | 0 | 0 | 0 | 0 | 0.253333 | 0.193333 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
02b47567ef828e320860bb0213d0f3e57d782088 | 159 | py | Python | scripts/sys/list_jupyter.py | dclong/docker-jupyterlab | a2f62e28fa9473fa2ad844d2be488bb171021e00 | [
"MIT"
] | 18 | 2017-09-28T13:26:41.000Z | 2021-12-16T04:07:53.000Z | scripts/sys/list_jupyter.py | dclong/docker-jupyterlab | a2f62e28fa9473fa2ad844d2be488bb171021e00 | [
"MIT"
] | 5 | 2017-10-19T20:02:21.000Z | 2022-03-19T16:30:26.000Z | scripts/sys/list_jupyter.py | dclong/docker-jupyterlab | a2f62e28fa9473fa2ad844d2be488bb171021e00 | [
"MIT"
] | 14 | 2017-06-19T12:36:00.000Z | 2021-10-02T15:39:42.000Z | #!/usr/bin/env python3
import json
from jupyter_server import serverapp
servers = list(serverapp.list_running_servers())
print(json.dumps(servers, indent=4))
| 22.714286 | 48 | 0.798742 | 23 | 159 | 5.391304 | 0.73913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013793 | 0.08805 | 159 | 6 | 49 | 26.5 | 0.841379 | 0.132075 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
02cbb2bfd220bee0df951df95d885c1e2e9be41e | 268 | py | Python | src/UnitTests/Infrastructure_UnitTests/flask_run.py | pieterbork/SUPERFREQ | c5aff0a28a299e1146612f60b02d8cefd5fe74a5 | [
"MIT"
] | 3 | 2018-09-14T15:13:33.000Z | 2019-07-16T04:27:45.000Z | src/UnitTests/Infrastructure_UnitTests/flask_run.py | pieterbork/SUPERFREQ | c5aff0a28a299e1146612f60b02d8cefd5fe74a5 | [
"MIT"
] | null | null | null | src/UnitTests/Infrastructure_UnitTests/flask_run.py | pieterbork/SUPERFREQ | c5aff0a28a299e1146612f60b02d8cefd5fe74a5 | [
"MIT"
] | 2 | 2018-01-22T03:11:51.000Z | 2018-02-24T01:28:27.000Z | #!/usr/bin/env python2
#author : Kade Cooper kaco0964@colorado.edu
#name : flask_run.py
#purpose : Test request from flask server for testing the libraries
#date : 2018.03.24
#version: 1.0.10
#version notes (latest): Compatible w/ python2
print "Flask Test Here!"
| 22.333333 | 68 | 0.738806 | 42 | 268 | 4.690476 | 0.880952 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078947 | 0.149254 | 268 | 11 | 69 | 24.363636 | 0.785088 | 0.843284 | 0 | 0 | 0 | 0 | 0.457143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
02cbfa18b160c209e7c05a62f6a1bc7ff1626a3d | 259 | py | Python | test.py | osfunapps/os-crypto-py | 0a91f868b1fde3973ce41ca9c4519d5d2edeff46 | [
"MIT"
] | null | null | null | test.py | osfunapps/os-crypto-py | 0a91f868b1fde3973ce41ca9c4519d5d2edeff46 | [
"MIT"
] | null | null | null | test.py | osfunapps/os-crypto-py | 0a91f868b1fde3973ce41ca9c4519d5d2edeff46 | [
"MIT"
] | null | null | null |
#
# files = fh.search_files('/Users/home/Programming/android/coroutine/rwdc-coroutines-materials/starter/app/src/main/res', by_extension='.xml')
# content = fh.get_dir_content("/Users/home/Desktop/apps", False, True, False)
# print(files)
# print(content)
| 28.777778 | 142 | 0.745174 | 36 | 259 | 5.25 | 0.75 | 0.095238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07722 | 259 | 8 | 143 | 32.375 | 0.790795 | 0.945946 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
02ec290cdfcb98ded703012c13c31765f17dd4b3 | 608 | py | Python | src/py_dss_tools/model/general/__init__.py | eniovianna/py_dss_tools | 3057fb0b74facd05a362e4e4a588f79f70aa9dd7 | [
"MIT"
] | 3 | 2021-05-29T00:40:10.000Z | 2021-09-30T17:56:14.000Z | src/py_dss_tools/model/general/__init__.py | eniovianna/py_dss_tools | 3057fb0b74facd05a362e4e4a588f79f70aa9dd7 | [
"MIT"
] | null | null | null | src/py_dss_tools/model/general/__init__.py | eniovianna/py_dss_tools | 3057fb0b74facd05a362e4e4a588f79f70aa9dd7 | [
"MIT"
] | 3 | 2021-05-29T00:40:46.000Z | 2022-01-13T22:04:49.000Z | # -*- encoding: utf-8 -*-
"""
Created by Ênio Viana at 22/09/2021 at 23:07:30
Project: py_dss_tools [set, 2021]
"""
from .GeneralElement import GeneralElement
from .CNData import CNData
from .GrowthShape import GrowthShape
from .LineCode import LineCode
from .LineGeometry import LineGeometry
from .LineSpacing import LineSpacing
from .LoadShape import LoadShape
from .PriceShape import PriceShape
from .Spectrum import Spectrum
from .TCCCurve import TCCCurve
from .TSData import TSData
from .TShape import TShape
from .WireData import WireData
from .XFMRCode import XFMRCode
from .XYCurve import XYCurve
| 27.636364 | 48 | 0.802632 | 81 | 608 | 6 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036122 | 0.134868 | 608 | 21 | 49 | 28.952381 | 0.887833 | 0.174342 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
02f7d8f80addc6b46ffb7dfb5252769704a0da75 | 72 | py | Python | content/_build/jupyter_execute/notebooks/solar_resource_data_overview.py | AssessingSolar/Solar-Resource-Assessment-in-Python | 230558004b0cabd17d52198fd1901fe36663a036 | [
"BSD-3-Clause"
] | 3 | 2021-03-17T15:21:07.000Z | 2021-08-25T07:27:24.000Z | content/_build/jupyter_execute/notebooks/solar_resource_data_overview.py | AssessingSolar/Solar-Resource-Assessment-in-Python | 230558004b0cabd17d52198fd1901fe36663a036 | [
"BSD-3-Clause"
] | 3 | 2021-07-23T17:55:22.000Z | 2021-09-03T15:23:37.000Z | content/_build/jupyter_execute/notebooks/solar_resource_data_overview.py | AssessingSolar/Solar-Resource-Assessment-in-Python | 230558004b0cabd17d52198fd1901fe36663a036 | [
"BSD-3-Clause"
] | null | null | null | # Overview
This section is still under construction. Come back soon!
| 12 | 57 | 0.763889 | 10 | 72 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.194444 | 72 | 5 | 58 | 14.4 | 0.948276 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
02f8e5c2b4635c02b05d61ca1fe1392be417a995 | 578 | py | Python | joplin_web/api/permissions.py | kuyper/joplin-web | 7a13b75cbb55741ddfb58767af34c7ad164fec11 | [
"BSD-3-Clause"
] | null | null | null | joplin_web/api/permissions.py | kuyper/joplin-web | 7a13b75cbb55741ddfb58767af34c7ad164fec11 | [
"BSD-3-Clause"
] | null | null | null | joplin_web/api/permissions.py | kuyper/joplin-web | 7a13b75cbb55741ddfb58767af34c7ad164fec11 | [
"BSD-3-Clause"
] | 1 | 2019-12-13T15:18:58.000Z | 2019-12-13T15:18:58.000Z | from rest_framework import permissions
class DjangoModelPermissions(permissions.BasePermission):
perms_map = {
'GET': [],
'OPTIONS': [],
'HEAD': [],
'POST': ['joplin_web.add_folders', 'joplin_web.add_notes', 'joplin_web.add_tags'],
'PUT': ['joplin_web.change_folders', 'joplin_web.change_notes', 'joplin_web.change_tags'],
'PATCH': ['joplin_web.change_folders', 'joplin_web.change_notes', 'joplin_web.change_tags'],
'DELETE': ['joplin_web.delete_folders', 'joplin_web.delete_notes', 'joplin_web.delete_tags'],
} | 41.285714 | 101 | 0.66955 | 66 | 578 | 5.469697 | 0.363636 | 0.299169 | 0.249307 | 0.121884 | 0.33795 | 0.33795 | 0.33795 | 0.33795 | 0.33795 | 0.33795 | 0 | 0 | 0.16955 | 578 | 14 | 102 | 41.285714 | 0.752083 | 0 | 0 | 0 | 0 | 0 | 0.523316 | 0.400691 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
f308795af04006be9fdaac76a7b49acecf8b3892 | 94 | py | Python | tests/__init__.py | amertkara/bittorent-parser | 68c1cc555ba95c481ebed8f14808f7e588523aa5 | [
"Apache-2.0"
] | null | null | null | tests/__init__.py | amertkara/bittorent-parser | 68c1cc555ba95c481ebed8f14808f7e588523aa5 | [
"Apache-2.0"
] | null | null | null | tests/__init__.py | amertkara/bittorent-parser | 68c1cc555ba95c481ebed8f14808f7e588523aa5 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from tests.test_btparser import TestBtparser
__all__ = [TestBtparser] | 23.5 | 44 | 0.734043 | 11 | 94 | 5.818182 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012195 | 0.12766 | 94 | 4 | 45 | 23.5 | 0.768293 | 0.223404 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
f30ac680b22c7531b0dfca6a102d94654b064e54 | 461 | py | Python | chapter_8/CTCI_8_11.py | ztaylor2/cracking-the-coding-interview | 0587d233d76f99481667a96806acd6dd007aa5e6 | [
"MIT"
] | null | null | null | chapter_8/CTCI_8_11.py | ztaylor2/cracking-the-coding-interview | 0587d233d76f99481667a96806acd6dd007aa5e6 | [
"MIT"
] | null | null | null | chapter_8/CTCI_8_11.py | ztaylor2/cracking-the-coding-interview | 0587d233d76f99481667a96806acd6dd007aa5e6 | [
"MIT"
] | null | null | null | """Coins.
Given an infinite supply of quarters dimes, nickels, and pennies,
wrote code to represent the number of ways to represent n cents.
"""
def calc_num_coins(n):
"""Calc the num of coins."""
def add_coin(total=0):
"""."""
if total > n:
return 0
if total == n:
return 1
return add_coin(total + 1) + add_coin(total + 5) + add_coin(total + 10) + add_coin(total + 25)
return add_coin()
| 23.05 | 102 | 0.585683 | 68 | 461 | 3.852941 | 0.485294 | 0.160305 | 0.229008 | 0.068702 | 0.114504 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027778 | 0.29718 | 461 | 19 | 103 | 24.263158 | 0.780864 | 0.353579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
f3107baed490cce37d00586355bcbd6a0355c412 | 256 | py | Python | hackerrank/python/list-comprehensions.py | Ashindustry007/competitive-programming | 2eabd3975c029d235abb7854569593d334acae2f | [
"WTFPL"
] | 506 | 2018-08-22T10:30:38.000Z | 2022-03-31T10:01:49.000Z | hackerrank/python/list-comprehensions.py | Ashindustry007/competitive-programming | 2eabd3975c029d235abb7854569593d334acae2f | [
"WTFPL"
] | 13 | 2019-08-07T18:31:18.000Z | 2020-12-15T21:54:41.000Z | hackerrank/python/list-comprehensions.py | Ashindustry007/competitive-programming | 2eabd3975c029d235abb7854569593d334acae2f | [
"WTFPL"
] | 234 | 2018-08-06T17:11:41.000Z | 2022-03-26T10:56:42.000Z | #!/usr/bin/env python2
# https://www.hackerrank.com/challenges/list-comprehensions
from sys import stdin
X, Y, Z, N = [int(stdin.readline()) for i in range(4)]
print [[x, y, z] for x in range(X+1) for y in range(Y+1) for z in range(Z+1) if x + y + z != N]
| 42.666667 | 95 | 0.65625 | 54 | 256 | 3.111111 | 0.537037 | 0.166667 | 0.053571 | 0.047619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023364 | 0.164063 | 256 | 5 | 96 | 51.2 | 0.761682 | 0.308594 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.333333 | null | null | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
b823352d799c90682c7995fc1d4bdd717097a69c | 10,270 | py | Python | tests/test_cli.py | aldanor/conda | f5e6cfb05c80f83b1f84ab8ed2ec42f3e167900d | [
"BSD-3-Clause"
] | null | null | null | tests/test_cli.py | aldanor/conda | f5e6cfb05c80f83b1f84ab8ed2ec42f3e167900d | [
"BSD-3-Clause"
] | null | null | null | tests/test_cli.py | aldanor/conda | f5e6cfb05c80f83b1f84ab8ed2ec42f3e167900d | [
"BSD-3-Clause"
] | null | null | null | import unittest
from conda.cli.common import arg2spec, spec_from_line
from conda.compat import text_type
from tests.helpers import capture_with_argv, capture_json_with_argv
class TestArg2Spec(unittest.TestCase):
def test_simple(self):
self.assertEqual(arg2spec('python'), 'python')
self.assertEqual(arg2spec('python=2.6'), 'python 2.6*')
self.assertEqual(arg2spec('ipython=0.13.2'), 'ipython 0.13.2*')
self.assertEqual(arg2spec('ipython=0.13.0'), 'ipython 0.13|0.13.0*')
self.assertEqual(arg2spec('foo=1.3.0=3'), 'foo 1.3.0 3')
def test_pip_style(self):
self.assertEqual(arg2spec('foo>=1.3'), 'foo >=1.3')
self.assertEqual(arg2spec('zope.int>=1.3,<3.0'), 'zope.int >=1.3,<3.0')
self.assertEqual(arg2spec('numpy >=1.9'), 'numpy >=1.9')
def test_invalid(self):
self.assertRaises(SystemExit, arg2spec, '!xyz 1.3')
class TestSpecFromLine(unittest.TestCase):
def test_invalid(self):
self.assertEqual(spec_from_line('='), None)
self.assertEqual(spec_from_line('foo 1.0'), None)
def test_conda_style(self):
self.assertEqual(spec_from_line('foo'), 'foo')
self.assertEqual(spec_from_line('foo=1.0'), 'foo 1.0')
self.assertEqual(spec_from_line('foo=1.0*'), 'foo 1.0*')
self.assertEqual(spec_from_line('foo=1.0|1.2'), 'foo 1.0|1.2')
self.assertEqual(spec_from_line('foo=1.0=2'), 'foo 1.0 2')
def test_pip_style(self):
self.assertEqual(spec_from_line('foo>=1.0'), 'foo >=1.0')
self.assertEqual(spec_from_line('foo >=1.0'), 'foo >=1.0')
self.assertEqual(spec_from_line('FOO-Bar >=1.0'), 'foo-bar >=1.0')
self.assertEqual(spec_from_line('foo >= 1.0'), 'foo >=1.0')
self.assertEqual(spec_from_line('foo > 1.0'), 'foo >1.0')
self.assertEqual(spec_from_line('foo != 1.0'), 'foo !=1.0')
self.assertEqual(spec_from_line('foo <1.0'), 'foo <1.0')
self.assertEqual(spec_from_line('foo >=1.0 , < 2.0'), 'foo >=1.0,<2.0')
class TestJson(unittest.TestCase):
def assertJsonSuccess(self, res):
self.assertIsInstance(res, dict)
self.assertTrue('success' in res)
def assertJsonError(self, res):
self.assertIsInstance(res, dict)
self.assertTrue('error' in res)
def test_clean(self):
res = capture_json_with_argv('conda', 'clean', '--index-cache', '--lock',
'--packages', '--tarballs', '--json')
self.assertJsonSuccess(res)
def test_config(self):
res = capture_json_with_argv('conda', 'config', '--get', '--json')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'config', '--get', 'channels',
'--json')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'config', '--get', 'channels',
'--system', '--json')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'config', '--get', 'channels',
'--file', 'tmpfile.rc', '--json')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'config', '--add', 'channels',
'binstar', '--json')
self.assertIsInstance(res, dict)
res = capture_json_with_argv('conda', 'config', '--add', 'channels',
'binstar', '--force', '--json')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'config', '--remove', 'channels',
'binstar', '--json')
self.assertJsonError(res)
res = capture_json_with_argv('conda', 'config', '--remove', 'channels',
'binstar', '--force', '--json')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'config', '--remove', 'channels',
'nonexistent', '--force', '--json')
self.assertJsonError(res)
res = capture_json_with_argv('conda', 'config', '--remove', 'envs_dirs',
'binstar', '--json')
self.assertJsonError(res)
res = capture_json_with_argv('conda', 'config', '--set', 'use_pip',
'yes', '--json')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'config', '--get', 'use_pip',
'--json')
self.assertJsonSuccess(res)
self.assertTrue(res['get']['use_pip'])
res = capture_json_with_argv('conda', 'config', '--remove-key', 'use_pip',
'--json')
self.assertJsonError(res)
res = capture_json_with_argv('conda', 'config', '--remove-key', 'use_pip',
'--force', '--json')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'config', '--remove-key', 'use_pip',
'--force', '--json')
self.assertJsonError(res)
def test_info(self):
res = capture_json_with_argv('conda', 'info', '--json')
keys = ('channels', 'conda_version', 'default_prefix', 'envs',
'envs_dirs', 'is_foreign', 'pkgs_dirs', 'platform',
'python_version', 'rc_path', 'root_prefix', 'root_writable')
self.assertTrue(all(key in res for key in keys))
res = capture_json_with_argv('conda', 'info', 'conda', '--json')
self.assertIsInstance(res, dict)
self.assertTrue('conda' in res)
self.assertIsInstance(res['conda'], list)
def test_install(self):
res = capture_json_with_argv('conda', 'install', 'pip', '--json', '--quiet')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'update', 'pip', '--json', '--quiet')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'remove', 'pip', '--json', '--quiet')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'remove', 'pip', '--json', '--quiet')
self.assertJsonError(res)
res = capture_json_with_argv('conda', 'update', 'pip', '--json', '--quiet')
self.assertJsonError(res)
res = capture_json_with_argv('conda', 'install', 'pip=1.5.5', '--json', '--quiet')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'install', '=', '--json', '--quiet')
self.assertJsonError(res)
res = capture_json_with_argv('conda', 'remove', '-n', 'testing',
'--all', '--json', '--quiet')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'remove', '-n', 'testing',
'--all', '--json', '--quiet')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'remove', '-n', 'testing2',
'--all', '--json', '--quiet')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'create', '-n', 'testing',
'python', '--json', '--quiet')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'install', '-n', 'testing',
'python', '--json', '--quiet')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'install', '--dry-run',
'python', '--json', '--quiet')
self.assertJsonSuccess(res)
res = capture_json_with_argv('conda', 'create', '--clone', 'testing',
'-n', 'testing2', '--json', '--quiet')
self.assertJsonSuccess(res)
def test_run(self):
res = capture_json_with_argv('conda', 'run', 'not_installed', '--json')
self.assertJsonError(res)
res = capture_json_with_argv('conda', 'run', 'not_installed-0.1-py27_0.tar.bz2', '--json')
self.assertJsonError(res)
def test_list(self):
res = capture_json_with_argv('conda', 'list', '--json')
self.assertIsInstance(res, list)
res = capture_json_with_argv('conda', 'list', '-r', '--json')
self.assertTrue(isinstance(res, list) or
(isinstance(res, dict) and 'error' in res))
res = capture_json_with_argv('conda', 'list', 'ipython', '--json')
self.assertIsInstance(res, list)
res = capture_json_with_argv('conda', 'list', '--name', 'nonexistent', '--json')
self.assertJsonError(res)
res = capture_json_with_argv('conda', 'list', '--name', 'nonexistent', '-r', '--json')
self.assertJsonError(res)
def test_search(self):
res = capture_json_with_argv('conda', 'search', '--json')
self.assertIsInstance(res, dict)
self.assertIsInstance(res['_license'], list)
self.assertIsInstance(res['_license'][0], dict)
keys = ('build', 'channel', 'extracted', 'features', 'fn',
'installed', 'version')
self.assertTrue(all(key in res['_license'][0] for key in keys))
for res in (capture_json_with_argv('conda', 'search', 'ipython', '--json'),
capture_json_with_argv('conda', 'search', '--unknown', '--json'),
capture_json_with_argv('conda', 'search', '--use-index-cache', '--json'),
capture_json_with_argv('conda', 'search', '--outdated', '--json'),
capture_json_with_argv('conda', 'search', '-c', 'https://conda.binstar.org/asmeurer', '--json'),
capture_json_with_argv('conda', 'search', '-c', 'https://conda.binstar.org/asmeurer', '--override-channels', '--json'),
capture_json_with_argv('conda', 'search', '--platform', 'win-32', '--json'),):
self.assertIsInstance(res, dict)
res = capture_json_with_argv('conda', 'search', '*', '--json')
self.assertJsonError(res)
res = capture_json_with_argv('conda', 'search', '--canonical', '--json')
self.assertIsInstance(res, list)
self.assertIsInstance(res[0], text_type)
if __name__ == '__main__':
unittest.main()
| 42.970711 | 131 | 0.559786 | 1,145 | 10,270 | 4.818341 | 0.119651 | 0.073953 | 0.135943 | 0.172195 | 0.758746 | 0.728113 | 0.655066 | 0.594526 | 0.541961 | 0.510785 | 0 | 0.017911 | 0.260662 | 10,270 | 238 | 132 | 43.151261 | 0.708679 | 0 | 0 | 0.463687 | 0 | 0 | 0.229503 | 0.003116 | 0 | 0 | 0 | 0 | 0.446927 | 1 | 0.083799 | false | 0 | 0.022346 | 0 | 0.122905 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b82a6e5d5496424e222bd1f2a599b83fccb8e782 | 1,144 | py | Python | code/api/schemas.py | CiscoSecurity/tr-05-serverless-rsa-netwitness | f9d2fe554efceede3c06bcee40062405b1f971d5 | [
"MIT"
] | null | null | null | code/api/schemas.py | CiscoSecurity/tr-05-serverless-rsa-netwitness | f9d2fe554efceede3c06bcee40062405b1f971d5 | [
"MIT"
] | null | null | null | code/api/schemas.py | CiscoSecurity/tr-05-serverless-rsa-netwitness | f9d2fe554efceede3c06bcee40062405b1f971d5 | [
"MIT"
] | null | null | null | from marshmallow import ValidationError, Schema, fields
def validate_string(value):
if value == '':
raise ValidationError('Field may not be blank.')
class ObservableSchema(Schema):
type = fields.String(
validate=validate_string,
required=True,
)
value = fields.String(
validate=validate_string,
required=True,
)
class NetwitnessSchema(Schema):
sessionid = fields.Str(required=True)
time = fields.DateTime(required=True)
eth_src = fields.Str(required=False, data_key='eth.src')
eth_dst = fields.Str(required=False, data_key='eth.dst')
ip_src = fields.Str(required=False, data_key='ip.src')
ip_dst = fields.Str(required=False, data_key='ip.dst')
proto = fields.Str(required=False, data_key='ip.proto')
service = fields.Str(required=False)
netname = fields.Str(required=False)
direction = fields.Str(required=False)
filename = fields.Str(required=False)
username = fields.Str(required=False)
packets = fields.Str(required=False)
did = fields.Str(required=False)
domain = fields.Str(required=False, data_key='alias.host')
| 31.777778 | 62 | 0.691434 | 144 | 1,144 | 5.402778 | 0.3125 | 0.161954 | 0.305913 | 0.367609 | 0.372751 | 0.372751 | 0.335476 | 0 | 0 | 0 | 0 | 0 | 0.184441 | 1,144 | 35 | 63 | 32.685714 | 0.833869 | 0 | 0 | 0.137931 | 0 | 0 | 0.058566 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034483 | false | 0 | 0.034483 | 0 | 0.724138 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
b834189505926cee42121bea448f9fcce646536c | 72 | py | Python | django_boost/test/__init__.py | toshiki-tosshi/django-boost | 2431b743af2d976571d491ae232a5cb03c760b7e | [
"MIT"
] | 25 | 2019-05-23T11:19:18.000Z | 2022-02-19T15:28:09.000Z | django_boost/test/__init__.py | toshiki-tosshi/django-boost | 2431b743af2d976571d491ae232a5cb03c760b7e | [
"MIT"
] | 49 | 2019-09-17T08:40:22.000Z | 2022-03-02T14:08:27.000Z | django_boost/test/__init__.py | toshiki-tosshi/django-boost | 2431b743af2d976571d491ae232a5cb03c760b7e | [
"MIT"
] | 4 | 2019-09-17T08:16:55.000Z | 2020-08-24T09:33:16.000Z | from django_boost.test.testcase import TestCase
__all__ = ["TestCase"]
| 18 | 47 | 0.791667 | 9 | 72 | 5.777778 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 72 | 3 | 48 | 24 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
b83e946d974761664394e9a435f8ac025421b60d | 869 | py | Python | src/api/error_handlers.py | CallumHoughton18/Mushroom-Classification | 10e376a925147f82dd73c69f117fb0d95cc7725f | [
"MIT"
] | 1 | 2021-01-17T19:44:13.000Z | 2021-01-17T19:44:13.000Z | src/api/error_handlers.py | mbeacom/Mushroom-Classification | ab8a08498c93aa0d307dbcfb37b00f27a63055df | [
"MIT"
] | null | null | null | src/api/error_handlers.py | mbeacom/Mushroom-Classification | ab8a08498c93aa0d307dbcfb37b00f27a63055df | [
"MIT"
] | 1 | 2021-01-17T19:44:30.000Z | 2021-01-17T19:44:30.000Z | """Contains error handler functions which can be registered to the flask application"""
from werkzeug.exceptions import HTTPException
from api.custom_logger import get_custom_logger, LoggerType
from api.helpers import create_error_response
def unhandled_exception_handler(error: Exception):
"""
Handles all uncaught exceptions, should be registered to the app
upon initialization
"""
get_custom_logger(LoggerType.FAILURE).critical('Unhandled Exception: %s', error, exc_info=True)
return create_error_response("Unhandled Internal Server Error...", 500)
def http_error_as_json(error: HTTPException):
"""
Turns the given error into a json response containing the error message
"""
get_custom_logger(LoggerType.BASIC).error('(%s)-%s', error.code, error.description)
return create_error_response(error.description, error.code)
| 41.380952 | 99 | 0.773303 | 112 | 869 | 5.830357 | 0.508929 | 0.073507 | 0.068913 | 0.114855 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004021 | 0.141542 | 869 | 20 | 100 | 43.45 | 0.871314 | 0.273878 | 0 | 0 | 0 | 0 | 0.108291 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.333333 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
b858a3053e1d08d60031959a2025aa546fc2c484 | 864 | py | Python | tests/functional/test_home_page.py | vyahello/fake-cars-api | 13c7325a7d8779d4b2e5ce60d5664b843c891cb6 | [
"MIT"
] | null | null | null | tests/functional/test_home_page.py | vyahello/fake-cars-api | 13c7325a7d8779d4b2e5ce60d5664b843c891cb6 | [
"MIT"
] | 3 | 2019-11-22T20:56:17.000Z | 2021-09-15T08:18:30.000Z | tests/functional/test_home_page.py | vyahello/fake-vehicles-api | 13c7325a7d8779d4b2e5ce60d5664b843c891cb6 | [
"MIT"
] | null | null | null | import pytest
import requests
from apistar import TestClient
from api.web.support import Status
@pytest.fixture(scope="module")
def response_home(client: TestClient) -> requests.Response:
return client.get("/")
@pytest.fixture(scope="module")
def response_index(client: TestClient) -> requests.Response:
return client.get("/index.html")
def test_home_status_code(response_home: requests.Response) -> None:
assert response_home.status_code == Status.SUCCESS.code
def test_home_status_content(response_home: requests.Response) -> None:
assert "Fake vehicles" in response_home.text
def test_index_status_code(response_index: requests.Response) -> None:
assert response_index.status_code == Status.SUCCESS.code
def test_index_status_content(response_index: requests.Response) -> None:
assert "Fake vehicles" in response_index.text
| 27 | 73 | 0.780093 | 114 | 864 | 5.701754 | 0.27193 | 0.147692 | 0.123077 | 0.16 | 0.686154 | 0.661538 | 0.396923 | 0.147692 | 0 | 0 | 0 | 0 | 0.119213 | 864 | 31 | 74 | 27.870968 | 0.854139 | 0 | 0 | 0.111111 | 0 | 0 | 0.05787 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 1 | 0.333333 | false | 0 | 0.222222 | 0.111111 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
b8702c2b949ad3b93ab50e6b8737174992359fa5 | 2,119 | py | Python | Guides/IP_Finder_and_Validator.py | lanceyvang/blue_team | 2c6056e842df22f889804d6c63d64f1f208f563f | [
"MIT"
] | 1 | 2022-01-26T01:40:01.000Z | 2022-01-26T01:40:01.000Z | Workshops/bt_6 snort and logs/files/IP_Finder_and_Validator.py | lanceyvang/blue_team | 2c6056e842df22f889804d6c63d64f1f208f563f | [
"MIT"
] | null | null | null | Workshops/bt_6 snort and logs/files/IP_Finder_and_Validator.py | lanceyvang/blue_team | 2c6056e842df22f889804d6c63d64f1f208f563f | [
"MIT"
] | 2 | 2020-09-28T20:16:13.000Z | 2021-03-21T01:02:45.000Z | #!/usr/bin/env python
import sys
import re
def create_content():
file = open(sys.argv[1], 'r')
content = file.read()
file.close()
return content
def create_dict(li):
ip_dict = {}
for ip_address in li:
if ip_address in ip_dict: ip_dict[ip_address] += 1
else: ip_dict[ip_address] = 1
return ip_dict
def sort_ip_li(dict):
def compare_ip(ip):
return int(ip.split('.')[0])
def compare_amount(key):
return dict[key]
first_sort = sorted(dict, key = compare_ip)
return sorted(first_sort, key = compare_amount)
def format_amount(n):
n_str = str(n)
if len(n_str) == 1: n_str = '00' + n_str
elif len(n_str) == 2: n_str = '0' + n_str
return '('+ n_str + ')'
def print_ip_lines(li, dict):
for ip in li:
validate = re.search(r"^(([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\.){3}([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])$", ip)
print(format_amount(dict[ip]) + ' ' + str(bool(validate)) + ': ' + ip + ' *' )
def main():
content = create_content()
all_ips = re.findall(r"[0-9]+\.[0-9]+\.[0-9]+.[0-9]+", content)
ip_dict = create_dict(all_ips)
sorted_ips = sort_ip_li(ip_dict)
print_ip_lines(sorted_ips, ip_dict)
main()
# OUTPUT
# (001) False: 1.1234.1.1 *
# (001) False: 7.888.8.8 *
# (001) True: 11.11.11.105 *
# (001) True: 11.11.11.95 *
# (001) True: 24.17.237.70 *
# (001) True: 141.101.98.63 *
# (001) True: 141.101.98.43 *
# (001) True: 141.101.97.63 *
# (001) True: 141.101.198.63 *
# (001) True: 141.101.98.53 *
# (001) False: 444.2.2.2 *
# (001) False: 555.1.1.1 *
# (001) False: 777.777.7777.777 *
# (001) False: 888.8888.888.888 *
# (001) False: 999.999.999 *
# (002) True: 2.2.2.2 *
# (002) False: 09.01.02.03 *
# (002) True: 141.102.98.63 *
# (003) True: 11.11.11.89 *
# (003) True: 141.101.98.61 *
# (004) True: 11.11.11.70 *
# (004) True: 192.150.249.87 *
# (004) True: 211.168.230.94 *
# (045) True: 127.0.0.1 *
# (049) True: 211.190.205.93 *
# (050) True: 61.73.94.162 * | 29.430556 | 142 | 0.548372 | 378 | 2,119 | 2.957672 | 0.296296 | 0.021467 | 0.053667 | 0.05814 | 0.150268 | 0.071556 | 0.033989 | 0.033989 | 0.033989 | 0.033989 | 0 | 0.225846 | 0.233129 | 2,119 | 72 | 143 | 29.430556 | 0.462154 | 0.388863 | 0 | 0 | 0 | 0.023256 | 0.115142 | 0.105678 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.046512 | null | null | 0.069767 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b8712fee4ba5429f3d5950cead07be5d843a8959 | 229 | py | Python | homeassistant/components/acmeda/const.py | domwillcode/home-assistant | f170c80bea70c939c098b5c88320a1c789858958 | [
"Apache-2.0"
] | 30,023 | 2016-04-13T10:17:53.000Z | 2020-03-02T12:56:31.000Z | homeassistant/components/acmeda/const.py | jagadeeshvenkatesh/core | 1bd982668449815fee2105478569f8e4b5670add | [
"Apache-2.0"
] | 31,101 | 2020-03-02T13:00:16.000Z | 2022-03-31T23:57:36.000Z | homeassistant/components/acmeda/const.py | jagadeeshvenkatesh/core | 1bd982668449815fee2105478569f8e4b5670add | [
"Apache-2.0"
] | 11,956 | 2016-04-13T18:42:31.000Z | 2020-03-02T09:32:12.000Z | """Constants for the Rollease Acmeda Automate integration."""
import logging
LOGGER = logging.getLogger(__package__)
DOMAIN = "acmeda"
ACMEDA_HUB_UPDATE = "acmeda_hub_update_{}"
ACMEDA_ENTITY_REMOVE = "acmeda_entity_remove_{}"
| 25.444444 | 61 | 0.79476 | 27 | 229 | 6.222222 | 0.62963 | 0.107143 | 0.178571 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.100437 | 229 | 8 | 62 | 28.625 | 0.815534 | 0.240175 | 0 | 0 | 0 | 0 | 0.291667 | 0.136905 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b8b00aa5ab19032d377a00f4d61fd7c150577e89 | 186 | py | Python | Misc/ReinforcementLearning/Main.py | ViRu-ThE-ViRuS/TF_Projects | b009f814177a4efc7972f42ddc6a2fa35f340a53 | [
"MIT"
] | null | null | null | Misc/ReinforcementLearning/Main.py | ViRu-ThE-ViRuS/TF_Projects | b009f814177a4efc7972f42ddc6a2fa35f340a53 | [
"MIT"
] | null | null | null | Misc/ReinforcementLearning/Main.py | ViRu-ThE-ViRuS/TF_Projects | b009f814177a4efc7972f42ddc6a2fa35f340a53 | [
"MIT"
] | null | null | null | from BipedalWalker import *
if __name__ == '__main__':
agent = Agent()
# agent.load()
for _ in range(10):
agent.train(10)
agent.save()
agent.play(1)
| 18.6 | 27 | 0.55914 | 22 | 186 | 4.318182 | 0.727273 | 0.210526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03876 | 0.306452 | 186 | 9 | 28 | 20.666667 | 0.697674 | 0.064516 | 0 | 0 | 0 | 0 | 0.046512 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b8bae700637322fdfea4aa4c82cd6c8af8065199 | 618 | py | Python | carla_utils/ros/wrapper.py | IamWangYunKai/DG-TrajGen | 0a8aab7e1c05111a5afe43d53801c55942e9ff56 | [
"MIT"
] | 31 | 2021-09-15T00:43:43.000Z | 2022-03-27T22:57:21.000Z | carla_utils/ros/wrapper.py | zhangdongkun98/carla-utils | a370db53589841c8cffe95c8df43dfc036176431 | [
"MIT"
] | 1 | 2021-12-09T03:08:13.000Z | 2021-12-15T07:08:31.000Z | carla_utils/ros/wrapper.py | zhangdongkun98/carla-utils | a370db53589841c8cffe95c8df43dfc036176431 | [
"MIT"
] | 2 | 2021-11-26T05:45:18.000Z | 2022-01-19T12:46:41.000Z |
import rospy
from ..world_map import Core
from .pub_sub import ROSPublish
class PublishWrapper(object):
pub_dict = dict()
def __init__(self, config, node_name='carla_env'):
core: Core = config.core
rospy.init_node('{}_{}_{}'.format(node_name, core.host.replace('.', '_'), str(core.port)), disable_signals=True)
self.global_frame_id = 'map'
self.ros_pubish = ROSPublish(self.pub_dict)
def run_once(self, *args, **kwargs):
return
def run_step(self, *args, **kwargs):
return
def kill(self):
rospy.signal_shutdown('[ROS] kill myself!')
| 22.888889 | 120 | 0.640777 | 80 | 618 | 4.675 | 0.55 | 0.037433 | 0.074866 | 0.106952 | 0.122995 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.218447 | 618 | 26 | 121 | 23.769231 | 0.774327 | 0 | 0 | 0.125 | 0 | 0 | 0.06483 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.1875 | 0.125 | 0.6875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
b8bc966c2e6a330da0f02fbb0a666891a44a6120 | 156 | py | Python | Curso_Python/if_else.py | FranciscoCabrita1/Cabrita | af9dfb12dbc64cf6181d4e906156170c5449877e | [
"MIT"
] | 5 | 2020-08-24T23:29:58.000Z | 2022-02-07T19:58:07.000Z | Curso_Python/if_else.py | lulavalenca/Curso-Completo-de-Python-no-Youtube | af9dfb12dbc64cf6181d4e906156170c5449877e | [
"MIT"
] | null | null | null | Curso_Python/if_else.py | lulavalenca/Curso-Completo-de-Python-no-Youtube | af9dfb12dbc64cf6181d4e906156170c5449877e | [
"MIT"
] | 2 | 2020-08-24T23:30:06.000Z | 2021-12-23T18:23:38.000Z | carros = ["audi", "bmw", "ferrari","honda"]
for carro in carros:
if carro == "bmw":
print(carro.upper())
else:
print(carro.title()) | 22.285714 | 43 | 0.544872 | 19 | 156 | 4.473684 | 0.684211 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25641 | 156 | 7 | 44 | 22.285714 | 0.732759 | 0 | 0 | 0 | 0 | 0 | 0.140127 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b8c4ec5af5a44250d0fc4273206f3ea2dc7b0c78 | 454 | py | Python | tests/encoders/audio/tfhub/test_trilll.py | boba-and-beer/vectorhub | fc536a59c77755f4051af37338839e24e0add5c4 | [
"Apache-2.0"
] | 385 | 2020-10-26T13:12:11.000Z | 2021-10-07T15:14:48.000Z | tests/encoders/audio/tfhub/test_trilll.py | boba-and-beer/vectorhub | fc536a59c77755f4051af37338839e24e0add5c4 | [
"Apache-2.0"
] | 24 | 2020-10-29T13:16:31.000Z | 2021-08-31T06:47:33.000Z | tests/encoders/audio/tfhub/test_trilll.py | boba-and-beer/vectorhub | fc536a59c77755f4051af37338839e24e0add5c4 | [
"Apache-2.0"
] | 45 | 2020-10-29T15:25:19.000Z | 2021-09-05T21:50:57.000Z | import numpy as np
from vectorhub.encoders.audio.tfhub import Trill2Vec, TrillDistilled2Vec
from ....test_utils import assert_encoder_works
def test_trill_works():
"""
Testing for speech embedding initialization
"""
enc = Trill2Vec()
assert_encoder_works(enc, vector_length=512, data_type='audio')
def test_trill_distilled_works():
enc = TrillDistilled2Vec()
assert_encoder_works(enc, vector_length=2048, data_type='audio')
| 30.266667 | 72 | 0.762115 | 57 | 454 | 5.789474 | 0.54386 | 0.118182 | 0.163636 | 0.127273 | 0.2 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0.028351 | 0.145374 | 454 | 14 | 73 | 32.428571 | 0.822165 | 0.094714 | 0 | 0 | 0 | 0 | 0.025316 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.222222 | false | 0 | 0.333333 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
b8c4fbf8f0903f754f3fa13aa450dc0d10031d17 | 850 | py | Python | Dataset/Leetcode/train/12/361.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | Dataset/Leetcode/train/12/361.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | Dataset/Leetcode/train/12/361.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | class Solution(object):
def XXX(self, num):
"""
:type num: int
:rtype: str
"""
dic = {0:'', 1:'I', 5:'V', 10:'X', 50:'L', 100:'C', 500:'D', 1000:'M'}
rate = 1
number = num
result = ''
while number !=0:
cur_num = number%10
if cur_num < 4:
result += dic[rate]*cur_num
elif cur_num == 4:
result += dic[rate*5]
result += dic[rate]
elif cur_num == 5:
result += dic[rate*5]
elif cur_num<9:
result += dic[rate]*(cur_num-5)
result += dic[rate*5]
else:
result += dic[rate*10]
result += dic[rate]
number = number//10
rate *= 10
return result[::-1]
| 26.5625 | 78 | 0.389412 | 98 | 850 | 3.306122 | 0.387755 | 0.222222 | 0.320988 | 0.12963 | 0.311728 | 0.253086 | 0.12963 | 0 | 0 | 0 | 0 | 0.079295 | 0.465882 | 850 | 31 | 79 | 27.419355 | 0.634361 | 0 | 0 | 0.208333 | 0 | 0 | 0.00885 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b23327f62a8b9f45d77cba686e3cf09f1104ef36 | 388 | py | Python | gerenciador_tarefas/gerenciador.py | Engcompaulo/gerenciadortarefas | 83df9e4530c25f468e17cdfe88df4be2826443e1 | [
"MIT"
] | null | null | null | gerenciador_tarefas/gerenciador.py | Engcompaulo/gerenciadortarefas | 83df9e4530c25f468e17cdfe88df4be2826443e1 | [
"MIT"
] | null | null | null | gerenciador_tarefas/gerenciador.py | Engcompaulo/gerenciadortarefas | 83df9e4530c25f468e17cdfe88df4be2826443e1 | [
"MIT"
] | null | null | null | from fastapi import FastAPI
app = FastAPI()
"""
TAREFAS = [
{
"id": 1,
"titulo": "titulo",
"descricao": "descricao",
"estado": "Finalizado"
},
{
"id": 2,
"titulo": "titulo",
"descricao": "descricao",
"estado": "Finalizado"
}
]
"""
TAREFAS = []
@app.get("/tarefas")
def listar_tarefas():
return TAREFAS | 15.52 | 33 | 0.487113 | 32 | 388 | 5.875 | 0.5 | 0.12766 | 0.223404 | 0.319149 | 0.489362 | 0.489362 | 0 | 0 | 0 | 0 | 0 | 0.007663 | 0.32732 | 388 | 25 | 34 | 15.52 | 0.712644 | 0 | 0 | 0 | 0 | 0 | 0.065574 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0.166667 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
b2556b535a018a00b033cdd9c9b58f9c9f46dd30 | 209 | py | Python | Lesson07/checklist.py | xperthunter/pybioinformatics | d99b71d4c69d2e8a08d0b322551df478f2e85708 | [
"MIT"
] | null | null | null | Lesson07/checklist.py | xperthunter/pybioinformatics | d99b71d4c69d2e8a08d0b322551df478f2e85708 | [
"MIT"
] | null | null | null | Lesson07/checklist.py | xperthunter/pybioinformatics | d99b71d4c69d2e8a08d0b322551df478f2e85708 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# Write a program that compares two files of names to find:
# Names unique to file 1
# Names unique to file 2
# Names shared in both files
"""
python3 checklist.py --file1 --file2
"""
| 19 | 59 | 0.708134 | 35 | 209 | 4.228571 | 0.742857 | 0.148649 | 0.175676 | 0.22973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035503 | 0.191388 | 209 | 10 | 60 | 20.9 | 0.840237 | 0.904306 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b275cd1d8a60d39e858466965ae556defe0584f4 | 934 | py | Python | LTIME96C Competition/HOOPS.py | 8Bit1Byte/Codechef-Solutions | a79d64042da04e007c5101d3c784a843df01f852 | [
"MIT"
] | 2 | 2021-05-24T11:20:46.000Z | 2021-06-18T12:21:43.000Z | LTIME96C Competition/HOOPS.py | 8Bit1Byte/CodechefSolutions | a79d64042da04e007c5101d3c784a843df01f852 | [
"MIT"
] | null | null | null | LTIME96C Competition/HOOPS.py | 8Bit1Byte/CodechefSolutions | a79d64042da04e007c5101d3c784a843df01f852 | [
"MIT"
] | null | null | null | '''
Problem Name: Hoop Jump
Problem Code: HOOPS
Problem Link: https://www.codechef.com/problems/HOOPS
Solution Link: https://www.codechef.com/viewsolution/47135989
'''
import os.path
from math import gcd, floor, ceil
from collections import *
import sys
mod = 1000000007
INF = float('inf')
def st(): return list(sys.stdin.readline().strip())
def li(): return list(map(int, sys.stdin.readline().split()))
def ls(): return list(sys.stdin.readline().split())
def mp(): return map(int, sys.stdin.readline().split())
def inp(): return int(sys.stdin.readline())
def pr(n): return sys.stdout.write(str(n)+"\n")
def prl(n): return sys.stdout.write(str(n)+" ")
# for standard i/o
if os.path.exists('input.txt'):
sys.stdin = open('input.txt', 'r')
sys.stdout = open('output.txt', 'w')
def solve(n):
print(n//2+1)
if __name__ == '__main__':
t = inp()
for _ in range(t):
n = inp()
solve(n) | 26.685714 | 65 | 0.648822 | 145 | 934 | 4.117241 | 0.475862 | 0.080402 | 0.134003 | 0.095477 | 0.361809 | 0.184255 | 0.184255 | 0 | 0 | 0 | 0 | 0.025641 | 0.164882 | 934 | 35 | 66 | 26.685714 | 0.739744 | 0.189507 | 0 | 0 | 0 | 0 | 0.059946 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.347826 | false | 0 | 0.173913 | 0.304348 | 0.521739 | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
b276a25a1f8fabf7526303890531c97f7b057241 | 683 | py | Python | crawler/crawler/items.py | krispingal/improved-happiness | 1113e19065fee08a2530bf1fb1d4b2f888155f77 | [
"BSD-3-Clause"
] | null | null | null | crawler/crawler/items.py | krispingal/improved-happiness | 1113e19065fee08a2530bf1fb1d4b2f888155f77 | [
"BSD-3-Clause"
] | null | null | null | crawler/crawler/items.py | krispingal/improved-happiness | 1113e19065fee08a2530bf1fb1d4b2f888155f77 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
import scrapy
from scrapy.loader.processors import TakeFirst, Compose, MapCompose
def clean_prep_step(prep_step: str):
return prep_step.strip().replace('\n', '')
def extract_num_servings(servings: str):
return servings.split(' ')[0]
class Recipe(scrapy.Item):
name = scrapy.Field(input_processor=TakeFirst())
servings = scrapy.Field(input_processor=Compose(TakeFirst(), extract_num_servings))
ingredients = scrapy.Field()
preparation_steps = scrapy.Field(input_processor=MapCompose(clean_prep_step))
rating = scrapy.Field(input_processor=Compose(TakeFirst(), TakeFirst()))
tags = scrapy.Field(input_processor=TakeFirst())
| 34.15 | 87 | 0.743777 | 83 | 683 | 5.927711 | 0.445783 | 0.134146 | 0.162602 | 0.254065 | 0.304878 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0.003339 | 0.122987 | 683 | 19 | 88 | 35.947368 | 0.81803 | 0.030747 | 0 | 0 | 0 | 0 | 0.004545 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.153846 | 0.153846 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
b2a464bcdc9772d11a2d32d10d924b785cbd1d9c | 15 | py | Python | easyml/__init__.py | richiefrost/easy-ml | c42293f8f916b11e370e565bb6b5d2f3a330c38a | [
"MIT"
] | null | null | null | easyml/__init__.py | richiefrost/easy-ml | c42293f8f916b11e370e565bb6b5d2f3a330c38a | [
"MIT"
] | null | null | null | easyml/__init__.py | richiefrost/easy-ml | c42293f8f916b11e370e565bb6b5d2f3a330c38a | [
"MIT"
] | null | null | null | name = "easyml" | 15 | 15 | 0.666667 | 2 | 15 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 15 | 1 | 15 | 15 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0.375 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b2aada57e112be95915aa828d920e94daf6d254c | 2,388 | py | Python | test/vanilla/legacy/Expected/AcceptanceTests/BodyComplexPythonThreeOnly/bodycomplexpython3only/models/__init__.py | cfculhane/autorest.python | 8cbca95faee88d933a58bbbd17b76834faa8d387 | [
"MIT"
] | 35 | 2018-04-03T12:15:53.000Z | 2022-03-11T14:03:34.000Z | test/vanilla/legacy/Expected/AcceptanceTests/BodyComplexPythonThreeOnly/bodycomplexpython3only/models/__init__.py | cfculhane/autorest.python | 8cbca95faee88d933a58bbbd17b76834faa8d387 | [
"MIT"
] | 652 | 2017-08-28T22:44:41.000Z | 2022-03-31T21:20:31.000Z | test/vanilla/legacy/Expected/AcceptanceTests/BodyComplexPythonThreeOnly/bodycomplexpython3only/models/__init__.py | cfculhane/autorest.python | 8cbca95faee88d933a58bbbd17b76834faa8d387 | [
"MIT"
] | 29 | 2017-08-28T20:57:01.000Z | 2022-03-11T14:03:38.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
from ._models_py3 import ArrayWrapper
from ._models_py3 import Basic
from ._models_py3 import BooleanWrapper
from ._models_py3 import ByteWrapper
from ._models_py3 import Cat
from ._models_py3 import Cookiecuttershark
from ._models_py3 import DateWrapper
from ._models_py3 import DatetimeWrapper
from ._models_py3 import Datetimerfc1123Wrapper
from ._models_py3 import DictionaryWrapper
from ._models_py3 import Dog
from ._models_py3 import DotFish
from ._models_py3 import DotFishMarket
from ._models_py3 import DotSalmon
from ._models_py3 import DoubleWrapper
from ._models_py3 import DurationWrapper
from ._models_py3 import Error
from ._models_py3 import Fish
from ._models_py3 import FloatWrapper
from ._models_py3 import Goblinshark
from ._models_py3 import IntWrapper
from ._models_py3 import LongWrapper
from ._models_py3 import MyBaseType
from ._models_py3 import MyDerivedType
from ._models_py3 import Pet
from ._models_py3 import ReadonlyObj
from ._models_py3 import Salmon
from ._models_py3 import Sawshark
from ._models_py3 import Shark
from ._models_py3 import Siamese
from ._models_py3 import SmartSalmon
from ._models_py3 import StringWrapper
from ._auto_rest_complex_test_service_enums import (
CMYKColors,
GoblinSharkColor,
MyKind,
)
__all__ = [
"ArrayWrapper",
"Basic",
"BooleanWrapper",
"ByteWrapper",
"Cat",
"Cookiecuttershark",
"DateWrapper",
"DatetimeWrapper",
"Datetimerfc1123Wrapper",
"DictionaryWrapper",
"Dog",
"DotFish",
"DotFishMarket",
"DotSalmon",
"DoubleWrapper",
"DurationWrapper",
"Error",
"Fish",
"FloatWrapper",
"Goblinshark",
"IntWrapper",
"LongWrapper",
"MyBaseType",
"MyDerivedType",
"Pet",
"ReadonlyObj",
"Salmon",
"Sawshark",
"Shark",
"Siamese",
"SmartSalmon",
"StringWrapper",
"CMYKColors",
"GoblinSharkColor",
"MyKind",
]
| 27.767442 | 94 | 0.707705 | 254 | 2,388 | 6.362205 | 0.334646 | 0.19802 | 0.257426 | 0.376238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020459 | 0.160804 | 2,388 | 85 | 95 | 28.094118 | 0.785928 | 0.18928 | 0 | 0 | 0 | 0 | 0.186203 | 0.011411 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.445946 | 0 | 0.445946 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
a234e7db2f4a21cc97dacc89103ec2365850ac11 | 1,825 | py | Python | pyaz/mysql/server/configuration/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | null | null | null | pyaz/mysql/server/configuration/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | null | null | null | pyaz/mysql/server/configuration/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | 1 | 2022-02-03T09:12:01.000Z | 2022-02-03T09:12:01.000Z | '''
Manage configuration values for a server.
'''
from .... pyaz_utils import _call_az
def set(name, resource_group, server_name, value=None):
'''
Update the configuration of a server.
Required Parameters:
- name -- The name of the configuration
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
- server_name -- Name of the server. The name can contain only lowercase letters, numbers, and the hyphen (-) character. Minimum 3 characters and maximum 63 characters.
Optional Parameters:
- value -- Value of the configuration. If not provided, configuration value will be set to default.
'''
return _call_az("az mysql server configuration set", locals())
def show(name, resource_group, server_name):
'''
Get the configuration for a server."
Required Parameters:
- name -- The name of the server configuration.
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
- server_name -- Name of the server. The name can contain only lowercase letters, numbers, and the hyphen (-) character. Minimum 3 characters and maximum 63 characters.
'''
return _call_az("az mysql server configuration show", locals())
def list(resource_group, server_name):
'''
List the configuration values for a server.
Required Parameters:
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
- server_name -- Name of the server. The name can contain only lowercase letters, numbers, and the hyphen (-) character. Minimum 3 characters and maximum 63 characters.
'''
return _call_az("az mysql server configuration list", locals())
| 42.44186 | 172 | 0.72 | 245 | 1,825 | 5.277551 | 0.228571 | 0.090487 | 0.034803 | 0.046404 | 0.796597 | 0.693736 | 0.693736 | 0.664346 | 0.664346 | 0.600928 | 0 | 0.006135 | 0.196164 | 1,825 | 42 | 173 | 43.452381 | 0.875256 | 0.722192 | 0 | 0 | 0 | 0 | 0.25187 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0.142857 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
a2487cf5f03166a4425a860c46988fa22e980c83 | 345 | py | Python | ifs.py | hodlar/curso_python | d19d4bdc8011a5ef47b787d448d5feb15a190f2e | [
"CC0-1.0"
] | null | null | null | ifs.py | hodlar/curso_python | d19d4bdc8011a5ef47b787d448d5feb15a190f2e | [
"CC0-1.0"
] | null | null | null | ifs.py | hodlar/curso_python | d19d4bdc8011a5ef47b787d448d5feb15a190f2e | [
"CC0-1.0"
] | null | null | null | Alonso_Position=1
if (Alonso_Position==1):
print("Espectacular Alonso, se ha hecho justicia a pesar del coche")
print("Ya queda menos para ganar el mundal")
elif (Alonso_Position>1):
print("Gran carrera de Alonso, lástima que el coche no esté a la altura")
else:
print("No ha podido terminar la carrera por una avería mecánica")
| 38.333333 | 77 | 0.736232 | 56 | 345 | 4.482143 | 0.678571 | 0.167331 | 0.179283 | 0.159363 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010601 | 0.17971 | 345 | 8 | 78 | 43.125 | 0.876325 | 0 | 0 | 0 | 0 | 0 | 0.62029 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
a25168ece7092cfdb81063ceebddefa33ec5fe51 | 213 | py | Python | tests/models.py | steffann/django-peeringdb | 7151c7807927dfb31f3a6d3b4dd6d8adc7d23363 | [
"Apache-2.0"
] | null | null | null | tests/models.py | steffann/django-peeringdb | 7151c7807927dfb31f3a6d3b4dd6d8adc7d23363 | [
"Apache-2.0"
] | null | null | null | tests/models.py | steffann/django-peeringdb | 7151c7807927dfb31f3a6d3b4dd6d8adc7d23363 | [
"Apache-2.0"
] | null | null | null |
from django.db import models
from django_peeringdb.models import URLField
class FieldModel(models.Model):
url = URLField(null=True, blank=True)
class Meta:
app_label = 'django_peeringdb.tests'
| 19.363636 | 44 | 0.737089 | 28 | 213 | 5.5 | 0.642857 | 0.12987 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.183099 | 213 | 10 | 45 | 21.3 | 0.885057 | 0 | 0 | 0 | 0 | 0 | 0.103774 | 0.103774 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.833333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
a25d76b2ce3bbf0fe4fad6359eb816b25f08c709 | 671 | py | Python | app/__init__.py | samzhangjy/Guangdu | c8ac5830d4a615be2f26314c41dbdb96ebbb79f0 | [
"MIT"
] | null | null | null | app/__init__.py | samzhangjy/Guangdu | c8ac5830d4a615be2f26314c41dbdb96ebbb79f0 | [
"MIT"
] | null | null | null | app/__init__.py | samzhangjy/Guangdu | c8ac5830d4a615be2f26314c41dbdb96ebbb79f0 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# @Author: Sam Zhang
# @Date: 2020-04-10 20:05:32
# @Last Modified by: Sam Zhang
# @Last Modified time: 2020-04-14 11:17:08
from flask import Flask
from .extensions import *
from uuid import uuid4
def create_app():
app = Flask(__name__)
app.config['SECRET_KEY'] = str(uuid4())
bootstrap.init_app(app)
from .main import main as main_bp
app.register_blueprint(main_bp)
from .baidu import baidu as baidu_bp
app.register_blueprint(baidu_bp)
from .google import google as google_bp
app.register_blueprint(google_bp)
from .api import api as api_bp
app.register_blueprint(api_bp)
return app
| 20.96875 | 43 | 0.694486 | 104 | 671 | 4.298077 | 0.442308 | 0.044743 | 0.116331 | 0.196868 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058271 | 0.207154 | 671 | 31 | 44 | 21.645161 | 0.781955 | 0.210134 | 0 | 0 | 0 | 0 | 0.019084 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.4375 | 0 | 0.5625 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
a287103220b6eca63c9989b640133b8ee7281e71 | 434 | py | Python | zepid/causal/doublyrobust/utils.py | joannadiong/zEpid | 7377ed06156d074aa2b571be520e8e004a564353 | [
"MIT"
] | 101 | 2018-12-17T20:32:20.000Z | 2022-03-29T08:51:46.000Z | zepid/causal/doublyrobust/utils.py | joannadiong/zEpid | 7377ed06156d074aa2b571be520e8e004a564353 | [
"MIT"
] | 124 | 2018-12-13T22:30:41.000Z | 2022-02-10T00:24:25.000Z | zepid/causal/doublyrobust/utils.py | joannadiong/zEpid | 7377ed06156d074aa2b571be520e8e004a564353 | [
"MIT"
] | 26 | 2019-02-07T17:45:15.000Z | 2022-01-03T00:39:34.000Z | import numpy as np
# Utilities only meant for the doubly-robust branch
def tmle_unit_bounds(y, mini, maxi, bound):
# bounding for continuous outcomes
v = (y - mini) / (maxi - mini)
v = np.where(np.less(v, bound), bound, v)
v = np.where(np.greater(v, 1-bound), 1-bound, v)
return v
def tmle_unit_unbound(ystar, mini, maxi):
# unbounding of bounded continuous outcomes
return ystar*(maxi - mini) + mini
| 25.529412 | 52 | 0.668203 | 68 | 434 | 4.205882 | 0.5 | 0.083916 | 0.076923 | 0.06993 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005882 | 0.21659 | 434 | 16 | 53 | 27.125 | 0.835294 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0.125 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
a2c9ec87bb93bef359311c13fc4e1c644963127b | 52 | py | Python | uri/python/1759.py | el-cardu/challenges | 836453415e08b04e08d4e10d2f69257052551fa6 | [
"Unlicense"
] | null | null | null | uri/python/1759.py | el-cardu/challenges | 836453415e08b04e08d4e10d2f69257052551fa6 | [
"Unlicense"
] | null | null | null | uri/python/1759.py | el-cardu/challenges | 836453415e08b04e08d4e10d2f69257052551fa6 | [
"Unlicense"
] | null | null | null | N = int(input())
print(('Ho ' * N).rstrip() + '!')
| 13 | 33 | 0.442308 | 7 | 52 | 3.285714 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.192308 | 52 | 3 | 34 | 17.333333 | 0.547619 | 0 | 0 | 0 | 0 | 0 | 0.078431 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
a2da7f1bce33f1e99a0c92de3b09eda57f84dd83 | 1,197 | py | Python | Python Snippets with Documentation/08 Modules/04 Packages.py | AhmedRaja1/Python-Beginner-s-Starter-Kit | 285cfbeb7207e6531954f21cae3a062f977ee5a0 | [
"MIT"
] | 1 | 2021-09-27T16:47:25.000Z | 2021-09-27T16:47:25.000Z | Python Snippets with Documentation/08 Modules/04 Packages.py | AhmedRaja1/Python-Beginner-s-Starter-Kit | 285cfbeb7207e6531954f21cae3a062f977ee5a0 | [
"MIT"
] | null | null | null | Python Snippets with Documentation/08 Modules/04 Packages.py | AhmedRaja1/Python-Beginner-s-Starter-Kit | 285cfbeb7207e6531954f21cae3a062f977ee5a0 | [
"MIT"
] | 1 | 2021-09-27T16:47:33.000Z | 2021-09-27T16:47:33.000Z | # 04 Packages
# As our application grows we are going to organize it in folder. Separating the modules in folders for a better organization.
# Here we created a folder "ecommerce" and put the "esales.py" module there.
# We have to add a "__init__.py" file to the "ecommerce" folder.
# When we do that Python treats that folder as a Package.
# A Package is a countainer for one or more modules.
# In file sytems terms a Pakages is mapped to a directory and a module is mapped to a file.
import ecommerce.esales # To import the "esales.py" module we have to prefix it with the name of the package.
ecommerce.esales.calc_tax() # to use any of the function in the "esales.py" module we have to prefix it with the nam,e fo the package.
from ecommerce.esales import calc_tax, calc_shipping # This way is better because we don't need to prefix the name of the package everytime we want to use a function
calc_shipping()
calc_tax()
from ecommerce import esales # If we have to import a lot a functions and it becomes noisy in the above method. We can import it like this.
esales.calc_shipping() # And just use "esales" with the "." operator to access the functions in that module.
esales.calc_tax() | 54.409091 | 165 | 0.765246 | 217 | 1,197 | 4.170507 | 0.396313 | 0.026519 | 0.035359 | 0.056354 | 0.127072 | 0.088398 | 0.088398 | 0.088398 | 0.088398 | 0.088398 | 0 | 0.002041 | 0.181287 | 1,197 | 22 | 166 | 54.409091 | 0.921429 | 0.805347 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.375 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
a2e1a85acd463202554711084de757bfee4e7cf7 | 309 | py | Python | plugins/diffusion/__init__.py | bsavelev/medipy | f0da3750a6979750d5f4c96aedc89ad5ae74545f | [
"CECILL-B"
] | null | null | null | plugins/diffusion/__init__.py | bsavelev/medipy | f0da3750a6979750d5f4c96aedc89ad5ae74545f | [
"CECILL-B"
] | null | null | null | plugins/diffusion/__init__.py | bsavelev/medipy | f0da3750a6979750d5f4c96aedc89ad5ae74545f | [
"CECILL-B"
] | 1 | 2022-03-04T05:47:08.000Z | 2022-03-04T05:47:08.000Z | import os.path
import medipy.itk
medipy.itk.load_wrapitk_module(os.path.dirname(__file__), "MediPyDiffusion")
import estimation
import fiber_statistics
import gui
import io
import registration
from spectral_analysis import spectral_analysis
import scalars
import statistics
import tractography
import utils
| 19.3125 | 76 | 0.860841 | 41 | 309 | 6.268293 | 0.560976 | 0.046693 | 0.171206 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097087 | 309 | 15 | 77 | 20.6 | 0.921147 | 0 | 0 | 0 | 0 | 0 | 0.048544 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.923077 | 0 | 0.923077 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
a2ec6478304eb0abf0962ea55c86c119f5811fde | 261 | py | Python | draw_a_line.py | ryanbenedetti/turtle-graphics-for-kids | 82868266fa8184ff50d9a99bdcb89f05479d9f59 | [
"MIT"
] | 9 | 2017-08-12T09:35:42.000Z | 2021-07-12T17:23:19.000Z | draw_a_line.py | ryanbenedetti/turtle-graphics-for-kids | 82868266fa8184ff50d9a99bdcb89f05479d9f59 | [
"MIT"
] | null | null | null | draw_a_line.py | ryanbenedetti/turtle-graphics-for-kids | 82868266fa8184ff50d9a99bdcb89f05479d9f59 | [
"MIT"
] | 6 | 2016-12-22T18:01:33.000Z | 2021-07-12T17:23:21.000Z | import turtle
turtle.bgcolor("black")
t = turtle.Pen()
t.pencolor("red")
t.forward(50)
t.pencolor("orange")
t.forward(50)
t.pencolor("yellow")
t.forward(50)
t.pencolor("blue")
t.forward(50)
t.pencolor("indigo")
t.forward(50)
t.pencolor("violet")
t.forward(50)
| 15.352941 | 23 | 0.708812 | 44 | 261 | 4.204545 | 0.340909 | 0.291892 | 0.324324 | 0.297297 | 0.513514 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049587 | 0.072797 | 261 | 16 | 24 | 16.3125 | 0.714876 | 0 | 0 | 0.4 | 0 | 0 | 0.138462 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 0.066667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
a2ef8a8b0d8d7292d2343f6a68e34bb712d7e975 | 117 | py | Python | kns/test_empty.py | Daiiqi/horikun_toulove | e506e399ea48816921c9ef9a8eea3538fec44bee | [
"Apache-2.0"
] | null | null | null | kns/test_empty.py | Daiiqi/horikun_toulove | e506e399ea48816921c9ef9a8eea3538fec44bee | [
"Apache-2.0"
] | null | null | null | kns/test_empty.py | Daiiqi/horikun_toulove | e506e399ea48816921c9ef9a8eea3538fec44bee | [
"Apache-2.0"
] | null | null | null | # 这是一段空代码,仅创建一个循环并输出log
log("接下来将输出3次”Hello Love!“")
for k in range (3):
log("Hello Love!")
wait(800)
| 16.714286 | 29 | 0.606838 | 16 | 117 | 4.4375 | 0.8125 | 0.253521 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 0.230769 | 117 | 6 | 30 | 19.5 | 0.733333 | 0.179487 | 0 | 0 | 0 | 0 | 0.367816 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.