hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
7d53c17daf3f8110688e42fdc0054aec20a9eb6f | 209 | py | Python | remove_duplicates_v2.py | rhthomas/Python-Interview-Problems-for-Practice | cb713c13f6d70851dbde6337944a77940dfabff2 | [
"MIT"
] | null | null | null | remove_duplicates_v2.py | rhthomas/Python-Interview-Problems-for-Practice | cb713c13f6d70851dbde6337944a77940dfabff2 | [
"MIT"
] | null | null | null | remove_duplicates_v2.py | rhthomas/Python-Interview-Problems-for-Practice | cb713c13f6d70851dbde6337944a77940dfabff2 | [
"MIT"
] | 1 | 2019-07-29T12:40:52.000Z | 2019-07-29T12:40:52.000Z | def remove_duplicates_v2(arr):
dedupe_arr = []
for i in arr:
if i not in dedupe_arr:
dedupe_arr.append(i)
return dedupe_arr
result = remove_duplicates([0,0,0,1,1,2,2,3,4,5])
print(result)
| 17.416667 | 49 | 0.669856 | 39 | 209 | 3.410256 | 0.538462 | 0.270677 | 0.180451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065868 | 0.200957 | 209 | 11 | 50 | 19 | 0.730539 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0 | 0 | 0.25 | 0.125 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7d7c585a7490f7289c0d942c78350af428e79cee | 910 | py | Python | pybioc/s4vectors.py | jlumpe/py-bioconductor | a0e1e1e85d639dfd271bb202a25d41edcf2338dd | [
"MIT"
] | 1 | 2019-08-20T15:22:23.000Z | 2019-08-20T15:22:23.000Z | pybioc/s4vectors.py | jlumpe/py-bioconductor | a0e1e1e85d639dfd271bb202a25d41edcf2338dd | [
"MIT"
] | null | null | null | pybioc/s4vectors.py | jlumpe/py-bioconductor | a0e1e1e85d639dfd271bb202a25d41edcf2338dd | [
"MIT"
] | null | null | null | """Convert S4Vectors objects to and from their Python representations.
Bioconductor package: `S4Vectors <http://bioconductor.org/packages/S4Vectors/>`_
::
Pagès H, Lawrence M and Aboyoun P (2017). S4Vectors: S4 implementation of
vectors and lists. R package version 0.14.6.
"""
import numpy as np
from rpy2.robjects.packages import importr
def dataframe_to_pandas(dataframe):
"""Convert an S4Vector DataFrame to a Pandas DataFrame.
Requires the pandas package be installed, obviously.
:param dataframe: rpy2 S4 object corresponding to an S4VectorS DataFrame.
:type dataframe: rpy2.robjects.methods.RS4
:rtype: pandas.DataFrame
"""
import pandas as pd
rbase = importr('base')
colnames = list(rbase.colnames(dataframe))
data = list(map(np.array, dataframe.do_slot('listData')))
df = pd.DataFrame.from_items(list(zip(colnames, data)))
df.index = dataframe.do_slot('rownames')
return df
| 26 | 80 | 0.758242 | 125 | 910 | 5.472 | 0.584 | 0.065789 | 0.04386 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025478 | 0.137363 | 910 | 34 | 81 | 26.764706 | 0.84586 | 0.592308 | 0 | 0 | 0 | 0 | 0.054496 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.4 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
7d7c806e009f99193a3c259aa976b15850aca979 | 20,436 | py | Python | sdk/python/pulumi_aws_native/cloudformation/_inputs.py | pulumi/pulumi-aws-native | 1ae4a4d9c2256b2a79ca536f8d8497b28d10e4c3 | [
"Apache-2.0"
] | 29 | 2021-09-30T19:32:07.000Z | 2022-03-22T21:06:08.000Z | sdk/python/pulumi_aws_native/cloudformation/_inputs.py | pulumi/pulumi-aws-native | 1ae4a4d9c2256b2a79ca536f8d8497b28d10e4c3 | [
"Apache-2.0"
] | 232 | 2021-09-30T19:26:26.000Z | 2022-03-31T23:22:06.000Z | sdk/python/pulumi_aws_native/cloudformation/_inputs.py | pulumi/pulumi-aws-native | 1ae4a4d9c2256b2a79ca536f8d8497b28d10e4c3 | [
"Apache-2.0"
] | 4 | 2021-11-10T19:42:01.000Z | 2022-02-05T10:15:49.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from ._enums import *
__all__ = [
'HookVersionLoggingConfigArgs',
'ManagedExecutionPropertiesArgs',
'ResourceVersionLoggingConfigArgs',
'StackSetAutoDeploymentArgs',
'StackSetDeploymentTargetsArgs',
'StackSetOperationPreferencesArgs',
'StackSetParameterArgs',
'StackSetStackInstancesArgs',
'StackSetTagArgs',
'StackTagArgs',
'TypeActivationLoggingConfigArgs',
]
@pulumi.input_type
class HookVersionLoggingConfigArgs:
def __init__(__self__, *,
log_group_name: Optional[pulumi.Input[str]] = None,
log_role_arn: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] log_group_name: The Amazon CloudWatch log group to which CloudFormation sends error logging information when invoking the type's handlers.
:param pulumi.Input[str] log_role_arn: The ARN of the role that CloudFormation should assume when sending log entries to CloudWatch logs.
"""
if log_group_name is not None:
pulumi.set(__self__, "log_group_name", log_group_name)
if log_role_arn is not None:
pulumi.set(__self__, "log_role_arn", log_role_arn)
@property
@pulumi.getter(name="logGroupName")
def log_group_name(self) -> Optional[pulumi.Input[str]]:
"""
The Amazon CloudWatch log group to which CloudFormation sends error logging information when invoking the type's handlers.
"""
return pulumi.get(self, "log_group_name")
@log_group_name.setter
def log_group_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "log_group_name", value)
@property
@pulumi.getter(name="logRoleArn")
def log_role_arn(self) -> Optional[pulumi.Input[str]]:
"""
The ARN of the role that CloudFormation should assume when sending log entries to CloudWatch logs.
"""
return pulumi.get(self, "log_role_arn")
@log_role_arn.setter
def log_role_arn(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "log_role_arn", value)
@pulumi.input_type
class ManagedExecutionPropertiesArgs:
def __init__(__self__, *,
active: Optional[pulumi.Input[bool]] = None):
"""
Describes whether StackSets performs non-conflicting operations concurrently and queues conflicting operations.
"""
if active is not None:
pulumi.set(__self__, "active", active)
@property
@pulumi.getter
def active(self) -> Optional[pulumi.Input[bool]]:
return pulumi.get(self, "active")
@active.setter
def active(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "active", value)
@pulumi.input_type
class ResourceVersionLoggingConfigArgs:
def __init__(__self__, *,
log_group_name: Optional[pulumi.Input[str]] = None,
log_role_arn: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] log_group_name: The Amazon CloudWatch log group to which CloudFormation sends error logging information when invoking the type's handlers.
:param pulumi.Input[str] log_role_arn: The ARN of the role that CloudFormation should assume when sending log entries to CloudWatch logs.
"""
if log_group_name is not None:
pulumi.set(__self__, "log_group_name", log_group_name)
if log_role_arn is not None:
pulumi.set(__self__, "log_role_arn", log_role_arn)
@property
@pulumi.getter(name="logGroupName")
def log_group_name(self) -> Optional[pulumi.Input[str]]:
"""
The Amazon CloudWatch log group to which CloudFormation sends error logging information when invoking the type's handlers.
"""
return pulumi.get(self, "log_group_name")
@log_group_name.setter
def log_group_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "log_group_name", value)
@property
@pulumi.getter(name="logRoleArn")
def log_role_arn(self) -> Optional[pulumi.Input[str]]:
"""
The ARN of the role that CloudFormation should assume when sending log entries to CloudWatch logs.
"""
return pulumi.get(self, "log_role_arn")
@log_role_arn.setter
def log_role_arn(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "log_role_arn", value)
@pulumi.input_type
class StackSetAutoDeploymentArgs:
def __init__(__self__, *,
enabled: Optional[pulumi.Input[bool]] = None,
retain_stacks_on_account_removal: Optional[pulumi.Input[bool]] = None):
"""
:param pulumi.Input[bool] enabled: If set to true, StackSets automatically deploys additional stack instances to AWS Organizations accounts that are added to a target organization or organizational unit (OU) in the specified Regions. If an account is removed from a target organization or OU, StackSets deletes stack instances from the account in the specified Regions.
:param pulumi.Input[bool] retain_stacks_on_account_removal: If set to true, stack resources are retained when an account is removed from a target organization or OU. If set to false, stack resources are deleted. Specify only if Enabled is set to True.
"""
if enabled is not None:
pulumi.set(__self__, "enabled", enabled)
if retain_stacks_on_account_removal is not None:
pulumi.set(__self__, "retain_stacks_on_account_removal", retain_stacks_on_account_removal)
@property
@pulumi.getter
def enabled(self) -> Optional[pulumi.Input[bool]]:
"""
If set to true, StackSets automatically deploys additional stack instances to AWS Organizations accounts that are added to a target organization or organizational unit (OU) in the specified Regions. If an account is removed from a target organization or OU, StackSets deletes stack instances from the account in the specified Regions.
"""
return pulumi.get(self, "enabled")
@enabled.setter
def enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enabled", value)
@property
@pulumi.getter(name="retainStacksOnAccountRemoval")
def retain_stacks_on_account_removal(self) -> Optional[pulumi.Input[bool]]:
"""
If set to true, stack resources are retained when an account is removed from a target organization or OU. If set to false, stack resources are deleted. Specify only if Enabled is set to True.
"""
return pulumi.get(self, "retain_stacks_on_account_removal")
@retain_stacks_on_account_removal.setter
def retain_stacks_on_account_removal(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "retain_stacks_on_account_removal", value)
@pulumi.input_type
class StackSetDeploymentTargetsArgs:
def __init__(__self__, *,
accounts: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
organizational_unit_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
The AWS OrganizationalUnitIds or Accounts for which to create stack instances in the specified Regions.
:param pulumi.Input[Sequence[pulumi.Input[str]]] accounts: AWS accounts that you want to create stack instances in the specified Region(s) for.
:param pulumi.Input[Sequence[pulumi.Input[str]]] organizational_unit_ids: The organization root ID or organizational unit (OU) IDs to which StackSets deploys.
"""
if accounts is not None:
pulumi.set(__self__, "accounts", accounts)
if organizational_unit_ids is not None:
pulumi.set(__self__, "organizational_unit_ids", organizational_unit_ids)
@property
@pulumi.getter
def accounts(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
AWS accounts that you want to create stack instances in the specified Region(s) for.
"""
return pulumi.get(self, "accounts")
@accounts.setter
def accounts(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "accounts", value)
@property
@pulumi.getter(name="organizationalUnitIds")
def organizational_unit_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
The organization root ID or organizational unit (OU) IDs to which StackSets deploys.
"""
return pulumi.get(self, "organizational_unit_ids")
@organizational_unit_ids.setter
def organizational_unit_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "organizational_unit_ids", value)
@pulumi.input_type
class StackSetOperationPreferencesArgs:
def __init__(__self__, *,
failure_tolerance_count: Optional[pulumi.Input[int]] = None,
failure_tolerance_percentage: Optional[pulumi.Input[int]] = None,
max_concurrent_count: Optional[pulumi.Input[int]] = None,
max_concurrent_percentage: Optional[pulumi.Input[int]] = None,
region_concurrency_type: Optional[pulumi.Input['StackSetRegionConcurrencyType']] = None,
region_order: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
The user-specified preferences for how AWS CloudFormation performs a stack set operation.
"""
if failure_tolerance_count is not None:
pulumi.set(__self__, "failure_tolerance_count", failure_tolerance_count)
if failure_tolerance_percentage is not None:
pulumi.set(__self__, "failure_tolerance_percentage", failure_tolerance_percentage)
if max_concurrent_count is not None:
pulumi.set(__self__, "max_concurrent_count", max_concurrent_count)
if max_concurrent_percentage is not None:
pulumi.set(__self__, "max_concurrent_percentage", max_concurrent_percentage)
if region_concurrency_type is not None:
pulumi.set(__self__, "region_concurrency_type", region_concurrency_type)
if region_order is not None:
pulumi.set(__self__, "region_order", region_order)
@property
@pulumi.getter(name="failureToleranceCount")
def failure_tolerance_count(self) -> Optional[pulumi.Input[int]]:
return pulumi.get(self, "failure_tolerance_count")
@failure_tolerance_count.setter
def failure_tolerance_count(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "failure_tolerance_count", value)
@property
@pulumi.getter(name="failureTolerancePercentage")
def failure_tolerance_percentage(self) -> Optional[pulumi.Input[int]]:
return pulumi.get(self, "failure_tolerance_percentage")
@failure_tolerance_percentage.setter
def failure_tolerance_percentage(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "failure_tolerance_percentage", value)
@property
@pulumi.getter(name="maxConcurrentCount")
def max_concurrent_count(self) -> Optional[pulumi.Input[int]]:
return pulumi.get(self, "max_concurrent_count")
@max_concurrent_count.setter
def max_concurrent_count(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_concurrent_count", value)
@property
@pulumi.getter(name="maxConcurrentPercentage")
def max_concurrent_percentage(self) -> Optional[pulumi.Input[int]]:
return pulumi.get(self, "max_concurrent_percentage")
@max_concurrent_percentage.setter
def max_concurrent_percentage(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_concurrent_percentage", value)
@property
@pulumi.getter(name="regionConcurrencyType")
def region_concurrency_type(self) -> Optional[pulumi.Input['StackSetRegionConcurrencyType']]:
return pulumi.get(self, "region_concurrency_type")
@region_concurrency_type.setter
def region_concurrency_type(self, value: Optional[pulumi.Input['StackSetRegionConcurrencyType']]):
pulumi.set(self, "region_concurrency_type", value)
@property
@pulumi.getter(name="regionOrder")
def region_order(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
return pulumi.get(self, "region_order")
@region_order.setter
def region_order(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "region_order", value)
@pulumi.input_type
class StackSetParameterArgs:
def __init__(__self__, *,
parameter_key: pulumi.Input[str],
parameter_value: pulumi.Input[str]):
"""
:param pulumi.Input[str] parameter_key: The key associated with the parameter. If you don't specify a key and value for a particular parameter, AWS CloudFormation uses the default value that is specified in your template.
:param pulumi.Input[str] parameter_value: The input value associated with the parameter.
"""
pulumi.set(__self__, "parameter_key", parameter_key)
pulumi.set(__self__, "parameter_value", parameter_value)
@property
@pulumi.getter(name="parameterKey")
def parameter_key(self) -> pulumi.Input[str]:
"""
The key associated with the parameter. If you don't specify a key and value for a particular parameter, AWS CloudFormation uses the default value that is specified in your template.
"""
return pulumi.get(self, "parameter_key")
@parameter_key.setter
def parameter_key(self, value: pulumi.Input[str]):
pulumi.set(self, "parameter_key", value)
@property
@pulumi.getter(name="parameterValue")
def parameter_value(self) -> pulumi.Input[str]:
"""
The input value associated with the parameter.
"""
return pulumi.get(self, "parameter_value")
@parameter_value.setter
def parameter_value(self, value: pulumi.Input[str]):
pulumi.set(self, "parameter_value", value)
@pulumi.input_type
class StackSetStackInstancesArgs:
def __init__(__self__, *,
deployment_targets: pulumi.Input['StackSetDeploymentTargetsArgs'],
regions: pulumi.Input[Sequence[pulumi.Input[str]]],
parameter_overrides: Optional[pulumi.Input[Sequence[pulumi.Input['StackSetParameterArgs']]]] = None):
"""
Stack instances in some specific accounts and Regions.
:param pulumi.Input[Sequence[pulumi.Input[str]]] regions: The names of one or more Regions where you want to create stack instances using the specified AWS account(s).
:param pulumi.Input[Sequence[pulumi.Input['StackSetParameterArgs']]] parameter_overrides: A list of stack set parameters whose values you want to override in the selected stack instances.
"""
pulumi.set(__self__, "deployment_targets", deployment_targets)
pulumi.set(__self__, "regions", regions)
if parameter_overrides is not None:
pulumi.set(__self__, "parameter_overrides", parameter_overrides)
@property
@pulumi.getter(name="deploymentTargets")
def deployment_targets(self) -> pulumi.Input['StackSetDeploymentTargetsArgs']:
return pulumi.get(self, "deployment_targets")
@deployment_targets.setter
def deployment_targets(self, value: pulumi.Input['StackSetDeploymentTargetsArgs']):
pulumi.set(self, "deployment_targets", value)
@property
@pulumi.getter
def regions(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
The names of one or more Regions where you want to create stack instances using the specified AWS account(s).
"""
return pulumi.get(self, "regions")
@regions.setter
def regions(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "regions", value)
@property
@pulumi.getter(name="parameterOverrides")
def parameter_overrides(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['StackSetParameterArgs']]]]:
"""
A list of stack set parameters whose values you want to override in the selected stack instances.
"""
return pulumi.get(self, "parameter_overrides")
@parameter_overrides.setter
def parameter_overrides(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['StackSetParameterArgs']]]]):
pulumi.set(self, "parameter_overrides", value)
@pulumi.input_type
class StackSetTagArgs:
def __init__(__self__, *,
key: pulumi.Input[str],
value: pulumi.Input[str]):
"""
Tag type enables you to specify a key-value pair that can be used to store information about an AWS CloudFormation StackSet.
:param pulumi.Input[str] key: A string used to identify this tag. You can specify a maximum of 127 characters for a tag key.
:param pulumi.Input[str] value: A string containing the value for this tag. You can specify a maximum of 256 characters for a tag value.
"""
pulumi.set(__self__, "key", key)
pulumi.set(__self__, "value", value)
@property
@pulumi.getter
def key(self) -> pulumi.Input[str]:
"""
A string used to identify this tag. You can specify a maximum of 127 characters for a tag key.
"""
return pulumi.get(self, "key")
@key.setter
def key(self, value: pulumi.Input[str]):
pulumi.set(self, "key", value)
@property
@pulumi.getter
def value(self) -> pulumi.Input[str]:
"""
A string containing the value for this tag. You can specify a maximum of 256 characters for a tag value.
"""
return pulumi.get(self, "value")
@value.setter
def value(self, value: pulumi.Input[str]):
pulumi.set(self, "value", value)
@pulumi.input_type
class StackTagArgs:
def __init__(__self__, *,
key: pulumi.Input[str],
value: pulumi.Input[str]):
pulumi.set(__self__, "key", key)
pulumi.set(__self__, "value", value)
@property
@pulumi.getter
def key(self) -> pulumi.Input[str]:
return pulumi.get(self, "key")
@key.setter
def key(self, value: pulumi.Input[str]):
pulumi.set(self, "key", value)
@property
@pulumi.getter
def value(self) -> pulumi.Input[str]:
return pulumi.get(self, "value")
@value.setter
def value(self, value: pulumi.Input[str]):
pulumi.set(self, "value", value)
@pulumi.input_type
class TypeActivationLoggingConfigArgs:
def __init__(__self__, *,
log_group_name: Optional[pulumi.Input[str]] = None,
log_role_arn: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] log_group_name: The Amazon CloudWatch log group to which CloudFormation sends error logging information when invoking the type's handlers.
:param pulumi.Input[str] log_role_arn: The ARN of the role that CloudFormation should assume when sending log entries to CloudWatch logs.
"""
if log_group_name is not None:
pulumi.set(__self__, "log_group_name", log_group_name)
if log_role_arn is not None:
pulumi.set(__self__, "log_role_arn", log_role_arn)
@property
@pulumi.getter(name="logGroupName")
def log_group_name(self) -> Optional[pulumi.Input[str]]:
"""
The Amazon CloudWatch log group to which CloudFormation sends error logging information when invoking the type's handlers.
"""
return pulumi.get(self, "log_group_name")
@log_group_name.setter
def log_group_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "log_group_name", value)
@property
@pulumi.getter(name="logRoleArn")
def log_role_arn(self) -> Optional[pulumi.Input[str]]:
"""
The ARN of the role that CloudFormation should assume when sending log entries to CloudWatch logs.
"""
return pulumi.get(self, "log_role_arn")
@log_role_arn.setter
def log_role_arn(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "log_role_arn", value)
| 43.113924 | 377 | 0.689225 | 2,492 | 20,436 | 5.441814 | 0.08748 | 0.100583 | 0.062975 | 0.036428 | 0.781653 | 0.679375 | 0.641251 | 0.567584 | 0.530713 | 0.499078 | 0 | 0.000807 | 0.211832 | 20,436 | 473 | 378 | 43.205074 | 0.841125 | 0.26468 | 0 | 0.453947 | 1 | 0 | 0.134905 | 0.076311 | 0 | 0 | 0 | 0 | 0 | 1 | 0.207237 | false | 0 | 0.019737 | 0.032895 | 0.348684 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7d7d7947fd914b5cc9bc0a9ba606fddf21a1c2c5 | 1,578 | py | Python | Tests/Test_CheckIfMyUserValid.py | rohitgs28/FindMyEmployer | d4b369eb488f44e40ef371ac09847f8ccc39994c | [
"MIT"
] | null | null | null | Tests/Test_CheckIfMyUserValid.py | rohitgs28/FindMyEmployer | d4b369eb488f44e40ef371ac09847f8ccc39994c | [
"MIT"
] | null | null | null | Tests/Test_CheckIfMyUserValid.py | rohitgs28/FindMyEmployer | d4b369eb488f44e40ef371ac09847f8ccc39994c | [
"MIT"
] | null | null | null | import unittest
import mock
from mock import MagicMock,patch
import os.path
import logging
import sys,os
from MockData import Emailid,Password,Messages1
import sys
import sys, os
sys.path.append(os.path.abspath(os.path.join('..', 'extensions/')))
import extensions
sys.path.append(os.path.abspath(os.path.join('..', 'LoggingDatabase/')))
import LoggingErrorsinDatabase
sys.path.append(os.path.abspath(os.path.join('..', 'Databaselayer/')))
import CheckIfUserValid
sys.path.append(os.path.abspath(os.path.join('..', 'Businesslayer/')))
import CheckIfMyUserValid
class Test_CheckIfMyUserValid(unittest.TestCase):
def test_isValid_1(self):
ifUserValid = CheckIfMyUserValid.CheckIfMyUserValid(Emailid[1], Password[1], True,True)
result = ifUserValid.isValid()
assert result == True
def test_isValid_2(self):
ifUserValid = CheckIfMyUserValid.CheckIfMyUserValid('qwe', Password[1], False,'Invalid UserId / Password')
result = ifUserValid.isValid()
assert result == 'Invalid UserId / Password'
def test_isValid_3(self):
ifUserValid = CheckIfMyUserValid.CheckIfMyUserValid(Emailid[1], 'qwe', False,'Invalid UserId / Password')
result = ifUserValid.isValid()
assert result == 'Invalid UserId / Password'
def test_isValid_4(self):
ifUserValid = CheckIfMyUserValid.CheckIfMyUserValid(Emailid[1], Password[2], False,'Invalid UserId / Password')
result = ifUserValid.isValid()
assert result == 'Invalid UserId / Password'
if __name__ == '__main__':
unittest.main()
| 36.697674 | 119 | 0.719899 | 176 | 1,578 | 6.357955 | 0.255682 | 0.048257 | 0.112601 | 0.053619 | 0.580876 | 0.548704 | 0.495979 | 0.376229 | 0.376229 | 0.247542 | 0 | 0.008258 | 0.155894 | 1,578 | 42 | 120 | 37.571429 | 0.831832 | 0 | 0 | 0.25 | 0 | 0 | 0.143853 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.111111 | false | 0.222222 | 0.361111 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
7d88b55959846e1664306da8d5d17c64b3e13ceb | 4,828 | py | Python | cgi-bin/any/tmp.py | 5610110083/Safety-in-residential-project | 000a48f8c5e94f69497a40529f3540d6b1603ad1 | [
"Apache-2.0"
] | null | null | null | cgi-bin/any/tmp.py | 5610110083/Safety-in-residential-project | 000a48f8c5e94f69497a40529f3540d6b1603ad1 | [
"Apache-2.0"
] | null | null | null | cgi-bin/any/tmp.py | 5610110083/Safety-in-residential-project | 000a48f8c5e94f69497a40529f3540d6b1603ad1 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python
#Import modules for CGI handling
import cgi, cgitb
import Cookie, os, time
form = cgi.FieldStorage()
device1 = form.getvalue('device1')
if device1 is None:
device1 = 'on'
cookie = Cookie.SimpleCookie()
cookie_string = os.environ.get('HTTP_COOKIE')
def getCookies():
if not cookie_string:
return False
else:
# load() parses the cookie string
cookie.load(cookie_string)
# Use the value attribute of the cookie to get it
txt = str(cookie['login'].value)
if txt == 'success':
return True
else:
return False
if getCookies() == False:
print 'Content-Type: text/html\n'
print '<html><head>'
homeIP = 'siczones.coe.psu.ac.th'
print ('''<meta http-equiv="refresh" content="0.1;http://%s">'''%(homeIP))
print '</head></html>'
else:
print ("Content-type:text/html\r\n\r\n")
print ('''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Welcome to server</title>
<link href="../favicon.ico" rel="icon" type="image/x-icon"/>
<link href="../favicon.ico" rel="shortcut icon" type="image/x-icon"/>
<!-- This file has been downloaded from Bootsnipp.com. Enjoy! -->
<meta name="viewport" content="width=device-width, initial-scale=1">
<link href="http://maxcdn.bootstrapcdn.com/bootstrap/3.3.0/css/bootstrap.min.css" rel="stylesheet">
<!-- Custom Fonts -->
<link href="/vendor/font-awesome/css/font-awesome.min.css" rel="stylesheet" type="text/css">
<link href="https://fonts.googleapis.com/css?family=Montserrat:400,700" rel="stylesheet" type="text/css">
<link href='https://fonts.googleapis.com/css?family=Kaushan+Script' rel='stylesheet' type='text/css'>
<link href='https://fonts.googleapis.com/css?family=Droid+Serif:400,700,400italic,700italic' rel='stylesheet' type='text/css'>
<link href='https://fonts.googleapis.com/css?family=Roboto+Slab:400,100,300,700' rel='stylesheet' type='text/css'>
<!-- Theme CSS -->
<link href="../css/agency.css" rel="stylesheet">
<link href="../css/siczones.css" rel="stylesheet">
<script src="http://code.jquery.com/jquery-1.11.1.min.js"></script>
<script src="http://maxcdn.bootstrapcdn.com/bootstrap/3.3.0/js/bootstrap.min.js"></script>
<script>
$(document).ready(function(){
$(window).scroll(function () {
if ($(this).scrollTop() > 50) {
$('#back-to-top').fadeIn();
} else {
$('#back-to-top').fadeOut();
}
});
// scroll body to 0px on click
$('#back-to-top').click(function () {
$('#back-to-top').tooltip('hide');
$('body,html').animate({
scrollTop: 0
}, 800);
return false;
});
$('#back-to-top').tooltip('show');
});
</script>
</head>''')
print ('''
<body>
<!-- ==================== Nav Tabs ======================= -->
<nav class="nav nav-tabs navbar-default navbar-fixed-top">
<div class = "container">
<ul class="nav nav-tabs">
<li role="presentation" class="active"><a href="index.py"><span class="glyphicon glyphicon-home"/> Home</a></li>
<li role="presentation"><a href="mode.py">Mode</a></li>
<li role="presentation" class="dropdown">
<a class="dropdown-toggle" data-toggle="dropdown" href="#" role="button" aria-haspopup="true" aria-expanded="false">
Other<span class="caret"></span>
</a>
<ul class="dropdown-menu">
<li><a href="status.py">Status</a></li>
<li><a href="device.py">Device</a></li>
<li><a href="alert.py">Alert</a></li>
<li role="separator" class="divider"></li>
<li><a href="logout.py" onmouseover="style.color='red'" onmouseout="style.color='black'">Log out</a></li>
</ul>
</li>
</ul>
</div>
</nav>
<br><br><br>
<div class="container-fluid">
<div class="container">
<div class="row">
<div class="col-sm-4 col-md-3 col-xs-5">
<!-- <img src="/img/brand.png" width="50px" height="50px" alt="Brand" style="display: block; margin-left: auto; margin-right: auto;"> -->
<img src="/img/brand/Brand.png" style="max-height: 100px; display: block; margin-left: auto; margin-right: auto;" class="img-responsive" alt="Header">
<br>
</div>
<div class="col-sm-8 col-md-9 col-xxs-7">
<br>
<brand style="display: block; margin-left: auto; margin-right: auto;">
Safety in residential system
</brand>
<hr>
</div>
</div>
</div>
</div>
<!-- ========================== Nav Tabs ======================= -->
<div class = "container bg-all">
<div class="wrapper">''')
print ("</html>")
| 37.138462 | 162 | 0.569594 | 619 | 4,828 | 4.436187 | 0.373183 | 0.029133 | 0.030954 | 0.038237 | 0.251639 | 0.183176 | 0.172251 | 0.172251 | 0.130371 | 0.130371 | 0 | 0.017895 | 0.212925 | 4,828 | 129 | 163 | 37.426357 | 0.704737 | 0.026926 | 0 | 0.191304 | 0 | 0.121739 | 0.870232 | 0.214362 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.017391 | null | null | 0.069565 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7d9317c7026756017842ec057af9ac93f44511ea | 3,724 | py | Python | concordance.py | fjanoos/python | d75d6159cbf3c91b1ca3507ee4bc45afc1ab85dc | [
"Unlicense"
] | 1 | 2020-02-12T02:34:35.000Z | 2020-02-12T02:34:35.000Z | concordance.py | fjanoos/python | d75d6159cbf3c91b1ca3507ee4bc45afc1ab85dc | [
"Unlicense"
] | null | null | null | concordance.py | fjanoos/python | d75d6159cbf3c91b1ca3507ee4bc45afc1ab85dc | [
"Unlicense"
] | null | null | null | #!/usr/bin/python
# Program to generate a concordance list from the text formatted as:
# word_1 { sentence_number_1 : occurence_count, sentence_number_2: occurence_count, ... }
# word_2 { sentence_number_1 : occurence_count, sentence_number_2: occurence_count, ... }
# Note: A sentence is defined as starting with a word with its first letter capitalized
# and ending with a period. In case this rule is not respected, sentence counts will
# be incorrect
from optparse import OptionParser
import sys
def computeConcordance(file_input, verbose_flag = True):
word_dictionary = {};
strip_list = ',:;`\'"()[]{}'
sentence_number = 1; #keep track of sentence
# read the file and tokenize
word_list = file_input.read().split();
curr_word = word_list[0];
for next_word in word_list[1:]:
inc_count = 0;
# test for a new sentence
if curr_word.endswith('.') and next_word.istitle():
inc_count = 1;
# strip out the trialing .
curr_word = curr_word.rstrip('.');
if verbose_flag:
print 'reading sentence #%d ' % sentence_number;
# convert word to lower-case and strip of certain characters
word_ = curr_word.lower().strip(strip_list);
if word_ in word_dictionary.keys():
if sentence_number in word_dictionary[word_].keys():
word_dictionary[word_][sentence_number] += 1;
else:
word_dictionary[word_][sentence_number] = 1;
else:
word_dictionary[word_] = {sentence_number:1};
curr_word = next_word;
sentence_number += inc_count;
# process the last word - remove the trailing fullstop
word_ = curr_word.lower().strip(strip_list+'.');
if word_ in word_dictionary.keys():
if sentence_number in word_dictionary[word_].keys():
word_dictionary[word_][sentence_number] += 1;
else:
word_dictionary[word_][sentence_number] = 1;
else:
word_dictionary[word_] = {sentence_number:1};
# do a cleanup
word_list = word_dictionary.keys();
for word in word_list:
if word+'.' in word_list:
cc_word = word_dictionary[word+'.'];
for (k,v) in cc_word.iteritems():
if k in word_dictionary[word]:
word_dictionary[word][k] += v;
else:
word_dictionary[word][k] = v;
del word_dictionary[word+'.']
return word_dictionary;
if __name__ == "__main__":
usage = \
''' %prog [options] input_filename
Prints the concordance list of the given text file ''';
option_parser = OptionParser(usage)
option_parser.add_option("-v", "--verbose", action="store_true", dest="verbose", default=False) ;
(options, args) = option_parser.parse_args()
if len(args) != 1:
#option_parser.error("incorrect number of arguments")
option_parser.print_help();
sys.exit();
try:
file_input = open(args[0], 'r')
except IOError:
print 'cannot open '+ args[0]
else:
if options.verbose:
print 'generating concordance for '+args[0];
condcordance_list = computeConcordance(file_input, options.verbose);
word_list = condcordance_list.keys();
word_list.sort();
for word in word_list:
print '%25s'%word, ' : ', condcordance_list[word]
| 35.807692 | 102 | 0.575994 | 423 | 3,724 | 4.813239 | 0.307329 | 0.123772 | 0.114931 | 0.076621 | 0.299607 | 0.257367 | 0.257367 | 0.257367 | 0.257367 | 0.257367 | 0 | 0.009134 | 0.323845 | 3,724 | 103 | 103 | 36.15534 | 0.799444 | 0.200054 | 0 | 0.276923 | 1 | 0 | 0.041712 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.030769 | null | null | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7db64f98144c4ec917b1b6f2fa323afee782f798 | 40,419 | py | Python | pysnmp/CXFrameRelay-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/CXFrameRelay-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/CXFrameRelay-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module CXFrameRelay-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/CXFrameRelay-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 18:17:11 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
ObjectIdentifier, OctetString, Integer = mibBuilder.importSymbols("ASN1", "ObjectIdentifier", "OctetString", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueSizeConstraint, ValueRangeConstraint, ConstraintsIntersection, ConstraintsUnion, SingleValueConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueSizeConstraint", "ValueRangeConstraint", "ConstraintsIntersection", "ConstraintsUnion", "SingleValueConstraint")
cxModuleHwPhysSlot, = mibBuilder.importSymbols("CXModuleHardware-MIB", "cxModuleHwPhysSlot")
SapIndex, Alias, cxFrameRelay = mibBuilder.importSymbols("CXProduct-SMI", "SapIndex", "Alias", "cxFrameRelay")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
Gauge32, iso, NotificationType, TimeTicks, IpAddress, MibIdentifier, Counter64, NotificationType, ModuleIdentity, Counter32, MibScalar, MibTable, MibTableRow, MibTableColumn, Integer32, Bits, ObjectIdentity, Unsigned32 = mibBuilder.importSymbols("SNMPv2-SMI", "Gauge32", "iso", "NotificationType", "TimeTicks", "IpAddress", "MibIdentifier", "Counter64", "NotificationType", "ModuleIdentity", "Counter32", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Integer32", "Bits", "ObjectIdentity", "Unsigned32")
TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString")
class DLCI(Integer32):
subtypeSpec = Integer32.subtypeSpec + ValueRangeConstraint(1, 1022)
frpSapTable = MibTable((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1), )
if mibBuilder.loadTexts: frpSapTable.setStatus('mandatory')
frpSapEntry = MibTableRow((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1), ).setIndexNames((0, "CXFrameRelay-MIB", "frpSapNumber"))
if mibBuilder.loadTexts: frpSapEntry.setStatus('mandatory')
frpSapNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 1), SapIndex()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapNumber.setStatus('mandatory')
frpSapRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("invalid", 1), ("valid", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapRowStatus.setStatus('mandatory')
frpSapAlias = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 3), Alias()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapAlias.setStatus('mandatory')
frpSapCompanionAlias = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 4), Alias()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapCompanionAlias.setStatus('mandatory')
frpSapType = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("lower", 1), ("upper", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapType.setStatus('mandatory')
frpSapAddressLength = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(2, 3, 4))).clone(namedValues=NamedValues(("two-octets", 2), ("three-octets", 3), ("four-octets", 4))).clone('two-octets')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapAddressLength.setStatus('mandatory')
frpSapMaxSupportedVCs = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 7), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 1022))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapMaxSupportedVCs.setStatus('deprecated')
frpSapVCBase = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 8), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 1022))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapVCBase.setStatus('deprecated')
frpSapOutCongestionManagement = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 9), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("disabled", 1), ("enabled", 2))).clone('disabled')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapOutCongestionManagement.setStatus('mandatory')
frpSapResourceAllocation = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 10), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 99)).clone(80)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapResourceAllocation.setStatus('mandatory')
frpSapLinkManagement = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 11), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7))).clone(namedValues=NamedValues(("none", 1), ("frameRelayForum", 2), ("ansiAnnexD", 3), ("ccittAnnexA", 4), ("dama1", 5), ("dama2", 6), ("auto", 7))).clone('none')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapLinkManagement.setStatus('mandatory')
frpSapInterfaceType = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 12), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("uniUser", 1), ("uniNetwork", 2), ("nni", 3))).clone('uniUser')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapInterfaceType.setStatus('mandatory')
frpSapPollingInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 13), Integer32().subtype(subtypeSpec=ValueRangeConstraint(5, 30)).clone(10)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapPollingInterval.setStatus('mandatory')
frpSapPollingVerification = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 14), Integer32().subtype(subtypeSpec=ValueRangeConstraint(5, 30)).clone(15)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapPollingVerification.setStatus('mandatory')
frpSapFullEnquiryInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 15), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 255)).clone(6)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapFullEnquiryInterval.setStatus('mandatory')
frpSapErrorThreshold = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 16), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 10)).clone(3)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapErrorThreshold.setStatus('mandatory')
frpSapMonitoredEvents = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 17), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 10)).clone(4)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapMonitoredEvents.setStatus('mandatory')
frpSapMode = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 18), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("frameRelay", 1), ("transparent", 2), ("frameRelayAtmNIwf", 3))).clone('frameRelay')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapMode.setStatus('mandatory')
frpSapPrioQueue1HitRatio = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 19), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapPrioQueue1HitRatio.setStatus('mandatory')
frpSapPrioQueue2HitRatio = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 20), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapPrioQueue2HitRatio.setStatus('mandatory')
frpSapPrioQueue3HitRatio = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 21), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapPrioQueue3HitRatio.setStatus('mandatory')
frpSapPrioQueue4HitRatio = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 22), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapPrioQueue4HitRatio.setStatus('mandatory')
frpSapDialEntry = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 23), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapDialEntry.setStatus('mandatory')
frpSapFilterBitMap = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 24), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapFilterBitMap.setStatus('mandatory')
frpSapLmiFlavor = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 25), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("other", 1), ("strict", 2), ("tolerant", 3))).clone('tolerant')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapLmiFlavor.setStatus('mandatory')
frpSapGenerator = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 33), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("disabled", 1), ("enabled", 2), ("retrigger", 3))).clone('disabled')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapGenerator.setStatus('mandatory')
frpSapGeneratorDlciNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 34), DLCI().clone(16)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapGeneratorDlciNumber.setStatus('mandatory')
frpSapGeneratorFrameSize = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 35), Integer32().subtype(subtypeSpec=ValueRangeConstraint(32, 4096)).clone(32)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapGeneratorFrameSize.setStatus('mandatory')
frpSapGeneratorNumberOfFrames = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 36), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 200)).clone(1)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapGeneratorNumberOfFrames.setStatus('mandatory')
frpSapGeneratorInterFrameDelay = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 37), Integer32().subtype(subtypeSpec=ValueRangeConstraint(50, 60000)).clone(50)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapGeneratorInterFrameDelay.setStatus('mandatory')
frpSapBillingTimer = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 38), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 525600)).clone(1440)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapBillingTimer.setStatus('mandatory')
frpSapSdLmMessageInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 39), Integer32().subtype(subtypeSpec=ValueRangeConstraint(10, 65535)).clone(50)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapSdLmMessageInterval.setStatus('obsolete')
frpSapSdLmActiveTimer = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 40), Integer32().subtype(subtypeSpec=ValueRangeConstraint(10, 65535)).clone(1000)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSapSdLmActiveTimer.setStatus('obsolete')
frpSaptrapTrap1 = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 48), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("disabled", 1), ("enabled", 2))).clone('disabled')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpSaptrapTrap1.setStatus('mandatory')
frpSapControl = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 53), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1))).clone(namedValues=NamedValues(("retriggerBillingTimer", 1)))).setMaxAccess("writeonly")
if mibBuilder.loadTexts: frpSapControl.setStatus('mandatory')
frpSapControlStats = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 54), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("clearSapStats", 1), ("clearAllCircuitStats", 2)))).setMaxAccess("writeonly")
if mibBuilder.loadTexts: frpSapControlStats.setStatus('mandatory')
frpSapstatLinkManagementState = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 55), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("linkDown", 1), ("linkUp", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatLinkManagementState.setStatus('mandatory')
frpSapstatCurrentLinkManagementType = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 56), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7))).clone(namedValues=NamedValues(("none", 1), ("frameRelayForum", 2), ("ansiAnnexD", 3), ("ccittAnnexA", 4), ("dama1", 5), ("dama2", 6), ("discovering", 7)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatCurrentLinkManagementType.setStatus('mandatory')
frpSapstatTxDataFrames = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 61), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatTxDataFrames.setStatus('mandatory')
frpSapstatRxDataFrames = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 62), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatRxDataFrames.setStatus('mandatory')
frpSapstatTxDataOctets = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 63), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatTxDataOctets.setStatus('mandatory')
frpSapstatRxDataOctets = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 64), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatRxDataOctets.setStatus('mandatory')
frpSapstatTxLmiFrames = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 65), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatTxLmiFrames.setStatus('mandatory')
frpSapstatRxLmiFrames = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 66), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatRxLmiFrames.setStatus('mandatory')
frpSapstatTxQueuedDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 67), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatTxQueuedDiscards.setStatus('mandatory')
frpSapstatRxCIRExceededDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 79), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatRxCIRExceededDiscards.setStatus('mandatory')
frpSapstatRxSysCongestionDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 80), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatRxSysCongestionDiscards.setStatus('mandatory')
frpSapstatRxUnavailInboundDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 81), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatRxUnavailInboundDiscards.setStatus('mandatory')
frpSapstatRxUnavailOutboundDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 82), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatRxUnavailOutboundDiscards.setStatus('mandatory')
frpSapstatRxInvalidVCDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 83), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatRxInvalidVCDiscards.setStatus('mandatory')
frpSapstatRxBadStatusDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 84), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatRxBadStatusDiscards.setStatus('mandatory')
frpSapstatRxMiscellaneousDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 85), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatRxMiscellaneousDiscards.setStatus('mandatory')
frpSapstatRxCIRExceeds = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 86), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatRxCIRExceeds.setStatus('mandatory')
frpSapstatRxShortFrameDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 87), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatRxShortFrameDiscards.setStatus('mandatory')
frpSapstatLmiInvalidFieldDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 97), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatLmiInvalidFieldDiscards.setStatus('mandatory')
frpSapstatLmiInvalidSequenceDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 98), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatLmiInvalidSequenceDiscards.setStatus('mandatory')
frpSapstatLmiTimeouts = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 99), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatLmiTimeouts.setStatus('mandatory')
frpSapstatLmiInvalidStatusDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 100), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatLmiInvalidStatusDiscards.setStatus('mandatory')
frpSapstatLmiInvalidStatusEnqDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 101), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatLmiInvalidStatusEnqDiscards.setStatus('mandatory')
frpSapstatLmiInvalidUpdStatusDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 1, 1, 102), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpSapstatLmiInvalidUpdStatusDiscards.setStatus('mandatory')
frpCircuitTable = MibTable((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2), )
if mibBuilder.loadTexts: frpCircuitTable.setStatus('mandatory')
frpCircuitEntry = MibTableRow((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1), ).setIndexNames((0, "CXFrameRelay-MIB", "frpCircuitSapNumber"), (0, "CXFrameRelay-MIB", "frpCircuitDlci"))
if mibBuilder.loadTexts: frpCircuitEntry.setStatus('mandatory')
frpCircuitSapNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 1), SapIndex()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitSapNumber.setStatus('mandatory')
frpCircuitDlci = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 2), DLCI()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitDlci.setStatus('mandatory')
frpCircuitRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("invalid", 1), ("valid", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitRowStatus.setStatus('mandatory')
frpCircuitPriorityLevel = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("veryHigh", 1), ("high", 2), ("medium", 3), ("low", 4))).clone('medium')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitPriorityLevel.setStatus('mandatory')
frpCircuitCommittedBurst = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 5), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitCommittedBurst.setStatus('mandatory')
frpCircuitExcessBurst = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 6), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitExcessBurst.setStatus('mandatory')
frpCircuitCommittedInformationRate = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 7), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitCommittedInformationRate.setStatus('mandatory')
frpCircuitCIRManagement = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("disabled", 1), ("enabled-inbound", 2), ("monitor-inbound", 3), ("enabled-outbound", 4))).clone('disabled')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitCIRManagement.setStatus('mandatory')
frpCircuitMultiProtEncaps = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 9), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("disabled", 1), ("enabled", 2))).clone('disabled')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitMultiProtEncaps.setStatus('mandatory')
frpCircuitHighPriorityBurst = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 10), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitHighPriorityBurst.setStatus('mandatory')
frpCircuitLowPriorityBurst = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 11), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitLowPriorityBurst.setStatus('mandatory')
frpCircuitFragmentationSize = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 18), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitFragmentationSize.setStatus('mandatory')
frpCircuitAlias = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 19), Alias()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitAlias.setStatus('mandatory')
frpCircuitCompanionSapNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 20), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitCompanionSapNumber.setStatus('mandatory')
frpCircuitCompanionDlci = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 21), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitCompanionDlci.setStatus('mandatory')
frpCircuitAlternateSapNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 22), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitAlternateSapNumber.setStatus('mandatory')
frpCircuitAlternateDlci = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 23), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitAlternateDlci.setStatus('mandatory')
frpCircuitMulticastGroupId = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 24), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitMulticastGroupId.setStatus('mandatory')
frpCircuitMulticastType = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 25), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8))).clone(namedValues=NamedValues(("noMulticastAssociation", 1), ("rootOneWay", 2), ("leafOneWay", 3), ("rootTwoWay", 4), ("leafTwoWay", 5), ("rootNWay", 6), ("rootTwoWaySinglePass", 7), ("leafTwoWaySinglePass", 8))).clone('noMulticastAssociation')).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitMulticastType.setStatus('mandatory')
frpCircuitCompressionPort = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 26), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitCompressionPort.setStatus('mandatory')
frpCircuitExpressService = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 27), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("disabled", 1), ("enabled", 2))).clone('enabled')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuitExpressService.setStatus('mandatory')
frpCircuittrapTrap1 = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 32), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("disabled", 1), ("enabled", 2))).clone('disabled')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuittrapTrap1.setStatus('mandatory')
frpCircuittrapTrap2 = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 33), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("disabled", 1), ("enabled", 2))).clone('disabled')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpCircuittrapTrap2.setStatus('mandatory')
frpCircuitControlStats = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 39), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1))).clone(namedValues=NamedValues(("clearCircuitStats", 1)))).setMaxAccess("writeonly")
if mibBuilder.loadTexts: frpCircuitControlStats.setStatus('mandatory')
frpCircuitstatReportedState = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 40), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("notReported", 1), ("reportedActive", 2), ("reportedInactive", 3))).clone('notReported')).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatReportedState.setStatus('mandatory')
frpCircuitstatRouteState = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 41), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("noRoute", 1), ("routeNotOperational", 2), ("routeOperational", 3))).clone('noRoute')).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatRouteState.setStatus('mandatory')
frpCircuitstatAlternateRouteState = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 42), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("noRoute", 1), ("routeNotOperational", 2), ("routeOperational", 3), ("alternateCircuit", 4))).clone('noRoute')).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatAlternateRouteState.setStatus('mandatory')
frpCircuitstatLocalCreationTime = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 47), TimeTicks()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatLocalCreationTime.setStatus('mandatory')
frpCircuitstatRemoteCreationTime = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 48), TimeTicks()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatRemoteCreationTime.setStatus('mandatory')
frpCircuitstatTxFrames = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 49), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatTxFrames.setStatus('mandatory')
frpCircuitstatRxFrames = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 50), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatRxFrames.setStatus('mandatory')
frpCircuitstatTxOctets = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 51), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatTxOctets.setStatus('mandatory')
frpCircuitstatRxOctets = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 52), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatRxOctets.setStatus('mandatory')
frpCircuitstatTxFECNs = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 53), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatTxFECNs.setStatus('mandatory')
frpCircuitstatRxFECNs = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 54), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatRxFECNs.setStatus('mandatory')
frpCircuitstatTxBECNs = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 55), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatTxBECNs.setStatus('mandatory')
frpCircuitstatRxBECNs = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 56), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatRxBECNs.setStatus('mandatory')
frpCircuitstatTxQueuedDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 63), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatTxQueuedDiscards.setStatus('mandatory')
frpCircuitstatRxCIRExceededDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 70), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatRxCIRExceededDiscards.setStatus('mandatory')
frpCircuitstatRxSysCongestionDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 71), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatRxSysCongestionDiscards.setStatus('mandatory')
frpCircuitstatRxUnavailInboundDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 72), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatRxUnavailInboundDiscards.setStatus('mandatory')
frpCircuitstatRxUnavailOutboundDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 73), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatRxUnavailOutboundDiscards.setStatus('mandatory')
frpCircuitstatRxCIRExceeds = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 74), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatRxCIRExceeds.setStatus('mandatory')
frpCircuitstatFragmentationFailures = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 75), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatFragmentationFailures.setStatus('mandatory')
frpCircuitstatDeFragmentationFailures = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 2, 1, 76), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpCircuitstatDeFragmentationFailures.setStatus('mandatory')
frpReportedPvcTable = MibTable((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 3), )
if mibBuilder.loadTexts: frpReportedPvcTable.setStatus('mandatory')
frpReportedPvcEntry = MibTableRow((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 3, 1), ).setIndexNames((0, "CXFrameRelay-MIB", "frpReportedPvcSapNumber"))
if mibBuilder.loadTexts: frpReportedPvcEntry.setStatus('mandatory')
frpReportedPvcSapNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 3, 1, 1), SapIndex()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpReportedPvcSapNumber.setStatus('mandatory')
frpReportedPvcDlci = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 3, 1, 2), DLCI()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpReportedPvcDlci.setStatus('mandatory')
frpReportedPvcLocallyConfigured = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 3, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("no", 1), ("yes", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpReportedPvcLocallyConfigured.setStatus('mandatory')
frpReportedPvcStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 3, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("inactive", 1), ("active", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpReportedPvcStatus.setStatus('mandatory')
frpMulticastTable = MibTable((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 4), )
if mibBuilder.loadTexts: frpMulticastTable.setStatus('mandatory')
frpMulticastEntry = MibTableRow((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 4, 1), ).setIndexNames((0, "CXFrameRelay-MIB", "frpMulticastGroupId"), (0, "CXFrameRelay-MIB", "frpMulticastSapNumber"), (0, "CXFrameRelay-MIB", "frpMulticastDlci"))
if mibBuilder.loadTexts: frpMulticastEntry.setStatus('mandatory')
frpMulticastGroupId = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 4, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpMulticastGroupId.setStatus('mandatory')
frpMulticastSapNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 4, 1, 2), SapIndex()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpMulticastSapNumber.setStatus('mandatory')
frpMulticastDlci = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 4, 1, 3), DLCI()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpMulticastDlci.setStatus('mandatory')
frpMulticastRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 4, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("invalid", 1), ("valid", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpMulticastRowStatus.setStatus('mandatory')
frpMulticastMemberType = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 4, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("root", 1), ("leaf", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpMulticastMemberType.setStatus('mandatory')
frpMulticastServiceType = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 4, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("oneWay", 1), ("twoWay", 2), ("nWay", 3), ("twoWaySinglePass", 4)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: frpMulticastServiceType.setStatus('mandatory')
frpMulticastMemberStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 4, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("inactive", 1), ("active", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpMulticastMemberStatus.setStatus('mandatory')
frpMibLevel = MibScalar((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: frpMibLevel.setStatus('mandatory')
frpSapInterfaceStatusChange = NotificationType((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3) + (0,1)).setObjects(("CXModuleHardware-MIB", "cxModuleHwPhysSlot"), ("CXFrameRelay-MIB", "frpSapNumber"), ("CXFrameRelay-MIB", "frpSapstatLinkManagementState"))
frpPvcReportedStatusChange = NotificationType((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3) + (0,2)).setObjects(("CXModuleHardware-MIB", "cxModuleHwPhysSlot"), ("CXFrameRelay-MIB", "frpCircuitSapNumber"), ("CXFrameRelay-MIB", "frpCircuitDlci"), ("CXFrameRelay-MIB", "frpCircuitstatReportedState"))
frpPvcBillingStats = NotificationType((1, 3, 6, 1, 4, 1, 495, 2, 1, 6, 3) + (0,3)).setObjects(("CXModuleHardware-MIB", "cxModuleHwPhysSlot"), ("CXFrameRelay-MIB", "frpCircuitSapNumber"), ("CXFrameRelay-MIB", "frpCircuitDlci"), ("CXFrameRelay-MIB", "frpCircuitstatTxFrames"), ("CXFrameRelay-MIB", "frpCircuitstatRxFrames"), ("CXFrameRelay-MIB", "frpCircuitstatTxOctets"), ("CXFrameRelay-MIB", "frpCircuitstatRxOctets"))
mibBuilder.exportSymbols("CXFrameRelay-MIB", frpCircuitstatRxCIRExceededDiscards=frpCircuitstatRxCIRExceededDiscards, frpSapGenerator=frpSapGenerator, frpSapstatLinkManagementState=frpSapstatLinkManagementState, frpPvcBillingStats=frpPvcBillingStats, frpSapInterfaceType=frpSapInterfaceType, frpMulticastTable=frpMulticastTable, frpCircuitstatRxBECNs=frpCircuitstatRxBECNs, frpSapstatRxUnavailOutboundDiscards=frpSapstatRxUnavailOutboundDiscards, frpSapstatLmiTimeouts=frpSapstatLmiTimeouts, frpSapControlStats=frpSapControlStats, frpCircuitstatRouteState=frpCircuitstatRouteState, frpCircuitstatRemoteCreationTime=frpCircuitstatRemoteCreationTime, frpSapSdLmMessageInterval=frpSapSdLmMessageInterval, frpCircuitstatDeFragmentationFailures=frpCircuitstatDeFragmentationFailures, frpSapPrioQueue1HitRatio=frpSapPrioQueue1HitRatio, frpSapTable=frpSapTable, frpSapstatRxDataOctets=frpSapstatRxDataOctets, frpCircuitLowPriorityBurst=frpCircuitLowPriorityBurst, frpMulticastDlci=frpMulticastDlci, frpCircuitCompanionSapNumber=frpCircuitCompanionSapNumber, frpSapGeneratorFrameSize=frpSapGeneratorFrameSize, frpReportedPvcEntry=frpReportedPvcEntry, frpSapControl=frpSapControl, frpSapGeneratorInterFrameDelay=frpSapGeneratorInterFrameDelay, frpMibLevel=frpMibLevel, frpSapGeneratorNumberOfFrames=frpSapGeneratorNumberOfFrames, frpMulticastRowStatus=frpMulticastRowStatus, frpSapstatLmiInvalidUpdStatusDiscards=frpSapstatLmiInvalidUpdStatusDiscards, frpMulticastGroupId=frpMulticastGroupId, frpSapPollingInterval=frpSapPollingInterval, frpSapstatRxLmiFrames=frpSapstatRxLmiFrames, frpSapNumber=frpSapNumber, frpSapFilterBitMap=frpSapFilterBitMap, frpCircuitFragmentationSize=frpCircuitFragmentationSize, frpCircuitstatAlternateRouteState=frpCircuitstatAlternateRouteState, frpSapMode=frpSapMode, frpSapLinkManagement=frpSapLinkManagement, frpCircuitstatRxOctets=frpCircuitstatRxOctets, frpCircuittrapTrap2=frpCircuittrapTrap2, frpSapstatRxMiscellaneousDiscards=frpSapstatRxMiscellaneousDiscards, frpSapRowStatus=frpSapRowStatus, frpCircuitRowStatus=frpCircuitRowStatus, frpSapstatRxSysCongestionDiscards=frpSapstatRxSysCongestionDiscards, frpSapPrioQueue4HitRatio=frpSapPrioQueue4HitRatio, frpSapstatRxDataFrames=frpSapstatRxDataFrames, frpSapOutCongestionManagement=frpSapOutCongestionManagement, frpSapstatTxDataFrames=frpSapstatTxDataFrames, frpCircuitExpressService=frpCircuitExpressService, frpCircuitAlternateSapNumber=frpCircuitAlternateSapNumber, frpCircuitCompanionDlci=frpCircuitCompanionDlci, frpSapAddressLength=frpSapAddressLength, frpSapPollingVerification=frpSapPollingVerification, frpCircuitCommittedBurst=frpCircuitCommittedBurst, frpCircuitSapNumber=frpCircuitSapNumber, frpPvcReportedStatusChange=frpPvcReportedStatusChange, frpReportedPvcDlci=frpReportedPvcDlci, frpSapstatRxCIRExceededDiscards=frpSapstatRxCIRExceededDiscards, frpSapPrioQueue3HitRatio=frpSapPrioQueue3HitRatio, frpReportedPvcTable=frpReportedPvcTable, frpCircuitstatRxFrames=frpCircuitstatRxFrames, frpCircuitExcessBurst=frpCircuitExcessBurst, frpCircuitstatFragmentationFailures=frpCircuitstatFragmentationFailures, frpCircuitstatLocalCreationTime=frpCircuitstatLocalCreationTime, frpSapstatLmiInvalidStatusEnqDiscards=frpSapstatLmiInvalidStatusEnqDiscards, frpSapstatTxLmiFrames=frpSapstatTxLmiFrames, frpReportedPvcStatus=frpReportedPvcStatus, frpSapGeneratorDlciNumber=frpSapGeneratorDlciNumber, frpMulticastServiceType=frpMulticastServiceType, frpSapMonitoredEvents=frpSapMonitoredEvents, frpReportedPvcSapNumber=frpReportedPvcSapNumber, frpSaptrapTrap1=frpSaptrapTrap1, frpSapstatLmiInvalidStatusDiscards=frpSapstatLmiInvalidStatusDiscards, frpCircuitCommittedInformationRate=frpCircuitCommittedInformationRate, frpCircuittrapTrap1=frpCircuittrapTrap1, frpReportedPvcLocallyConfigured=frpReportedPvcLocallyConfigured, frpCircuitstatRxSysCongestionDiscards=frpCircuitstatRxSysCongestionDiscards, frpMulticastMemberType=frpMulticastMemberType, frpSapstatLmiInvalidFieldDiscards=frpSapstatLmiInvalidFieldDiscards, frpSapstatRxBadStatusDiscards=frpSapstatRxBadStatusDiscards, frpSapstatCurrentLinkManagementType=frpSapstatCurrentLinkManagementType, frpSapstatRxCIRExceeds=frpSapstatRxCIRExceeds, frpSapErrorThreshold=frpSapErrorThreshold, frpSapAlias=frpSapAlias, frpSapMaxSupportedVCs=frpSapMaxSupportedVCs, frpCircuitstatTxFECNs=frpCircuitstatTxFECNs, frpSapResourceAllocation=frpSapResourceAllocation, frpSapBillingTimer=frpSapBillingTimer, frpSapEntry=frpSapEntry, frpSapstatRxInvalidVCDiscards=frpSapstatRxInvalidVCDiscards, frpSapstatTxDataOctets=frpSapstatTxDataOctets, frpSapVCBase=frpSapVCBase, frpCircuitAlias=frpCircuitAlias, frpCircuitstatRxFECNs=frpCircuitstatRxFECNs, frpSapstatLmiInvalidSequenceDiscards=frpSapstatLmiInvalidSequenceDiscards, frpCircuitstatReportedState=frpCircuitstatReportedState, frpMulticastEntry=frpMulticastEntry, DLCI=DLCI, frpMulticastSapNumber=frpMulticastSapNumber, frpCircuitstatRxUnavailOutboundDiscards=frpCircuitstatRxUnavailOutboundDiscards, frpMulticastMemberStatus=frpMulticastMemberStatus, frpCircuitMulticastType=frpCircuitMulticastType, frpCircuitMultiProtEncaps=frpCircuitMultiProtEncaps, frpSapPrioQueue2HitRatio=frpSapPrioQueue2HitRatio, frpCircuitTable=frpCircuitTable, frpCircuitCIRManagement=frpCircuitCIRManagement, frpSapstatTxQueuedDiscards=frpSapstatTxQueuedDiscards, frpCircuitstatRxCIRExceeds=frpCircuitstatRxCIRExceeds, frpSapSdLmActiveTimer=frpSapSdLmActiveTimer, frpCircuitPriorityLevel=frpCircuitPriorityLevel, frpCircuitHighPriorityBurst=frpCircuitHighPriorityBurst, frpCircuitstatRxUnavailInboundDiscards=frpCircuitstatRxUnavailInboundDiscards, frpSapInterfaceStatusChange=frpSapInterfaceStatusChange, frpSapFullEnquiryInterval=frpSapFullEnquiryInterval, frpSapCompanionAlias=frpSapCompanionAlias, frpCircuitAlternateDlci=frpCircuitAlternateDlci, frpCircuitMulticastGroupId=frpCircuitMulticastGroupId, frpCircuitControlStats=frpCircuitControlStats, frpCircuitEntry=frpCircuitEntry, frpCircuitstatTxOctets=frpCircuitstatTxOctets, frpCircuitstatTxFrames=frpCircuitstatTxFrames, frpSapstatRxShortFrameDiscards=frpSapstatRxShortFrameDiscards, frpSapDialEntry=frpSapDialEntry, frpCircuitDlci=frpCircuitDlci, frpCircuitstatTxQueuedDiscards=frpCircuitstatTxQueuedDiscards, frpSapstatRxUnavailInboundDiscards=frpSapstatRxUnavailInboundDiscards, frpCircuitstatTxBECNs=frpCircuitstatTxBECNs, frpSapLmiFlavor=frpSapLmiFlavor, frpSapType=frpSapType, frpCircuitCompressionPort=frpCircuitCompressionPort)
| 148.054945 | 6,443 | 0.772261 | 4,306 | 40,419 | 7.248955 | 0.088249 | 0.011149 | 0.012398 | 0.016403 | 0.463926 | 0.417088 | 0.314058 | 0.284968 | 0.263696 | 0.254757 | 0 | 0.07346 | 0.076177 | 40,419 | 272 | 6,444 | 148.599265 | 0.76248 | 0.008066 | 0 | 0 | 0 | 0 | 0.110742 | 0.007409 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.007576 | 0.030303 | 0 | 0.037879 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7db898e1bfb0b9d166a715c956086e70e2a0127b | 200 | py | Python | solvebio/version.py | PolinaBevad/solvebio-python | f6c736baa01b5a868a385cb0baf8f9dc2007cec3 | [
"MIT"
] | null | null | null | solvebio/version.py | PolinaBevad/solvebio-python | f6c736baa01b5a868a385cb0baf8f9dc2007cec3 | [
"MIT"
] | null | null | null | solvebio/version.py | PolinaBevad/solvebio-python | f6c736baa01b5a868a385cb0baf8f9dc2007cec3 | [
"MIT"
] | null | null | null | # Note that this file is multi-lingual and can be used in both Python
# and POSIX shell.
# This file should define a variable VERSION which we use as the
# debugger version number.
VERSION = '2.9.0'
| 28.571429 | 69 | 0.745 | 36 | 200 | 4.138889 | 0.861111 | 0.107383 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018634 | 0.195 | 200 | 6 | 70 | 33.333333 | 0.906832 | 0.86 | 0 | 0 | 0 | 0 | 0.217391 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7dc89fd27aac1c9356cbaf3de0bc02198d7f3e2f | 2,798 | py | Python | TweetPoster/reddit.py | joealcorn/TweetPoster | b2892417eb3cece2eef3bc48558214065c6be4c0 | [
"MIT"
] | 33 | 2015-03-16T21:09:52.000Z | 2021-03-29T16:31:37.000Z | TweetPoster/reddit.py | joealcorn/TweetPoster | b2892417eb3cece2eef3bc48558214065c6be4c0 | [
"MIT"
] | 14 | 2015-04-17T15:43:17.000Z | 2016-03-16T13:50:39.000Z | TweetPoster/reddit.py | joealcorn/TweetPoster | b2892417eb3cece2eef3bc48558214065c6be4c0 | [
"MIT"
] | 19 | 2015-06-17T15:02:46.000Z | 2016-10-23T16:54:09.000Z | import time
from socket import timeout
from requests.exceptions import RequestException
from TweetPoster import User, Database
from TweetPoster.signals import pre_request
db = Database()
class Redditor(User):
authenticated = False
last_request = None
def __init__(self, bypass_ratelimit=False, *a, **kw):
super(Redditor, self).__init__(*a, **kw)
if not bypass_ratelimit:
pre_request.connect(self._ratelimit, sender=self)
def login(self, username, password):
"""
Logs a user in, stores modhash in Redditor.modhash
"""
login_url = 'https://ssl.reddit.com/api/login'
params = {
'passwd': password,
'rem': False,
'user': username,
'api_type': 'json',
}
print 'Logging in...'
r = self.post(login_url, params)
if 'data' not in r.json()['json']:
raise Exception('login failed')
self.modhash = r.json()['json']['data']['modhash']
self.authenticated = True
return self
def comment(self, thing_id, comment):
"""
Replies to :thing_id: with :comment:
"""
url = 'http://www.reddit.com/api/comment'
params = {
'uh': self.modhash,
'thing_id': thing_id,
'comment': comment,
'api_type': 'json',
}
print 'Commenting on ' + thing_id
return self.post(url, params)
def get_new_posts(self, db=db):
"""
Returns a list of posts that haven't already
been processed
"""
print 'Fetching new posts...'
url = 'http://www.reddit.com/domain/twitter.com/new.json'
try:
r = self.get(url, params=dict(limit=100))
assert r.status_code == 200
all_posts = r.json()['data']['children']
except (RequestException, ValueError, AssertionError, timeout):
return []
posts = [
Submission(p) for p in all_posts
if not db.has_processed(p['data']['name'])
]
return posts
def _ratelimit(self, sender):
"""
Helps us abide by reddit's API usage limitations.
https://github.com/reddit/reddit/wiki/API#rules
"""
if self.last_request is not None:
diff = time.time() - self.last_request
if diff < 2:
time.sleep(2 - diff)
self.last_request = time.time()
class Submission(object):
def __init__(self, json):
self.title = json['data']['title']
self.url = json['data']['url']
self.id = json['data']['id']
self.fullname = json['data']['name']
def mark_as_processed(self, db=db):
db.mark_as_processed(self.fullname)
| 26.647619 | 71 | 0.557541 | 325 | 2,798 | 4.673846 | 0.381538 | 0.0316 | 0.029625 | 0.021066 | 0.025016 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00418 | 0.31594 | 2,798 | 104 | 72 | 26.903846 | 0.789446 | 0 | 0 | 0.060606 | 0 | 0 | 0.12326 | 0 | 0 | 0 | 0 | 0 | 0.030303 | 0 | null | null | 0.060606 | 0.075758 | null | null | 0.045455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
7dd2607d35d193812551a383cf15b46e746763df | 751 | py | Python | plot/event/concat.py | cxxixi/Online-opinions-on-weibo | ae4586b8b42d166c9a2386319891a04d390585fb | [
"MIT"
] | null | null | null | plot/event/concat.py | cxxixi/Online-opinions-on-weibo | ae4586b8b42d166c9a2386319891a04d390585fb | [
"MIT"
] | null | null | null | plot/event/concat.py | cxxixi/Online-opinions-on-weibo | ae4586b8b42d166c9a2386319891a04d390585fb | [
"MIT"
] | null | null | null | import pandas as pd
csv1 = pd.read_csv('D:\my_documents\competition\government\Report\event\\1.csv',header=None)
csv2 = pd.read_csv('D:\my_documents\competition\government\Report\event\\2.csv',header=None)
csv3 = pd.read_csv('D:\my_documents\competition\government\Report\event\\3.csv',header=None)
csv4 = pd.read_csv('D:\my_documents\competition\government\Report\event\\4.csv',header=None)
csv5 = pd.read_csv('D:\my_documents\competition\government\Report\event\\5.csv',header=None)
csv6 = pd.read_csv('D:\my_documents\competition\government\Report\event\\6.csv',header=None)
comment_data = pd.concat([csv1,csv2,csv3,csv4,csv5,csv6],axis=0)
comment_data.to_csv('D:\my_documents\competition\government\Report\event\\comment_data.csv',header=None)
| 57.769231 | 104 | 0.786951 | 122 | 751 | 4.704918 | 0.270492 | 0.04878 | 0.073171 | 0.182927 | 0.635889 | 0.635889 | 0.635889 | 0.635889 | 0.554007 | 0.554007 | 0 | 0.026316 | 0.038615 | 751 | 12 | 105 | 62.583333 | 0.768698 | 0 | 0 | 0 | 0 | 0 | 0.556 | 0.556 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7dd5fa97d681cacbfd5b760865ea5a9616b9dc8f | 476 | py | Python | app/crud/session.py | coolexplorer/py-session | 86ffbd3f1744df18d2deeaa77eebe5dd9980c6ae | [
"MIT"
] | 2 | 2021-08-17T07:15:46.000Z | 2021-08-17T07:18:15.000Z | app/crud/session.py | coolexplorer/py-session | 86ffbd3f1744df18d2deeaa77eebe5dd9980c6ae | [
"MIT"
] | 1 | 2022-02-24T06:52:08.000Z | 2022-02-24T06:52:08.000Z | app/crud/session.py | coolexplorer/py-session | 86ffbd3f1744df18d2deeaa77eebe5dd9980c6ae | [
"MIT"
] | null | null | null | from aioredis import Redis
class SessionCrud():
def __init__(self, redis: Redis) -> None:
self.redis = redis
async def set_dict(self, key: str, data: dict):
return await self.redis.hmset(key, data)
async def get_len(self, key: str):
return await self.redis.hlen(key)
async def get_all(self, key: str):
return await self.redis.hgetall(key)
async def touch(self, key: str):
return await self.redis.touch(key) | 26.444444 | 51 | 0.642857 | 69 | 476 | 4.333333 | 0.362319 | 0.180602 | 0.133779 | 0.267559 | 0.301003 | 0.301003 | 0.301003 | 0 | 0 | 0 | 0 | 0 | 0.25 | 476 | 18 | 52 | 26.444444 | 0.837535 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.083333 | 0 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
7dd624728e5251ddbbaece0762a8cb09a8dc1a5d | 485 | py | Python | firmware/serializers.py | fitahol/fitahol | ce84dc909aa98f2dc7594ef26568e015cbfe0e94 | [
"MIT"
] | 2 | 2017-02-20T14:11:30.000Z | 2017-06-11T16:10:33.000Z | firmware/serializers.py | fitahol/fitahol | ce84dc909aa98f2dc7594ef26568e015cbfe0e94 | [
"MIT"
] | null | null | null | firmware/serializers.py | fitahol/fitahol | ce84dc909aa98f2dc7594ef26568e015cbfe0e94 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# coding=utf-8
"""
__created__ = '23/10/2016'
__author__ = 'deling.ma'
"""
from rest_framework import serializers
from .models import ClientVersion, Feedback
class ClientVersionSerializer(serializers.ModelSerializer):
class Meta:
model = ClientVersion
fields = ("id", "url", "")
class FeedbackSerializer(serializers.ModelSerializer):
class Meta:
model = Feedback
fields = ("nickname", "contact", "content")
| 19.4 | 59 | 0.670103 | 47 | 485 | 6.723404 | 0.723404 | 0.164557 | 0.196203 | 0.221519 | 0.253165 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023438 | 0.208247 | 485 | 24 | 60 | 20.208333 | 0.799479 | 0.175258 | 0 | 0.2 | 0 | 0 | 0.069231 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7de459f4fedc920007feabd18a04581069c66b93 | 4,927 | py | Python | python_src/__init__.py | LovelyA72/ScoreDraft | dd344a49a5eec2670110cc43d672936cd1c27844 | [
"MIT"
] | 1 | 2020-03-26T15:48:49.000Z | 2020-03-26T15:48:49.000Z | python_test/ScoreDraft/__init__.py | LovelyA72/ScoreDraft | dd344a49a5eec2670110cc43d672936cd1c27844 | [
"MIT"
] | null | null | null | python_test/ScoreDraft/__init__.py | LovelyA72/ScoreDraft | dd344a49a5eec2670110cc43d672936cd1c27844 | [
"MIT"
] | null | null | null | import os
from . import PyScoreDraft
ScoreDraftPath_old= os.path.dirname(__file__)
ScoreDraftPath=""
#\\escaping fix
for ch in ScoreDraftPath_old:
if ch=="\\":
ScoreDraftPath+="/"
else:
ScoreDraftPath+=ch
if os.name == 'nt':
os.environ["PATH"]+=";"+ScoreDraftPath
elif os.name == "posix":
os.environ["PATH"]+=":"+ScoreDraftPath
PyScoreDraft.ScanExtensions(ScoreDraftPath)
from .PyScoreDraft import TellDuration
'''
TellDuration(seq) takes in a single input "seq"
It can be a note-sequence, a beat-sequence, or a singing-sequence,
anything acceptable by Instrument.play(), Percussion.play(), Singer.sing()
as the "seq" parameter
The return value is the total duration of the sequence as an integer
'''
from .TrackBuffer import setDefaultNumberOfChannels
from .TrackBuffer import TrackBuffer
from .TrackBuffer import MixTrackBufferList
from .TrackBuffer import WriteTrackBufferToWav
from .TrackBuffer import ReadTrackBufferFromWav
try:
from .Extensions import WriteNoteSequencesToMidi
except ImportError:
pass
try:
from .Extensions import PlayTrackBuffer
except ImportError:
pass
try:
from .Extensions import PlayGetRemainingTime
except ImportError:
pass
try:
from .Extensions import QPlayTrackBuffer
except ImportError:
pass
try:
from .Extensions import QPlayGetRemainingTime
except ImportError:
pass
from .Catalog import Catalog
from .Catalog import PrintCatalog
from .Instrument import Instrument
from .Percussion import Percussion
from .Singer import Singer
from .Document import Document
try:
from .Meteor import Document as MeteorDocument
except ImportError:
pass
from .InternalInstruments import PureSin, Square, Triangle, Sawtooth, NaivePiano, BottleBlow
try:
from .PercussionSampler import PercussionSampler
PERC_SAMPLE_ROOT=ScoreDraftPath+'/PercussionSamples'
if os.path.isdir(PERC_SAMPLE_ROOT):
for item in os.listdir(PERC_SAMPLE_ROOT):
file_path = PERC_SAMPLE_ROOT+'/'+item
if os.path.isfile(file_path) and item.endswith(".wav"):
name = item[0:len(item)-4]
definition="""
def """+name+"""():
return PercussionSampler('"""+file_path+"""')
"""
exec(definition)
Catalog['Percussions'] += [name+' - PercussionSampler']
except ImportError:
pass
try:
from .InstrumentSampler import InstrumentSampler_Single
from .InstrumentSampler import InstrumentSampler_Multi
INSTR_SAMPLE_ROOT=ScoreDraftPath+'/InstrumentSamples'
if os.path.isdir(INSTR_SAMPLE_ROOT):
for item in os.listdir(INSTR_SAMPLE_ROOT):
inst_path = INSTR_SAMPLE_ROOT+'/'+item
if os.path.isfile(inst_path) and item.endswith(".wav"):
name = item[0:len(item)-4]
definition="""
def """+name+"""():
return InstrumentSampler_Single('"""+inst_path+"""')
"""
exec(definition)
Catalog['Instruments'] += [name+' - InstrumentSampler_Single']
elif os.path.isdir(inst_path):
name = item
definition="""
def """+item+"""():
return InstrumentSampler_Multi('"""+inst_path+"""')
"""
exec(definition)
Catalog['Instruments'] += [name+' - InstrumentSampler_Multi']
except ImportError:
pass
try:
from .KeLa import KeLa
KELA_SAMPLE_ROOT=ScoreDraftPath+'/KeLaSamples'
if os.path.isdir(KELA_SAMPLE_ROOT):
for item in os.listdir(KELA_SAMPLE_ROOT):
kela_path = KELA_SAMPLE_ROOT+'/'+item
if os.path.isdir(kela_path):
definition="""
def """+item+"""():
return KeLa('"""+kela_path+"""')
"""
exec(definition)
Catalog['Singers'] += [item+' - KeLa']
except ImportError:
pass
try:
from .UtauDraft import UtauDraft
from .CVVCChineseConverter import CVVCChineseConverter
from .XiaYYConverter import XiaYYConverter
from .JPVCVConverter import JPVCVConverter
from .TsuroVCVConverter import TsuroVCVConverter
from .TTEnglishConverter import TTEnglishConverter
from .VCCVEnglishConverter import VCCVEnglishConverter
UTAU_VB_ROOT=ScoreDraftPath+'/UTAUVoice'
UTAU_VB_SUFFIX='_UTAU'
if os.path.isdir(UTAU_VB_ROOT):
for item in os.listdir(UTAU_VB_ROOT):
utau_path = UTAU_VB_ROOT+'/'+item
if os.path.isdir(utau_path):
definition="""
def """+item+UTAU_VB_SUFFIX+"""(useCuda=True):
return UtauDraft('"""+utau_path+"""',useCuda)
"""
exec(definition)
Catalog['Singers'] += [item+UTAU_VB_SUFFIX+' - UtauDraft']
except ImportError:
pass
try:
from .SF2Instrument import ListPresets as ListPresetsSF2
from .SF2Instrument import SF2Instrument
SF2_ROOT=ScoreDraftPath+'/SF2'
if os.path.isdir(SF2_ROOT):
for item in os.listdir(SF2_ROOT):
sf2_path = SF2_ROOT+'/'+item
if os.path.isfile(sf2_path) and item.endswith(".sf2"):
name = item[0:len(item)-4]
definition="""
def """+name+"""(preset_index):
return SF2Instrument('"""+sf2_path+"""', preset_index)
def """+name+"""_List():
ListPresetsSF2('"""+sf2_path+"""')
"""
exec(definition)
Catalog['Instruments'] += [name+' - SF2Instrument']
except ImportError:
pass
try:
from .KarplusStrongInstrument import KarplusStrongInstrument
except ImportError:
pass
| 26.068783 | 92 | 0.744266 | 581 | 4,927 | 6.180723 | 0.225473 | 0.02005 | 0.070175 | 0.06015 | 0.27736 | 0.208855 | 0.16597 | 0.077973 | 0.043999 | 0.034531 | 0 | 0.005364 | 0.129694 | 4,927 | 188 | 93 | 26.207447 | 0.83209 | 0.002841 | 0 | 0.401316 | 0 | 0 | 0.123756 | 0.021203 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.078947 | 0.309211 | 0 | 0.348684 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
7de4f8c5d117b4c130f144233897b11a1c074d09 | 1,988 | py | Python | src/deepcv/pipeline.py | PaulEmmanuelSotir/DeepCV | 4c0ed68d47dceb713d7f34ca258dad957bcd3611 | [
"MIT"
] | 6 | 2020-04-05T13:12:02.000Z | 2022-03-13T06:27:55.000Z | src/deepcv/pipeline.py | PaulEmmanuelSotir/DeepCV | 4c0ed68d47dceb713d7f34ca258dad957bcd3611 | [
"MIT"
] | null | null | null | src/deepcv/pipeline.py | PaulEmmanuelSotir/DeepCV | 4c0ed68d47dceb713d7f34ca258dad957bcd3611 | [
"MIT"
] | 2 | 2020-04-19T21:10:26.000Z | 2021-04-23T22:19:32.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""Construction of the master pipeline.
"""
import logging
import operator
from pathlib import Path
from functools import reduce
from typing import Dict, Union, Any, Optional, Callable
from kedro.pipeline import Pipeline, node
import kedro.pipeline.decorators as dec
import deepcv.utils
import deepcv.meta
import deepcv.keypoints
import deepcv.detection
import deepcv.classification
__all__ = ['create_pipelines', 'DECORATORS', 'GET_PIPELINE_FN_NAME', 'SUBPACKAGES_WITH_PIPELINES']
__author__ = 'Paul-Emmanuel Sotir'
DECORATORS = [dec.log_time] # Other decorator available: memory_profiler? ,retry, spark_to_pandas, pandas_to_spark
GET_PIPELINE_FN_NAME = 'get_pipelines'
SUBPACKAGES_WITH_PIPELINES = [deepcv.classification, deepcv.keypoints, deepcv.detection]
def create_pipelines(**kwargs) -> Dict[str, Pipeline]:
"""Create the project's pipeline.
Args:
kwargs: Ignore any additional arguments added in the future.
Returns:
A mapping from a pipeline name to a ``Pipeline`` object.
NOTE: For MLflow experiments/runs tracking support, pipeline(s) (or at least one node of the pipeline(s)) which involves training should have a 'train' tag (project hooks defined in `deepcv.run` creates/ends mlflow run for each `train` pipelines)
"""
pipeline_map = {}
for subpackage in SUBPACKAGES_WITH_PIPELINES:
get_pipelines: Optional[Callable[[None], Dict[str, Pipeline]]] = getattr(subpackage, GET_PIPELINE_FN_NAME, None)
if get_pipelines is None:
logging.warn(f'Warning: Could\'t find `{GET_PIPELINE_FN_NAME}` function in `{subpackage}` Deepcv subpackage or submodule.')
pipeline_map.update({n: p.decorate(*DECORATORS) for n, p in get_pipelines().items()})
return {**pipeline_map, "__default__": reduce(operator.add, pipeline_map.values())}
if __name__ == '__main__':
# Simply call `deepcv.pipelines.create_pipelines`
pipelines = create_pipelines()
| 38.980392 | 250 | 0.745976 | 260 | 1,988 | 5.488462 | 0.484615 | 0.042046 | 0.03644 | 0.047652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000594 | 0.152918 | 1,988 | 50 | 251 | 39.76 | 0.846793 | 0.317404 | 0 | 0 | 0 | 0.037037 | 0.104784 | 0.019742 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | false | 0 | 0.444444 | 0 | 0.518519 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
81483d605432de2896f9a6c661fe16af9728cb13 | 427 | py | Python | src/OTLMOW/OTLModel/Classes/Vlierstruweel.py | davidvlaminck/OTLClassPython | 71330afeb37c3ea6d9981f521ff8f4a3f8b946fc | [
"MIT"
] | 2 | 2022-02-01T08:58:11.000Z | 2022-02-08T13:35:17.000Z | src/OTLMOW/OTLModel/Classes/Vlierstruweel.py | davidvlaminck/OTLMOW | 71330afeb37c3ea6d9981f521ff8f4a3f8b946fc | [
"MIT"
] | null | null | null | src/OTLMOW/OTLModel/Classes/Vlierstruweel.py | davidvlaminck/OTLMOW | 71330afeb37c3ea6d9981f521ff8f4a3f8b946fc | [
"MIT"
] | null | null | null | # coding=utf-8
from OTLMOW.OTLModel.Classes.Struweel import Struweel
# Generated with OTLClassCreator. To modify: extend, do not edit
class Vlierstruweel(Struweel):
"""S5 - gewone vlier (dominant)."""
typeURI = 'https://wegenenverkeer.data.vlaanderen.be/ns/onderdeel#Vlierstruweel'
"""De URI van het object volgens https://www.w3.org/2001/XMLSchema#anyURI."""
def __init__(self):
super().__init__()
| 30.5 | 84 | 0.714286 | 53 | 427 | 5.603774 | 0.90566 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019284 | 0.149883 | 427 | 13 | 85 | 32.846154 | 0.798898 | 0.248244 | 0 | 0 | 1 | 0 | 0.28692 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
814fa143633b75c2a688a3c22750199a354205f9 | 402 | py | Python | Chapter 10/Code/twilio-test.py | professor-li/book-dow-iot-projects | 5b2b16459298a6503b87c17ac550e635299aa945 | [
"MIT"
] | 17 | 2018-05-26T13:00:57.000Z | 2021-11-11T09:07:18.000Z | Chapter 10/Code/twilio-test.py | professor-li/book-dow-iot-projects | 5b2b16459298a6503b87c17ac550e635299aa945 | [
"MIT"
] | null | null | null | Chapter 10/Code/twilio-test.py | professor-li/book-dow-iot-projects | 5b2b16459298a6503b87c17ac550e635299aa945 | [
"MIT"
] | 16 | 2018-05-14T09:04:41.000Z | 2021-11-11T09:07:22.000Z | from twilio.rest import Client
account_sid = '<<your account_sid>>'
auth_token = '<<your auth_token>>'
client = Client(account_sid, auth_token)
message = client.messages.create(
body='Twilio says hello!',
from_='<<your Twilio number>>',
to='<<your cell number'>>
)
print(message.sid) | 30.923077 | 61 | 0.514925 | 40 | 402 | 5 | 0.5 | 0.15 | 0.16 | 0.19 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.368159 | 402 | 13 | 62 | 30.923077 | 0.787402 | 0 | 0 | 0 | 0 | 0 | 0.240695 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.1 | null | null | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
816e0cf8ef9cdc52f94f3eed6b62e694f63537e7 | 826 | py | Python | tests/unit/customer/test_forms.py | iicc/django-oscar | 67ebe6bc21c242e9b0750b9f306b2f46a2758199 | [
"BSD-3-Clause"
] | 2 | 2019-07-27T23:00:28.000Z | 2021-09-08T14:25:30.000Z | tests/unit/customer/test_forms.py | iicc/django-oscar | 67ebe6bc21c242e9b0750b9f306b2f46a2758199 | [
"BSD-3-Clause"
] | 11 | 2019-12-21T06:06:48.000Z | 2022-01-13T01:41:33.000Z | tests/unit/customer/test_forms.py | iicc/django-oscar | 67ebe6bc21c242e9b0750b9f306b2f46a2758199 | [
"BSD-3-Clause"
] | 2 | 2020-05-02T22:47:28.000Z | 2021-09-08T14:25:32.000Z | from unittest import mock
from django.core.exceptions import ValidationError
from django.test import TestCase
from oscar.apps.customer.forms import EmailUserCreationForm
class TestEmailUserCreationForm(TestCase):
@mock.patch('oscar.apps.customer.forms.validate_password')
def test_validator_passed_populated_user(self, mocked_validate):
mocked_validate.side_effect = ValidationError('That password is rubbish')
form = EmailUserCreationForm(data={'email': 'terry@boom.com', 'password1': 'terry', 'password2': 'terry'})
self.assertFalse(form.is_valid())
mocked_validate.assert_called_once_with('terry', form.instance)
self.assertEqual(mocked_validate.call_args[0][1].email, 'terry@boom.com')
self.assertEqual(form.errors['password2'], ['That password is rubbish'])
| 39.333333 | 114 | 0.754237 | 97 | 826 | 6.268041 | 0.546392 | 0.092105 | 0.055921 | 0.072368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006974 | 0.131961 | 826 | 20 | 115 | 41.3 | 0.841004 | 0 | 0 | 0 | 0 | 0 | 0.200969 | 0.052058 | 0 | 0 | 0 | 0 | 0.307692 | 1 | 0.076923 | false | 0.384615 | 0.307692 | 0 | 0.461538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
816ff73eabdd4ad173633cd1c07e534036d5f006 | 266 | py | Python | ex048.py | paulo-caixeta/Exercicios_Curso_Python | 3b77925499c174ea9ff81dec65d6319125219b9a | [
"MIT"
] | null | null | null | ex048.py | paulo-caixeta/Exercicios_Curso_Python | 3b77925499c174ea9ff81dec65d6319125219b9a | [
"MIT"
] | null | null | null | ex048.py | paulo-caixeta/Exercicios_Curso_Python | 3b77925499c174ea9ff81dec65d6319125219b9a | [
"MIT"
] | null | null | null | # Calcula a soma entre todos os ímpares que são múltiplos de três
soma_impar = 0
cont = 0
for c in range(1, 501, 2):
if (c % 3) == 0:
cont += 1
soma_impar += c
print('A soma dos {} ímpares múltiplos de 3 de 0 a 500 é {}'.format(cont, soma_impar)) | 33.25 | 86 | 0.620301 | 50 | 266 | 3.24 | 0.58 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 0.266917 | 266 | 8 | 86 | 33.25 | 0.753846 | 0.236842 | 0 | 0 | 0 | 0 | 0.257426 | 0 | 0 | 0 | 0 | 0.125 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8173b8b26fa7d728de28b53b68619477b1ed279d | 369 | py | Python | projdir/app/music.py | NITKOSG/InfoGami | f5ce567d5558a768e58886b3419e378cabb7049d | [
"MIT"
] | 2 | 2017-02-01T09:57:40.000Z | 2017-06-03T15:26:55.000Z | projdir/app/music.py | NITKOSG/InfoGami | f5ce567d5558a768e58886b3419e378cabb7049d | [
"MIT"
] | 2 | 2018-05-11T20:10:23.000Z | 2019-05-01T21:13:07.000Z | projdir/app/music.py | NITKOSG/InfoGami | f5ce567d5558a768e58886b3419e378cabb7049d | [
"MIT"
] | 3 | 2017-02-11T13:19:28.000Z | 2018-08-31T18:51:18.000Z | from django.shortcuts import render,redirect,get_object_or_404
from django.http import HttpResponse
from django.contrib.auth.models import User
from .models import MusicModel
from .views import loginRequired
@loginRequired
def music_list(request):
music_list = MusicModel.objects.all()
return render(request,'music/music_list.html',{'music_list':music_list})
| 30.75 | 76 | 0.813008 | 51 | 369 | 5.72549 | 0.529412 | 0.15411 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009063 | 0.102981 | 369 | 11 | 77 | 33.545455 | 0.873112 | 0 | 0 | 0 | 0 | 0 | 0.084011 | 0.056911 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.555556 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
8175df45db6e284c4e4a9cb3583eafc6ac2ccbad | 94 | py | Python | Unit 05 Lists & Dictionaries/02 A Day at the Supermarket/Looping and lists/1-beFOR we begin.py | lpython2006e/python-samples | b94ba67ce0d7798ecf796dadae206aa75da58301 | [
"MIT"
] | null | null | null | Unit 05 Lists & Dictionaries/02 A Day at the Supermarket/Looping and lists/1-beFOR we begin.py | lpython2006e/python-samples | b94ba67ce0d7798ecf796dadae206aa75da58301 | [
"MIT"
] | null | null | null | Unit 05 Lists & Dictionaries/02 A Day at the Supermarket/Looping and lists/1-beFOR we begin.py | lpython2006e/python-samples | b94ba67ce0d7798ecf796dadae206aa75da58301 | [
"MIT"
] | null | null | null | names = ["Adam", "Alex", "Mariah", "Martine", "Columbus"]
for word in names:
print(word)
| 18.8 | 57 | 0.606383 | 12 | 94 | 4.75 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.180851 | 94 | 4 | 58 | 23.5 | 0.74026 | 0 | 0 | 0 | 0 | 0 | 0.308511 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
817bad067a1a9a314baf5e3cc172c84c8ea05d62 | 11,260 | py | Python | pysnmp-with-texts/HPNSAECC-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/HPNSAECC-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/HPNSAECC-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module HPNSAECC-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/HPNSAECC-MIB
# Produced by pysmi-0.3.4 at Wed May 1 13:42:09 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, OctetString, ObjectIdentifier = mibBuilder.importSymbols("ASN1", "Integer", "OctetString", "ObjectIdentifier")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
SingleValueConstraint, ConstraintsUnion, ConstraintsIntersection, ValueSizeConstraint, ValueRangeConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "SingleValueConstraint", "ConstraintsUnion", "ConstraintsIntersection", "ValueSizeConstraint", "ValueRangeConstraint")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
Counter64, MibIdentifier, Bits, MibScalar, MibTable, MibTableRow, MibTableColumn, Counter32, NotificationType, iso, Integer32, Gauge32, IpAddress, enterprises, Unsigned32, ObjectIdentity, NotificationType, ModuleIdentity, TimeTicks = mibBuilder.importSymbols("SNMPv2-SMI", "Counter64", "MibIdentifier", "Bits", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Counter32", "NotificationType", "iso", "Integer32", "Gauge32", "IpAddress", "enterprises", "Unsigned32", "ObjectIdentity", "NotificationType", "ModuleIdentity", "TimeTicks")
DisplayString, TextualConvention = mibBuilder.importSymbols("SNMPv2-TC", "DisplayString", "TextualConvention")
hp = MibIdentifier((1, 3, 6, 1, 4, 1, 11))
nm = MibIdentifier((1, 3, 6, 1, 4, 1, 11, 2))
hpnsa = MibIdentifier((1, 3, 6, 1, 4, 1, 11, 2, 23))
hpnsaECC = MibIdentifier((1, 3, 6, 1, 4, 1, 11, 2, 23, 6))
hpnsaEccMibRev = MibIdentifier((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 1))
hpnsaEccAgent = MibIdentifier((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 2))
hpnsaEccLog = MibIdentifier((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 3))
hpnsaEccMibRevMajor = MibScalar((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: hpnsaEccMibRevMajor.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccMibRevMajor.setDescription('The major revision level of the MIB.')
hpnsaEccMibRevMinor = MibScalar((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: hpnsaEccMibRevMinor.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccMibRevMinor.setDescription('The minor revision level of the MIB.')
hpnsaEccAgentTable = MibTable((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 2, 1), )
if mibBuilder.loadTexts: hpnsaEccAgentTable.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccAgentTable.setDescription('A table of SNMP Agents that satisfy requests for this MIB.')
hpnsaEccAgentEntry = MibTableRow((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 2, 1, 1), ).setIndexNames((0, "HPNSAECC-MIB", "hpnsaEccAgentIndex"))
if mibBuilder.loadTexts: hpnsaEccAgentEntry.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccAgentEntry.setDescription('A description of the agents that access ECC Memory related information.')
hpnsaEccAgentIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 2, 1, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: hpnsaEccAgentIndex.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccAgentIndex.setDescription('A unique index for this module description.')
hpnsaEccAgentName = MibTableColumn((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 2, 1, 1, 2), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: hpnsaEccAgentName.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccAgentName.setDescription('Name of the Agent/Agents satisfying SNMP requests for this MIB.')
hpnsaEccAgentVersion = MibTableColumn((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 2, 1, 1, 3), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 5))).setMaxAccess("readonly")
if mibBuilder.loadTexts: hpnsaEccAgentVersion.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccAgentVersion.setDescription('Version number of the Agent/Agents satisfying SNMP requests for this MIB.')
hpnsaEccAgentDate = MibTableColumn((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 2, 1, 1, 4), OctetString().subtype(subtypeSpec=ValueSizeConstraint(6, 6)).setFixedLength(6)).setMaxAccess("readonly")
if mibBuilder.loadTexts: hpnsaEccAgentDate.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccAgentDate.setDescription('The date on which this Agent was created. field octets contents range _________________________________________________ 1 1 years since 1900 0..255 2 2 month 1..12 3 3 day 1..31 4 4 hour 0..23 5 5 minute 0..59 6 6 second 0..59 ')
hpnsaEccStatus = MibScalar((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 3, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("notSupported", 1), ("disabled", 2), ("enabled", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: hpnsaEccStatus.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccStatus.setDescription('ECC memory system tracking status: 1 - ECC memory is not supported in this machine 2 - ECC memory logging is disabled due to some errors (example, too many single or multiple bits error occurred in a short period of time) 3 - ECC memory logging is enabled and functioning.')
hpnsaEccEraseLog = MibScalar((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 3, 2), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: hpnsaEccEraseLog.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccEraseLog.setDescription("Set this variable to integer value 1234 and without changing it again before hpnsaEccPollTime expired, will erase the system's Log area.")
hpnsaEccTotalErrCorrected = MibScalar((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 3, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hpnsaEccTotalErrCorrected.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccTotalErrCorrected.setDescription('Total number of ECC memory error had occurred and been corrected.')
hpnsaEccTrapEnable = MibScalar((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 3, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 0))).clone(namedValues=NamedValues(("trapOn", 1), ("trapOff", 0)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: hpnsaEccTrapEnable.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccTrapEnable.setDescription('Set this variable to 1, the ECC memory errors are forwarded as SNMP traps. No trap are generated if this variable is set to 0.')
hpnsaEccTrapDelay = MibScalar((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 3, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(100, 5000))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: hpnsaEccTrapDelay.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccTrapDelay.setDescription('Delay in milliseconds between the sending of ECC traps.')
hpnsaEccPollTime = MibScalar((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 3, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(60, 2592000))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: hpnsaEccPollTime.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccPollTime.setDescription('Seconds between checking of ECC memory error.')
hpnsaEccMemErrTable = MibTable((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 3, 7), )
if mibBuilder.loadTexts: hpnsaEccMemErrTable.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccMemErrTable.setDescription('A table of ECC memory error descriptions.')
hpnsaEccMemErrEntry = MibTableRow((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 3, 7, 1), ).setIndexNames((0, "HPNSAECC-MIB", "hpnsaEccMemErrIndex"))
if mibBuilder.loadTexts: hpnsaEccMemErrEntry.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccMemErrEntry.setDescription('ECC memory error description.')
hpnsaEccMemErrIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 3, 7, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hpnsaEccMemErrIndex.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccMemErrIndex.setDescription('A unique index for the ECC memory error log.')
hpnsaEccMemErrTime = MibTableColumn((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 3, 7, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hpnsaEccMemErrTime.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccMemErrTime.setDescription('The Server local time when the ECC memory error occurred.')
hpnsaEccMemErrDesc = MibTableColumn((1, 3, 6, 1, 4, 1, 11, 2, 23, 6, 3, 7, 1, 3), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: hpnsaEccMemErrDesc.setStatus('mandatory')
if mibBuilder.loadTexts: hpnsaEccMemErrDesc.setDescription('A string indicating the SIMM location when ECC memory error occurred.')
hpnsaEccErrorCorrected = NotificationType((1, 3, 6, 1, 4, 1, 11, 2, 23, 6) + (0,4353))
if mibBuilder.loadTexts: hpnsaEccErrorCorrected.setDescription('An ECC single-bit error has been corrected in one of the memory modules')
hpnsaEccSBEOverflow = NotificationType((1, 3, 6, 1, 4, 1, 11, 2, 23, 6) + (0,4354))
if mibBuilder.loadTexts: hpnsaEccSBEOverflow.setDescription("Error logging for ECC single-bit errors has been disabled due to too many SBE's detected in a short time period")
hpnsaEccMemoryResize = NotificationType((1, 3, 6, 1, 4, 1, 11, 2, 23, 6) + (0,4355))
if mibBuilder.loadTexts: hpnsaEccMemoryResize.setDescription('ECC Memory size has been adjusted during the Power-On-Self-Test during the last boot due to a failed memory module')
hpnsaEccMultiBitError = NotificationType((1, 3, 6, 1, 4, 1, 11, 2, 23, 6) + (0,4357))
if mibBuilder.loadTexts: hpnsaEccMultiBitError.setDescription('An ECC double-bit error has occurred in one of the memory modules')
hpnsaEccMultiBitErrorOverflow = NotificationType((1, 3, 6, 1, 4, 1, 11, 2, 23, 6) + (0,4358))
if mibBuilder.loadTexts: hpnsaEccMultiBitErrorOverflow.setDescription("Error logging for ECC multiple-bit errors has been disabled due to too many MBE's detected in a short time period")
mibBuilder.exportSymbols("HPNSAECC-MIB", hpnsa=hpnsa, hpnsaEccMemErrDesc=hpnsaEccMemErrDesc, hpnsaEccAgentVersion=hpnsaEccAgentVersion, hp=hp, hpnsaEccMibRevMajor=hpnsaEccMibRevMajor, hpnsaEccMemErrTable=hpnsaEccMemErrTable, hpnsaEccLog=hpnsaEccLog, hpnsaEccAgentEntry=hpnsaEccAgentEntry, hpnsaEccAgentDate=hpnsaEccAgentDate, hpnsaEccAgentIndex=hpnsaEccAgentIndex, hpnsaEccMibRev=hpnsaEccMibRev, hpnsaEccMibRevMinor=hpnsaEccMibRevMinor, hpnsaEccMemErrEntry=hpnsaEccMemErrEntry, hpnsaEccErrorCorrected=hpnsaEccErrorCorrected, hpnsaEccAgent=hpnsaEccAgent, hpnsaEccMemErrTime=hpnsaEccMemErrTime, hpnsaEccMemErrIndex=hpnsaEccMemErrIndex, hpnsaEccMemoryResize=hpnsaEccMemoryResize, hpnsaEccMultiBitErrorOverflow=hpnsaEccMultiBitErrorOverflow, hpnsaEccMultiBitError=hpnsaEccMultiBitError, hpnsaEccTrapEnable=hpnsaEccTrapEnable, hpnsaEccAgentTable=hpnsaEccAgentTable, hpnsaEccTrapDelay=hpnsaEccTrapDelay, hpnsaEccEraseLog=hpnsaEccEraseLog, hpnsaEccSBEOverflow=hpnsaEccSBEOverflow, hpnsaEccTotalErrCorrected=hpnsaEccTotalErrCorrected, hpnsaEccAgentName=hpnsaEccAgentName, hpnsaEccPollTime=hpnsaEccPollTime, hpnsaEccStatus=hpnsaEccStatus, hpnsaECC=hpnsaECC, nm=nm)
| 126.516854 | 1,155 | 0.77913 | 1,374 | 11,260 | 6.349345 | 0.203057 | 0.059147 | 0.103508 | 0.014214 | 0.426295 | 0.236589 | 0.171022 | 0.154516 | 0.152109 | 0.141334 | 0 | 0.062015 | 0.096359 | 11,260 | 88 | 1,156 | 127.954545 | 0.795381 | 0.028242 | 0 | 0 | 0 | 0.08642 | 0.264795 | 0.008506 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.074074 | 0 | 0.074074 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
818556d91b21cdcb6f8221312ec0472ce6899399 | 13,462 | py | Python | mlxtend/mlxtend/regressor/tests/test_stacking_regression.py | WhiteWolf21/fp-growth | 01e1d853b09f244f14e66d7d0c87f139a0f67c81 | [
"MIT"
] | null | null | null | mlxtend/mlxtend/regressor/tests/test_stacking_regression.py | WhiteWolf21/fp-growth | 01e1d853b09f244f14e66d7d0c87f139a0f67c81 | [
"MIT"
] | null | null | null | mlxtend/mlxtend/regressor/tests/test_stacking_regression.py | WhiteWolf21/fp-growth | 01e1d853b09f244f14e66d7d0c87f139a0f67c81 | [
"MIT"
] | null | null | null | # Sebastian Raschka 2014-2020
# mlxtend Machine Learning Library Extensions
# Author: Sebastian Raschka <sebastianraschka.com>
#
# License: BSD 3 clause
import pytest
import numpy as np
from mlxtend.externals.estimator_checks import NotFittedError
from mlxtend.utils import assert_raises
from mlxtend.regressor import StackingRegressor
from sklearn.linear_model import LinearRegression
from sklearn.linear_model import Ridge
from sklearn.linear_model import Lasso
from sklearn.svm import SVR
from sklearn.ensemble import RandomForestRegressor
from sklearn.model_selection import GridSearchCV
from sklearn.model_selection import train_test_split
from scipy import sparse
from numpy.testing import assert_almost_equal
from sklearn.base import clone
# Generating a sample dataset
np.random.seed(1)
X1 = np.sort(5 * np.random.rand(40, 1), axis=0)
X2 = np.sort(5 * np.random.rand(40, 2), axis=0)
y = np.sin(X1).ravel()
y[::5] += 3 * (0.5 - np.random.rand(8))
y2 = np.sin(X2)
w = np.random.random(40)
def test_different_models():
lr = LinearRegression()
svr_lin = SVR(kernel='linear', gamma='auto')
ridge = Ridge(random_state=1)
svr_rbf = SVR(kernel='rbf', gamma='auto')
stregr = StackingRegressor(regressors=[svr_lin, lr, ridge],
meta_regressor=svr_rbf)
stregr.fit(X1, y).predict(X1)
mse = 0.21
got = np.mean((stregr.predict(X1) - y) ** 2)
assert round(got, 2) == mse
def test_multivariate():
lr = LinearRegression()
svr_lin = SVR(kernel='linear', gamma='auto')
ridge = Ridge(random_state=1)
svr_rbf = SVR(kernel='rbf', gamma='auto')
stregr = StackingRegressor(regressors=[svr_lin, lr, ridge],
meta_regressor=svr_rbf)
stregr.fit(X2, y).predict(X2)
mse = 0.22
got = np.mean((stregr.predict(X2) - y) ** 2)
assert round(got, 2) == mse
def test_multivariate_class():
lr = LinearRegression()
ridge = Ridge(random_state=1)
meta = LinearRegression(normalize=True)
stregr = StackingRegressor(regressors=[lr, ridge],
meta_regressor=meta)
stregr.fit(X2, y2).predict(X2)
mse = 0.12
got = np.mean((stregr.predict(X2) - y2) ** 2.)
# there seems to be an issue with the following test on Windows
# sometimes via Appveyor
assert round(got, 2) == mse, got
def test_sample_weight():
lr = LinearRegression()
svr_lin = SVR(kernel='linear', gamma='auto')
ridge = Ridge(random_state=1)
svr_rbf = SVR(kernel='rbf', gamma='auto')
stregr = StackingRegressor(regressors=[svr_lin, lr, ridge],
meta_regressor=svr_rbf)
pred1 = stregr.fit(X1, y, sample_weight=w).predict(X1)
mse = 0.22
got = np.mean((stregr.predict(X1) - y) ** 2)
assert round(got, 2) == mse
# make sure that this is not equivalent to the model with no weight
pred2 = stregr.fit(X1, y).predict(X1)
maxdiff = np.max(np.abs(pred1 - pred2))
assert maxdiff > 1e-3, "max diff is %.4f" % maxdiff
def test_weight_ones():
# sample weight of ones should produce equivalent outcome as no weight
lr = LinearRegression()
svr_lin = SVR(kernel='linear', gamma='auto')
ridge = Ridge(random_state=1)
svr_rbf = SVR(kernel='rbf', gamma='auto')
stregr = StackingRegressor(regressors=[svr_lin, lr, ridge],
meta_regressor=svr_rbf)
pred1 = stregr.fit(X1, y).predict(X1)
pred2 = stregr.fit(X1, y, sample_weight=np.ones(40)).predict(X1)
maxdiff = np.max(np.abs(pred1 - pred2))
assert maxdiff < 1e-3, "max diff is %.4f" % maxdiff
def test_weight_unsupported_regressor():
# including regressor that does not support
# sample_weight should raise error
lr = LinearRegression()
svr_lin = SVR(kernel='linear', gamma='auto')
ridge = Ridge(random_state=1)
svr_rbf = SVR(kernel='rbf', gamma='auto')
lasso = Lasso(random_state=1)
stregr = StackingRegressor(regressors=[svr_lin, lr, ridge, lasso],
meta_regressor=svr_rbf)
with pytest.raises(TypeError):
stregr.fit(X1, y, sample_weight=w).predict(X1)
def test_weight_unsupported_meta():
# meta regressor with no support for
# sample_weight should raise error
lr = LinearRegression()
svr_lin = SVR(kernel='linear', gamma='auto')
ridge = Ridge(random_state=1)
lasso = Lasso(random_state=1)
stregr = StackingRegressor(regressors=[svr_lin, lr, ridge],
meta_regressor=lasso)
with pytest.raises(TypeError):
stregr.fit(X1, y, sample_weight=w).predict(X1)
def test_weight_unsupported_with_no_weight():
# pass no weight to regressors with no weight support
# should not be a problem
lr = LinearRegression()
svr_lin = SVR(kernel='linear', gamma='auto')
ridge = Ridge(random_state=1)
svr_rbf = SVR(kernel='rbf', gamma='auto')
lasso = Lasso(random_state=1)
stregr = StackingRegressor(regressors=[svr_lin, lr, ridge, lasso],
meta_regressor=svr_rbf)
stregr.fit(X1, y).predict(X1)
stregr = StackingRegressor(regressors=[svr_lin, lr, ridge],
meta_regressor=lasso)
stregr.fit(X1, y).predict(X1)
def test_gridsearch():
lr = LinearRegression()
svr_lin = SVR(kernel='linear', gamma='auto')
ridge = Ridge(random_state=1)
svr_rbf = SVR(kernel='rbf', gamma='auto')
stregr = StackingRegressor(regressors=[svr_lin, lr, ridge],
meta_regressor=svr_rbf)
params = {'ridge__alpha': [0.01, 1.0],
'svr__C': [0.01, 1.0],
'meta_regressor__C': [0.01, 1.0]}
grid = GridSearchCV(estimator=stregr,
param_grid=params,
cv=5,
iid=False,
refit=True,
verbose=0)
grid = grid.fit(X1, y)
best = 0.1
got = round(grid.best_score_, 2)
assert best == got
def test_gridsearch_numerate_regr():
svr_lin = SVR(kernel='linear', gamma='auto')
ridge = Ridge(random_state=1)
svr_rbf = SVR(kernel='rbf', gamma='auto')
stregr = StackingRegressor(regressors=[svr_lin, ridge, ridge],
meta_regressor=svr_rbf)
params = {'ridge-1__alpha': [0.01, 1.0],
'ridge-2__alpha': [0.01, 1.0],
'svr__C': [0.01, 1.0],
'meta_regressor__C': [0.01, 1.0]}
grid = GridSearchCV(estimator=stregr,
param_grid=params,
cv=5,
iid=False,
refit=True,
verbose=0)
grid = grid.fit(X1, y)
best = 0.1
got = round(grid.best_score_, 2)
assert best == got
def test_get_coeff():
lr = LinearRegression()
svr_lin = SVR(kernel='linear', gamma='auto')
ridge = Ridge(random_state=1)
stregr = StackingRegressor(regressors=[svr_lin, lr],
meta_regressor=ridge)
stregr.fit(X1, y)
got = stregr.coef_
expect = np.array([0.4874216, 0.45518317])
assert_almost_equal(got, expect)
def test_get_intercept():
lr = LinearRegression()
svr_lin = SVR(kernel='linear', gamma='auto')
ridge = Ridge(random_state=1)
stregr = StackingRegressor(regressors=[svr_lin, lr],
meta_regressor=ridge)
stregr.fit(X1, y)
got = stregr.intercept_
expect = 0.02
assert round(got, 2) == expect
# ValueError was changed to AttributeError in sklearn >= 0.19
def test_get_coeff_fail():
lr = LinearRegression()
svr_rbf = SVR(kernel='rbf', gamma='auto')
ridge = Ridge(random_state=1)
stregr = StackingRegressor(regressors=[ridge, lr],
meta_regressor=svr_rbf)
with pytest.raises(AttributeError):
stregr = stregr.fit(X1, y)
r = stregr.coef_
assert r
def test_get_params():
lr = LinearRegression()
svr_rbf = SVR(kernel='rbf', gamma='auto')
ridge = Ridge(random_state=1)
stregr = StackingRegressor(regressors=[ridge, lr],
meta_regressor=svr_rbf)
got = sorted(list({s.split('__')[0] for s in stregr.get_params().keys()}))
expect = ['linearregression',
'meta_regressor',
'refit',
'regressors',
'ridge',
'store_train_meta_features',
'use_features_in_secondary',
'verbose']
assert got == expect, got
def test_regressor_gridsearch():
lr = LinearRegression()
svr_rbf = SVR(kernel='rbf', gamma='auto')
ridge = Ridge(random_state=1)
stregr = StackingRegressor(regressors=[lr],
meta_regressor=svr_rbf)
params = {'regressors': [[lr], [lr, ridge]]}
grid = GridSearchCV(estimator=stregr,
param_grid=params,
cv=5,
iid=False,
refit=True)
grid.fit(X1, y)
assert len(grid.best_params_['regressors']) == 2
def test_predict_meta_features():
lr = LinearRegression()
svr_rbf = SVR(kernel='rbf', gamma='auto')
ridge = Ridge(random_state=1)
stregr = StackingRegressor(regressors=[lr, ridge],
meta_regressor=svr_rbf)
X_train, X_test, y_train, y_test = train_test_split(X2, y, test_size=0.3)
stregr.fit(X_train, y_train)
test_meta_features = stregr.predict(X_test)
assert test_meta_features.shape[0] == X_test.shape[0]
def test_train_meta_features_():
lr = LinearRegression()
svr_rbf = SVR(kernel='rbf', gamma='auto')
ridge = Ridge(random_state=1)
stregr = StackingRegressor(regressors=[lr, ridge],
meta_regressor=svr_rbf,
store_train_meta_features=True)
X_train, X_test, y_train, y_test = train_test_split(X2, y, test_size=0.3)
stregr.fit(X_train, y_train)
train_meta_features = stregr.train_meta_features_
assert train_meta_features.shape[0] == X_train.shape[0]
def test_not_fitted_predict():
lr = LinearRegression()
svr_rbf = SVR(kernel='rbf', gamma='auto')
ridge = Ridge(random_state=1)
stregr = StackingRegressor(regressors=[lr, ridge],
meta_regressor=svr_rbf,
store_train_meta_features=True)
X_train, X_test, y_train, y_test = train_test_split(X2, y, test_size=0.3)
expect = ("This StackingRegressor instance is not fitted yet. Call "
"'fit' with appropriate arguments before using this method.")
assert_raises(NotFittedError,
expect,
stregr.predict,
X_train)
assert_raises(NotFittedError,
expect,
stregr.predict_meta_features,
X_train)
def test_clone():
lr = LinearRegression()
svr_rbf = SVR(kernel='rbf', gamma='auto')
ridge = Ridge(random_state=1)
stregr = StackingRegressor(regressors=[lr, ridge],
meta_regressor=svr_rbf,
store_train_meta_features=True)
clone(stregr)
def test_features_in_secondary():
lr = LinearRegression()
svr_lin = SVR(kernel='linear', gamma='auto')
rf = RandomForestRegressor(n_estimators=10, random_state=2)
ridge = Ridge(random_state=0)
svr_rbf = SVR(kernel='rbf', gamma='auto')
stack = StackingRegressor(regressors=[svr_lin, lr, ridge, rf],
meta_regressor=svr_rbf,
use_features_in_secondary=True)
stack.fit(X1, y).predict(X1)
mse = 0.14
got = np.mean((stack.predict(X1) - y) ** 2)
print(got)
assert round(got, 2) == mse
stack = StackingRegressor(regressors=[svr_lin, lr, ridge, rf],
meta_regressor=svr_rbf,
use_features_in_secondary=False)
# dense
stack.fit(X1, y).predict(X1)
mse = 0.12
got = np.mean((stack.predict(X1) - y) ** 2)
print(got)
assert round(got, 2) == mse
def test_predictions_from_sparse_matrix():
lr = LinearRegression()
svr_lin = SVR(kernel='linear', gamma='auto')
ridge = Ridge(random_state=1)
stregr = StackingRegressor(regressors=[svr_lin, lr],
meta_regressor=ridge)
# dense
stregr.fit(X1, y)
print(stregr.score(X1, y))
assert round(stregr.score(X1, y), 2) == 0.61
# sparse
stregr.fit(sparse.csr_matrix(X1), y)
print(stregr.score(X1, y))
assert round(stregr.score(X1, y), 2) == 0.61
def test_sparse_matrix_inputs_and_features_in_secondary():
lr = LinearRegression()
svr_lin = SVR(kernel='linear', gamma='auto')
rf = RandomForestRegressor(n_estimators=10, random_state=2)
ridge = Ridge(random_state=0)
svr_rbf = SVR(kernel='rbf', gamma='auto')
stack = StackingRegressor(regressors=[svr_lin, lr, ridge, rf],
meta_regressor=svr_rbf,
use_features_in_secondary=True)
# dense
stack.fit(X1, y).predict(X1)
mse = 0.14
got = np.mean((stack.predict(X1) - y) ** 2)
assert round(got, 2) == mse
# sparse
stack.fit(sparse.csr_matrix(X1), y)
mse = 0.14
got = np.mean((stack.predict(sparse.csr_matrix(X1)) - y) ** 2)
assert round(got, 2) == mse
| 33.655 | 78 | 0.611945 | 1,702 | 13,462 | 4.663337 | 0.126322 | 0.026458 | 0.034774 | 0.058208 | 0.739322 | 0.70354 | 0.674688 | 0.653521 | 0.643064 | 0.633741 | 0 | 0.026917 | 0.268682 | 13,462 | 399 | 79 | 33.739348 | 0.779279 | 0.052073 | 0 | 0.675325 | 0 | 0 | 0.048673 | 0.003925 | 0 | 0 | 0 | 0 | 0.081169 | 1 | 0.071429 | false | 0 | 0.048701 | 0 | 0.12013 | 0.012987 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
819712661c434c29c647df9706229c41b1b83dad | 1,814 | py | Python | src/genotype/neat/node.py | sash-a/CoDeepNEAT | 3476078a48986107ea1fc1a7ab1b42a55ac4f62f | [
"MIT"
] | 28 | 2019-10-18T06:44:00.000Z | 2021-11-09T18:54:52.000Z | src/genotype/neat/node.py | sash-a/CoDeepNEAT | 3476078a48986107ea1fc1a7ab1b42a55ac4f62f | [
"MIT"
] | 2 | 2019-10-21T07:21:52.000Z | 2022-01-12T22:34:03.000Z | src/genotype/neat/node.py | sash-a/CoDeepNEAT | 3476078a48986107ea1fc1a7ab1b42a55ac4f62f | [
"MIT"
] | 11 | 2019-07-01T13:01:32.000Z | 2022-03-31T20:18:56.000Z | from __future__ import annotations
from typing import TYPE_CHECKING
import random
from enum import Enum
from configuration import config
from src.genotype.mutagen.option import Option
from src.genotype.neat.gene import Gene
if TYPE_CHECKING:
pass
class NodeType(Enum):
INPUT = 0
HIDDEN = 1
OUTPUT = 2
class Node(Gene):
"""General neat node"""
def __init__(self, id, type: NodeType = NodeType.HIDDEN):
super().__init__(id)
self.node_type: NodeType = type
# TODO
self.lossy_aggregation = Option('lossy', False, True,
current_value=random.choices([False, True], weights=[1-config.lossy_chance,
config.lossy_chance])[0],
mutation_chance=0.3 if config.mutate_lossy_values else 0)
self.try_conv_aggregation = Option('conv_aggregation', False, True, current_value=random.choice([False, True]))
mult_chance = config.element_wise_multiplication_chance
mult_weights = [1-mult_chance, mult_chance]
self.element_wise_multiplication_aggregation = \
Option('element_wise_multiplication_aggregation', False, True, current_value=
random.choices([False, True], weights=mult_weights)[0],
mutation_chance= 0.2 if mult_chance > 0 else 0,
probability_weighting=mult_weights)
def is_output_node(self):
return self.node_type == NodeType.OUTPUT
def is_input_node(self):
return self.node_type == NodeType.INPUT
def get_all_mutagens(self):
return [self.lossy_aggregation, self.try_conv_aggregation]
def convert_node(self, **kwargs):
raise NotImplemented()
| 34.226415 | 119 | 0.635612 | 209 | 1,814 | 5.244019 | 0.320574 | 0.04927 | 0.032847 | 0.054745 | 0.197993 | 0.197993 | 0.153285 | 0.091241 | 0.091241 | 0 | 0 | 0.010753 | 0.282249 | 1,814 | 52 | 120 | 34.884615 | 0.831029 | 0.012679 | 0 | 0 | 0 | 0 | 0.033613 | 0.021849 | 0 | 0 | 0 | 0.019231 | 0 | 1 | 0.135135 | false | 0.027027 | 0.189189 | 0.081081 | 0.540541 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
8199a12494b6e9ff40f7dd4748178b60d1e4daa2 | 939 | py | Python | hparams.py | jayg996/wave-u-net-pytorch | fede13d3f1374365ae8912df4a61ce1d151157ef | [
"MIT"
] | 3 | 2019-09-04T10:42:25.000Z | 2019-11-07T14:37:11.000Z | hparams.py | jayg996/wave-u-net-pytorch | fede13d3f1374365ae8912df4a61ce1d151157ef | [
"MIT"
] | null | null | null | hparams.py | jayg996/wave-u-net-pytorch | fede13d3f1374365ae8912df4a61ce1d151157ef | [
"MIT"
] | null | null | null | import yaml
# TODO: add function should be changed
class HParams(object):
# Hyperparameter class using yaml
def __init__(self, **kwargs):
self.__dict__ = kwargs
def add(self, **kwargs):
# change is needed - if key is existed, do not update.
self.__dict__.update(kwargs)
def update(self, **kwargs):
self.__dict__.update(kwargs)
return self
def save(self, path):
with open(path, 'w') as f:
yaml.dump(self.__dict__, f)
return self
def __repr__(self):
return '\nHyperparameters:\n' + '\n'.join([' {}={}'.format(k, v) for k, v in self.__dict__.items()])
@classmethod
def load(cls, path):
with open(path, 'r') as f:
return cls(**yaml.load(f))
if __name__ == '__main__':
hparams = HParams.load('hparams.yaml')
print(hparams)
d = {"MemoryNetwork": 0, "c": 1}
hparams.add(**d)
print(hparams) | 25.378378 | 108 | 0.587859 | 121 | 939 | 4.264463 | 0.479339 | 0.077519 | 0.054264 | 0.069767 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002911 | 0.268371 | 939 | 37 | 109 | 25.378378 | 0.74818 | 0.12886 | 0 | 0.24 | 0 | 0 | 0.078528 | 0 | 0 | 0 | 0 | 0.027027 | 0 | 1 | 0.24 | false | 0 | 0.04 | 0.04 | 0.48 | 0.08 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
81a5aacd02d00a35f68e1e72d44bba2a31ea7dc9 | 1,729 | py | Python | code/util/defn.py | jiavila/hpc-client-1 | 7e50668ffd7f17c281034fd34924d601feab3232 | [
"MIT"
] | null | null | null | code/util/defn.py | jiavila/hpc-client-1 | 7e50668ffd7f17c281034fd34924d601feab3232 | [
"MIT"
] | 4 | 2021-11-24T00:03:46.000Z | 2021-12-11T17:33:14.000Z | code/util/defn.py | jiavila/hpc-client-1 | 7e50668ffd7f17c281034fd34924d601feab3232 | [
"MIT"
] | 1 | 2021-11-23T23:44:15.000Z | 2021-11-23T23:44:15.000Z | from pydantic import BaseModel, Field
from typing import Any, List, Optional
class ConfigFileCast(BaseModel):
"""
Represents cast.yml cast settings
"""
cluster: str
dry_run: bool
admin_contact_email: str
group_whitelist: bool
cast_on_tag: bool
cast_gear_whitelist: List[str] = Field(default_factory=list)
show_script_template_values: bool
show_script_template_result: bool
show_commnd_template_result: bool
command: Optional[List[str]]
command_script_stdin: Optional[bool]
script: Optional[str]
script_executable: Optional[bool]
use_hold_engine: bool
class ConfigFile(BaseModel):
"""
Represents a cast.yml settings file
"""
cast: ConfigFileCast
class Paths(BaseModel):
"""
Represents various absolute paths used by the application
"""
cast_path: str
yaml_path: str
scripts_path: str
hpc_logs_path: str
engine_run_path: str
class CredentialEnv(BaseModel):
"""
Represents parsed environment variable settings
"""
host: str
port: int
credential: str
class Config(BaseModel):
"""
Represents all cast settings as used by the application
"""
cast: ConfigFileCast
paths: Paths
creds: CredentialEnv
sdk: Optional[Any]
class JobSettings(BaseModel):
"""
Represents all HPC-relevant job settings.
"""
fw_id: str
singularity_debug: bool
singularity_writable: bool
# The meaning of the following values vary by cluster type.
ram: Optional[str]
cpu: Optional[str]
class ScriptTemplate(BaseModel):
"""
Reresents the values available when templating an HPC script.
"""
job: JobSettings
script_path: str
script_log_path: str
cast_path: str
engine_run_path: str
| 17.642857 | 62 | 0.717756 | 215 | 1,729 | 5.590698 | 0.427907 | 0.052413 | 0.02995 | 0.033278 | 0.078203 | 0.03827 | 0 | 0 | 0 | 0 | 0 | 0 | 0.205899 | 1,729 | 97 | 63 | 17.824742 | 0.875455 | 0.227877 | 0 | 0.130435 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.043478 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
81a5e4c275bdbcc13ea070afccaf009ec8211cd5 | 2,590 | py | Python | django_socketio_chat/serializers.py | leukeleu/django-socketio-chat | 8bdebfe6591202b9e13144afbea40e93189082d0 | [
"Apache-2.0"
] | 6 | 2016-10-29T13:31:53.000Z | 2021-04-21T04:42:47.000Z | django_socketio_chat/serializers.py | leukeleu/django-socketio-chat | 8bdebfe6591202b9e13144afbea40e93189082d0 | [
"Apache-2.0"
] | null | null | null | django_socketio_chat/serializers.py | leukeleu/django-socketio-chat | 8bdebfe6591202b9e13144afbea40e93189082d0 | [
"Apache-2.0"
] | 7 | 2015-04-09T22:34:09.000Z | 2022-03-17T20:06:14.000Z | from django.contrib.auth.models import User
from rest_framework import serializers
from rest_framework.fields import Field, CharField
from .models import ChatSession, Chat, UserChatStatus, Message
class UUIDFieldSerializerMixin(serializers.ModelSerializer):
"""
Django REST Framework does not know what to do with UUIDFields.
TODO: can this be made into a real Mixin that doesn't inherit from the serializers.ModelSerializer base class?
"""
def get_field(self, model_field):
if (model_field.name == 'uuid'):
return CharField()
return super(UUIDFieldSerializerMixin, self).get_field(model_field)
# ---[ viewpoint = User ]--- #
class ChatSessionSerializer(serializers.ModelSerializer):
username = Field(source='user.username')
class Meta:
model = ChatSession
fields = ('username', 'status')
class UserSerializer(serializers.ModelSerializer):
# TODO: add `availability` field / property to User (via UserProfile?): availability = Field(source='get_availability')
status = serializers.SerializerMethodField('get_status')
class Meta:
model = User
fields = ('username', 'status')
def get_status(self, obj):
pass
# ---[ viewpoint = Chat ]--- #
class UserChatStatusSerializer(UUIDFieldSerializerMixin, serializers.ModelSerializer):
user = UserSerializer()
unread_messages = Field(source='unread_messages')
class Meta:
model = UserChatStatus
fields = ('user', 'status', 'joined', 'unread_messages')
class ChatMessageSerializer(UUIDFieldSerializerMixin, serializers.ModelSerializer):
user_from__username = Field(source='user_from.username')
chat__uuid = Field(source='chat.uuid')
class Meta:
model = Message
fields = ('uuid', 'chat__uuid', 'timestamp', 'user_from__username', 'message_body',)
class ChatSerializer(UUIDFieldSerializerMixin, serializers.ModelSerializer):
user_chat_statuses = UserChatStatusSerializer()
messages = ChatMessageSerializer()
class Meta:
model = Chat
queryset = Chat.objects.filter(uuid='74546e66ed5546ddb70faaca326a4b95')
fields = ('uuid', 'started', 'user_chat_statuses', 'messages')
# ---[ viewpoint = Message ]-------- #
class MessageSerializer(UUIDFieldSerializerMixin, serializers.ModelSerializer):
chat__uuid = Field(source='chat.uuid')
user_from__username = Field(source='user_from.username')
class Meta:
model = Message
fields = ('uuid', 'timestamp', 'chat__uuid', 'user_from__username', 'message_body',)
| 31.585366 | 123 | 0.706564 | 256 | 2,590 | 6.988281 | 0.320313 | 0.116266 | 0.046954 | 0.038569 | 0.147568 | 0.112912 | 0.048072 | 0.048072 | 0 | 0 | 0 | 0.008983 | 0.183398 | 2,590 | 81 | 124 | 31.975309 | 0.836879 | 0.148263 | 0 | 0.304348 | 0 | 0 | 0.152363 | 0.014686 | 0 | 0 | 0 | 0.024691 | 0 | 1 | 0.043478 | false | 0.021739 | 0.086957 | 0 | 0.673913 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
81aa0c108ae6f77d6016f16d099a7c0fe710116b | 977 | py | Python | models/laser.py | debaraj-barua/Atari-Space-Invaders | dceadabd2992bcdc5e5cd1a3d2a2cf85033df423 | [
"MIT"
] | null | null | null | models/laser.py | debaraj-barua/Atari-Space-Invaders | dceadabd2992bcdc5e5cd1a3d2a2cf85033df423 | [
"MIT"
] | null | null | null | models/laser.py | debaraj-barua/Atari-Space-Invaders | dceadabd2992bcdc5e5cd1a3d2a2cf85033df423 | [
"MIT"
] | null | null | null | import pygame
from screens.background import slow_bg_obj
from utils.collide import collide
class Laser:
def __init__(self, x, y, img):
self.x = x
self.y = y
self.img = img
self.mask = pygame.mask.from_surface(self.img)
def draw(self, window):
# making laser's coordinates centered in the sprite
background_width = slow_bg_obj.rectBGimg.width
screen_rect = window.get_rect()
center_x = screen_rect.centerx
starting_x = center_x - background_width//2
x_offset, y_offset = self.img.get_size()
window.blit(self.img, (starting_x+self.x-x_offset/2, self.y-y_offset/2))
def move(self, vel):
self.y += vel
def off_screen(self, height):
return not(height >= self.y >= 0)
def collision(self, obj):
return collide(self, obj)
def get_width(self):
return self.img.get_width()
def get_height(self):
return self.img.get_height() | 27.914286 | 80 | 0.637666 | 143 | 977 | 4.160839 | 0.342657 | 0.070588 | 0.05042 | 0.057143 | 0.067227 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005525 | 0.258956 | 977 | 35 | 81 | 27.914286 | 0.816298 | 0.050154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.269231 | false | 0 | 0.115385 | 0.153846 | 0.576923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
81b39261babc079984771c2f8bf2065d0563a135 | 882 | py | Python | src/ebonite/client/helpers.py | koskotG/ebonite | 9f9ae016b70fb24865d5edc99142afb8ab4ddc59 | [
"Apache-2.0"
] | null | null | null | src/ebonite/client/helpers.py | koskotG/ebonite | 9f9ae016b70fb24865d5edc99142afb8ab4ddc59 | [
"Apache-2.0"
] | null | null | null | src/ebonite/client/helpers.py | koskotG/ebonite | 9f9ae016b70fb24865d5edc99142afb8ab4ddc59 | [
"Apache-2.0"
] | null | null | null | from typing import Any, Dict
from ebonite.core.objects.core import Model
def create_model(model_object, input_data, model_name: str = None, params: Dict[str, Any] = None,
description: str = None) -> Model:
"""
Creates Model instance from arbitrary model objects and sample of input data
:param model_object: model object (function, sklearn model, tensorflow output tensor list etc)
:param input_data: sample of input data (numpy array, pandas dataframe, feed dict etc)
:param model_name: name for model in database, if not provided will be autogenerated
:param params: dict with arbitrary parameters. Must be json-serializable
:param description: text description of this model
:return: :class:`~ebonite.core.objects.core.Model` instance
"""
return Model.create(model_object, input_data, model_name, params, description)
| 46.421053 | 98 | 0.736961 | 121 | 882 | 5.289256 | 0.46281 | 0.070313 | 0.05625 | 0.06875 | 0.090625 | 0.090625 | 0 | 0 | 0 | 0 | 0 | 0 | 0.187075 | 882 | 18 | 99 | 49 | 0.892608 | 0.598639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
81b7fc16b979681c0cbb3ace4f70ba6ebebe2d06 | 319 | py | Python | src/sane_fin_site/fin_storage/admin.py | StanleySane/sane-fin-site | 4e7341aa3bc17d12f6cf3741ed2fec4bf554b9f9 | [
"BSD-3-Clause"
] | null | null | null | src/sane_fin_site/fin_storage/admin.py | StanleySane/sane-fin-site | 4e7341aa3bc17d12f6cf3741ed2fec4bf554b9f9 | [
"BSD-3-Clause"
] | null | null | null | src/sane_fin_site/fin_storage/admin.py | StanleySane/sane-fin-site | 4e7341aa3bc17d12f6cf3741ed2fec4bf554b9f9 | [
"BSD-3-Clause"
] | null | null | null | from django.contrib import admin
from .models import (
Exporter, InstrumentValue, DownloadedInterval, CachedItem, SourceApiActuality)
admin.site.register(Exporter)
admin.site.register(InstrumentValue)
admin.site.register(DownloadedInterval)
admin.site.register(CachedItem)
admin.site.register(SourceApiActuality)
| 29 | 82 | 0.836991 | 33 | 319 | 8.090909 | 0.393939 | 0.168539 | 0.318352 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075235 | 319 | 10 | 83 | 31.9 | 0.905085 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
81d221b727f9bb638b5220c9bd9f84867418d89e | 1,153 | py | Python | src/server/app/repository/KeybindingRepository.py | MatthiasRiener/Slidea | 44ddff2ad2a1c69b6a406a8248ca6225766c715c | [
"MIT"
] | 6 | 2020-12-15T18:57:53.000Z | 2022-02-06T18:54:35.000Z | src/server/app/repository/KeybindingRepository.py | MatthiasRiener/Slidea | 44ddff2ad2a1c69b6a406a8248ca6225766c715c | [
"MIT"
] | 186 | 2020-11-17T10:18:17.000Z | 2022-03-02T07:19:22.000Z | src/server/app/repository/KeybindingRepository.py | MatthiasRiener/Slidea | 44ddff2ad2a1c69b6a406a8248ca6225766c715c | [
"MIT"
] | 1 | 2020-12-14T19:37:30.000Z | 2020-12-14T19:37:30.000Z | from ..db.settings import mongoclient
import json
import os
from bson import json_util
class KeybindingRepository():
def __init__(self, testing):
self.testing = testing
def createKeybindings(self, u_id):
keybindings = self.readJson("shortcuts.json")
mongoclient.db['keybinding'].insert_one(
{'u_id': u_id, 'bindings': keybindings})
def loadEasterEggs(self):
return self.readJson("eastereggs.json")
def getKeybindings(self, u_id):
res = mongoclient.db['keybinding'].find_one({"u_id": u_id})
return json.loads(json_util.dumps(res))
def updateKeybinds(self, keybinds, u_id):
mongoclient.db['keybinding'].update(
{"u_id": u_id}, {"$set": {"bindings": keybinds}})
return 1
def readJson(self, jsonfile_path):
__location__ = os.path.realpath(
os.path.join(os.getcwd(), os.path.dirname(__file__)))
with open(os.path.join(__location__, jsonfile_path)) as json_file:
data = json.load(json_file)
res = list()
for bind in data:
res.append(bind)
return res
| 30.342105 | 74 | 0.623591 | 138 | 1,153 | 4.971014 | 0.405797 | 0.039359 | 0.100583 | 0.026239 | 0.026239 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00116 | 0.252385 | 1,153 | 37 | 75 | 31.162162 | 0.794664 | 0 | 0 | 0 | 0 | 0 | 0.078925 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.206897 | false | 0 | 0.137931 | 0.034483 | 0.517241 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
81dda7f3e7674268450ce78d23d04e382556e6f7 | 1,986 | py | Python | sandbox/order/migrations/0009_surcharge.py | thelabnyc/django-oscar-cch | d98832c9cf642c6d241e3aaf4b1dc631c3d5ce0e | [
"0BSD"
] | 14 | 2017-04-19T01:20:47.000Z | 2021-03-31T13:19:55.000Z | sandbox/order/migrations/0009_surcharge.py | thelabnyc/django-oscar-wfrs | 9abd4ecbdafd597407fdf60657103cb5d29c4c8b | [
"0BSD"
] | 24 | 2019-12-04T21:37:01.000Z | 2022-03-11T23:16:20.000Z | sandbox/sandbox/order/migrations/0009_surcharge.py | thelabnyc/django-oscar-api-checkout | bf66901cff4467b26e2c50260b3e1a61404b8b74 | [
"0BSD"
] | 4 | 2017-07-22T19:47:10.000Z | 2021-10-30T14:20:54.000Z | # Generated by Django 2.2.6 on 2020-02-19 09:16
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
("order", "0008_auto_20190301_1035"),
]
replaces = [
("order", "0008_surcharge"),
]
operations = [
migrations.CreateModel(
name="Surcharge",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("name", models.CharField(max_length=128, verbose_name="Surcharge")),
(
"code",
models.CharField(max_length=128, verbose_name="Surcharge code"),
),
(
"incl_tax",
models.DecimalField(
decimal_places=2,
default=0,
max_digits=12,
verbose_name="Surcharge (inc. tax)",
),
),
(
"excl_tax",
models.DecimalField(
decimal_places=2,
default=0,
max_digits=12,
verbose_name="Surcharge (excl. tax)",
),
),
(
"order",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="surcharges",
to="order.Order",
verbose_name="Surcharges",
),
),
],
options={"abstract": False, "ordering": ["pk"]},
),
]
| 30.090909 | 85 | 0.379658 | 134 | 1,986 | 5.462687 | 0.492537 | 0.090164 | 0.10929 | 0.060109 | 0.34153 | 0.34153 | 0.34153 | 0.34153 | 0.34153 | 0.202186 | 0 | 0.051852 | 0.524169 | 1,986 | 65 | 86 | 30.553846 | 0.722751 | 0.022659 | 0 | 0.310345 | 1 | 0 | 0.104177 | 0.011862 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.034483 | 0 | 0.103448 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c48be16011637519775b240ff91de050be722d28 | 631 | py | Python | migrations/20210527_01_WsYmD-add-account-role-table.py | ThatOneAnimeGuy/seiso | f8ad20a0ec59b86b88149723eafc8e6d9f8be451 | [
"BSD-3-Clause"
] | 3 | 2021-11-08T05:23:08.000Z | 2021-11-08T09:46:51.000Z | migrations/20210527_01_WsYmD-add-account-role-table.py | ThatOneAnimeGuy/seiso | f8ad20a0ec59b86b88149723eafc8e6d9f8be451 | [
"BSD-3-Clause"
] | null | null | null | migrations/20210527_01_WsYmD-add-account-role-table.py | ThatOneAnimeGuy/seiso | f8ad20a0ec59b86b88149723eafc8e6d9f8be451 | [
"BSD-3-Clause"
] | 2 | 2021-11-08T05:23:12.000Z | 2021-11-16T01:16:35.000Z | """
Add account_role table
"""
from yoyo import step
__depends__ = {'20210524_01_9elM3-add-account-artist-subscription-table'}
steps = [
step("""
INSERT INTO account (id, username, password_hash) VALUES (1, 'admin', '$2y$12$NcBlnm9W1sJ14R99c/jTTuGEBz6YFCxOVrtxKyhr3cdb654vRfX1u') ON CONFLICT DO NOTHING;
CREATE TABLE account_role (
id serial NOT NULL PRIMARY KEY,
account_id int NOT NULL REFERENCES account(id),
role varchar(20) NOT NULL,
UNIQUE (account_id, role)
);
INSERT INTO account_role (account_id, role) VALUES (1, 'admin');
""")
]
| 30.047619 | 165 | 0.649762 | 74 | 631 | 5.364865 | 0.581081 | 0.11335 | 0.098237 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065126 | 0.245642 | 631 | 20 | 166 | 31.55 | 0.768908 | 0.034865 | 0 | 0 | 0 | 0.071429 | 0.876872 | 0.196339 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.071429 | 0.071429 | 0 | 0.071429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
c48cd23da45abc794ac035b5452691e53293b621 | 429 | py | Python | server/tests/api/test_tasks.py | tedmiston/prefect | a2cb40c28c942b1d170db42a55bab99598a4dcd6 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2020-05-10T14:32:32.000Z | 2020-05-10T14:32:32.000Z | server/tests/api/test_tasks.py | tedmiston/prefect | a2cb40c28c942b1d170db42a55bab99598a4dcd6 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2022-02-14T11:25:57.000Z | 2022-02-27T16:25:14.000Z | server/tests/api/test_tasks.py | tedmiston/prefect | a2cb40c28c942b1d170db42a55bab99598a4dcd6 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2020-05-04T13:22:11.000Z | 2020-05-04T13:22:11.000Z | # Licensed under the Prefect Community License, available at
# https://www.prefect.io/legal/prefect-community-license
import prefect
import prefect_server
from prefect_server.database import models
class TestCreateTask:
async def test_task_auto_generated_default_to_false(self, task_id):
task_id = await models.Task.where(id=task_id).first({"id", "auto_generated"})
assert task_id.auto_generated == False
| 30.642857 | 85 | 0.778555 | 60 | 429 | 5.333333 | 0.566667 | 0.075 | 0.14375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135198 | 429 | 13 | 86 | 33 | 0.862534 | 0.263403 | 0 | 0 | 0 | 0 | 0.051118 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
c494196ed654cede8f196dd36394f7c327f3bc3b | 317 | py | Python | example1.py | FreakX23/EBook_Training | de445b0a9e56a1f1ffc51ae3c5e10ebe8297e9b6 | [
"MIT"
] | null | null | null | example1.py | FreakX23/EBook_Training | de445b0a9e56a1f1ffc51ae3c5e10ebe8297e9b6 | [
"MIT"
] | null | null | null | example1.py | FreakX23/EBook_Training | de445b0a9e56a1f1ffc51ae3c5e10ebe8297e9b6 | [
"MIT"
] | null | null | null | print('Hello my dear') #comments are essential but I am always to lazy
print ('what is your name?')
myName = input()
print('it is nice to meet you,' + myName)
print('the length of you name is :')
print(len(myName))
print ('what is your age?')
myAge = input ()
print('you will be ' + str(int(myAge) +1) +' in a year')
| 31.7 | 70 | 0.66877 | 56 | 317 | 3.785714 | 0.660714 | 0.084906 | 0.103774 | 0.141509 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003831 | 0.176656 | 317 | 9 | 71 | 35.222222 | 0.808429 | 0.14511 | 0 | 0 | 0 | 0 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.777778 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
c4abcf2a947dc3c4d66fe5f1b2d7a9e46a65282f | 8,743 | py | Python | impgrpc/compiled/federate_pb2.py | sako0938/imp-grpc-client | 3ca0380ef7481f261684d0f22e1985189a9ec7c9 | [
"MIT"
] | null | null | null | impgrpc/compiled/federate_pb2.py | sako0938/imp-grpc-client | 3ca0380ef7481f261684d0f22e1985189a9ec7c9 | [
"MIT"
] | null | null | null | impgrpc/compiled/federate_pb2.py | sako0938/imp-grpc-client | 3ca0380ef7481f261684d0f22e1985189a9ec7c9 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: federate.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from google.api import annotations_pb2 as google_dot_api_dot_annotations__pb2
from protoc_gen_openapiv2.options import annotations_pb2 as protoc__gen__openapiv2_dot_options_dot_annotations__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='federate.proto',
package='federate',
syntax='proto3',
serialized_options=b'Z#github.com/imperviousai/freeimp/gen\222A\237\001\022@\n\021Federate Services\"&\n\rImpervious AI\022\025https://impervious.ai2\0031.0*\003\001\002\0042\020application/json:\020application/jsonr2\n\024Documentation on IMP\022\032https://docs.impervious.ai',
create_key=_descriptor._internal_create_key,
serialized_pb=b'\n\x0e\x66\x65\x64\x65rate.proto\x12\x08\x66\x65\x64\x65rate\x1a\x1cgoogle/api/annotations.proto\x1a.protoc-gen-openapiv2/options/annotations.proto\"(\n\x16RequestFederateRequest\x12\x0e\n\x06pubkey\x18\x01 \x01(\t\"%\n\x17RequestFederateResponse\x12\n\n\x02id\x18\x01 \x01(\t\"(\n\x16LeaveFederationRequest\x12\x0e\n\x06pubkey\x18\x01 \x01(\t\"%\n\x17LeaveFederationResponse\x12\n\n\x02id\x18\x01 \x01(\t2\xfa\x01\n\x08\x46\x65\x64\x65rate\x12w\n\x0fRequestFederate\x12 .federate.RequestFederateRequest\x1a!.federate.RequestFederateResponse\"\x1f\x82\xd3\xe4\x93\x02\x19\"\x14/v1/federate/request:\x01*\x12u\n\x0fLeaveFederation\x12 .federate.LeaveFederationRequest\x1a!.federate.LeaveFederationResponse\"\x1d\x82\xd3\xe4\x93\x02\x17\"\x12/v1/federate/leave:\x01*B\xc8\x01Z#github.com/imperviousai/freeimp/gen\x92\x41\x9f\x01\x12@\n\x11\x46\x65\x64\x65rate Services\"&\n\rImpervious AI\x12\x15https://impervious.ai2\x03\x31.0*\x03\x01\x02\x04\x32\x10\x61pplication/json:\x10\x61pplication/jsonr2\n\x14\x44ocumentation on IMP\x12\x1ahttps://docs.impervious.aib\x06proto3'
,
dependencies=[google_dot_api_dot_annotations__pb2.DESCRIPTOR,protoc__gen__openapiv2_dot_options_dot_annotations__pb2.DESCRIPTOR,])
_REQUESTFEDERATEREQUEST = _descriptor.Descriptor(
name='RequestFederateRequest',
full_name='federate.RequestFederateRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='pubkey', full_name='federate.RequestFederateRequest.pubkey', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=106,
serialized_end=146,
)
_REQUESTFEDERATERESPONSE = _descriptor.Descriptor(
name='RequestFederateResponse',
full_name='federate.RequestFederateResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='federate.RequestFederateResponse.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=148,
serialized_end=185,
)
_LEAVEFEDERATIONREQUEST = _descriptor.Descriptor(
name='LeaveFederationRequest',
full_name='federate.LeaveFederationRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='pubkey', full_name='federate.LeaveFederationRequest.pubkey', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=187,
serialized_end=227,
)
_LEAVEFEDERATIONRESPONSE = _descriptor.Descriptor(
name='LeaveFederationResponse',
full_name='federate.LeaveFederationResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='federate.LeaveFederationResponse.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=229,
serialized_end=266,
)
DESCRIPTOR.message_types_by_name['RequestFederateRequest'] = _REQUESTFEDERATEREQUEST
DESCRIPTOR.message_types_by_name['RequestFederateResponse'] = _REQUESTFEDERATERESPONSE
DESCRIPTOR.message_types_by_name['LeaveFederationRequest'] = _LEAVEFEDERATIONREQUEST
DESCRIPTOR.message_types_by_name['LeaveFederationResponse'] = _LEAVEFEDERATIONRESPONSE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
RequestFederateRequest = _reflection.GeneratedProtocolMessageType('RequestFederateRequest', (_message.Message,), {
'DESCRIPTOR' : _REQUESTFEDERATEREQUEST,
'__module__' : 'federate_pb2'
# @@protoc_insertion_point(class_scope:federate.RequestFederateRequest)
})
_sym_db.RegisterMessage(RequestFederateRequest)
RequestFederateResponse = _reflection.GeneratedProtocolMessageType('RequestFederateResponse', (_message.Message,), {
'DESCRIPTOR' : _REQUESTFEDERATERESPONSE,
'__module__' : 'federate_pb2'
# @@protoc_insertion_point(class_scope:federate.RequestFederateResponse)
})
_sym_db.RegisterMessage(RequestFederateResponse)
LeaveFederationRequest = _reflection.GeneratedProtocolMessageType('LeaveFederationRequest', (_message.Message,), {
'DESCRIPTOR' : _LEAVEFEDERATIONREQUEST,
'__module__' : 'federate_pb2'
# @@protoc_insertion_point(class_scope:federate.LeaveFederationRequest)
})
_sym_db.RegisterMessage(LeaveFederationRequest)
LeaveFederationResponse = _reflection.GeneratedProtocolMessageType('LeaveFederationResponse', (_message.Message,), {
'DESCRIPTOR' : _LEAVEFEDERATIONRESPONSE,
'__module__' : 'federate_pb2'
# @@protoc_insertion_point(class_scope:federate.LeaveFederationResponse)
})
_sym_db.RegisterMessage(LeaveFederationResponse)
DESCRIPTOR._options = None
_FEDERATE = _descriptor.ServiceDescriptor(
name='Federate',
full_name='federate.Federate',
file=DESCRIPTOR,
index=0,
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_start=269,
serialized_end=519,
methods=[
_descriptor.MethodDescriptor(
name='RequestFederate',
full_name='federate.Federate.RequestFederate',
index=0,
containing_service=None,
input_type=_REQUESTFEDERATEREQUEST,
output_type=_REQUESTFEDERATERESPONSE,
serialized_options=b'\202\323\344\223\002\031\"\024/v1/federate/request:\001*',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='LeaveFederation',
full_name='federate.Federate.LeaveFederation',
index=1,
containing_service=None,
input_type=_LEAVEFEDERATIONREQUEST,
output_type=_LEAVEFEDERATIONRESPONSE,
serialized_options=b'\202\323\344\223\002\027\"\022/v1/federate/leave:\001*',
create_key=_descriptor._internal_create_key,
),
])
_sym_db.RegisterServiceDescriptor(_FEDERATE)
DESCRIPTOR.services_by_name['Federate'] = _FEDERATE
# @@protoc_insertion_point(module_scope)
| 37.848485 | 1,089 | 0.780281 | 984 | 8,743 | 6.622967 | 0.207317 | 0.033144 | 0.037901 | 0.049716 | 0.445911 | 0.405094 | 0.396195 | 0.363664 | 0.353537 | 0.294614 | 0 | 0.047637 | 0.099623 | 8,743 | 230 | 1,090 | 38.013043 | 0.780234 | 0.05593 | 0 | 0.589744 | 1 | 0.020513 | 0.281917 | 0.204854 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.030769 | 0 | 0.030769 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c4af7d52cdb73dcb35c726e2aa79c469bf78fa18 | 649 | py | Python | app/constants/web_services.py | CityOfNewYork/NYCOpenRecords | 476a236a573e6f3a2f96c6537a30ee27b2bd3a2b | [
"Apache-2.0"
] | 37 | 2016-01-21T18:33:56.000Z | 2021-10-24T01:43:20.000Z | app/constants/web_services.py | CityOfNewYork/NYCOpenRecords | 476a236a573e6f3a2f96c6537a30ee27b2bd3a2b | [
"Apache-2.0"
] | 179 | 2016-01-21T21:33:31.000Z | 2022-02-15T21:31:35.000Z | app/constants/web_services.py | CityOfNewYork/NYCOpenRecords | 476a236a573e6f3a2f96c6537a30ee27b2bd3a2b | [
"Apache-2.0"
] | 13 | 2017-05-19T17:27:31.000Z | 2020-07-05T00:55:29.000Z | from urllib.parse import urljoin
OAUTH_BASE_ENDPOINT = "account/api/oauth/"
AUTH_ENDPOINT = urljoin(OAUTH_BASE_ENDPOINT, 'authorize.htm')
USER_ENDPOINT = urljoin(OAUTH_BASE_ENDPOINT, 'user.htm')
EMAIL_VALIDATION_ENDPOINT = "/account/validateEmail.htm"
EMAIL_VALIDATION_STATUS_ENDPOINT = "/account/api/isEmailValidated.htm"
TOU_ENDPOINT = "/account/user/termsOfUse.htm"
TOU_STATUS_ENDPOINT = "/account/api/isTermsOfUseCurrent.htm"
ENROLLMENT_ENDPOINT = "/account/api/enrollment.htm"
ENROLLMENT_STATUS_ENDPOINT = "/account/api/getEnrollment.htm"
USER_SEARCH_ENDPOINT = "/account/api/user.htm"
USERS_SEARCH_ENDPOINT = "/account/api/getUsers.htm"
| 34.157895 | 70 | 0.813559 | 81 | 649 | 6.234568 | 0.333333 | 0.267327 | 0.249505 | 0.142574 | 0.126733 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069337 | 649 | 18 | 71 | 36.055556 | 0.836093 | 0 | 0 | 0 | 0 | 0 | 0.40832 | 0.348228 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c4b7dc0c7a717c06139826632a890575c6b24259 | 761 | py | Python | labtool_ex2/src/labtool_ex2/uarray.py | Bierbunker/Lab-Tool | cb65fa4b39135708045c0da161e9e5e0c908623c | [
"MIT"
] | 1 | 2022-01-12T21:26:37.000Z | 2022-01-12T21:26:37.000Z | labtool_ex2/src/labtool_ex2/uarray.py | Bierbunker/Lab-Tool | cb65fa4b39135708045c0da161e9e5e0c908623c | [
"MIT"
] | 10 | 2022-01-11T20:02:38.000Z | 2022-01-17T14:18:12.000Z | labtool_ex2/src/labtool_ex2/uarray.py | etschgi1/Lab-Tool | cb65fa4b39135708045c0da161e9e5e0c908623c | [
"MIT"
] | null | null | null | from numpy import asarray, ndarray
class UArray(ndarray):
def __new__(cls, input_array):
# Input array is an already formed ndarray instance
# We first cast to be our class type
return asarray(input_array).view(cls)
@property
def n(self):
return asarray([catch(lambda x: x.n, value) for value in self])
# ufloat_from_str(x.__format__()).n
@property
def s(self):
return asarray([catch(lambda x: x.s, value, handle=lambda _: 0) for value in self])
# https://stackoverflow.com/questions/1528237/how-to-handle-exceptions-in-a-list-comprehensions
def catch(func, arg, handle=lambda x: x):
try:
return func(arg)
except AttributeError:
return handle(arg)
| 26.241379 | 95 | 0.65046 | 106 | 761 | 4.54717 | 0.528302 | 0.062241 | 0.049793 | 0.091286 | 0.124481 | 0.124481 | 0.124481 | 0 | 0 | 0 | 0 | 0.013937 | 0.245729 | 761 | 28 | 96 | 27.178571 | 0.825784 | 0.278581 | 0 | 0.133333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.266667 | false | 0 | 0.066667 | 0.2 | 0.733333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
c4c19eb3f262e6d056b85447f39800b1eefc0f79 | 6,151 | py | Python | digsby/src/tests/testutil/testfsm.py | ifwe/digsby | f5fe00244744aa131e07f09348d10563f3d8fa99 | [
"Python-2.0"
] | 35 | 2015-08-15T14:32:38.000Z | 2021-12-09T16:21:26.000Z | digsby/src/tests/testutil/testfsm.py | niterain/digsby | 16a62c7df1018a49eaa8151c0f8b881c7e252949 | [
"Python-2.0"
] | 4 | 2015-09-12T10:42:57.000Z | 2017-02-27T04:05:51.000Z | digsby/src/tests/testutil/testfsm.py | niterain/digsby | 16a62c7df1018a49eaa8151c0f8b881c7e252949 | [
"Python-2.0"
] | 15 | 2015-07-10T23:58:07.000Z | 2022-01-23T22:16:33.000Z | from util.fsm import StateMachine
if __name__ == "__main__":
machine = StateMachine('status', ["off", "fleft", "left", "bleft",
"fright", "right", "bright"], "left")
machine.create_trans("left", "fleft", "otherleft")
print machine.process("otherleft")
#>>> states = ["off", "fleft", "left", "bleft", "fright", "right", "bright"]
#>>> ops = ["buddy_icon_disabled", "buddy_icon_enabled"]
#>>> ops2 = ["enabled", "disabled", "off", "fleft", "left", "bleft", "fright", "right", "bright"]
#>>> ["status_" + op for op in ops2]
#['status_enabled', 'status_disabled', 'status_off', 'status_fleft', 'status_left', 'status_bleft', 'status_fright', 'status_right', 'status_bright']
#>>> ops3 = ["status_" + op for op in ops2]
#>>> for state in states:
#... for op in ops + ops3:
#... print "machine.create_transition('%s', '%s', None)" % (state, op)
import sys
sys.exit(0)
"""
below is what I believe to be a complete state machine definition for
a status or service icon, neglecting the off states.
so far, my idea for implementation is incomplete, it is lacking
the interaction between state machines necessary for the whole thing to
work. One can see, however, that the number of needed transitions is
much smaller than the possible ones.
"""
machine = None
to_state = None
'buddy_icon_disabled'
'buddy_icon_left'
'buddy_icon_right'
states = ['fleft', 'left', 'bleft_l', 'bright_l', 'bleft_r', 'bright_r', 'right', 'fright']
#definition of simple transitions (someone else wants my spot)
#I'm in the far left, other bumps me
machine.create_trans('fleft', 'left', 'other_fleft')
#I'm in the left, other bumps me
machine.create_trans('left', 'fleft', 'other_left')
#buddy icon is on the left
#badge on left
machine.create_trans('bleft_l', 'bright_l', 'other_bleft_l')
#badge on right
machine.create_trans('bright_l', 'bleft_l', 'other_bright_l')
#buddy icon is on the right
#badge on left
machine.create_trans('bleft_r', 'bright_r', 'other_bleft_r')
#badge on right
machine.create_trans('bright_r', 'bleft_r', 'other_bright_r')
#I'm in the far right, other bumps me
machine.create_trans('fright', 'right', 'other_fright')
#I'm in the right, other bumps me
machine.create_trans('right', 'fright', 'other_right')
#definition of buddy icon translation
#badge on left
machine.create_trans('bleft_l', 'bleft_r', 'buddy_icon_right')
#badge on right
machine.create_trans('bright_l', 'bright_r', 'buddy_icon_right')
#badge on left
machine.create_trans('bleft_r', 'bleft_l', 'buddy_icon_left')
#badge on right
machine.create_trans('bright_r', 'bright_l', 'buddy_icon_left')
#these are the hard ones
#buddy icon disabled. where to go
#ok, the definition is easy, the trouble is, you have to tell them in the
#correct order. If you do, the state machines do the heavy lifting for you
#else, you get the wrong result
machine.create_trans('bleft_l', 'left', 'buddy_icon_disabled')
machine.create_trans('bright_l', 'left', 'buddy_icon_disabled')
machine.create_trans('bleft_r', 'right', 'buddy_icon_disabled')
machine.create_trans('bright_r', 'right', 'buddy_icon_disabled')
#example
states1 = ['off', 'fleft', 'left', 'bleft_l', 'bright_l',
'bleft_r', 'bright_r', 'right', 'fright']
states2 = ['off', 'left', 'right']
manager = StateManager()
status_machine = StateMachine("status", states1, "off")
service_machine = StateMachine("service", states1, "off")
buddy_icon_machine = StateMachine("buddy_icon", states2, "off")
manager.add_machine(status_machine)
manager.add_machine(service_machine)
manager.add_machine(buddy_icon_machine)
status_machine.create_trans('fleft', 'left', 'service_fleft')
status_machine.create_trans('left', 'fleft', 'service_left')
status_machine.create_trans('bleft_l', 'bright_l', 'service_bleft_l')
status_machine.create_trans('bright_l', 'bleft_l', 'service_bright_l')
status_machine.create_trans('bleft_r', 'bright_r', 'service_bleft_r')
status_machine.create_trans('bright_r', 'bleft_r', 'service_bright_r')
status_machine.create_trans('fright', 'right', 'service_fright')
status_machine.create_trans('right', 'fright', 'service_right')
status_machine.create_trans('bleft_l', 'bleft_r', 'buddy_icon_right')
status_machine.create_trans('bright_l', 'bright_r', 'buddy_icon_right')
status_machine.create_trans('bleft_r', 'bleft_l', 'buddy_icon_left')
status_machine.create_trans('bright_r', 'bright_l', 'buddy_icon_left')
status_machine.create_trans('bleft_l', 'left', 'buddy_icon_off')
status_machine.create_trans('bright_l', 'left', 'buddy_icon_off')
status_machine.create_trans('bleft_r', 'right', 'buddy_icon_off')
status_machine.create_trans('bright_r', 'right', 'buddy_icon_off')
service_machine.create_trans('fleft', 'left', 'status_fleft')
service_machine.create_trans('left', 'fleft', 'status_left')
service_machine.create_trans('bleft_l', 'bright_l', 'status_bleft_l')
service_machine.create_trans('bright_l', 'bleft_l', 'status_bright_l')
service_machine.create_trans('bleft_r', 'bright_r', 'status_bleft_r')
service_machine.create_trans('bright_r', 'bleft_r', 'status_bright_r')
service_machine.create_trans('fright', 'right', 'status_fright')
service_machine.create_trans('right', 'fright', 'status_right')
service_machine.create_trans('bleft_l', 'bleft_r', 'buddy_icon_right')
service_machine.create_trans('bright_l', 'bright_r', 'buddy_icon_right')
service_machine.create_trans('bleft_r', 'bleft_l', 'buddy_icon_left')
service_machine.create_trans('bright_r', 'bright_l', 'buddy_icon_left')
service_machine.create_trans('bleft_l', 'left', 'buddy_icon_off')
service_machine.create_trans('bright_l', 'left', 'buddy_icon_off')
service_machine.create_trans('bleft_r', 'right', 'buddy_icon_off')
service_machine.create_trans('bright_r', 'right', 'buddy_icon_off')
| 45.902985 | 149 | 0.696635 | 848 | 6,151 | 4.727594 | 0.145047 | 0.162135 | 0.220005 | 0.103268 | 0.619107 | 0.517835 | 0.492891 | 0.357446 | 0.265153 | 0.156149 | 0 | 0.002121 | 0.156722 | 6,151 | 133 | 150 | 46.24812 | 0.770773 | 0.196716 | 0 | 0 | 0 | 0 | 0.353072 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.027778 | null | null | 0.013889 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c4d5e809cb98ba9a807b1ef26917065883da3a0e | 2,125 | py | Python | apps/comment/migrations/0002_auto_20200405_1025.py | Iamacode/blog | a679cc25b89dde548a7668e291af9c5d14530bdb | [
"Apache-2.0"
] | null | null | null | apps/comment/migrations/0002_auto_20200405_1025.py | Iamacode/blog | a679cc25b89dde548a7668e291af9c5d14530bdb | [
"Apache-2.0"
] | 8 | 2020-06-06T00:57:59.000Z | 2022-01-13T02:26:12.000Z | apps/comment/migrations/0002_auto_20200405_1025.py | Iamacode/blog | a679cc25b89dde548a7668e291af9c5d14530bdb | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11.12 on 2020-04-05 10:25
from __future__ import unicode_literals
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
('comment', '0001_initial'),
('storm', '0001_initial'),
]
operations = [
migrations.AddField(
model_name='articlecomment',
name='belong',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='article_comments', to='storm.Article', verbose_name='所属文章'),
),
migrations.AddField(
model_name='articlecomment',
name='parent',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='articlecomment_child_comments', to='comment.ArticleComment', verbose_name='父评论'),
),
migrations.AddField(
model_name='articlecomment',
name='rep_to',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='articlecomment_rep_comments', to='comment.ArticleComment', verbose_name='回复'),
),
migrations.AddField(
model_name='aboutcomment',
name='author',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='aboutcomment_related', to='comment.CommentUser', verbose_name='评论人'),
),
migrations.AddField(
model_name='aboutcomment',
name='parent',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='aboutcomment_child_comments', to='comment.AboutComment', verbose_name='父评论'),
),
migrations.AddField(
model_name='aboutcomment',
name='rep_to',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='aboutcomment_rep_comments', to='comment.AboutComment', verbose_name='回复'),
),
]
| 42.5 | 199 | 0.659294 | 229 | 2,125 | 5.930131 | 0.257642 | 0.047128 | 0.072165 | 0.113402 | 0.73785 | 0.73785 | 0.490427 | 0.430044 | 0.430044 | 0.430044 | 0 | 0.015532 | 0.212235 | 2,125 | 49 | 200 | 43.367347 | 0.795699 | 0.032471 | 0 | 0.536585 | 1 | 0 | 0.207988 | 0.074038 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.073171 | 0 | 0.170732 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c4ddd303d32d869ec04012023d959afeb4f20217 | 2,188 | py | Python | ros/src/twist_controller/twist_controller.py | canersu/carnd_capstone | d7afcd2fd2a0c0a6274f6c1a5233b525107b75a0 | [
"MIT"
] | null | null | null | ros/src/twist_controller/twist_controller.py | canersu/carnd_capstone | d7afcd2fd2a0c0a6274f6c1a5233b525107b75a0 | [
"MIT"
] | 7 | 2020-09-26T01:25:54.000Z | 2022-03-12T00:45:16.000Z | ros/src/twist_controller/twist_controller.py | canersu/carnd_capstone | d7afcd2fd2a0c0a6274f6c1a5233b525107b75a0 | [
"MIT"
] | null | null | null | from yaw_controller import YawController
from pid import PID
from lowpass import LowPassFilter
import rospy
GAS_DENSITY = 2.858
ONE_MPH = 0.44704
class Controller(object):
def __init__(self, vehicle_mass, fuel_capacity, brake_deadband, decel_limit, accel_limit,
wheel_radius, wheel_base,steer_ratio, max_lat_accel, max_steer_angle):
# TODO: Implement
self.yaw_controller=YawController(wheel_base,steer_ratio,0.1,max_lat_accel,max_steer_angle)
# make a PID object to be used later for throttle control (parameters can be tuned)
kp=0.3
ki=0.1
kd=0
mn_throttle=0
mx_throttle=0.2
self.pid_controller=PID(kp,ki,kd,mn_throttle,mx_throttle)
# LOW PASS FILTER to get rid of velocity high frequency noise
tau=0.5
ts=0.02
self.low_pass=LowPassFilter(tau,ts)
# Variables
self.vehicle_mass=vehicle_mass
self.fuel_capacity=fuel_capacity
self.brake_deadband=brake_deadband
self.decel_limit=decel_limit
self.accel_limit=accel_limit
self.wheel_radius=wheel_radius
self.previous_time=rospy.get_time()
def control(self, current_vel, dwb_enabled, linear_vel, angular_vel):
# TODO: Change the arg, kwarg list to suit your needs
# Return throttle, brake, steer
if not dwb_enabled:
self.pid_controller.reset()
return 0, 0, 0
current_vel=self.low_pass.filt(current_vel)
steer=self.yaw_controller.get_steering(linear_vel,angular_vel,current_vel)
vel_err=linear_vel- current_vel
current_time=rospy.get_time()
sample_time=current_time-self.previous_time
self.previous_time=current_time
throttle=self.pid_controller.step(vel_err,sample_time)
brake=0
if linear_vel==0 and current_vel <0.1:
throttle=0
brake=400 # Carla will roll forward with only 400N*m of torque
elif throttle<0.1 and vel_err<0:
throttle=0
decel=max(vel_err, self.decel_limit)
brake=abs(decel)*self.vehicle_mass*self.wheel_radius
return throttle, brake, steer
| 33.661538 | 99 | 0.682358 | 314 | 2,188 | 4.5 | 0.359873 | 0.042463 | 0.031847 | 0.026893 | 0.03397 | 0.03397 | 0 | 0 | 0 | 0 | 0 | 0.026076 | 0.246344 | 2,188 | 64 | 100 | 34.1875 | 0.830807 | 0.137112 | 0 | 0.043478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015625 | 0 | 1 | 0.043478 | false | 0.065217 | 0.086957 | 0 | 0.195652 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
c4de533f8cbbfc696875c0bdb9cde147a19ab7e0 | 240 | py | Python | kolibri/plugins/coach/urls.py | arceduardvincent/kolibri | 26073dda2569bb38bfe1e08ba486e96f650d10ce | [
"MIT"
] | null | null | null | kolibri/plugins/coach/urls.py | arceduardvincent/kolibri | 26073dda2569bb38bfe1e08ba486e96f650d10ce | [
"MIT"
] | 3 | 2016-05-24T21:12:01.000Z | 2017-03-09T22:43:08.000Z | kolibri/plugins/coach/urls.py | DXCanas/kolibri | 4571fc5e5482a2dc9cd8f93dd45222a69d8a68b4 | [
"MIT"
] | 1 | 2021-07-26T11:38:29.000Z | 2021-07-26T11:38:29.000Z | from django.conf.urls import include
from django.conf.urls import url
from . import views
from .api_urls import urlpatterns
urlpatterns = [
url('^api/', include(urlpatterns)),
url('^$', views.CoachView.as_view(), name='coach'),
]
| 21.818182 | 55 | 0.708333 | 32 | 240 | 5.25 | 0.46875 | 0.178571 | 0.166667 | 0.214286 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145833 | 240 | 10 | 56 | 24 | 0.819512 | 0 | 0 | 0 | 0 | 0 | 0.05 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
c4e9b2077e969a1d1fc395fa65c00cecd09f3c3e | 728 | py | Python | acam/acam/doctype/acam_factor/acam_factor.py | josephalbaph/acam | 04366f80bca2cd596ec1a272fd396cf9de8abcaf | [
"MIT"
] | null | null | null | acam/acam/doctype/acam_factor/acam_factor.py | josephalbaph/acam | 04366f80bca2cd596ec1a272fd396cf9de8abcaf | [
"MIT"
] | null | null | null | acam/acam/doctype/acam_factor/acam_factor.py | josephalbaph/acam | 04366f80bca2cd596ec1a272fd396cf9de8abcaf | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Copyright (c) 2020, Joseph Marie Alba and contributors
# For license information, please see license.txt
from __future__ import unicode_literals
import frappe
from frappe.model.document import Document
from frappe.utils import flt
class AcamFactor(Document):
def validate(self):
self.validate_check_factor();
def validate_check_factor(self):
self.check_factors = flt(self.distribution_services)+flt(self.distribution_connection_services) \
+flt(self.regulated_retail_services)+flt(self.non_regulated_retail_services)+flt(self.supplier_of_last_resort) \
+flt(self.wholesale_aggregator)+flt(self.related_business)+flt(self.generation)+flt(self.supply_services)+flt(self.general_purpose); | 40.444444 | 135 | 0.806319 | 99 | 728 | 5.676768 | 0.535354 | 0.124555 | 0.133452 | 0.092527 | 0.106762 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007541 | 0.089286 | 728 | 18 | 135 | 40.444444 | 0.840121 | 0.17033 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.363636 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
c4ea3aeed9a429ff802c0abba630e20c976bcf24 | 224 | py | Python | Chapter07/transformer/transformer/__init__.py | bpbpublications/Getting-started-with-Deep-Learning-for-Natural-Language-Processing | 89f35a8e327bd9143fdb44e84b8f7b4fdc8ae58d | [
"MIT"
] | null | null | null | Chapter07/transformer/transformer/__init__.py | bpbpublications/Getting-started-with-Deep-Learning-for-Natural-Language-Processing | 89f35a8e327bd9143fdb44e84b8f7b4fdc8ae58d | [
"MIT"
] | 1 | 2021-10-14T10:15:10.000Z | 2021-10-14T10:15:10.000Z | Chapter07/transformer/transformer/__init__.py | bpbpublications/Getting-started-with-Deep-Learning-for-Natural-Language-Processing | 89f35a8e327bd9143fdb44e84b8f7b4fdc8ae58d | [
"MIT"
] | 1 | 2022-01-02T20:57:01.000Z | 2022-01-02T20:57:01.000Z | import transformer.Optim
__all__ = [
transformer.Constants, transformer.Modules, transformer.Layers,
transformer.SubLayers, transformer.Models, transformer.Optim,
transformer.Translator, transformer.Beam]
| 32 | 68 | 0.772321 | 20 | 224 | 8.45 | 0.55 | 0.189349 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 224 | 6 | 69 | 37.333333 | 0.880208 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c4fc99fecd841b00f315010b67b204a6f307948a | 631 | py | Python | Spark/3d.py | bcspragu/Machine-Learning-Projects | b6832cbb9bb27d7e8253300f97a3ab84b1a555dc | [
"MIT"
] | null | null | null | Spark/3d.py | bcspragu/Machine-Learning-Projects | b6832cbb9bb27d7e8253300f97a3ab84b1a555dc | [
"MIT"
] | null | null | null | Spark/3d.py | bcspragu/Machine-Learning-Projects | b6832cbb9bb27d7e8253300f97a3ab84b1a555dc | [
"MIT"
] | 1 | 2018-09-26T13:13:03.000Z | 2018-09-26T13:13:03.000Z | def en_even(r):
return r[0] == "en" and len(r[1]) % 2 == 0
def en_odd(r):
return r[0] == "en" and len(r[1]) % 2 == 1
def predict(w):
def result(r):
return (r[0],r[1], np.dot(w.T, r[2])[0][0], r[3])
return result
train = rdd.filter(en_even)
test = rdd.filter(en_odd)
nxxt = train.map(x_xtranspose)
nres = nxxt.reduce(np.add)
nxy = train.map(xy_scale)
nres2 = nxy.reduce(np.add)
nweights = np.dot(np.linalg.inv(nres), nres2.T)
# Make predictions on test with our train weights
pred = test.map(predict(nweights))
# Because we already filtered by code "en"
pred = pred.filter(lambda r: r[1] == "yahoo")
| 22.535714 | 57 | 0.630745 | 118 | 631 | 3.322034 | 0.432203 | 0.020408 | 0.061224 | 0.068878 | 0.102041 | 0.102041 | 0.102041 | 0.102041 | 0.102041 | 0.102041 | 0 | 0.03301 | 0.183835 | 631 | 27 | 58 | 23.37037 | 0.728155 | 0.139461 | 0 | 0 | 0 | 0 | 0.016667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0 | 0.176471 | 0.470588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
f20f68709f1775a216b31d63787b51857ee2f573 | 709 | py | Python | src/codebase/models.py | ooclab/ga.npr | c23f19eabfc6220eab01a60cea8d9128e8ec14a5 | [
"MIT"
] | 1 | 2019-09-20T04:32:52.000Z | 2019-09-20T04:32:52.000Z | src/codebase/models.py | ooclab/ga.npr | c23f19eabfc6220eab01a60cea8d9128e8ec14a5 | [
"MIT"
] | null | null | null | src/codebase/models.py | ooclab/ga.npr | c23f19eabfc6220eab01a60cea8d9128e8ec14a5 | [
"MIT"
] | 1 | 2019-09-20T04:32:53.000Z | 2019-09-20T04:32:53.000Z | # pylint: disable=R0902,E1101,W0201,too-few-public-methods,W0613
import datetime
from sqlalchemy_utils import UUIDType
from sqlalchemy import (
Column,
DateTime,
Integer,
Sequence,
)
from codebase.utils.sqlalchemy import ORMBase
class User(ORMBase):
"""
用户由 AuthN 服务创建并鉴别,本处存储仅是为了关系映射方便
1. uuid 作为用户 ID 不宜放在其他关联表中,而应该使用 Integer 主键
2. SQLAlchemy 可以提供方便的查询
"""
__tablename__ = "authz_user"
id = Column(Integer, Sequence("authz_user_id_seq"), primary_key=True)
# TODO: 虽然我们这里认为 user id 是 uuid,当实际情况不一定是
# 这里可以考虑根据用户配置,动态创建 uid 项,而不是强制使用 uuid
uuid = Column(UUIDType(), unique=True)
created = Column(DateTime(), default=datetime.datetime.utcnow)
| 22.870968 | 73 | 0.716502 | 87 | 709 | 5.724138 | 0.632184 | 0.036145 | 0.044177 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031359 | 0.190409 | 709 | 30 | 74 | 23.633333 | 0.836237 | 0.341326 | 0 | 0 | 0 | 0 | 0.061224 | 0 | 0 | 0 | 0 | 0.033333 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f2155c15a0ed8eab74ad7cd5242fea9605c61765 | 3,764 | py | Python | plugins/base.py | tom-tan/drill-hawk | 000bb030503d60ab952f8862fe5b4b9096e4d39d | [
"Apache-2.0"
] | 3 | 2020-04-17T09:41:37.000Z | 2020-05-16T06:07:11.000Z | plugins/base.py | tom-tan/drill-hawk | 000bb030503d60ab952f8862fe5b4b9096e4d39d | [
"Apache-2.0"
] | null | null | null | plugins/base.py | tom-tan/drill-hawk | 000bb030503d60ab952f8862fe5b4b9096e4d39d | [
"Apache-2.0"
] | null | null | null |
class DHFetchPlugin(object):
""" Drill-Hawkプラグイン
"""
def get_es_source(self):
""" メトリクスのElasticSearchでデータをsearchするときに、
_sourceに指定する項目のリストを返す
:return: _sourcesに指定するElasticSearch上の項目のリスト (string list)
"""
raise NotImplementedError()
def build(self, cwl_workflow_data):
""" cwl_workflow_dataにプラグインが必要とするデータを追加する。
:param cwl_workflow_data: メトリクスのElasticSearchから取得したworkflow情報
:return: 引数のcwl_workflow_dataに情報追加したデータ。
.. note::
cwl_workflow_data の標準は、以下のドキュメントを参照
<https://github.com/inutano/cwl-metrics/tree/master/docs>
cwl_workflow_data の例は以下の通りである。
.. code-block:: none
:linenos:
{'workflow':
{
'prepare': {
'start_time': '2020-05-05T14:22:15',
'end_date': '2020-05-05T15:06:28',
'end_time': '2020-05-05T14:22:22'
},
...
'steps': {
'HISAT2-3': {
'stepname': 'HISAT2-3',
'start_date': '2020-05-05T14:24:37',
'end_date': '2020-05-05T14:54:06',
'reconf': {
'start_time': '2020-05-05T14:22:23',
'end_time': '2020-05-05T14:24:36',
'ra': {'start_time': '2020-05-05T14:22:23.409770',
'end_time': '2020-05-05T14:24:29.092136'}
},
},
...
"""
raise NotImplementedError()
class DHGraphPlugin(object):
def build(self, workflow_data, graph_data, steps, total_keys):
""" グラフデータにプラグインが付加したいデータを追加する。
graph_dataには以下のプレフィックスをステップ名につけたものをキーとして、
グラフ化するデータ(実行時間、利用料金)が入っている。
* ``id-`` : ステップ名
* ``time-`` : 実行時間(秒)
* ``cost-`` : 利用料金(USD)
* ``start-`` : 開始時刻
* ``end-`` : 終了時刻
:param workflow_data: DHFetchPluginで処理した後のデータ
:param graph_data: グラフ(d3)用データ
:param steps: ステップごとのデータ
:param total_keys: 全てのworkflowで出現するステップをソートしたリスト
:return: 加工後の (graph_data, steps, total_keys)
"""
raise NotImplementedError()
class DHTablePlugin(object):
def build(self, workflow_table_data):
""" テーブルのセルを加工し、セルごとにHTMLデータに変換する。
workflow_table_data["ext_columns"] にプラグインでの追加カラムを入れること
各カラムの値はstepにカラム名のフィールドを追加し、そこに入れること。
.. code-block:: python
:linenos:
workflow_table_data["ext_columns"].append(column_name)
template = jinja2_env.from_string(reconf_cell_template)
for step in workflow_table_data["steps"]:
step[column_name] = template.render(step=step, content=workflow_table_data)
:param workflow_table_data: テーブル化する対象のデータ
:return: 加工後の workflow_table_data (dict)
"""
raise NotImplementedError()
class DHPlugin(object):
""" pluginのpythonモジュールに ``create_plugin(*args, **kwargs)`` を定義し
その関数でpluginインスタンスを返すように実装すること。
.. code-block:: python
:linenos:
def create_plugin(*args, **kwargs):
plugin = base.DHPlugin(fetch=FetchPlugin(),
table=TablePlugin(),
graph=GraphPlugin())
return plugin
"""
def __init__(self, fetch=None, table=None, graph=None):
""" Pluginのインスタンスを作成する
:params fetch: DHFetchPluginのインスタンス
:params table: DHFTablePluginのインスタンス
:params graph: DHGraphPluginのインスタンス
"""
self.fetch = fetch
self.table = table
self.graph = graph
| 29.873016 | 90 | 0.547821 | 322 | 3,764 | 6.21118 | 0.428571 | 0.027 | 0.044 | 0.045 | 0.141 | 0.055 | 0.024 | 0 | 0 | 0 | 0 | 0.057994 | 0.340329 | 3,764 | 125 | 91 | 30.112 | 0.747483 | 0.677471 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3125 | false | 0 | 0 | 0 | 0.5625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f215e14d4ac7d052d5a5ea340bf8305c3cd78f11 | 1,204 | py | Python | var/spack/repos/builtin/packages/r-e1071/package.py | player1537-forks/spack | 822b7632222ec5a91dc7b7cda5fc0e08715bd47c | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 11 | 2015-10-04T02:17:46.000Z | 2018-02-07T18:23:00.000Z | var/spack/repos/builtin/packages/r-e1071/package.py | player1537-forks/spack | 822b7632222ec5a91dc7b7cda5fc0e08715bd47c | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 22 | 2017-08-01T22:45:10.000Z | 2022-03-10T07:46:31.000Z | var/spack/repos/builtin/packages/r-e1071/package.py | player1537-forks/spack | 822b7632222ec5a91dc7b7cda5fc0e08715bd47c | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 4 | 2016-06-10T17:57:39.000Z | 2018-09-11T04:59:38.000Z | # Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class RE1071(RPackage):
"""Misc Functions of the Department of Statistics, Probability Theory Group
(Formerly: E1071), TU Wien.
Functions for latent class analysis, short time Fourier transform, fuzzy
clustering, support vector machines, shortest path computation, bagged
clustering, naive Bayes classifier, generalized k-nearest neighbour ..."""
cran = "e1071"
version('1.7-9', sha256='9bf9a15e7ce0b9b1a57ce3048d29cbea7f2a5bb2e91271b1b6aaafe07c852226')
version('1.7-4', sha256='e6ab871b06f500dc65f8f781cc7253f43179698784c06dab040b4aa6592f2309')
version('1.7-2', sha256='721c299ce83047312acfa3e0c4b3d4c223d84a4c53400c73465cca2c92913752')
version('1.7-1', sha256='5c5f04a51c1cd2c7dbdf69987adef9bc07116804c63992cd36d804a1daf89dfe')
version('1.6-7', sha256='7048fbc0ac17d7e3420fe68081d0e0a2176b1154ee3191d53558ea9724c7c980')
depends_on('r-class', type=('build', 'run'))
depends_on('r-proxy', type=('build', 'run'), when='@1.7-9:')
| 44.592593 | 95 | 0.76412 | 125 | 1,204 | 7.344 | 0.704 | 0.043573 | 0.039216 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.245247 | 0.126246 | 1,204 | 26 | 96 | 46.307692 | 0.627376 | 0.421096 | 0 | 0 | 0 | 0 | 0.575893 | 0.47619 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f220b132bcc29d0ebdfbecf10af4b23118a286fd | 534 | py | Python | Project_2/pipelines/emojis.py | TitoGrine/IART_Project | a387cface38473fa90e132207847887b43a69cec | [
"MIT"
] | null | null | null | Project_2/pipelines/emojis.py | TitoGrine/IART_Project | a387cface38473fa90e132207847887b43a69cec | [
"MIT"
] | null | null | null | Project_2/pipelines/emojis.py | TitoGrine/IART_Project | a387cface38473fa90e132207847887b43a69cec | [
"MIT"
] | null | null | null | from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.pipeline import make_pipeline
from preprocessors.emojis import Emojis
from preprocessors.tokenizer import Tokenizer
from preprocessors.utils import fit
def emojis_pipeline(x, y, clsf):
model = make_pipeline(Tokenizer(preserve_case=False, strip_handles=False), Emojis(),
TfidfVectorizer(lowercase=False, tokenizer=lambda _: _))
vectorized_x = model.fit_transform(x, y)
return fit(vectorized_x, y, clsf, oversample=True)
| 35.6 | 88 | 0.76779 | 65 | 534 | 6.138462 | 0.476923 | 0.12782 | 0.030075 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157303 | 534 | 14 | 89 | 38.142857 | 0.886667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.5 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
f2220df09f92fe0864297e956c366264130ebe59 | 354 | py | Python | day03/day3_part1.py | briannamcdonald/advent-of-code-2020 | 9699137817e750191d0d5c28633181cdade25191 | [
"MIT"
] | null | null | null | day03/day3_part1.py | briannamcdonald/advent-of-code-2020 | 9699137817e750191d0d5c28633181cdade25191 | [
"MIT"
] | null | null | null | day03/day3_part1.py | briannamcdonald/advent-of-code-2020 | 9699137817e750191d0d5c28633181cdade25191 | [
"MIT"
] | null | null | null | def main():
data = open("day3/input.txt", "r")
data = [line.strip() for line in data.readlines()]
tree_counter = 0
x = 0
for line in data:
if x >= len(line):
x = x % (len(line))
if line[x] == "#":
tree_counter += 1
x += 3
print(tree_counter)
if __name__ == "__main__":
main() | 20.823529 | 54 | 0.483051 | 48 | 354 | 3.333333 | 0.479167 | 0.20625 | 0.1125 | 0.1625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02193 | 0.355932 | 354 | 17 | 55 | 20.823529 | 0.679825 | 0 | 0 | 0 | 0 | 0 | 0.067606 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0 | 0 | 0.071429 | 0.071429 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f22896dcb2f03b010e608c1eb7af7fb1ec47ab41 | 109 | py | Python | src/config_train.py | RaulRC/genetic-neural-optimizer | fa169cdc9b43c58470c3e7a7214185d56e61579a | [
"MIT"
] | 1 | 2021-04-30T09:07:15.000Z | 2021-04-30T09:07:15.000Z | src/config_train.py | RaulRC/genetic-neural-optimizer | fa169cdc9b43c58470c3e7a7214185d56e61579a | [
"MIT"
] | 5 | 2020-01-28T23:00:10.000Z | 2022-02-10T00:16:05.000Z | src/config_train.py | RaulRC/genetic-neural-optimizer | fa169cdc9b43c58470c3e7a7214185d56e61579a | [
"MIT"
] | null | null | null | iterations = 1
generations_list = [500]
populations_list = [30]
elitism_list = [0.2, 0.8]
mutables_list = [1] | 21.8 | 25 | 0.715596 | 17 | 109 | 4.352941 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117021 | 0.137615 | 109 | 5 | 26 | 21.8 | 0.670213 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1efb11ff4ae7eb6dcc0b68a7c7f69dd951c03bad | 599 | gyp | Python | binding.gyp | Natim/node-sodium | a69acc128b2588fa1e52315155cc170e844325d8 | [
"MIT"
] | 11 | 2015-04-05T17:32:35.000Z | 2020-12-27T01:12:18.000Z | binding.gyp | Natim/node-sodium | a69acc128b2588fa1e52315155cc170e844325d8 | [
"MIT"
] | null | null | null | binding.gyp | Natim/node-sodium | a69acc128b2588fa1e52315155cc170e844325d8 | [
"MIT"
] | 2 | 2016-01-27T21:18:37.000Z | 2020-10-15T21:46:16.000Z | {
'variables': { 'target_arch%': 'ia32', 'naclversion': '0.4.5' },
'targets': [
{
'target_name': 'sodium',
'sources': [
'sodium.cc',
],
"dependencies": [
"<(module_root_dir)/deps/libsodium.gyp:libsodium"
],
'include_dirs': [
'./deps/libsodium-<(naclversion)/src/libsodium/include'
],
'cflags!': [ '-fno-exceptions' ],
}
]
}
| 28.52381 | 78 | 0.338898 | 34 | 599 | 5.823529 | 0.764706 | 0.131313 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017123 | 0.512521 | 599 | 20 | 79 | 29.95 | 0.660959 | 0 | 0 | 0.166667 | 0 | 0 | 0.378965 | 0.166945 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
480d6781696eb74c5f6c70e662d21c9ef1de859c | 1,068 | py | Python | src/dynatrace/metric/utils/__init__.py | dynatrace-oss/dynatrace-metric-utils-python | d59cd910c55fd0042e98e5a7e61dd23d4555f530 | [
"Apache-2.0"
] | null | null | null | src/dynatrace/metric/utils/__init__.py | dynatrace-oss/dynatrace-metric-utils-python | d59cd910c55fd0042e98e5a7e61dd23d4555f530 | [
"Apache-2.0"
] | 1 | 2021-10-14T11:37:10.000Z | 2021-10-14T11:37:10.000Z | src/dynatrace/metric/utils/__init__.py | dynatrace-oss/dynatrace-metric-utils-python | d59cd910c55fd0042e98e5a7e61dd23d4555f530 | [
"Apache-2.0"
] | null | null | null | # Copyright 2021 Dynatrace LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Ignore "imported but not used" errors. This is here, so all the types
# can directly be imported from the module, without specifying the files.
from .dynatrace_metrics_factory import DynatraceMetricsFactory # noqa: F401
from .dynatrace_metrics_serializer import \
DynatraceMetricsSerializer # noqa: F401
from .metric_error import MetricError # noqa: F401
from .dynatrace_metrics_api_constants import \
DynatraceMetricsApiConstants # noqa: F401
VERSION = "0.1.0b0"
| 41.076923 | 76 | 0.768727 | 150 | 1,068 | 5.42 | 0.633333 | 0.073801 | 0.073801 | 0.03936 | 0.068881 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027119 | 0.171348 | 1,068 | 25 | 77 | 42.72 | 0.891525 | 0.697566 | 0 | 0 | 0 | 0 | 0.023179 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.571429 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
480e95c749d65ccf16293c5c5839373b6040ca26 | 169 | py | Python | PythonExercicios/ex002.py | Luis-Emanuel/Python | 92936dfb005b9755a53425d16c3ff54119eebe78 | [
"MIT"
] | null | null | null | PythonExercicios/ex002.py | Luis-Emanuel/Python | 92936dfb005b9755a53425d16c3ff54119eebe78 | [
"MIT"
] | null | null | null | PythonExercicios/ex002.py | Luis-Emanuel/Python | 92936dfb005b9755a53425d16c3ff54119eebe78 | [
"MIT"
] | null | null | null | #Faça um programa que leia o nome de cada pessoa e mostre uma mensagem de boas-vindas
nome = input('Qual é seu nome?')
print('É um prazer te conhecer,{}!'.format(nome))
| 42.25 | 85 | 0.727811 | 31 | 169 | 3.967742 | 0.806452 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.159763 | 169 | 3 | 86 | 56.333333 | 0.866197 | 0.497041 | 0 | 0 | 0 | 0 | 0.511905 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
481a5b0c46da146f43e23f59a6d1d5e7117589af | 3,046 | py | Python | Example-University-System/people.py | s-c-23/Elements-of-Software-Design | 4a29b6e864b792f7dc3bafd25b13e9abd8d79798 | [
"MIT"
] | null | null | null | Example-University-System/people.py | s-c-23/Elements-of-Software-Design | 4a29b6e864b792f7dc3bafd25b13e9abd8d79798 | [
"MIT"
] | null | null | null | Example-University-System/people.py | s-c-23/Elements-of-Software-Design | 4a29b6e864b792f7dc3bafd25b13e9abd8d79798 | [
"MIT"
] | null | null | null | class Person:
'''
This class represents a person
'''
def __init__(self, id, firstname, lastname, dob):
self.id = id
self.firstname = firstname
self.lastname = lastname
self.dob = dob
def __str__(self):
return "University ID Number: " + self.id + "\nName: " + self.firstname + " " + self.lastname
def __repr__(self):
return self.firstname + " " + self.lastname
def get_salary(self):
return 0
class Student(Person):
'''
This class represents a Student
'''
def __init__(self, id, firstname, lastname, dob, start_year):
self.start_year = start_year
self.courses = []
# invoking the __init__ of the parent class
# Person.__init__(self, firstname, lastname, dob)
# or better call super()
super().__init__(id, firstname, lastname, dob)
def add_course(self, course_id):
self.courses.append(course_id)
def get_courses():
return self.courses
def __str__(self):
return super().__str__() + ". This student has the following courses on records: " + str(list(self.courses))
# A student has no salary.
def get_salary(self):
return 0
class Professor(Person):
'''
This class represents a Professor in the university system.
'''
def __init__(self, id, firstname, lastname, dob, hiring_year, salary):
self.hiring_year = hiring_year
self.salary = salary
self.courses = set()
self.research_projects = set()
super().__init__(id, firstname, lastname, dob)
def __str__(self):
return super().__str__() + ". This Professor is the instructor of record of following courses : " + str(
list(self.courses))
def add_course(self, course_id):
self.courses.add(course_id)
def add_courses(self, courses):
for course in courses:
self.courses.add(course)
def get_courses():
return self.courses
def get_salary(self):
return self.salary
class Staff(Person):
'''
This class represents a staff member.
'''
def __init__(self, id, firstname, lastname, dob, hiring_year, salary):
self.hiring_year = hiring_year
self.salary = salary
super().__init__(id, firstname, lastname, dob)
def __str__(self):
return super().__str__() + ". This Staff memeber has a salary of " + str(self.salary)
def get_salary(self):
return self.salary
class Teaching_Assistant(Staff, Student):
'''
A Teaching Assistant is a student and is a staff member.
'''
def __init__(self, id, firstname, lastname, dob, start_year, hiring_year, salary):
Student.__init__(self, id, firstname, lastname, dob, start_year)
self.hiring_year = hiring_year
self.salary = salary
# Staff().__init__(self, id, firstname, lastname, dob, hiring_year, salary)
def __str__(self):
return Student.__str__(self) + Staff.__str__(self)
| 24.564516 | 116 | 0.623112 | 368 | 3,046 | 4.826087 | 0.160326 | 0.105293 | 0.123874 | 0.123874 | 0.603604 | 0.505068 | 0.498311 | 0.381194 | 0.279842 | 0.198761 | 0 | 0.000901 | 0.271504 | 3,046 | 123 | 117 | 24.764228 | 0.799459 | 0.141169 | 0 | 0.5 | 0 | 0 | 0.075069 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.2 | 0.616667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
483bee89287d9bc690eb4dc979ee33344a130255 | 737 | py | Python | Data Structure/Binary Tree/1376. Time Needed to Inform All Employees.py | smsubham/Data-Structure-Algorithms-Questions | 45da68231907068ef4e4a0444ffdac69b337fa7c | [
"Apache-2.0"
] | null | null | null | Data Structure/Binary Tree/1376. Time Needed to Inform All Employees.py | smsubham/Data-Structure-Algorithms-Questions | 45da68231907068ef4e4a0444ffdac69b337fa7c | [
"Apache-2.0"
] | null | null | null | Data Structure/Binary Tree/1376. Time Needed to Inform All Employees.py | smsubham/Data-Structure-Algorithms-Questions | 45da68231907068ef4e4a0444ffdac69b337fa7c | [
"Apache-2.0"
] | null | null | null | #https://leetcode.com/problems/time-needed-to-inform-all-employees/
#Source: https://leetcode.com/problems/time-needed-to-inform-all-employees/discuss/532560/JavaC%2B%2BPython-DFS
class Solution:
def numOfMinutes(self, n: int, headID: int, manager: List[int], informTime: List[int]) -> int:
children = [[] for i in range(n)]
for i, m in enumerate(manager):
if m >= 0: children[m].append(i)
def dfs(value):
"""
We have use "or [0]" to avoid ValueError: max() arg is an empty sequence return max( ) error for leaf nodes with no adjacent elements.
"""
return max( [dfs(i) for i in children[value]] or [0] )+informTime[value]
return dfs(headID) | 52.642857 | 146 | 0.628223 | 105 | 737 | 4.409524 | 0.561905 | 0.025918 | 0.069114 | 0.103672 | 0.233261 | 0.233261 | 0.233261 | 0.233261 | 0.233261 | 0.233261 | 0 | 0.019435 | 0.232022 | 737 | 14 | 147 | 52.642857 | 0.798587 | 0.421981 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
4843b47bb1181e8cb7f920ef8b13c5e648cce3eb | 6,861 | py | Python | appengine/monorail/framework/test/filecontent_test.py | allaparthi/monorail | e18645fc1b952a5a6ff5f06e0c740d75f1904473 | [
"BSD-3-Clause"
] | 2 | 2021-04-13T21:22:18.000Z | 2021-09-07T02:11:57.000Z | appengine/monorail/framework/test/filecontent_test.py | allaparthi/monorail | e18645fc1b952a5a6ff5f06e0c740d75f1904473 | [
"BSD-3-Clause"
] | 21 | 2020-09-06T02:41:05.000Z | 2022-03-02T04:40:01.000Z | appengine/monorail/framework/test/filecontent_test.py | allaparthi/monorail | e18645fc1b952a5a6ff5f06e0c740d75f1904473 | [
"BSD-3-Clause"
] | null | null | null | # Copyright 2016 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style
# license that can be found in the LICENSE file or at
# https://developers.google.com/open-source/licenses/bsd
"""Tests for the filecontent module."""
from __future__ import print_function
from __future__ import division
from __future__ import absolute_import
import unittest
from framework import filecontent
class MimeTest(unittest.TestCase):
"""Test methods for the mime module."""
_TEST_EXTENSIONS_TO_CTYPES = {
'html': 'text/plain',
'htm': 'text/plain',
'jpg': 'image/jpeg',
'jpeg': 'image/jpeg',
'pdf': 'application/pdf',
}
_CODE_EXTENSIONS = [
'py', 'java', 'mf', 'bat', 'sh', 'php', 'vb', 'pl', 'sql',
'patch', 'diff',
]
def testCommonExtensions(self):
"""Tests some common extensions for their expected content types."""
for ext, ctype in self._TEST_EXTENSIONS_TO_CTYPES.items():
self.assertEqual(
filecontent.GuessContentTypeFromFilename('file.%s' % ext),
ctype)
def testCaseDoesNotMatter(self):
"""Ensure that case (upper/lower) of extension does not matter."""
for ext, ctype in self._TEST_EXTENSIONS_TO_CTYPES.items():
ext = ext.upper()
self.assertEqual(
filecontent.GuessContentTypeFromFilename('file.%s' % ext),
ctype)
for ext in self._CODE_EXTENSIONS:
ext = ext.upper()
self.assertEqual(
filecontent.GuessContentTypeFromFilename('code.%s' % ext),
'text/plain')
def testCodeIsText(self):
"""Ensure that code extensions are text/plain."""
for ext in self._CODE_EXTENSIONS:
self.assertEqual(
filecontent.GuessContentTypeFromFilename('code.%s' % ext),
'text/plain')
def testNoExtensionIsText(self):
"""Ensure that no extension indicates text/plain."""
self.assertEqual(
filecontent.GuessContentTypeFromFilename('noextension'),
'text/plain')
def testUnknownExtension(self):
"""Ensure that an obviously unknown extension returns is binary."""
self.assertEqual(
filecontent.GuessContentTypeFromFilename('f.madeupextension'),
'application/octet-stream')
def testNoShockwaveFlash(self):
"""Ensure that Shockwave files will NOT be served w/ that content type."""
self.assertEqual(
filecontent.GuessContentTypeFromFilename('bad.swf'),
'application/octet-stream')
class DecodeFileContentsTest(unittest.TestCase):
def IsBinary(self, contents):
_contents, is_binary, _is_long = (
filecontent.DecodeFileContents(contents))
return is_binary
def testFileIsBinaryEmpty(self):
self.assertFalse(self.IsBinary(''))
def testFileIsBinaryShortText(self):
self.assertFalse(self.IsBinary('This is some plain text.'))
def testLineLengthDetection(self):
unicode_str = (
u'Some non-ascii chars - '
u'\xa2\xfa\xb6\xe7\xfc\xea\xd0\xf4\xe6\xf0\xce\xf6\xbe')
short_line = unicode_str.encode('iso-8859-1')
long_line = (unicode_str * 100)[:filecontent._MAX_SOURCE_LINE_LEN_LOWER+1]
long_line = long_line.encode('iso-8859-1')
lines = [short_line] * 100
lines.append(long_line)
# High lower ratio - text
self.assertFalse(self.IsBinary('\n'.join(lines)))
lines.extend([long_line] * 99)
# 50/50 lower/upper ratio - binary
self.assertTrue(self.IsBinary('\n'.join(lines)))
# Single line too long - binary
lines = [short_line] * 100
lines.append(short_line * 100) # Very long line
self.assertTrue(self.IsBinary('\n'.join(lines)))
def testFileIsBinaryLongText(self):
self.assertFalse(self.IsBinary('This is plain text. \n' * 100))
# long utf-8 lines are OK
self.assertFalse(self.IsBinary('This one long line. ' * 100))
def testFileIsBinaryLongBinary(self):
bin_string = ''.join([chr(c) for c in range(122, 252)])
self.assertTrue(self.IsBinary(bin_string * 100))
def testFileIsTextByPath(self):
bin_string = ''.join([chr(c) for c in range(122, 252)] * 100)
unicode_str = (
u'Some non-ascii chars - '
u'\xa2\xfa\xb6\xe7\xfc\xea\xd0\xf4\xe6\xf0\xce\xf6\xbe')
long_line = (unicode_str * 100)[:filecontent._MAX_SOURCE_LINE_LEN_LOWER+1]
long_line = long_line.encode('iso-8859-1')
for contents in [bin_string, long_line]:
self.assertTrue(filecontent.DecodeFileContents(contents, path=None)[1])
self.assertTrue(filecontent.DecodeFileContents(contents, path='')[1])
self.assertTrue(filecontent.DecodeFileContents(contents, path='foo')[1])
self.assertTrue(
filecontent.DecodeFileContents(contents, path='foo.bin')[1])
self.assertTrue(
filecontent.DecodeFileContents(contents, path='foo.zzz')[1])
for path in ['a/b/Makefile.in', 'README', 'a/file.js', 'b.txt']:
self.assertFalse(
filecontent.DecodeFileContents(contents, path=path)[1])
def testFileIsBinaryByCommonExtensions(self):
contents = 'this is not examined'
self.assertTrue(filecontent.DecodeFileContents(
contents, path='junk.zip')[1])
self.assertTrue(filecontent.DecodeFileContents(
contents, path='JUNK.ZIP')[1])
self.assertTrue(filecontent.DecodeFileContents(
contents, path='/build/HelloWorld.o')[1])
self.assertTrue(filecontent.DecodeFileContents(
contents, path='/build/Hello.class')[1])
self.assertTrue(filecontent.DecodeFileContents(
contents, path='/trunk/libs.old/swing.jar')[1])
self.assertFalse(filecontent.DecodeFileContents(
contents, path='HelloWorld.cc')[1])
self.assertFalse(filecontent.DecodeFileContents(
contents, path='Hello.java')[1])
self.assertFalse(filecontent.DecodeFileContents(
contents, path='README')[1])
self.assertFalse(filecontent.DecodeFileContents(
contents, path='READ.ME')[1])
self.assertFalse(filecontent.DecodeFileContents(
contents, path='README.txt')[1])
self.assertFalse(filecontent.DecodeFileContents(
contents, path='README.TXT')[1])
self.assertFalse(filecontent.DecodeFileContents(
contents, path='/trunk/src/com/monorail/Hello.java')[1])
self.assertFalse(filecontent.DecodeFileContents(
contents, path='/branches/1.2/resource.el')[1])
self.assertFalse(filecontent.DecodeFileContents(
contents, path='/wiki/PageName.wiki')[1])
def testUnreasonablyLongFile(self):
contents = '\n' * (filecontent.SOURCE_FILE_MAX_LINES + 2)
_contents, is_binary, is_long = filecontent.DecodeFileContents(
contents)
self.assertFalse(is_binary)
self.assertTrue(is_long)
contents = '\n' * 100
_contents, is_binary, is_long = filecontent.DecodeFileContents(
contents)
self.assertFalse(is_binary)
self.assertFalse(is_long)
| 36.301587 | 78 | 0.687218 | 791 | 6,861 | 5.857143 | 0.283186 | 0.143967 | 0.183682 | 0.176991 | 0.551047 | 0.530758 | 0.483488 | 0.37384 | 0.278437 | 0.249946 | 0 | 0.019437 | 0.182626 | 6,861 | 188 | 79 | 36.494681 | 0.806705 | 0.110188 | 0 | 0.431655 | 0 | 0.014388 | 0.129624 | 0.03897 | 0 | 0 | 0 | 0 | 0.280576 | 1 | 0.107914 | false | 0 | 0.035971 | 0 | 0.179856 | 0.007194 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
484548cb633a83ccd1078f757b293e78bef29d17 | 3,797 | py | Python | landlab/components/stream_power/tests/test_voronoi_sp.py | cctrunz/landlab | 4e4ef12f4bae82bc5194f1dcc9af8ff1a7c20939 | [
"MIT"
] | null | null | null | landlab/components/stream_power/tests/test_voronoi_sp.py | cctrunz/landlab | 4e4ef12f4bae82bc5194f1dcc9af8ff1a7c20939 | [
"MIT"
] | 1 | 2016-03-16T02:34:08.000Z | 2016-04-20T19:31:30.000Z | landlab/components/stream_power/tests/test_voronoi_sp.py | cctrunz/landlab | 4e4ef12f4bae82bc5194f1dcc9af8ff1a7c20939 | [
"MIT"
] | null | null | null | """Test the Voronoi compatibility of the grid."""
import os
import numpy as np
from numpy.testing import assert_array_almost_equal
from six.moves import range
from landlab import VoronoiDelaunayGrid
from landlab.components import FlowAccumulator, StreamPowerEroder
_THIS_DIR = os.path.abspath(os.path.dirname(__file__))
def test_sp_voronoi():
nnodes = 100
np.random.seed(0)
x = np.random.rand(nnodes)
np.random.seed(1)
y = np.random.rand(nnodes)
mg = VoronoiDelaunayGrid(x, y)
np.random.seed(2)
z = mg.add_field(
"node", "topographic__elevation", np.random.rand(nnodes) / 10000.0, copy=False
)
fr = FlowAccumulator(mg)
spe = StreamPowerEroder(mg, os.path.join(_THIS_DIR, "drive_sp_params_voronoi.txt"))
for i in range(10):
z[mg.core_nodes] += 0.01
fr.run_one_step()
spe.erode(mg, 1.0)
z_tg = np.array(
[
4.35994902e-05,
2.59262318e-06,
5.49662478e-05,
6.56738615e-03,
4.20367802e-05,
1.21371424e-02,
2.16596169e-02,
4.73320898e-02,
6.00389761e-02,
5.22007356e-02,
5.37507115e-02,
5.95794752e-02,
5.29862904e-02,
6.76465914e-02,
7.31720024e-02,
6.18730861e-02,
8.53975293e-05,
5.32189275e-02,
7.34302556e-02,
8.07385044e-02,
5.05246090e-05,
4.08940657e-02,
7.39971005e-02,
3.31915602e-02,
6.72650419e-02,
5.96745309e-05,
4.72752445e-02,
3.60359567e-02,
7.59432065e-02,
7.24461985e-02,
7.80305760e-02,
4.93866869e-02,
8.69642467e-02,
7.21627626e-02,
8.96368291e-02,
4.65142080e-02,
6.07720217e-02,
8.83372939e-02,
2.35887558e-02,
7.97616193e-02,
8.35615355e-02,
4.61809032e-02,
6.34634214e-02,
9.25711770e-02,
4.11717225e-03,
7.24493623e-02,
7.97908053e-02,
9.10375623e-02,
9.13155023e-02,
7.10567915e-02,
7.35271752e-02,
6.13091341e-02,
9.45498463e-02,
8.48532386e-02,
8.82702021e-02,
7.14969941e-02,
2.22640943e-02,
8.53311932e-02,
7.49161159e-02,
3.48837223e-02,
9.30132692e-02,
6.01817121e-05,
3.87455443e-02,
8.44673586e-02,
9.35213577e-02,
6.76075824e-02,
1.58614508e-02,
8.51346837e-02,
8.83645680e-02,
8.69944117e-02,
5.04000439e-05,
5.02319084e-02,
8.63882765e-02,
5.00991880e-02,
7.65156630e-02,
5.07591983e-02,
6.54909962e-02,
6.91505342e-02,
7.33358371e-02,
5.30109890e-02,
2.99074601e-02,
2.55509418e-06,
8.21523907e-02,
8.09368483e-02,
4.35073025e-02,
3.04096109e-02,
3.26298627e-02,
4.92259177e-02,
5.48690358e-02,
6.44732130e-02,
6.28133567e-02,
4.17977098e-06,
5.37149677e-02,
4.32828136e-02,
1.30559903e-02,
2.62405261e-02,
2.86079272e-02,
6.61481327e-05,
1.70477133e-05,
8.81652236e-05,
]
)
assert_array_almost_equal(mg.at_node["topographic__elevation"], z_tg)
| 26.739437 | 87 | 0.494338 | 443 | 3,797 | 4.1693 | 0.413093 | 0.024364 | 0.019491 | 0.029237 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.488433 | 0.396629 | 3,797 | 141 | 88 | 26.929078 | 0.317765 | 0.011325 | 0 | 0 | 0 | 0 | 0.020011 | 0.018943 | 0 | 0 | 0 | 0 | 0.015504 | 1 | 0.007752 | false | 0 | 0.046512 | 0 | 0.054264 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
48462e4e9a8dafb46c0eaf5a7da5f59337dcbd7b | 258 | py | Python | mailing/urls.py | HerbyDE/jagdreisencheck-webapp | 9af5deda2423b787da88a0c893f3c474d8e4f73f | [
"BSD-3-Clause"
] | null | null | null | mailing/urls.py | HerbyDE/jagdreisencheck-webapp | 9af5deda2423b787da88a0c893f3c474d8e4f73f | [
"BSD-3-Clause"
] | null | null | null | mailing/urls.py | HerbyDE/jagdreisencheck-webapp | 9af5deda2423b787da88a0c893f3c474d8e4f73f | [
"BSD-3-Clause"
] | null | null | null | from django.conf.urls import re_path
from mailing.views import unsubscribe, render_mail
app_name = 'mailing'
urlpatterns = [
re_path(r'^unsubscribe/$', unsubscribe, name='unsubscribe'),
re_path(r'^render/mail/$', render_mail, name='render_mail'),
] | 28.666667 | 64 | 0.736434 | 35 | 258 | 5.228571 | 0.457143 | 0.218579 | 0.076503 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120155 | 258 | 9 | 65 | 28.666667 | 0.806167 | 0 | 0 | 0 | 0 | 0 | 0.220077 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4860c1702ac499453acfb0b466d929c74e794d7c | 5,115 | py | Python | Data/PVdata/add_generated_perdate.py | Thomasvdw/ProgProject | bbaddbf4c1b066a65a2ef182611210f6d2d3d61d | [
"MIT"
] | null | null | null | Data/PVdata/add_generated_perdate.py | Thomasvdw/ProgProject | bbaddbf4c1b066a65a2ef182611210f6d2d3d61d | [
"MIT"
] | 3 | 2015-06-03T09:33:08.000Z | 2015-06-26T08:16:59.000Z | Data/PVdata/add_generated_perdate.py | Thomasvdw/ProgProject | bbaddbf4c1b066a65a2ef182611210f6d2d3d61d | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Wed May 27 20:06:01 2015
@author: Thomas
"""
# Python standard library imports
import csv
import os
def main():
generated = []
for file in os.listdir("reformatted/"):
print 'getting data from.. ' + file
generate_total = []
generate_2000 = []
generate_2001 = []
generate_2002 = []
generate_2003 = []
generate_2004 = []
generate_2005 = []
generate_2006 = []
generate_2007 = []
generate_2008 = []
generate_2009 = []
generate_2010 = []
generate_2011 = []
generate_2012 = []
generate_2013 = []
generate_2014 = []
generate_2015 = []
name = "reformatted/" + file
with open(name, 'r') as csvfile:
reader = csv.reader(csvfile, delimiter = ",")
next(csvfile)
for row in reader:
size = row[2]
size = size.replace(';', '')
size = size.replace('.', '')
size = float(size)
date = str(row[4])
date = date[-4:]
cost_size = row[10]
generate = row[11]
generate = generate.replace(';', '.')
generate = float(generate)
generate = generate / 1000
if cost_size > 1000 and date > "1995" and size < 1000:
if date <= "2015":
generate_2015.append(generate)
if date < "2014":
generate_2014.append(generate)
if date < "2013":
generate_2013.append(generate)
if date < "2012":
generate_2012.append(generate)
if date < "2011":
generate_2011.append(generate)
if date < "2010":
generate_2010.append(generate)
if date < "2009":
generate_2009.append(generate)
if date < "2008":
generate_2008.append(generate)
if date < "2007":
generate_2007.append(generate)
if date < "2006":
generate_2006.append(generate)
if date < "2005":
generate_2005.append(generate)
if date < "2004":
generate_2004.append(generate)
if date < "2003":
generate_2003.append(generate)
if date < "2002":
generate_2002.append(generate)
if date < "2001":
generate_2001.append(generate)
if date < "2000":
generate_2000.append(generate)
generate_total.append(generate)
else:
pass
generate2015 = sum(generate_2015)
generate2014 = sum(generate_2014)
generate2013 = sum(generate_2013)
generate2012 = sum(generate_2012)
generate2011 = sum(generate_2011)
generate2010 = sum(generate_2010)
generate2009 = sum(generate_2009)
generate2008 = sum(generate_2008)
generate2007 = sum(generate_2007)
generate2006 = sum(generate_2006)
generate2005 = sum(generate_2005)
generate2004 = sum(generate_2004)
generate2003 = sum(generate_2003)
generate2002 = sum(generate_2002)
generate2001 = sum(generate_2001)
generate2000 = sum(generate_2000)
generatetotal = sum(generate_total)
all_generated = [int(generate2015), int(generate2014), int(generate2013), int(generate2012),
int(generate2011), int(generate2010), int(generate2009), int(generate2008),
int(generate2007), int(generate2006), int(generate2005), int(generate2004),
int(generate2003), int(generate2002), int(generate2001), int(generate2000),
int(generatetotal)]
generated.append(all_generated)
dates = ['1/1/2015', '1/1/2014', '1/1/2013', '1/1/2012',
'1/1/2011', '1/1/2010', '1/1/2009', '1/1/2008',
'1/1/2007', '1/1/2006', '1/1/2005', '1/1/2004',
'1/1/2003', '1/1/2002', '1/1/2001', '1/1/2000', "total"]
for x, file in enumerate(os.listdir("reformatted/")):
name = "annual_generated_kwh_growth/" + "generated_" + file
with open(name, 'wb') as f:
writer = csv.writer(f)
writer.writerow(['Date', 'Annual generated'])
for i in range(17):
writer.writerow([dates[i], generated[x][i]])
if __name__ == '__main__':
main()
| 36.535714 | 100 | 0.470381 | 457 | 5,115 | 5.118162 | 0.242888 | 0.101753 | 0.102608 | 0.12826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.175534 | 0.423069 | 5,115 | 140 | 101 | 36.535714 | 0.617079 | 0.010362 | 0 | 0 | 1 | 0 | 0.0662 | 0.0056 | 0.009009 | 0 | 0 | 0 | 0 | 0 | null | null | 0.009009 | 0.018018 | null | null | 0.009009 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
48612b65d41d008c8b8c4376ab0c023aff7a09f8 | 276 | py | Python | jsonclasses_server/excs.py | zhichao-github/jsonclasses-server | 142b792dee60735703986b05cb9ded1b4cab13f0 | [
"MIT"
] | 2 | 2021-11-02T02:54:06.000Z | 2021-12-02T10:38:25.000Z | jsonclasses_server/excs.py | zhichao-github/jsonclasses-server | 142b792dee60735703986b05cb9ded1b4cab13f0 | [
"MIT"
] | 1 | 2021-12-15T13:59:14.000Z | 2021-12-15T13:59:14.000Z | jsonclasses_server/excs.py | zhichao-github/jsonclasses-server | 142b792dee60735703986b05cb9ded1b4cab13f0 | [
"MIT"
] | 3 | 2021-12-07T02:38:26.000Z | 2021-12-28T06:18:29.000Z | from __future__ import annotations
class AuthenticationException(Exception):
"""Authentication exception is throwed when user is not authorized.
"""
def __init__(self, message: str) -> None:
self.message = message
super().__init__(self.message)
| 25.090909 | 71 | 0.702899 | 29 | 276 | 6.275862 | 0.724138 | 0.181319 | 0.164835 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206522 | 276 | 10 | 72 | 27.6 | 0.83105 | 0.231884 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
48647249c10f15b223539931a599f3d45eda09a0 | 2,336 | py | Python | src/libcore/tests/test_frame.py | tizian/layer-laboratory | 008cc94b76127e9eb74227fcd3d0145da8ddec30 | [
"CNRI-Python"
] | 7 | 2020-07-24T03:19:59.000Z | 2022-03-30T10:56:12.000Z | src/libcore/tests/test_frame.py | tizian/layer-laboratory | 008cc94b76127e9eb74227fcd3d0145da8ddec30 | [
"CNRI-Python"
] | 1 | 2021-04-07T22:30:23.000Z | 2021-04-08T00:55:36.000Z | src/libcore/tests/test_frame.py | tizian/layer-laboratory | 008cc94b76127e9eb74227fcd3d0145da8ddec30 | [
"CNRI-Python"
] | 2 | 2020-06-08T08:25:09.000Z | 2021-04-05T22:13:08.000Z | import enoki as ek
import pytest
import mitsuba
def test01_construction(variant_scalar_rgb):
from mitsuba.core import Frame3f
# Uninitialized frame
_ = Frame3f()
# Frame3f from the 3 vectors: no normalization should be performed
f1 = Frame3f([0.005, 50, -6], [0.01, -13.37, 1], [0.5, 0, -6.2])
assert ek.allclose(f1.s, [0.005, 50, -6])
assert ek.allclose(f1.t, [0.01, -13.37, 1])
assert ek.allclose(f1.n, [0.5, 0, -6.2])
# Frame3f from the Normal component only
f2 = Frame3f([0, 0, 1])
assert ek.allclose(f2.s, [1, 0, 0])
assert ek.allclose(f2.t, [0, 1, 0])
assert ek.allclose(f2.n, [0, 0, 1])
# Copy constructor
f3 = Frame3f(f2)
assert f2 == f3
def test02_unit_frame(variant_scalar_rgb):
from mitsuba.core import Frame3f, Vector2f, Vector3f
for theta in [30 * mitsuba.core.math.Pi / 180, 95 * mitsuba.core.math.Pi / 180]:
phi = 73 * mitsuba.core.math.Pi / 180
sin_theta, cos_theta = ek.sin(theta), ek.cos(theta)
sin_phi, cos_phi = ek.sin(phi), ek.cos(phi)
v = Vector3f(
cos_phi * sin_theta,
sin_phi * sin_theta,
cos_theta
)
f = Frame3f(Vector3f(1.0, 2.0, 3.0) / ek.sqrt(14))
v2 = f.to_local(v)
v3 = f.to_world(v2)
assert ek.allclose(v3, v)
assert ek.allclose(Frame3f.cos_theta(v), cos_theta)
assert ek.allclose(Frame3f.sin_theta(v), sin_theta)
assert ek.allclose(Frame3f.cos_phi(v), cos_phi)
assert ek.allclose(Frame3f.sin_phi(v), sin_phi)
assert ek.allclose(Frame3f.cos_theta_2(v), cos_theta * cos_theta)
assert ek.allclose(Frame3f.sin_theta_2(v), sin_theta * sin_theta)
assert ek.allclose(Frame3f.cos_phi_2(v), cos_phi * cos_phi)
assert ek.allclose(Frame3f.sin_phi_2(v), sin_phi * sin_phi)
assert ek.allclose(Vector2f(Frame3f.sincos_phi(v)), [sin_phi, cos_phi])
assert ek.allclose(Vector2f(Frame3f.sincos_phi_2(v)), [sin_phi * sin_phi, cos_phi * cos_phi])
def test03_frame_equality(variant_scalar_rgb):
from mitsuba.core import Frame3f
f1 = Frame3f([1, 0, 0], [0, 1, 0], [0, 0, 1])
f2 = Frame3f([0, 0, 1])
f3 = Frame3f([0, 0, 1], [0, 1, 0], [1, 0, 0])
assert f1 == f2
assert f2 == f1
assert not f1 == f3
assert not f2 == f3
| 32.444444 | 101 | 0.619007 | 381 | 2,336 | 3.648294 | 0.2021 | 0.097842 | 0.195683 | 0.132374 | 0.507914 | 0.385612 | 0.334532 | 0.316547 | 0 | 0 | 0 | 0.092499 | 0.24101 | 2,336 | 71 | 102 | 32.901408 | 0.691483 | 0.059932 | 0 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.44 | 1 | 0.06 | false | 0 | 0.12 | 0 | 0.18 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
486edd47987229e67979fac1c61a3dbe5ef108af | 1,868 | py | Python | classification/smoke_tests.py | skorokithakis/Spamnesty | 65d27ca1b906ef429c997f148902f30e8609845b | [
"MIT"
] | 34 | 2016-11-23T13:30:39.000Z | 2021-12-08T15:23:13.000Z | classification/smoke_tests.py | skorokithakis/Spamnesty | 65d27ca1b906ef429c997f148902f30e8609845b | [
"MIT"
] | 9 | 2020-03-24T16:21:14.000Z | 2021-06-20T10:37:53.000Z | classification/smoke_tests.py | skorokithakis/Spamnesty | 65d27ca1b906ef429c997f148902f30e8609845b | [
"MIT"
] | 5 | 2018-03-11T18:54:34.000Z | 2020-02-07T09:46:35.000Z | from django.contrib.auth.models import User
from django.test import TestCase
from django.urls import reverse
from main.models import Conversation
from main.models import Domain
class SmokeTests(TestCase):
def setUp(self):
self.user = User.objects.create_user("user", "user@user.com", "password")
self.user.is_staff = True
self.user.save()
def test_urls(self):
response = self.client.get(reverse("classification:classify"))
self.assertEqual(response.status_code, 302)
self.assertTrue("login" in response.url)
self.client.login(username="user", password="password")
response = self.client.get(reverse("classification:classify"))
self.assertEqual(response.status_code, 200)
self.client.logout()
response = self.client.post(reverse("classification:delete"))
self.assertEqual(response.status_code, 302)
self.assertTrue("login" in response.url)
class DeleteTests(TestCase):
def setUp(self):
Domain.objects.create(name="example.com", company_name="Company")
self.user = User.objects.create_user("user", "user@user.com", "password")
self.user.is_staff = True
self.user.save()
def test_delete(self):
conversation = Conversation.objects.create()
self.client.login(username="user", password="password")
response = self.client.post(
reverse("classification:delete"), data={"conversation_id": conversation.id}
)
self.assertEqual(response.status_code, 200)
self.assertRaises(
Conversation.DoesNotExist, Conversation.objects.get, id=conversation.id
)
response = self.client.post(
reverse("classification:delete"), data={"conversation_id": conversation.id}
)
self.assertEqual(response.status_code, 404)
| 33.963636 | 87 | 0.672377 | 212 | 1,868 | 5.858491 | 0.254717 | 0.05153 | 0.072464 | 0.116747 | 0.665056 | 0.665056 | 0.665056 | 0.614332 | 0.614332 | 0.614332 | 0 | 0.010115 | 0.206103 | 1,868 | 54 | 88 | 34.592593 | 0.827377 | 0 | 0 | 0.536585 | 0 | 0 | 0.129015 | 0.058351 | 0 | 0 | 0 | 0 | 0.195122 | 1 | 0.097561 | false | 0.097561 | 0.121951 | 0 | 0.268293 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
488007ba68f74e56f4e9b6c17aa2807c2f8a69a7 | 4,904 | py | Python | joerd/source/gmted.py | Hivemapper/HM-colony-joerd | 0b86765156d0612d837548c2cf70376c43b3405c | [
"MIT"
] | 207 | 2016-07-21T07:55:31.000Z | 2022-03-21T20:59:02.000Z | joerd/source/gmted.py | Hivemapper/HM-colony-joerd | 0b86765156d0612d837548c2cf70376c43b3405c | [
"MIT"
] | 126 | 2016-07-22T15:58:32.000Z | 2022-03-19T09:52:30.000Z | joerd/source/gmted.py | Hivemapper/HM-colony-joerd | 0b86765156d0612d837548c2cf70376c43b3405c | [
"MIT"
] | 47 | 2016-10-06T17:37:52.000Z | 2022-02-28T19:26:22.000Z | from joerd.util import BoundingBox
import joerd.download as download
import joerd.check as check
import joerd.srs as srs
import joerd.mask as mask
from joerd.mkdir_p import mkdir_p
from shutil import copyfileobj
import os.path
import os
import requests
import logging
import re
import tempfile
import sys
import traceback
import subprocess
import glob
from osgeo import gdal
class GMTEDTile(object):
def __init__(self, parent, x, y):
self.url = parent.url
self.download_options = parent.download_options
self.base_dir = parent.base_dir
self.x = x
self.y = y
def __key(self):
return (self.x, self.y)
def __eq__(a, b):
return isinstance(b, type(a)) and \
a.__key() == b.__key()
def __hash__(self):
return hash(self.__key())
def _res(self):
return '300' if self.y == -90 else '075'
def _file_name(self):
res = self._res()
xname = "%03d%s" % (abs(self.x), "E" if self.x >= 0 else "W")
yname = "%02d%s" % (abs(self.y), "N" if self.y >= 0 else "S")
return "%(y)s%(x)s_20101117_gmted_mea%(res)s.tif" % \
dict(res=res, x=xname, y=yname)
def urls(self):
dir = "%s%03d" % ("E" if self.x >= 0 else "W", abs(self.x))
res = self._res()
dname = "/%(res)sdarcsec/mea/%(dir)s/" % dict(res=res, dir=dir)
return [self.url + dname + self._file_name()]
def verifier(self):
return check.is_gdal
def options(self):
return self.download_options
def output_file(self):
fname = self._file_name()
return os.path.join(self.base_dir, fname)
def unpack(self, store, tmp):
with store.upload_dir() as target:
mkdir_p(os.path.join(target, self.base_dir))
output_file = os.path.join(target, self.output_file())
mask.negative(tmp.name, "GTiff", output_file)
def freeze_dry(self):
return dict(type='gmted', x=self.x, y=self.y)
class GMTED(object):
def __init__(self, options={}):
self.num_download_threads = options.get('num_download_threads')
self.base_dir = options.get('base_dir', 'gmted')
self.url = options['url']
self.xs = options['xs']
self.ys = options['ys']
self.download_options = options
def get_index(self):
# GMTED is a static set of files - there's no need for an index, but we
# do need a directory to store stuff in.
if not os.path.isdir(self.base_dir):
os.makedirs(self.base_dir)
def existing_files(self):
for base, dirs, files in os.walk(self.base_dir):
for f in files:
if f.endswith('tif'):
yield os.path.join(base, f)
def rehydrate(self, data):
assert data.get('type') == 'gmted', \
"Unable to rehydrate %r from GMTED." % data
return GMTEDTile(self, data['x'], data['y'])
def downloads_for(self, tile):
tiles = set()
# if the tile scale is greater than 20x the GMTED scale, then there's no
# point in including GMTED, it'll be far too fine to make a difference.
# GMTED is 7.5 arc seconds at best (30 at the poles).
if tile.max_resolution() > 20 * 7.5 / 3600:
return tiles
# buffer by 0.1 degrees (48px) to grab neighbouring tiles to ensure
# that there's no tile edge artefacts.
tile_bbox = tile.latlon_bbox().buffer(0.1)
for y in self.ys:
for x in self.xs:
bbox = BoundingBox(x, y, x + 30, y + 20)
if tile_bbox.intersects(bbox):
tiles.add(GMTEDTile(self, x, y))
return tiles
def vrts_for(self, tile):
"""
Returns a list of sets of tiles, with each list element intended as a
separate VRT for use in GDAL.
The reason for this is that GDAL doesn't do any compositing _within_
a single VRT, so if there are multiple overlapping source rasters in
the VRT, only one will be chosen. This isn't often the case - most
raster datasets are non-overlapping apart from deliberately duplicated
margins.
"""
return [self.downloads_for(tile)]
def srs(self):
return srs.wgs84()
def filter_type(self, src_res, dst_res):
# seems like GRA_Lanczos has trouble with nodata, which is causing
# "ringing" near the edges of the data.
return gdal.GRA_Bilinear if src_res > dst_res else gdal.GRA_Cubic
def _parse_bbox(self, ns_deg, is_ns, ew_deg, is_ew, res):
bottom = int(ns_deg)
left = int(ew_deg)
if is_ns == 'S':
bottom = -bottom
if is_ew == 'W':
left = -left
b = BoundingBox(left, bottom, left + 30, bottom + 20)
return b
def create(options):
return GMTED(options)
| 31.037975 | 80 | 0.599715 | 721 | 4,904 | 3.951456 | 0.327323 | 0.022113 | 0.027027 | 0.011934 | 0.023868 | 0.009828 | 0.009828 | 0 | 0 | 0 | 0 | 0.015832 | 0.291599 | 4,904 | 157 | 81 | 31.235669 | 0.80426 | 0.1823 | 0 | 0.037383 | 0 | 0 | 0.050382 | 0.017303 | 0 | 0 | 0 | 0 | 0.009346 | 1 | 0.205607 | false | 0 | 0.168224 | 0.093458 | 0.560748 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6f7f447b1fa0fc9422454b244b15beea079bfb03 | 301 | py | Python | api/exceptions.py | SirSkaro/smogon.py | 4249e67628f9e024890fbc49c33fee833de41cdc | [
"MIT"
] | null | null | null | api/exceptions.py | SirSkaro/smogon.py | 4249e67628f9e024890fbc49c33fee833de41cdc | [
"MIT"
] | 1 | 2018-08-26T05:07:26.000Z | 2018-08-26T05:07:26.000Z | api/exceptions.py | SirSkaro/smogon.py | 4249e67628f9e024890fbc49c33fee833de41cdc | [
"MIT"
] | 1 | 2021-04-08T02:45:03.000Z | 2021-04-08T02:45:03.000Z | class Error(Exception):
"""Base class for other exceptions"""
pass
class InvalidSpecies(Error):
"""Species out of bounds of legitimate species"""
pass
class InvalidForm(Error):
"""Form is invalid"""
pass
class APIError(Error):
"""Something wrong with the API"""
pass | 20.066667 | 53 | 0.66113 | 36 | 301 | 5.527778 | 0.666667 | 0.135678 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.219269 | 301 | 15 | 54 | 20.066667 | 0.846809 | 0.398671 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
6f8f1f4dd830e709ad10d8587278084aca74c16a | 4,011 | py | Python | examples/jwt-client/cloudant-client.py | stefanberger/trusted-service-identity | 06027e3deee21328cfbc37784530a9dfe39c8889 | [
"Apache-2.0"
] | null | null | null | examples/jwt-client/cloudant-client.py | stefanberger/trusted-service-identity | 06027e3deee21328cfbc37784530a9dfe39c8889 | [
"Apache-2.0"
] | null | null | null | examples/jwt-client/cloudant-client.py | stefanberger/trusted-service-identity | 06027e3deee21328cfbc37784530a9dfe39c8889 | [
"Apache-2.0"
] | null | null | null | # This is just a very simple client to generate HTML file just as proof of concept,
# to demonstrate Trusted Identity use case.
#
# This should not be used in production environment.
import os
import json
# It is helpful to have access to tools
# for formatting date and time values.
from time import gmtime, strftime
from cloudant.client import Cloudant
from cloudant.error import CloudantException
from cloudant.result import Result, ResultByKey
# Functions:
def process_database(databaseName, target):
myDatabase = client.create_database(databaseName)
if myDatabase.exists():
print "'{0}' database exists.\n".format(databaseName)
result_collection = Result(myDatabase.all_docs, include_docs=True)
for a in result_collection:
print "Retrieved full document:\n{0}\n".format(a)
print "**:\n{0}\n".format(a["doc"]["lastName"])
doc = a["doc"]
"{name} and {phone} and {ssn}".format(name="brandon", ssn="123-456-78", phone="8888888")
line = "\t{lastName} {firstName} \tSSN:{ssn} phone:{phone} rating:{rating}".format(lastName=doc["lastName"], firstName=doc["firstName"],
ssn=doc["ssn"], phone=doc["phone"], rating=doc["rating"])
target.write(line)
#target.write("\t%s %s \tSSN:%s phone:%s rating:%s" % (doc["lastName"], doc["firstName"],
#doc["ssn"], doc["phone"], doc["rating"]))
target.write("\n")
claimsfilename = "all-claims"
claims = open(claimsfilename, 'r')
# Change current directory to avoid exposure of control files
try:
os.mkdir('static')
except OSError:
# The directory already exists,
# no need to create it.
pass
os.chdir('static')
# Begin creating a very simple web page.
filename = "index.html.new"
target = open(filename, 'w')
target.truncate()
target.write("<html><head><title>Trusted Identity Demo</title><meta http-equiv=\"refresh\" content=\"5\" /></head>\n")
target.write("<body><p>Executing access to Cloudant tables...</p><pre>")
# Put a clear indication of the current date and time at the top of the page.
target.write("====\n")
target.write(strftime("%Y-%m-%d %H:%M:%S", gmtime()))
target.write("\n====\n\n")
target.write("</pre><h3>Container Identity</h3><p>\n")
target.write(claims.read())
target.write("</p>\n")
# Start working with the IBM Cloudant service instance.
# IBM Cloudant Legacy authentication
# client = Cloudant("<username>", "<password>", url="<url>")
myurl=os.environ["TARGET_URL"]
username=os.environ["USERNAME"]
API_KEY=os.environ["API_KEY"]
try:
client = Cloudant(username,API_KEY, url=myurl)
client.connect()
except:
print "Error, no matching policy for this identity"
target.write("<pre>\n====\n\n")
target.write("\tNO MATCHING POLICIES FOR THIS IDENTITY!!\n\n")
target.write("</pre></p>")
target.write("<h3>US data results</h3><p><pre>\n")
try:
databaseName = "ti-users-us"
process_database(databaseName, target)
except:
print "Error, no full access to the US DB, trying limited access..."
try:
databaseName = "ti-users-us-limit"
process_database(databaseName, target)
except:
print "Error, no access to the DB"
target.write("\tNO DATABASE ACCESS!!\n")
target.write("\n")
# Put another clear indication of the current date and time at the bottom of the page.
target.write("\n====\n")
target.write("</pre></p><h3>EU data results</h3><p><pre>")
try:
databaseName = "ti-users-eu"
process_database(databaseName, target)
except:
print "Error, no full access to the EU DB, trying limited access..."
try:
databaseName = "ti-users-eu-limit"
process_database(databaseName, target)
except:
print "Error, no access to the DB"
target.write("\tNO DATABASE ACCESS!!\n")
target.write("\n")
target.write("\n====\n")
target.write(strftime("%Y-%m-%d %H:%M:%S", gmtime()))
target.write("\n====\n")
# Finish creating the web page.
target.write("</pre></body></html>")
target.close()
client.disconnect()
| 33.991525 | 144 | 0.66891 | 565 | 4,011 | 4.723894 | 0.318584 | 0.103035 | 0.049457 | 0.061821 | 0.312851 | 0.262271 | 0.230049 | 0.230049 | 0.197827 | 0.197827 | 0 | 0.007458 | 0.164298 | 4,011 | 117 | 145 | 34.282051 | 0.788783 | 0.219147 | 0 | 0.329268 | 1 | 0.012195 | 0.335797 | 0.015424 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.012195 | 0.073171 | null | null | 0.097561 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6f8fd8f4761804f66250d53a361c03286e7915fe | 3,610 | py | Python | tests/test_expectation.py | sneak-it/cachet-url-monitor | 5cfef6392ef84ad633c19c815e40f0a3d181a2d3 | [
"MIT"
] | null | null | null | tests/test_expectation.py | sneak-it/cachet-url-monitor | 5cfef6392ef84ad633c19c815e40f0a3d181a2d3 | [
"MIT"
] | null | null | null | tests/test_expectation.py | sneak-it/cachet-url-monitor | 5cfef6392ef84ad633c19c815e40f0a3d181a2d3 | [
"MIT"
] | 1 | 2019-10-20T13:03:44.000Z | 2019-10-20T13:03:44.000Z | #!/usr/bin/env python
import re
import unittest
import mock
import pytest
from cachet_url_monitor.configuration import HttpStatus, Regex
from cachet_url_monitor.configuration import Latency
class LatencyTest(unittest.TestCase):
def setUp(self):
self.expectation = Latency({'type': 'LATENCY', 'threshold': 1})
def test_init(self):
assert self.expectation.threshold == 1
def test_get_status_healthy(self):
def total_seconds():
return 0.1
request = mock.Mock()
elapsed = mock.Mock()
request.elapsed = elapsed
elapsed.total_seconds = total_seconds
assert self.expectation.get_status(request) == 1
def test_get_status_unhealthy(self):
def total_seconds():
return 2
request = mock.Mock()
elapsed = mock.Mock()
request.elapsed = elapsed
elapsed.total_seconds = total_seconds
assert self.expectation.get_status(request) == 2
def test_get_message(self):
def total_seconds():
return 0.1
request = mock.Mock()
elapsed = mock.Mock()
request.elapsed = elapsed
elapsed.total_seconds = total_seconds
assert self.expectation.get_message(request) == ('Latency above '
'threshold: 0.1000 seconds')
class HttpStatusTest(unittest.TestCase):
def setUp(self):
self.expectation = HttpStatus({'type': 'HTTP_STATUS', 'status_range': "200-300"})
def test_init(self):
assert self.expectation.status_range == (200, 300)
def test_init_with_one_status(self):
"""With only one value, we still expect a valid tuple"""
self.expectation = HttpStatus({'type': 'HTTP_STATUS', 'status_range': "200"})
assert self.expectation.status_range == (200, 201)
def test_init_with_invalid_number(self):
"""Invalid values should just fail with a ValueError, as we can't convert it to int."""
with pytest.raises(ValueError):
self.expectation = HttpStatus({'type': 'HTTP_STATUS', 'status_range': "foo"})
def test_get_status_healthy(self):
request = mock.Mock()
request.status_code = 200
assert self.expectation.get_status(request) == 1
def test_get_status_unhealthy(self):
request = mock.Mock()
request.status_code = 400
assert self.expectation.get_status(request) == 3
def test_get_message(self):
request = mock.Mock()
request.status_code = 400
assert self.expectation.get_message(request) == ('Unexpected HTTP '
'status (400)')
class RegexTest(unittest.TestCase):
def setUp(self):
self.expectation = Regex({'type': 'REGEX', 'regex': '.*(find stuff).*'})
def test_init(self):
assert self.expectation.regex == re.compile('.*(find stuff).*', re.UNICODE + re.DOTALL)
def test_get_status_healthy(self):
request = mock.Mock()
request.text = 'We could find stuff\n in this body.'
assert self.expectation.get_status(request) == 1
def test_get_status_unhealthy(self):
request = mock.Mock()
request.text = 'We will not find it here'
assert self.expectation.get_status(request) == 3
def test_get_message(self):
request = mock.Mock()
request.text = 'We will not find it here'
assert self.expectation.get_message(request) == ('Regex did not match '
'anything in the body')
| 30.854701 | 95 | 0.619114 | 422 | 3,610 | 5.135071 | 0.234597 | 0.124596 | 0.125981 | 0.099677 | 0.751269 | 0.733272 | 0.651131 | 0.526996 | 0.503922 | 0.455007 | 0 | 0.019466 | 0.274238 | 3,610 | 116 | 96 | 31.12069 | 0.807634 | 0.042382 | 0 | 0.602564 | 0 | 0 | 0.101567 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.25641 | false | 0 | 0.076923 | 0.038462 | 0.410256 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6f94cfaa5e6fee3e97088acc35c00e1a468b30ce | 253 | py | Python | src/factiva/core/__init__.py | wizeline/factiva-core-python | 818c7e02d8da9aec96e2280f8a45256dbebc5829 | [
"MIT"
] | 1 | 2021-06-01T17:21:46.000Z | 2021-06-01T17:21:46.000Z | src/factiva/core/__init__.py | wizeline/factiva-core-python | 818c7e02d8da9aec96e2280f8a45256dbebc5829 | [
"MIT"
] | null | null | null | src/factiva/core/__init__.py | wizeline/factiva-core-python | 818c7e02d8da9aec96e2280f8a45256dbebc5829 | [
"MIT"
] | null | null | null | """Implement core capabilities for classes."""
__all__ = ['const', 'dicts']
from factiva.core.apikeyuser import (
APIKeyUser
)
from factiva.core.streamuser import (
StreamUser
)
from factiva.core.stream_response import (
StreamResponse
)
| 16.866667 | 46 | 0.727273 | 27 | 253 | 6.62963 | 0.592593 | 0.184358 | 0.251397 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166008 | 253 | 14 | 47 | 18.071429 | 0.848341 | 0.158103 | 0 | 0 | 0 | 0 | 0.048309 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.3 | 0 | 0.3 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6f95281f4c73ca03c35f7e8abe5af3441af44a6a | 358 | py | Python | Dataset/Leetcode/train/11/72.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | Dataset/Leetcode/train/11/72.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | Dataset/Leetcode/train/11/72.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | class Solution:
def XXX(self, height: List[int]) -> int:
left = 0
right = len(height)-1
temp = 0
while left<right:
temp = max(temp,min(height[left],height[right])*(right-left))
if height[left] < height[right]:
left+=1
else:
right-=1
return temp
| 25.571429 | 73 | 0.47486 | 42 | 358 | 4.047619 | 0.47619 | 0.117647 | 0.188235 | 0.247059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023364 | 0.402235 | 358 | 13 | 74 | 27.538462 | 0.771028 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6f998c313527db2ae43adbb452251f1e43923246 | 3,807 | py | Python | geonames/models.py | davidegalletti/django_geonames_cities | 4bdd5ac5680aa816f85e5e35941f97498064788a | [
"BSD-3-Clause"
] | null | null | null | geonames/models.py | davidegalletti/django_geonames_cities | 4bdd5ac5680aa816f85e5e35941f97498064788a | [
"BSD-3-Clause"
] | null | null | null | geonames/models.py | davidegalletti/django_geonames_cities | 4bdd5ac5680aa816f85e5e35941f97498064788a | [
"BSD-3-Clause"
] | null | null | null | from django.db import models
class Country(models.Model):
name = models.CharField(max_length=200, db_index=True)
code = models.CharField(max_length=20, db_index=True)
# municipality_levels is a list of strings blank-separated with
# possible values GeonamesAdm1-GeonamesAdm5, PopulatedPlace
# and tells which models hold the municipalities for this country
# For Italy the town of Pisa is (and all the others are)
# Adm3 so municipality_levels for Italy has to be GeonamesAdm3
# GB has likely cities in two levels 2 and 3: we use a string like "GeonamesAdm2 GeonamesAdm3"
# or in 3 and PPL hence "GeonamesAdm3 PopulatedPlace"
municipality_levels = models.CharField(max_length=200, default='')
# data_loaded is True if we have loaded from geonames data for this country
data_loaded = models.BooleanField(default=False, db_index=True)
# Foreign countries have a code used to calculate Italian Codice Fiscale
it_codice_catastale = models.CharField(max_length=4, blank=True, null=True)
def __str__(self):
return self.name
class GeonamesAdm(models.Model):
name = models.CharField(max_length=200, db_index=True)
alternate_names = models.CharField(max_length=2000, default='', db_index=True)
suppressed = models.BooleanField(default=True)
class Meta:
abstract = True
class GeonamesAdm1(GeonamesAdm):
code = models.CharField(max_length=20, db_index=True)
feature_code = models.CharField(max_length=20)
country = models.ForeignKey(Country, on_delete=models.CASCADE)
def __str__(self):
return self.name
class GeonamesAdm2(GeonamesAdm):
code = models.CharField(max_length=20, db_index=True)
feature_code = models.CharField(max_length=20)
adm1 = models.ForeignKey(GeonamesAdm1, on_delete=models.CASCADE)
@property
def country(self):
return self.adm1.country
def __str__(self):
return self.name
class GeonamesAdm3(GeonamesAdm):
code = models.CharField(max_length=20, db_index=True)
adm2 = models.ForeignKey(GeonamesAdm2, on_delete=models.CASCADE)
postal_code = models.CharField("Postal Code", max_length=10)
# Italian municipalities are adm3 and have a code used to calculate Codice Fiscale
it_codice_catastale = models.CharField(max_length=4, blank=True, null=True)
@property
def country(self):
return self.adm2.country
def __str__(self):
return self.name
class GeonamesAdm4(GeonamesAdm):
code = models.CharField(max_length=20, db_index=True)
# GB has some ADM4 with adm2 parent and without adm3
adm2 = models.ForeignKey(GeonamesAdm2, blank=True, null=True, on_delete=models.CASCADE)
adm3 = models.ForeignKey(GeonamesAdm3, blank=True, null=True, on_delete=models.CASCADE)
@property
def country(self):
if self.adm2:
return self.adm2.country
else:
return self.adm3.country
def __str__(self):
return self.name
class GeonamesAdm5(GeonamesAdm):
adm4 = models.ForeignKey(GeonamesAdm4, on_delete=models.CASCADE)
@property
def country(self):
return self.adm4.country
def __str__(self):
return self.name
class PopulatedPlace(GeonamesAdm):
feature_code = models.CharField(max_length=20, db_index=True)
country = models.ForeignKey(Country, on_delete=models.CASCADE)
adm1 = models.ForeignKey(GeonamesAdm1, blank=True, null=True, on_delete=models.CASCADE)
adm2 = models.ForeignKey(GeonamesAdm2, blank=True, null=True, on_delete=models.CASCADE)
adm3 = models.ForeignKey(GeonamesAdm3, blank=True, null=True, on_delete=models.CASCADE)
adm4 = models.ForeignKey(GeonamesAdm4, blank=True, null=True, on_delete=models.CASCADE)
def __str__(self):
return self.name
| 35.25 | 98 | 0.727607 | 502 | 3,807 | 5.36255 | 0.2251 | 0.083581 | 0.093611 | 0.124814 | 0.572065 | 0.562036 | 0.532318 | 0.517088 | 0.404903 | 0.374443 | 0 | 0.0245 | 0.185185 | 3,807 | 107 | 99 | 35.579439 | 0.843327 | 0.189388 | 0 | 0.602941 | 0 | 0 | 0.003578 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.161765 | false | 0 | 0.014706 | 0.147059 | 0.897059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
6f9dd439d497f23f06d8a3fd432bdc50e4f055d9 | 151 | py | Python | May09.py | Ainevsia/RSA-related | 816101e3951b95b79b59057199179859941f460a | [
"MIT"
] | 7 | 2019-12-31T13:45:36.000Z | 2021-11-14T20:09:02.000Z | May09.py | Ainevsia/Algebraic-Number-Theory | 816101e3951b95b79b59057199179859941f460a | [
"MIT"
] | null | null | null | May09.py | Ainevsia/Algebraic-Number-Theory | 816101e3951b95b79b59057199179859941f460a | [
"MIT"
] | null | null | null | from toolkit import *
if __name__ == '__main__':
ls = [113, 167, 2017]
for i in ls:
print(i, root(i), phi(i - 1), len(root(i)))
| 21.571429 | 52 | 0.523179 | 24 | 151 | 2.958333 | 0.75 | 0.140845 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104762 | 0.304636 | 151 | 6 | 53 | 25.166667 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0.055172 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6fb9a1a89e10d74eb814be62145b54b4664d49a2 | 2,192 | py | Python | packer_builder/templates.py | mrlesmithjr/packer-builder | ff07b48740062c1a3050e670187bcd620a901b1c | [
"MIT"
] | 11 | 2019-10-16T21:05:58.000Z | 2021-08-03T16:14:51.000Z | packer_builder/templates.py | mrlesmithjr/packer-builder | ff07b48740062c1a3050e670187bcd620a901b1c | [
"MIT"
] | 35 | 2019-10-16T13:15:30.000Z | 2020-05-21T05:20:29.000Z | packer_builder/templates.py | mrlesmithjr/packer-builder | ff07b48740062c1a3050e670187bcd620a901b1c | [
"MIT"
] | 4 | 2020-03-02T15:45:26.000Z | 2021-08-24T17:58:36.000Z | """Generate Packer templates for offline execution/review."""
import os
import logging
import shutil
from packer_builder.template import Template
# pylint: disable=too-few-public-methods
class Templates():
"""Generate Packer templates without building."""
def __init__(self, args, distros):
"""Init a thing."""
# Setup logger
self.logger = logging.getLogger(__name__)
# self.args = args
self.distros = distros
self.build_dir = args.outputdir
self.password_override = args.password
# self.current_dir = os.getcwd()
def generate(self):
"""Generate templates and rename them into the defined output dir."""
# Iterate through defined distros
for distro, distro_spec in self.distros.items():
# Iterate through versions defined in distros
for version, version_spec in distro_spec['versions'].items():
version = str(version)
# Define data to pass to class
data = {'output_dir': self.build_dir,
'password_override': self.password_override,
'distro': distro, 'distro_spec': distro_spec,
'version': version, 'version_spec': version_spec}
# Generate the template
template = Template(data=data)
# Save template for processing
template.save()
# Validate the generated template
self.logger.info(
'Validating distro: %s, distro_spec: %s', distro,
distro_spec)
template.validate()
# Rename the generated template as distro and version
generated_template = os.path.join(
self.build_dir, 'template.json')
self.logger.info('generated_template: %s', generated_template)
renamed_template = os.path.join(
self.build_dir, f'{distro}-{version}.json') # noqa: E999
self.logger.info('renamed_template: %s', renamed_template)
shutil.move(generated_template, renamed_template)
| 38.45614 | 78 | 0.58531 | 225 | 2,192 | 5.546667 | 0.342222 | 0.048077 | 0.038462 | 0.028846 | 0.048077 | 0.048077 | 0.048077 | 0 | 0 | 0 | 0 | 0.00203 | 0.32573 | 2,192 | 56 | 79 | 39.142857 | 0.842355 | 0.241332 | 0 | 0 | 1 | 0 | 0.114654 | 0.014102 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064516 | false | 0.064516 | 0.129032 | 0 | 0.225806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
6fc0b08bd02ea3dda31378f041322fee0d094a78 | 840 | py | Python | tests/src/elements/select.py | Alpaca00/alpaca_web | 1db33978f774addbe456f750ffadf32d2f223610 | [
"MIT"
] | 4 | 2021-09-14T07:36:27.000Z | 2021-09-18T15:10:24.000Z | tests/src/elements/select.py | Alpaca00/alpaca_web | 1db33978f774addbe456f750ffadf32d2f223610 | [
"MIT"
] | 1 | 2021-12-14T20:32:29.000Z | 2021-12-15T13:28:29.000Z | tests/src/elements/select.py | Alpaca00/squash-opponent | 1db33978f774addbe456f750ffadf32d2f223610 | [
"MIT"
] | null | null | null | from selene import have
from tests.src import BasePage
class SelectList(BasePage):
def __init__(self, name=None):
super().__init__()
loc = f'//select[@name="{name}"]'
self._element = self.element(loc)
def open(self):
self._element.click()
return self
def _options(self):
return self._element.all('option')
def select_by_value(self, value):
self._options().element_by(have.value(value)).click()
return self
def select_by_text(self, text):
self._options().element_by(have.text(text)).click()
return self
def select_by_exact_text(self, text):
self._options().element_by(have.exact_text(text)).click()
return self
def set(self, value):
self.open()
self.select_by_value(value)
return self | 24.705882 | 65 | 0.62381 | 108 | 840 | 4.592593 | 0.277778 | 0.120968 | 0.120968 | 0.145161 | 0.366935 | 0.318548 | 0.145161 | 0.145161 | 0 | 0 | 0 | 0 | 0.25 | 840 | 34 | 66 | 24.705882 | 0.787302 | 0 | 0 | 0.2 | 0 | 0 | 0.035672 | 0.028537 | 0 | 0 | 0 | 0 | 0 | 1 | 0.28 | false | 0 | 0.08 | 0.04 | 0.64 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6fc694296a273025d122e08448288a762f2c6732 | 364 | py | Python | BOJ/18000~18999/18300~18399/18312.py | shinkeonkim/today-ps | f3e5e38c5215f19579bb0422f303a9c18c626afa | [
"Apache-2.0"
] | 2 | 2020-01-29T06:54:41.000Z | 2021-11-07T13:23:27.000Z | BOJ/18000~18999/18300~18399/18312.py | shinkeonkim/Today_PS | bb0cda0ee1b9c57e1cfa38355e29d0f1c6167a44 | [
"Apache-2.0"
] | null | null | null | BOJ/18000~18999/18300~18399/18312.py | shinkeonkim/Today_PS | bb0cda0ee1b9c57e1cfa38355e29d0f1c6167a44 | [
"Apache-2.0"
] | null | null | null | e_h,K = map(int,input().split())
h,m,s,e_m,e_s,ans = 0,0,0,59,59,0
while e_h != h or e_m != m or e_s != s:
z = "%02d%02d%02d" % (h,m,s)
if z.count(str(K)) >0:
# print(z)
ans+=1
s+=1
if s==60:
m+=1
s=0
if m==60:
h+=1
m=0
z = "%02d%02d%02d" % (h,m,s)
if z.count(str(K)) >0:
ans+=1
print(ans) | 19.157895 | 39 | 0.425824 | 83 | 364 | 1.795181 | 0.26506 | 0.161074 | 0.060403 | 0.134228 | 0.348993 | 0.348993 | 0.348993 | 0.348993 | 0.348993 | 0.348993 | 0 | 0.136364 | 0.335165 | 364 | 19 | 40 | 19.157895 | 0.479339 | 0.021978 | 0 | 0.352941 | 0 | 0 | 0.067606 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.058824 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6fcb3e133d65428d0e453a1466ccdea16e74e67d | 1,028 | py | Python | pisces/__init__.py | omarmarcillo/piscesWIN | ed935bb98a4a3a720d918e41b1d3dc30ddf8bf60 | [
"MIT"
] | null | null | null | pisces/__init__.py | omarmarcillo/piscesWIN | ed935bb98a4a3a720d918e41b1d3dc30ddf8bf60 | [
"MIT"
] | null | null | null | pisces/__init__.py | omarmarcillo/piscesWIN | ed935bb98a4a3a720d918e41b1d3dc30ddf8bf60 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
Pisces is a practical and extensible data management library in Python. It
leverages existing widely-used free and open-source technologies, such as SQL
databases and Python, in order to provide a seismological data management
solution that:
1) allows the user to both manage and analyze data with a single easy-to-learn
language, Python,
2) leverages large existing user communities to facilitate adoption and
problem solving in code development, and
3) imposes no expensive or restrictive licensing constraints on users.
The ultimate goal of Pisces is to allow the user to write code that will not
eventually have to be abandoned due to different project scales, system
architectures, or licensing concerns.
"""
__version__ = '0.2.1'
from pisces.util import db_connect, get_tables, travel_times, make_table
from pisces.schema.util import string_formatter
from pisces.io.trace import wfdisc2trace
from pisces.schema import kbcore
from pisces.io.readwaveform import read_waveform
| 38.074074 | 79 | 0.79572 | 156 | 1,028 | 5.179487 | 0.673077 | 0.061881 | 0.022277 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009292 | 0.162451 | 1,028 | 26 | 80 | 39.538462 | 0.929152 | 0.728599 | 0 | 0 | 0 | 0 | 0.018519 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.833333 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
6fe09226deccace9c2ed58de257a246088fa62c8 | 450 | py | Python | tickets/urls.py | vshalt/bugtrakr | f6618ce34e19050afe070e44d31959129d2428d1 | [
"MIT"
] | null | null | null | tickets/urls.py | vshalt/bugtrakr | f6618ce34e19050afe070e44d31959129d2428d1 | [
"MIT"
] | null | null | null | tickets/urls.py | vshalt/bugtrakr | f6618ce34e19050afe070e44d31959129d2428d1 | [
"MIT"
] | null | null | null | from django.urls import path
from . import views
app_name = 'tickets'
urlpatterns = [
path('', views.ticket_list, name='list'),
path('detail/<int:id>', views.ticket_detail, name='detail'),
path('create/', views.ticket_create, name='create'),
path('edit/<int:id>', views.ticket_edit, name='edit'),
path('assign/<int:id>', views.ticket_assign, name='assign'),
path('history/<int:id>', views.ticket_history, name='history'),
]
| 32.142857 | 67 | 0.666667 | 61 | 450 | 4.803279 | 0.311475 | 0.225256 | 0.136519 | 0.21843 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131111 | 450 | 13 | 68 | 34.615385 | 0.749361 | 0 | 0 | 0 | 0 | 0 | 0.235556 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.181818 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6fea5b24ef417955164bdfb25c06e41f94177ecb | 161 | py | Python | examples/flask/run_server.py | CircuitSacul/patreon-python | 9fd21760e68ee838d153a6cfc2c580ffcc00df40 | [
"Apache-2.0"
] | 100 | 2015-11-30T03:04:00.000Z | 2022-03-16T08:55:49.000Z | examples/flask/run_server.py | CircuitSacul/patreon-python | 9fd21760e68ee838d153a6cfc2c580ffcc00df40 | [
"Apache-2.0"
] | 23 | 2016-06-12T08:00:00.000Z | 2021-08-31T03:39:45.000Z | examples/flask/run_server.py | CircuitSacul/patreon-python | 9fd21760e68ee838d153a6cfc2c580ffcc00df40 | [
"Apache-2.0"
] | 40 | 2015-12-02T03:05:08.000Z | 2022-03-22T08:55:20.000Z | #!venv/bin/python
from my_site.app import create_app
from my_site import config
my_app = create_app()
my_app.run(debug=True, host='0.0.0.0', port=config.port)
| 20.125 | 56 | 0.757764 | 32 | 161 | 3.625 | 0.5 | 0.051724 | 0.172414 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027778 | 0.10559 | 161 | 7 | 57 | 23 | 0.777778 | 0.099379 | 0 | 0 | 0 | 0 | 0.048611 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
b5081f2432629b2a2655a3a7d4403fd76fbf708a | 722 | py | Python | metanic/collector/collection.py | LimpidTech/melody | a00b99f9b697864a078e2cb886be4d75c10458a9 | [
"BSD-3-Clause"
] | null | null | null | metanic/collector/collection.py | LimpidTech/melody | a00b99f9b697864a078e2cb886be4d75c10458a9 | [
"BSD-3-Clause"
] | 1 | 2020-02-11T21:34:24.000Z | 2020-02-11T21:34:24.000Z | metanic/collector/collection.py | LimpidTech/melody | a00b99f9b697864a078e2cb886be4d75c10458a9 | [
"BSD-3-Clause"
] | null | null | null | from zope.interface import Interface
from zope.interface import Attribute
class ICollection(Interface):
""" Provides an interface for getting named lists of resources. """
name = Attribute(""" A friendly name for this collection """)
def __call__(request):
""" Create a new Collection for the given request. """
def items(request):
""" Gets a list of resources in this collection. """
class Collection(object):
""" Implements boilerplate behaviors for Collection objects. """
def __init__(self):
self.pk = self.__class__.name
self.resource_url = self.name.lower()
def __call__(self, request):
return self.items(request)
| 27.769231 | 72 | 0.652355 | 83 | 722 | 5.46988 | 0.518072 | 0.035242 | 0.07489 | 0.101322 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.249307 | 722 | 25 | 73 | 28.88 | 0.837638 | 0.292244 | 0 | 0 | 0 | 0 | 0.080435 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.166667 | 0.083333 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
82eb7fca151b4498359bc764ae4c8e7b1d2b04ad | 358 | py | Python | others/numpy-basics/numpy-slicing.py | bt3gl/Resources-Numerical_Methods_for_Physics | 8668215f107230fafd9bdeb0061d353328cf03e8 | [
"Apache-2.0"
] | 17 | 2019-10-28T03:13:07.000Z | 2020-11-21T17:38:06.000Z | others/numpy-basics/numpy-slicing.py | bt3gl/Resources-Numerical_Methods_for_Physics | 8668215f107230fafd9bdeb0061d353328cf03e8 | [
"Apache-2.0"
] | null | null | null | others/numpy-basics/numpy-slicing.py | bt3gl/Resources-Numerical_Methods_for_Physics | 8668215f107230fafd9bdeb0061d353328cf03e8 | [
"Apache-2.0"
] | 5 | 2020-05-09T07:55:32.000Z | 2020-12-12T11:05:42.000Z | import numpy as np
# example from scipy.org NumPy tutorial
def f(x,y):
return 10*x+y
a = np.fromfunction(f, (5,4),dtype=int)
print a
print " "
print a[0:2,0:2]
print " "
print a[:,1]
print " "
print a.flatten()
print " "
for row in a: # iteration is done over fiest axis
print row
print " "
for element in a.flat:
print element,
| 10.529412 | 51 | 0.625698 | 64 | 358 | 3.5 | 0.578125 | 0.107143 | 0.147321 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033457 | 0.248603 | 358 | 33 | 52 | 10.848485 | 0.799257 | 0.198324 | 0 | 0.294118 | 0 | 0 | 0.017921 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.058824 | null | null | 0.647059 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
82fdcb2a9860c150bd756c3a0b220f5119b84b05 | 1,163 | py | Python | insights/parsers/dirsrv_sysconfig.py | mglantz/insights-core | 6f20bbbe03f53ee786f483b2a28d256ff1ad0fd4 | [
"Apache-2.0"
] | 1 | 2020-02-19T06:36:22.000Z | 2020-02-19T06:36:22.000Z | insights/parsers/dirsrv_sysconfig.py | mglantz/insights-core | 6f20bbbe03f53ee786f483b2a28d256ff1ad0fd4 | [
"Apache-2.0"
] | null | null | null | insights/parsers/dirsrv_sysconfig.py | mglantz/insights-core | 6f20bbbe03f53ee786f483b2a28d256ff1ad0fd4 | [
"Apache-2.0"
] | null | null | null | """
dirsrv_sysconfig - file ``/etc/sysconfig/dirsrv``
=================================================
This module provides the ``DirsrvSysconfig`` class parser, for reading the
options in the ``/etc/sysconfig/dirsrv`` file.
Sample input::
# how many seconds to wait for the startpid file to show
# up before we assume there is a problem and fail to start
# if using systemd, omit the "; export VARNAME" at the end
#STARTPID_TIME=10 ; export STARTPID_TIME
# how many seconds to wait for the pid file to show
# up before we assume there is a problem and fail to start
# if using systemd, omit the "; export VARNAME" at the end
#PID_TIME=600 ; export PID_TIME
KRB5CCNAME=/tmp/krb5cc_995
KRB5_KTNAME=/etc/dirsrv/ds.keytab
Examples:
>>> dirsrv_conf = shared[DirsrvSysconfig]
>>> dirsrv.KRB5_KTNAME
'/etc/dirsrv/ds.keytab'
>>> 'PID_TIME' in dirsrv.data
False
"""
from .. import parser, SysconfigOptions
from insights.specs import Specs
@parser(Specs.dirsrv)
class DirsrvSysconfig(SysconfigOptions):
"""
Parse the `dirsrv` service's start-up configuration.
"""
set_properties = True
| 28.365854 | 74 | 0.672399 | 157 | 1,163 | 4.910828 | 0.458599 | 0.027237 | 0.046693 | 0.041505 | 0.389105 | 0.389105 | 0.319066 | 0.251621 | 0.251621 | 0.251621 | 0 | 0.012848 | 0.196905 | 1,163 | 40 | 75 | 29.075 | 0.812634 | 0.83233 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
d201121efb26f3115d5cb0f32bea40ebc30f633d | 369 | py | Python | config.template.py | roman-76/mqtt-screen-power | 7ef15c03607df2c4bba0ba4733ed35f8c2e2854c | [
"MIT"
] | 8 | 2019-06-30T03:51:19.000Z | 2022-01-09T22:50:27.000Z | config.template.py | roman-76/mqtt-screen-power | 7ef15c03607df2c4bba0ba4733ed35f8c2e2854c | [
"MIT"
] | null | null | null | config.template.py | roman-76/mqtt-screen-power | 7ef15c03607df2c4bba0ba4733ed35f8c2e2854c | [
"MIT"
] | 4 | 2020-06-15T20:34:44.000Z | 2021-10-10T23:44:36.000Z | mqtt_host = "IP_OR_DOMAIN"
mqtt_port = 1883
mqtt_topic = "screen/rpi"
mqtt_username = "USERNAME"
mqtt_password = "PASSWORD"
# Raspberry Pi
power_on_command = "vcgencmd display_power 1"
power_off_command = "vcgencmd display_power 0"
# Other HDMI linux devices
# power_on_command = "xset -display :0 dpms force on"
# power_off_command = "xset -display :0 dpms force off"
| 30.75 | 55 | 0.769648 | 56 | 369 | 4.767857 | 0.517857 | 0.052434 | 0.104869 | 0.202247 | 0.209738 | 0.209738 | 0 | 0 | 0 | 0 | 0 | 0.025157 | 0.138211 | 369 | 11 | 56 | 33.545455 | 0.814465 | 0.387534 | 0 | 0 | 0 | 0 | 0.38914 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
d20652d7b479f48e1d6b08a6c22fb983620a44a6 | 1,297 | py | Python | pdc/settings_test.py | tzhaoredhat/automation | a1867dc2d3591fdae1fa7f80d457c25f9705070e | [
"MIT"
] | null | null | null | pdc/settings_test.py | tzhaoredhat/automation | a1867dc2d3591fdae1fa7f80d457c25f9705070e | [
"MIT"
] | null | null | null | pdc/settings_test.py | tzhaoredhat/automation | a1867dc2d3591fdae1fa7f80d457c25f9705070e | [
"MIT"
] | null | null | null | #
# Copyright (c) 2015 Red Hat
# Licensed under The MIT License (MIT)
# http://opensource.org/licenses/MIT
#
"""
Extra Django settings for test environment of pdc project.
"""
from settings import *
# Database settings
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'test.sqlite3',
'USER': '',
'PASSWORD': '',
'HOST': '',
'PORT': '',
}
}
# disable PERMISSION while testing
REST_FRAMEWORK = {
'DEFAULT_AUTHENTICATION_CLASSES': (
'pdc.apps.auth.authentication.TokenAuthenticationWithChangeSet',
'rest_framework.authentication.SessionAuthentication',
),
# 'DEFAULT_PERMISSION_CLASSES': [
# 'rest_framework.permissions.DjangoModelPermissions'
# ],
'DEFAULT_FILTER_BACKENDS': ('rest_framework.filters.DjangoFilterBackend',),
'DEFAULT_METADATA_CLASS': 'contrib.bulk_operations.metadata.BulkMetadata',
'DEFAULT_RENDERER_CLASSES': (
'rest_framework.renderers.JSONRenderer',
'pdc.apps.common.renderers.ReadOnlyBrowsableAPIRenderer',
),
'EXCEPTION_HANDLER': 'pdc.apps.common.handlers.exception_handler',
'DEFAULT_PAGINATION_CLASS': 'pdc.apps.common.pagination.AutoDetectedPageNumberPagination',
'NON_FIELD_ERRORS_KEY': 'detail',
}
| 25.431373 | 94 | 0.684657 | 117 | 1,297 | 7.393162 | 0.641026 | 0.075145 | 0.045087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005666 | 0.1835 | 1,297 | 50 | 95 | 25.94 | 0.811143 | 0.238242 | 0 | 0.076923 | 0 | 0 | 0.650875 | 0.556128 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.038462 | 0.038462 | 0 | 0.038462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d227ba512309412e361df7ee5b584c9da7f7ad07 | 553 | py | Python | src/citrine/resources/object_runs.py | jspeerless/citrine-python | ef373cde3b2f9be9bd5657b3e59966f75dc111ff | [
"Apache-2.0"
] | 23 | 2019-11-01T23:52:53.000Z | 2022-03-12T23:51:42.000Z | src/citrine/resources/object_runs.py | jspeerless/citrine-python | ef373cde3b2f9be9bd5657b3e59966f75dc111ff | [
"Apache-2.0"
] | 289 | 2019-09-09T22:48:32.000Z | 2022-03-25T16:56:03.000Z | src/citrine/resources/object_runs.py | jspeerless/citrine-python | ef373cde3b2f9be9bd5657b3e59966f75dc111ff | [
"Apache-2.0"
] | 8 | 2019-10-30T18:51:26.000Z | 2021-10-06T03:36:34.000Z | """Top-level class for all object run objects and collections thereof."""
from abc import ABC
from typing import TypeVar
from citrine.resources.data_objects import DataObject, DataObjectCollection
class ObjectRun(DataObject, ABC):
"""
An abstract object run object.
ObjectRun must be extended along with `Resource`
"""
ObjectRunResourceType = TypeVar("ObjectRunResourceType", bound="ObjectRun")
class ObjectRunCollection(DataObjectCollection[ObjectRunResourceType], ABC):
"""A collection of one kind of object run object."""
| 26.333333 | 76 | 0.764919 | 62 | 553 | 6.806452 | 0.612903 | 0.063981 | 0.07109 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151899 | 553 | 20 | 77 | 27.65 | 0.899787 | 0.352622 | 0 | 0 | 0 | 0 | 0.091463 | 0.064024 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
d22ad1b76c2458c29786cf541568b6476ad3a905 | 2,766 | py | Python | config.py | d4wner/axefuzzer | 2c5c5abc9cea3b24c14735ba9cb159911e9f1ded | [
"BSD-2-Clause"
] | 2 | 2019-03-13T12:30:48.000Z | 2021-01-29T23:40:46.000Z | config.py | d4wner/axefuzzer | 2c5c5abc9cea3b24c14735ba9cb159911e9f1ded | [
"BSD-2-Clause"
] | null | null | null | config.py | d4wner/axefuzzer | 2c5c5abc9cea3b24c14735ba9cb159911e9f1ded | [
"BSD-2-Clause"
] | 1 | 2019-03-13T12:30:52.000Z | 2019-03-13T12:30:52.000Z | # encoding: utf-8
# page show record size
""" show_cnt = 15
"""
# msyql dababase connection info
""" mysqldb_conn = {
'host' : 'localhost',
'user' : 'root',
'password' : '',
'db' : '',
'charset' : 'utf8'
}
"""
# with out save http response content to database
""" save_content = True """
# http map filenames to MIME types
# https://docs.python.org/2/library/mimetypes.html
http_mimes = ['text', 'image', 'application', 'video', 'message', 'audio']
# http static resource file extension
#保留swf?
static_ext = ['js', 'css', 'ico','txt','svg','flv','jpg','png','jpeg','gif','pdf','ss3','rar','zip','avi','mp4','wmi','exe','mpeg','wav','mp3','json','appcache','cache']
# media resource files type
media_types = ['image', 'video', 'audio']
# http static resource files
static_files = [
'text/css',
# 'application/javascript',
# 'application/x-javascript',
'application/msword',
'application/vnd.ms-excel',
'application/vnd.ms-powerpoint',
'application/x-ms-wmd',
'application/x-shockwave-flash',
# 'image/x-cmu-raster',
# 'image/x-ms-bmp',
# 'image/x-portable-graymap',
# 'image/x-portable-bitmap',
# 'image/jpeg',
# 'image/gif',
# 'image/x-xwindowdump',
# 'image/png',
# 'image/vnd.microsoft.icon',
# 'image/x-portable-pixmap',
# 'image/x-xpixmap',
# 'image/ief',
# 'image/x-portable-anymap',
# 'image/x-rgb',
# 'image/x-xbitmap',
# 'image/tiff',
# 'video/mpeg',
# 'video/x-sgi-movie',
# 'video/mp4',
# 'video/x-msvideo',
# 'video/quicktime'
# 'audio/mpeg',
# 'audio/x-wav',
# 'audio/x-aiff',
# 'audio/basic',
# 'audio/x-pn-realaudio',
]
#snow_listener
snow_listener_url = "http://localhost:8083/listener"
#SSRF、SQL盲注、命令执行盲注的root domain,如vscode.baidu.com
blind_reverse_domain = "pz35ac.ceye.io"
#sqlmap api server address
sqlmap_api_address = 'http://127.0.0.1:8775'
#盲注反射检测地址api
blind_reverse_api = 'http://api.ceye.io/v1/records?token=0c28dc05dc90d6ecaab7fa1f28d09d9b&type=%s&filter=%s'
#所有都检测傻逼了,cmd执行检测费时间,建议简化。
detect_types = ['xss','dom_xss','url_redirect','file_download','file_read','pass_by','sqli','ssrf','xxe','ssi','ssti','crlf','command_exec']
def logo():
print '''\n
_____ ___________
/ _ \ ___ ___ ____\_ _____/_ __________________ ___________
/ /_\ \\ \/ // __ \| __)| | \___ /\___ // __ \_ __ \
/ | \> <\ ___/| \ | | // / / /\ ___/| | \/
\____|__ /__/\_ \\___ >___ / |____//_____ \/_____ \\___ >__|
\/ \/ \/ \/ \/ \/ \/
[+]axeproxy v1.1 based on ring04h@wyproxy@mitmproxy, thx all.
[+]AxeFuzzer v4.0 based on demon@prf)
''' | 29.741935 | 169 | 0.57773 | 302 | 2,766 | 4.821192 | 0.60596 | 0.041209 | 0.038462 | 0.031593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021609 | 0.213666 | 2,766 | 93 | 170 | 29.741935 | 0.647816 | 0.345987 | 0 | 0 | 0 | 0.111111 | 0.63952 | 0.068182 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.037037 | 0 | null | null | 0.037037 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d23aae638dee68d1b79a3870045ad7b3e5757030 | 2,150 | py | Python | app/config.py | Lakatu-io/example-pg-docker | 25c98c6b82975c896eea0f55d5ed5155f4d4c2c8 | [
"MIT"
] | null | null | null | app/config.py | Lakatu-io/example-pg-docker | 25c98c6b82975c896eea0f55d5ed5155f4d4c2c8 | [
"MIT"
] | null | null | null | app/config.py | Lakatu-io/example-pg-docker | 25c98c6b82975c896eea0f55d5ed5155f4d4c2c8 | [
"MIT"
] | null | null | null | import secrets
from typing import Any, Union
from pydantic import AnyHttpUrl, BaseSettings, PostgresDsn, validator
class Settings(BaseSettings):
class Config:
env_file = ".env"
env_file_encoding = "utf-8"
case_sensitive = True
SECRET_KEY: str = secrets.token_urlsafe(32)
ACCESS_TOKEN_EXPIRE_MINUTES: int = 60 * 24
SERVER_NAME: str
SERVER_HOST: AnyHttpUrl
BACKEND_CORS_ORIGINS: list[AnyHttpUrl] = []
@validator("BACKEND_CORS_ORIGINS", pre=True)
def assemble_cors_origins(cls, v: Union[str, list[str]]) -> Union[list[str], str]:
if isinstance(v, str) and not v.startswith("["):
return [i.strip() for i in v.split(",")]
if isinstance(v, (list, str)):
return v
raise ValueError(v)
PROJECT_NAME: str
# Database Connection
POSTGRES_SERVER: str
POSTGRES_USER: str
POSTGRES_PASSWORD: str
POSTGRES_DB: str
DATABASE_URI: Union[PostgresDsn, None] = None
SYNC_DATABASE_URI: Union[PostgresDsn, None] = None
@validator("DATABASE_URI", pre=True, check_fields=False)
def assemble_db_connection(cls, v: Union[str, None], values: dict[str, Any]) -> Any:
if isinstance(v, str):
return v
return PostgresDsn.build(
scheme="postgresql+asyncpg",
user=values.get("POSTGRES_USER"),
password=values.get("POSTGRES_PASSWORD"),
host=values.get("POSTGRES_SERVER"),
path=f"/{values.get('POSTGRES_DB') or ''}",
)
@validator("SYNC_DATABASE_URI", pre=True, check_fields=False)
def assemble_sync_db_connection(cls, v: Union[str, None], values: dict[str, Any]) -> Any:
if isinstance(v, str):
return v
return PostgresDsn.build(
scheme="postgresql",
user=values.get("POSTGRES_USER"),
password=values.get("POSTGRES_PASSWORD"),
host=values.get("POSTGRES_SERVER"),
path=f"/{values.get('POSTGRES_DB') or ''}",
)
FIRST_SUPERUSER: str
FIRST_SUPERUSER_PASSWORD: str
USERS_OPEN_REGISTRATION: bool = False
settings = Settings()
| 31.617647 | 93 | 0.634884 | 258 | 2,150 | 5.104651 | 0.341085 | 0.05467 | 0.103265 | 0.027335 | 0.454062 | 0.454062 | 0.400911 | 0.400911 | 0.400911 | 0.332574 | 0 | 0.004332 | 0.248372 | 2,150 | 67 | 94 | 32.089552 | 0.810644 | 0.008837 | 0 | 0.283019 | 0 | 0 | 0.115547 | 0.025364 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056604 | false | 0.075472 | 0.056604 | 0 | 0.54717 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
d24258402cfc5b1079a795d80211aee7940569ad | 1,495 | py | Python | numbas/urls.py | NabeelSait/editor | 035c1a7ffdaa4a9557bec17f2d5a7ddb3b316fce | [
"Apache-2.0"
] | null | null | null | numbas/urls.py | NabeelSait/editor | 035c1a7ffdaa4a9557bec17f2d5a7ddb3b316fce | [
"Apache-2.0"
] | null | null | null | numbas/urls.py | NabeelSait/editor | 035c1a7ffdaa4a9557bec17f2d5a7ddb3b316fce | [
"Apache-2.0"
] | null | null | null | from django.conf.urls import include, url
from django.conf.urls.static import static
from django.conf import settings
from django.contrib import admin
from django.contrib.auth.views import LogoutView
from django.urls import path
import django.contrib.auth.views
import notifications.urls
admin.autodiscover()
urlpatterns = [
url(r'^admin/', admin.site.urls),
path('logout/',LogoutView.as_view(),name='logout'),
path('', include('django.contrib.auth.urls')),
url(r'', include('accounts.urls')),
]
if 'editor_rest_api' in settings.INSTALLED_APPS:
try:
from editor_rest_api.urls import urls as rest_urls
urlpatterns += [
url('^api/', include(rest_urls)),
]
except ImportError:
pass
if 'feature_survey' in settings.INSTALLED_APPS:
try:
from feature_survey.urls import urlpatterns as feature_survey_urls
urlpatterns += [
url('^feature-survey/', include(feature_survey_urls)),
]
except ImportError:
pass
urlpatterns += [
url(r'', include('editor.urls')),
url(r'^migrate/', include('migration.urls')),
url(r'^notifications/', include(notifications.urls, namespace='notifications')),
]
urlpatterns += static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)
if settings.DEBUG:
try:
import debug_toolbar
urlpatterns = [
url(r'^__debug__/', include(debug_toolbar.urls)),
] + urlpatterns
except ImportError:
pass
| 28.207547 | 81 | 0.676923 | 177 | 1,495 | 5.581921 | 0.259887 | 0.060729 | 0.04251 | 0.036437 | 0.117409 | 0.060729 | 0 | 0 | 0 | 0 | 0 | 0 | 0.200669 | 1,495 | 52 | 82 | 28.75 | 0.826778 | 0 | 0 | 0.311111 | 0 | 0 | 0.120401 | 0.016054 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.066667 | 0.311111 | 0 | 0.311111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
d243af58a88e33990983ad1aa6340a12ba02ab70 | 335 | py | Python | portfolio/templatetags/portfolio_tags.py | samshultz/techbitsdata | 753309cbfee7bfa9a08804786f29b37f2b058436 | [
"BSD-3-Clause"
] | null | null | null | portfolio/templatetags/portfolio_tags.py | samshultz/techbitsdata | 753309cbfee7bfa9a08804786f29b37f2b058436 | [
"BSD-3-Clause"
] | null | null | null | portfolio/templatetags/portfolio_tags.py | samshultz/techbitsdata | 753309cbfee7bfa9a08804786f29b37f2b058436 | [
"BSD-3-Clause"
] | null | null | null | from django import template
from ..models import Portfolio
from ..helpers import grouped
register = template.Library()
@register.simple_tag
def get_projects(count=None):
portfolio = None
if count:
portfolio = Portfolio.objects.all()[:count]
portfolio = Portfolio.objects.all()
return grouped(portfolio, 2)
| 19.705882 | 51 | 0.722388 | 40 | 335 | 6 | 0.55 | 0.116667 | 0.191667 | 0.25 | 0.275 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00365 | 0.18209 | 335 | 16 | 52 | 20.9375 | 0.872263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.272727 | 0 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d2563c13ea4e53d4e67819ca9cacaf088c72d854 | 739 | py | Python | showyourwork/cookiecutter-showyourwork/{{ cookiecutter.repo }}/src/scripts/paths.py | katiebreivik/showyourwork | 77a15de6778e14c3a3936e86e181539cc31cc693 | [
"MIT"
] | 1 | 2022-03-31T21:21:26.000Z | 2022-03-31T21:21:26.000Z | showyourwork/cookiecutter-showyourwork/{{ cookiecutter.repo }}/src/scripts/paths.py | katiebreivik/showyourwork | 77a15de6778e14c3a3936e86e181539cc31cc693 | [
"MIT"
] | 2 | 2021-12-16T15:07:23.000Z | 2022-03-18T14:31:35.000Z | showyourwork/cookiecutter-showyourwork/{{ cookiecutter.repo }}/src/scripts/paths.py | katiebreivik/showyourwork | 77a15de6778e14c3a3936e86e181539cc31cc693 | [
"MIT"
] | 2 | 2022-02-07T19:09:19.000Z | 2022-03-29T21:08:51.000Z | """
Exposes common paths useful for manipulating datasets and generating figures.
"""
from pathlib import Path
# Absolute path to the top level of the repository
root = Path(__file__).resolve().parents[2].absolute()
# Absolute path to the `src` folder
src = root / "src"
# Absolute path to the `src/data` folder (contains datasets)
data = src / "data"
# Absolute path to the `src/static` folder (contains static images)
static = src / "static"
# Absolute path to the `src/scripts` folder (contains figure/pipeline scripts)
scripts = src / "scripts"
# Absolute path to the `src/tex` folder (contains the manuscript)
tex = src / "tex"
# Absolute path to the `src/tex/figures` folder (contains figure output)
figures = tex / "figures" | 28.423077 | 78 | 0.728011 | 105 | 739 | 5.085714 | 0.361905 | 0.157303 | 0.183521 | 0.222846 | 0.235955 | 0.086142 | 0 | 0 | 0 | 0 | 0 | 0.001621 | 0.165088 | 739 | 26 | 79 | 28.423077 | 0.863857 | 0.673884 | 0 | 0 | 0 | 0 | 0.132159 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d2607ff17b383e618edc317c687c513a0812ffd1 | 3,149 | py | Python | ch2/lineards/bit_manipulation.py | lyskevin/cpbook-code | 027f77933428d7688f935800ffa9109794e429b1 | [
"UPL-1.0"
] | 1,441 | 2018-12-03T23:46:17.000Z | 2022-03-29T06:36:43.000Z | ch2/lineards/bit_manipulation.py | lyskevin/cpbook-code | 027f77933428d7688f935800ffa9109794e429b1 | [
"UPL-1.0"
] | 53 | 2018-12-11T13:50:35.000Z | 2022-03-20T04:30:39.000Z | ch2/lineards/bit_manipulation.py | lyskevin/cpbook-code | 027f77933428d7688f935800ffa9109794e429b1 | [
"UPL-1.0"
] | 420 | 2018-12-04T11:22:08.000Z | 2022-03-27T15:25:33.000Z | import math
def isOn(S, j):
return (S & (1<<j))
def setBit(S, j):
return (S | (1<<j))
def clearBit(S, j):
return (S & (~(1<<j)))
def toggleBit(S, j):
return (S ^ (1<<j))
def lowBit(S):
return (S&(-S))
def setAll(n):
return ((1<<n)-1)
def modulo(S, N): # returns S % N, where N is a power of 2
return ((S) & (N-1))
def isPowerOfTwo(S):
return (not(S & (S - 1)))
def nearestPowerOfTwo(S):
return 1<<round(math.log2(S))
def turnOffLastBit(S):
return (S & (S - 1))
def turnOnLastZero(S):
return ((S) | (S + 1))
def turnOffLastConsecutiveBits(S):
return ((S) & (S + 1))
def turnOnLastConsecutiveZeroes(S):
return ((S) | (S-1))
def printSet(vS): # in binary representation
print("S = {} = {:b}".format(vS, vS))
def main():
print("1. Representation (all indexing are 0-based and counted from right)")
S = 34
printSet(S)
print()
print("2. Multiply S by 2, then divide S by 4 (2x2), then by 2")
S = 34
printSet(S)
S = S << 1
printSet(S)
S = S >> 2
printSet(S)
S = S >> 1
printSet(S)
print()
print("3. Set/turn on the 3-rd item of the set")
S = 34
printSet(S)
S = setBit(S, 3)
printSet(S)
print()
print("4. Check if the 3-rd and then 2-nd item of the set is on?")
S = 42
printSet(S)
T = isOn(S, 3)
print("T = {}, {}".format(T, "ON" if T else "OFF"))
T = isOn(S, 2)
print("T = {}, {}".format(T, "ON" if T else "OFF"))
print()
print("5. Clear/turn off the 1-st item of the set")
S = 42
printSet(S)
S = clearBit(S, 1)
printSet(S)
print()
print("6. Toggle the 2-nd item and then 3-rd item of the set")
S = 40
printSet(S)
S = toggleBit(S, 2)
printSet(S)
S = toggleBit(S, 3)
printSet(S)
print()
print("7. Check the first bit from right that is on")
S = 40
printSet(S)
T = lowBit(S)
print("T = {} (this is always a power of 2)".format(T))
S = 52
printSet(S)
T = lowBit(S)
print("T = {} (this is always a power of 2)".format(T))
print();
print("8. Turn on all bits in a set of size n = 6")
S = setAll(6)
printSet(S)
print()
print("9. Other tricks (not shown in the book)")
print("8 % 4 = {}".format(modulo(8, 4)))
print("7 % 4 = {}".format(modulo(7, 4)))
print("6 % 4 = {}".format(modulo(6, 4)))
print("5 % 4 = {}".format(modulo(5, 4)))
print("is {} power of two? {}".format(9, isPowerOfTwo(9)))
print("is {} power of two? {}".format(8, isPowerOfTwo(8)))
print("is {} power of two? {}".format(7, isPowerOfTwo(7)))
for i in range(1, 17):
print("Nearest power of two of {} is {}".format(i, nearestPowerOfTwo(i)))
print("S = {}, turn off last bit in S, S = {}".format(40, turnOffLastBit(40)))
print("S = {}, turn on last zero in S, S = {}".format(41, turnOnLastZero(41)))
print("S = {}, turn off last consecutive bits in S, S = {}".format(39, turnOffLastConsecutiveBits(39)))
print("S = {}, turn on last consecutive zeroes in S, S = {}".format(36, turnOnLastConsecutiveZeroes(36)))
main()
| 24.410853 | 109 | 0.548111 | 505 | 3,149 | 3.417822 | 0.205941 | 0.023175 | 0.012167 | 0.066049 | 0.354577 | 0.275782 | 0.163384 | 0.088065 | 0.088065 | 0.059096 | 0 | 0.045927 | 0.267069 | 3,149 | 128 | 110 | 24.601563 | 0.701906 | 0.020006 | 0 | 0.356436 | 0 | 0 | 0.282193 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.148515 | false | 0 | 0.009901 | 0.128713 | 0.287129 | 0.504951 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 2 |
d27b437268858036988711c7d1801c7fca1f5ee2 | 600 | py | Python | Chapter05/myunittest/tests/tests_myadd/test_myadd3.py | MichaelRW/Python-for-Geeks | a111f61f1a0b077fc0524431e1ccefd9214d5c53 | [
"MIT"
] | 31 | 2020-08-10T22:37:41.000Z | 2022-03-09T21:35:56.000Z | Chapter05/myunittest/tests/tests_myadd/test_myadd3.py | MichaelRW/Python-for-Geeks | a111f61f1a0b077fc0524431e1ccefd9214d5c53 | [
"MIT"
] | null | null | null | Chapter05/myunittest/tests/tests_myadd/test_myadd3.py | MichaelRW/Python-for-Geeks | a111f61f1a0b077fc0524431e1ccefd9214d5c53 | [
"MIT"
] | 21 | 2020-08-10T22:37:44.000Z | 2022-03-07T07:26:28.000Z | #test_myadd3.py test suite for myadd2 class method to validate errors
import unittest
from myunittest.src.myadd.myadd2 import MyAdd
class MyAddTestSuite(unittest.TestCase):
def setUp(self):
self.myadd = MyAdd()
def test_typeerror1(self):
""" test case to check if we can handle non number input"""
self.assertRaises(TypeError, self.myadd.add, 'a' , -5)
def test_typeerror2(self):
""" test case to check if we can handle non number input"""
self.assertRaises(TypeError, self.myadd.add, 'a' , 'b')
if __name__ == '__main__':
unittest.main()
| 30 | 69 | 0.68 | 82 | 600 | 4.841463 | 0.5 | 0.06801 | 0.060453 | 0.070529 | 0.423174 | 0.423174 | 0.423174 | 0.423174 | 0.423174 | 0.423174 | 0 | 0.012685 | 0.211667 | 600 | 19 | 70 | 31.578947 | 0.826638 | 0.29 | 0 | 0 | 0 | 0 | 0.026699 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 1 | 0.272727 | false | 0 | 0.181818 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
d27ca161b8d8ddd1633186f6974d719607573422 | 6,493 | py | Python | src/wissenslandkarte/settings.py | Mattan-Qwer/test1 | 16bc7642a18d632181480644d1f188c9fb3785bc | [
"Apache-2.0"
] | 1 | 2021-07-25T12:46:08.000Z | 2021-07-25T12:46:08.000Z | src/wissenslandkarte/settings.py | Mattan-Qwer/test1 | 16bc7642a18d632181480644d1f188c9fb3785bc | [
"Apache-2.0"
] | 3 | 2021-03-04T21:15:32.000Z | 2021-05-15T22:01:11.000Z | src/wissenslandkarte/settings.py | Mattan-Qwer/test1 | 16bc7642a18d632181480644d1f188c9fb3785bc | [
"Apache-2.0"
] | 2 | 2021-03-17T18:02:58.000Z | 2021-07-15T17:58:28.000Z | """
Django settings for wissenslandkarte project.
Generated by 'django-admin startproject' using Django 3.1.7.
For more information on this file, see
https://docs.djangoproject.com/en/3.1/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/3.1/ref/settings/
"""
from pathlib import Path
from django.core.management.utils import get_random_secret_key
import pickle
import os
import json
# Build paths inside the project like this: BASE_DIR / 'subdir'.
BASE_DIR = Path(__file__).resolve().parent.parent
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/3.1/howto/deployment/checklist/
DEBUG_FILE = BASE_DIR.joinpath("./data/ACTIVATE_DEBUG_MODE")
# If you want to debug; create a file in the directory indicated above.
DEBUG = DEBUG_FILE.exists()
ENABLE_LIVE_JS = DEBUG and False
# this allows to use {% if debug %} in django templates.
INTERNAL_IPS = ['127.0.0.1', '::1']
SECRET_KEY_FILE = BASE_DIR.joinpath("./data/django-secret-key.pickle")
def load_or_create_secret_key() -> str:
# TODO we might want to record hostname and time of the secret creation in this pickle, to allow us to recognize if
# it becomes a constant during docker builds. Also, we might want to delete/recreate it explicitly during
# first startup.
try:
secret = pickle.load(open(SECRET_KEY_FILE, "rb"))
return secret
except FileNotFoundError:
secret = get_random_secret_key()
pickle.dump(secret, open(SECRET_KEY_FILE, "wb"))
return secret
SECRET_KEY = load_or_create_secret_key()
ALLOWED_HOSTS = [
'localhost',
'127.0.0.1',
'[::1]',
'python',
# 'wissenslandkarte.betreiberverein.de',
]
# Application definition
INSTALLED_APPS = [
'django.contrib.admin', # https://docs.djangoproject.com/en/3.2/ref/contrib/admin/
'django.contrib.auth', # https://docs.djangoproject.com/en/3.2/ref/contrib/auth/
'django.contrib.contenttypes', # https://docs.djangoproject.com/en/3.2/ref/contrib/contenttypes/
'django.contrib.sessions', # https://docs.djangoproject.com/en/3.2/topics/http/sessions/
'django.contrib.messages', # https://docs.djangoproject.com/en/3.2/ref/contrib/messages/
'django.contrib.staticfiles', # https://docs.djangoproject.com/en/3.2/ref/contrib/staticfiles/
'fontawesome_5',
'api',
'compliance',
'web_homepage',
'accounts',
]
MIDDLEWARE = [
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
# TODO both of these are not active for static files, and have to be configured in nginx again.
# this applies especially to:
# * CSP on static html files that are served to be displayed
# * User Uploads, e.g. images.
# * HSTS on every ressource.
'django.middleware.security.SecurityMiddleware',
'csp.middleware.CSPMiddleware',
]
ROOT_URLCONF = 'wissenslandkarte.urls'
AUTH_USER_MODEL = "accounts.User"
TEMPLATES = [
{
'BACKEND': 'django.template.backends.jinja2.Jinja2',
'DIRS': ["jinja-templates"],
'APP_DIRS': True,
'OPTIONS': {
'environment': 'web_homepage.jinja.environment'
},
},
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': ["templates"],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'wissenslandkarte.wsgi.application'
# Database
# https://docs.djangoproject.com/en/3.1/ref/settings/#databases
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': BASE_DIR / 'data' / 'db.sqlite3',
}
}
# Password validation
# https://docs.djangoproject.com/en/3.1/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
# TODO we should use a german list for our target group, however, these are difficult to find.
# Use Duden, given names, surnames, sports clubs and qwertz-Keywalks?
# https://docs.djangoproject.com/en/3.1/topics/auth/passwords/#password-validation
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
# 'password_list_path' : '...'
# This file should contain one lowercase password per line and may be plain text or gzipped.
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# Internationalization
# https://docs.djangoproject.com/en/3.1/topics/i18n/
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/3.1/howto/static-files/
STATIC_URL = '/static/'
# STATICFILES_DIRS = ["static"]
STATIC_ROOT = os.getenv("COLLECTSTATIC_DIR", None)
# TODO decide how to handle development environments.
# SECURE_HSTS_SECONDS = 0
# TODO
# [Vue warn]: It seems you are using the standalone build of Vue.js in an environment with Content Security Policy
# that prohibits unsafe-eval. The template compiler cannot work in this environment. Consider relaxing the policy to
# allow unsafe-eval or pre-compiling your templates into render functions.
CSP_DEFAULT_SRC = ("'self'", "'unsafe-eval'")
CSP_IMG_SRC = ("'self'", "https://betreiberverein.de")
if DEBUG:
# this is required for live.js.
CSP_DEFAULT_SRC = ("'self'", "'unsafe-inline'", "'unsafe-eval'")
CSP_SCRIPT_SRC = ("'self'", "'unsafe-inline'", "'unsafe-eval'")
if not DEBUG:
# This should only be enabled from certain docker container builds, due to a quota; and should be configurable by the admin.
# CSP_REPORT_URI = "https://p.report-uri.com/r/d/csp/enforce"
pass
# CSP_REPORT_TO = "default"
| 34.354497 | 128 | 0.696442 | 813 | 6,493 | 5.458795 | 0.383764 | 0.043939 | 0.069401 | 0.078864 | 0.209329 | 0.166742 | 0.118522 | 0.111987 | 0.070978 | 0 | 0 | 0.010676 | 0.17773 | 6,493 | 188 | 129 | 34.537234 | 0.820566 | 0.442939 | 0 | 0.055046 | 1 | 0 | 0.450028 | 0.324537 | 0 | 0 | 0 | 0.005319 | 0 | 1 | 0.009174 | false | 0.055046 | 0.045872 | 0 | 0.073395 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
962a79129b93d61d589f382410c63909adda91f1 | 278 | py | Python | pinax/projects/social_project/__init__.py | skabber/pinax | 6fdee6b7bbbb597074d45122badf3a6dd75e0b92 | [
"MIT"
] | 2 | 2015-12-27T23:07:51.000Z | 2016-05-09T08:57:28.000Z | pinax/projects/social_project/__init__.py | SMiGL/pinax | d08b2655fe661566bd13c5c170b1a4cad9e67a1d | [
"MIT"
] | null | null | null | pinax/projects/social_project/__init__.py | SMiGL/pinax | d08b2655fe661566bd13c5c170b1a4cad9e67a1d | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
__about__ = """
This project comes fully-featured, with everything that Pinax provides enabled
by default. It provides all tabs available, etc. From here you can remove
applications that you do not want to use, and add your own applications as well.
""" | 39.714286 | 80 | 0.748201 | 43 | 278 | 4.744186 | 0.906977 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004329 | 0.169065 | 278 | 7 | 81 | 39.714286 | 0.878788 | 0.07554 | 0 | 0 | 0 | 0 | 0.917969 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
963e63a165b779f9781423640c6ce74225c96b97 | 202 | py | Python | blog/accounts/forms.py | dasap89/db2_blog | 01a2eeb7b1fa3a91311c87024e5416f66603aa3a | [
"MIT"
] | null | null | null | blog/accounts/forms.py | dasap89/db2_blog | 01a2eeb7b1fa3a91311c87024e5416f66603aa3a | [
"MIT"
] | null | null | null | blog/accounts/forms.py | dasap89/db2_blog | 01a2eeb7b1fa3a91311c87024e5416f66603aa3a | [
"MIT"
] | null | null | null | import account.forms
class SignupForm(account.forms.SignupForm):
def __init__(self, *args, **kwargs):
super(SignupForm, self).__init__(*args, **kwargs)
del self.fields["username"] | 25.25 | 57 | 0.683168 | 23 | 202 | 5.652174 | 0.608696 | 0.184615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173267 | 202 | 8 | 58 | 25.25 | 0.778443 | 0 | 0 | 0 | 0 | 0 | 0.039409 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
964ba521c54eceed3f8228a16e627611ff524ce6 | 1,337 | py | Python | patterns/coursera_training/abc.py | adrianoff/python_learning | 428d7eded162c034287b85dbe00b54bdfe9fc3a6 | [
"MIT"
] | null | null | null | patterns/coursera_training/abc.py | adrianoff/python_learning | 428d7eded162c034287b85dbe00b54bdfe9fc3a6 | [
"MIT"
] | null | null | null | patterns/coursera_training/abc.py | adrianoff/python_learning | 428d7eded162c034287b85dbe00b54bdfe9fc3a6 | [
"MIT"
] | null | null | null | import math
import abc
class Base(abc.ABC):
def __init__(self, data, result):
self.data = data
self.result = result
def get_answer(self):
return [int(x >= 0.5) for x in self.data]
def get_score(self):
ans = self.get_answer()
return sum([int(x == y) for (x, y) in zip(ans, self.result)]) \
/ len(ans)
@abc.abstractmethod
def get_loss(self):
pass
class A(Base):
def get_loss(self):
return sum(
[(x - y) * (x - y) for (x, y) in zip(self.data, self.result)])
class B(Base):
def get_loss(self):
return -sum([
y * math.log(x) + (1 - y) * math.log(1 - x)
for (x, y) in zip(self.data, self.result)
])
def get_pre(self):
ans = self.get_answer()
res = [int(x == 1 and y == 1) for (x, y) in zip(ans, self.result)]
return sum(res) / sum(ans)
def get_rec(self):
ans = self.get_answer()
res = [int(x == 1 and y == 1) for (x, y) in zip(ans, self.result)]
return sum(res) / sum(self.result)
def get_score(self):
pre = self.get_pre()
rec = self.get_rec()
return 2 * pre * rec / (pre + rec)
class C(Base):
def get_loss(self):
return sum([abs(x - y) for (x, y) in zip(self.data, self.result)])
| 22.661017 | 74 | 0.521316 | 209 | 1,337 | 3.248804 | 0.186603 | 0.029455 | 0.044183 | 0.061856 | 0.5243 | 0.494845 | 0.494845 | 0.372607 | 0.338733 | 0.297496 | 0 | 0.009956 | 0.323859 | 1,337 | 58 | 75 | 23.051724 | 0.74115 | 0 | 0 | 0.275 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.025 | 0.05 | 0.1 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
965503cf9a9866bb5569655b0f44555dd6af56b1 | 248 | py | Python | views/test/urls.py | Cjwpython/WordlessBook | 3426ccf3ab2f8848caef98bbc7635407774d32b2 | [
"MIT"
] | 2 | 2021-05-19T10:53:25.000Z | 2022-01-20T01:20:08.000Z | views/test/urls.py | Cjwpython/WordlessBook | 3426ccf3ab2f8848caef98bbc7635407774d32b2 | [
"MIT"
] | null | null | null | views/test/urls.py | Cjwpython/WordlessBook | 3426ccf3ab2f8848caef98bbc7635407774d32b2 | [
"MIT"
] | 1 | 2022-01-20T01:19:56.000Z | 2022-01-20T01:19:56.000Z | # coding: utf-8
from apps import app
from .views import *
url_prefix = "/abc/123"
app.add_url_rule("/index", view_func=index1, methods=['GET', ], strict_slashes=False)
app.add_url_rule("/app", view_func=Task.as_view("apps"), strict_slashes=False)
| 31 | 85 | 0.737903 | 41 | 248 | 4.219512 | 0.609756 | 0.069364 | 0.104046 | 0.150289 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022222 | 0.092742 | 248 | 7 | 86 | 35.428571 | 0.746667 | 0.052419 | 0 | 0 | 0 | 0 | 0.107296 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
965bd57cd61f04cde6a2083dfb9b92df9ead251c | 1,609 | py | Python | Python3/982.py | rakhi2001/ecom7 | 73790d44605fbd51e8f7e804b9808e364fcfc680 | [
"MIT"
] | 854 | 2018-11-09T08:06:16.000Z | 2022-03-31T06:05:53.000Z | Python3/982.py | rakhi2001/ecom7 | 73790d44605fbd51e8f7e804b9808e364fcfc680 | [
"MIT"
] | 29 | 2019-06-02T05:02:25.000Z | 2021-11-15T04:09:37.000Z | Python3/982.py | rakhi2001/ecom7 | 73790d44605fbd51e8f7e804b9808e364fcfc680 | [
"MIT"
] | 347 | 2018-12-23T01:57:37.000Z | 2022-03-12T14:51:21.000Z | __________________________________________________________________________________________________
sample 476 ms submission
class Solution:
def countTriplets(self, A: List[int]) -> int:
n=len(A)
tmp=[bin(a)[2:].zfill(16) for a in A]
one={}
for i,a in enumerate(zip(*tmp)):
one[i]=set([i for i,v in enumerate(list(a)) if v=='1'])
Venn = collections.defaultdict(list)
cnt = 0
for j in range(len(one)):
if len(one[j]) != 0:
cnt += (len(one[j]))**3
for i in range(j, 0, -1):
for prv in Venn[i]:
intersec = prv & one[j]
if len(intersec) != 0:
cnt += ((-1)**i)*(len(intersec))**3
Venn[i+1].append(intersec)
Venn[1].append(one[j])
return n**3 - cnt
__________________________________________________________________________________________________
sample 16336 kb submission
class Solution:
def countTriplets(self, A: 'List[int]') -> 'int':
counts = {}
for a1 in A:
for a2 in A:
x = a1 & a2
if x in counts:
counts[x] += 1
else:
counts[x] = 1
tot = 0
for key in counts:
for a in A:
if a & key == 0:
tot += counts[key]
return tot
__________________________________________________________________________________________________
| 36.568182 | 98 | 0.505283 | 165 | 1,609 | 3.145455 | 0.30303 | 0.023121 | 0.088632 | 0.100193 | 0.208092 | 0.208092 | 0.208092 | 0.208092 | 0.208092 | 0.208092 | 0 | 0.0316 | 0.390305 | 1,609 | 43 | 99 | 37.418605 | 0.497452 | 0 | 0 | 0.121951 | 0 | 0 | 0.00808 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9662c573ff191e3785a992a2ce4d1b062e49045b | 1,178 | py | Python | tests/utils_tests.py | WhiteApfel/python-sdk | 66545c42759e5638b9a0b025da663d5f1f10bc3d | [
"MIT"
] | 1 | 2021-10-13T11:47:25.000Z | 2021-10-13T11:47:25.000Z | tests/utils_tests.py | WhiteApfel/python-sdk | 66545c42759e5638b9a0b025da663d5f1f10bc3d | [
"MIT"
] | null | null | null | tests/utils_tests.py | WhiteApfel/python-sdk | 66545c42759e5638b9a0b025da663d5f1f10bc3d | [
"MIT"
] | null | null | null | from __future__ import absolute_import, unicode_literals
from pyfondy import utils
from .tests_helper import TestCase
class UtilTest(TestCase):
def setUp(self):
self.data = self.get_dummy_data()
def test_to_xml(self):
xml = utils.to_xml(self.data['checkout_data'])
self.assertEqual(xml, '<?xml version="1.0" encoding="UTF-8"?><amount>100</amount><currency>USD</currency>')
def test_from_xml(self):
xml = utils.to_xml({'req': self.data['checkout_data']})
json = utils.from_xml(xml)
self.assertEqual(json, {'req': self.data['checkout_data']})
def test_to_form(self):
form = utils.to_form(self.data['checkout_data'])
self.assertEqual(form, 'amount=100¤cy=USD')
def test_from_from(self):
form = utils.to_form(self.data['checkout_data'])
json = utils.from_form(form)
self.assertEqual(json, self.data['checkout_data'])
def test_join_url(self):
joined_url = utils.join_url("checkout", "order")
self.assertEqual(joined_url, "checkout/order")
joined_url = utils.join_url("order", "/3ds")
self.assertEqual(joined_url, "order/3ds")
| 35.69697 | 115 | 0.666384 | 160 | 1,178 | 4.675 | 0.2625 | 0.074866 | 0.128342 | 0.160428 | 0.42246 | 0.358289 | 0.165775 | 0.104278 | 0.104278 | 0 | 0 | 0.011518 | 0.189304 | 1,178 | 32 | 116 | 36.8125 | 0.771728 | 0 | 0 | 0.08 | 0 | 0.04 | 0.198642 | 0.072156 | 0 | 0 | 0 | 0 | 0.24 | 1 | 0.24 | false | 0 | 0.12 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9664d8e999cd7dee5df06b6a6e3c21b141dec21b | 223 | py | Python | screens/HomeScreen.py | inuitwallet/bippy | fc21da4cdbe02cd9d7e73ff6e57e957d033b235e | [
"MIT"
] | 3 | 2015-03-05T08:24:04.000Z | 2015-11-05T11:32:17.000Z | screens/HomeScreen.py | inuitwallet/bippy | fc21da4cdbe02cd9d7e73ff6e57e957d033b235e | [
"MIT"
] | 2 | 2015-09-17T17:00:37.000Z | 2021-04-15T11:19:58.000Z | screens/HomeScreen.py | inuitwallet/bippy | fc21da4cdbe02cd9d7e73ff6e57e957d033b235e | [
"MIT"
] | 1 | 2015-09-17T15:47:13.000Z | 2015-09-17T15:47:13.000Z | from kivy.uix.screenmanager import Screen
class HomeScreen(Screen):
"""
The Welcome Screen
"""
def __init__(self, BippyApp, **kwargs):
super(HomeScreen, self).__init__(**kwargs)
self.BippyApp = BippyApp
return
| 18.583333 | 44 | 0.721973 | 26 | 223 | 5.884615 | 0.653846 | 0.156863 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152466 | 223 | 11 | 45 | 20.272727 | 0.809524 | 0.080717 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
9666365b172ca9c176be0c4e447a6acd73e37189 | 611 | py | Python | test/mynet.py | wnma3mz/flearn | df3c837bb164ec81736a3a64aaa85f574e7f67fa | [
"Apache-2.0"
] | 6 | 2021-11-11T15:09:28.000Z | 2022-03-16T02:15:06.000Z | test/mynet.py | wnma3mz/flearn | df3c837bb164ec81736a3a64aaa85f574e7f67fa | [
"Apache-2.0"
] | null | null | null | test/mynet.py | wnma3mz/flearn | df3c837bb164ec81736a3a64aaa85f574e7f67fa | [
"Apache-2.0"
] | null | null | null | import os
import torch
import torch.nn as nn
import torch.optim as optim
from flearn.client import net
class MyNet(net):
def __init__(self, model_fpath, init_model_name):
super(MyNet, self).__init__(model_fpath, init_model_name)
self.criterion = nn.CrossEntropyLoss()
def get(self):
seq = False
# net_local = MLP(28 * 28, 10) # mnist
net_local = MLP(3 * 224 * 224, 2) # covid2019
torch.save(net_local.state_dict(), self.init_model_name)
self.optimizer = optim.SGD(net_local.parameters(), lr=1e-3, momentum=0.9)
return net_local, seq
| 27.772727 | 81 | 0.666121 | 90 | 611 | 4.277778 | 0.5 | 0.103896 | 0.101299 | 0.098701 | 0.119481 | 0 | 0 | 0 | 0 | 0 | 0 | 0.046709 | 0.229133 | 611 | 21 | 82 | 29.095238 | 0.770701 | 0.07365 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.333333 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
967008edf71e27cbfe4cf0591a9f93b881200c53 | 308 | py | Python | tensorbay/opendataset/Flower/__init__.py | machearn/tensorbay-python-sdk | 5c96a5f4c0028c7bec0764f2d0142b29597ec3a9 | [
"MIT"
] | 73 | 2021-02-24T12:23:26.000Z | 2022-03-12T13:00:31.000Z | tensorbay/opendataset/Flower/__init__.py | machearn/tensorbay-python-sdk | 5c96a5f4c0028c7bec0764f2d0142b29597ec3a9 | [
"MIT"
] | 681 | 2021-02-25T07:34:17.000Z | 2022-03-25T07:08:23.000Z | tensorbay/opendataset/Flower/__init__.py | machearn/tensorbay-python-sdk | 5c96a5f4c0028c7bec0764f2d0142b29597ec3a9 | [
"MIT"
] | 35 | 2021-02-24T12:00:45.000Z | 2022-03-30T06:43:13.000Z | #!/usr/bin/env python3
#
# Copyright 2021 Graviti. Licensed under MIT License.
#
# pylint: disable=invalid-name
"""Dataloaders of the 17 Category Flower dataset and the 102 Category Flower dataset."""
from tensorbay.opendataset.Flower.loader import Flower17, Flower102
__all__ = ["Flower17", "Flower102"]
| 25.666667 | 88 | 0.762987 | 39 | 308 | 5.923077 | 0.820513 | 0.121212 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074627 | 0.12987 | 308 | 11 | 89 | 28 | 0.787313 | 0.600649 | 0 | 0 | 0 | 0 | 0.150442 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
9674136b297de5ffdecd52581b0046e7c24ac355 | 1,290 | py | Python | notion.py | takez0o/neustrukt-web | dbce557ae401759efba2f47b9b3fe9e01ef69000 | [
"MIT"
] | null | null | null | notion.py | takez0o/neustrukt-web | dbce557ae401759efba2f47b9b3fe9e01ef69000 | [
"MIT"
] | null | null | null | notion.py | takez0o/neustrukt-web | dbce557ae401759efba2f47b9b3fe9e01ef69000 | [
"MIT"
] | null | null | null |
## notion stuff
# notion_token = os.environ["NOTION_TOKEN"]
# sub_list_url = os.environ["NOTION_SUBS_PAGE"]
# messages_url = os.environ["NOTION_MESSAGES_PAGE"]
# app_list_url = os.environ["NOTION_APPLICATIONS_PAGE"]
# client = NotionClient(token_v2=notion_token)
# collection_view = client.get_collection_view(sub_list_url)
# messages_view = client.get_collection_view(messages_url)
# applications_view = client.get_collection_view(app_list_url)
# def updateNotion(id,email):
# new_row = collection_view.collection.add_row()
# new_row.id = str(id)
# new_row.email = str(email)
# def updateNotionMessages(id, firstname, lastname, email, message):
# new_row = messages_view.collection.add_row()
# new_row.id = str(id)
# new_row.firstname = str(firstname)
# new_row.lastname = str(lastname)
# new_row.email = str(email)
# new_row.message = str(message)
# def updateNotionApplications(id, firstname, lastname, email, profession, message, cv_url):
# new_row = applications_view.collection.add_row()
# new_row.id = str(id)
# new_row.firstname = str(firstname)
# new_row.lastname = str(lastname)
# new_row.email = str(email)
# new_row.profession = str(profession)
# new_row.message = str(message)
# new_row.cv = str(cv_url) | 36.857143 | 92 | 0.724031 | 176 | 1,290 | 5 | 0.193182 | 0.115909 | 0.068182 | 0.061364 | 0.496591 | 0.294318 | 0.294318 | 0.294318 | 0.294318 | 0.294318 | 0 | 0.000913 | 0.151163 | 1,290 | 35 | 93 | 36.857143 | 0.80274 | 0.949612 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.