hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
c29de757d5cc39eac04944228c8bcefe9b5042da | 544 | py | Python | net/dataset/build.py | sdjsngs/Cross-Epoch-Learning-for-Weakly-Supervised-Anomaly-Detection-in-Surveillance-Videos | f734db8d440f2974cb6b4234b30da6856ef62ce3 | [
"MIT"
] | 3 | 2021-07-30T04:45:08.000Z | 2022-02-23T12:44:16.000Z | net/dataset/build.py | sdjsngs/Cross-Epoch-Learning-for-Weakly-Supervised-Anomaly-Detection-in-Surveillance-Videos | f734db8d440f2974cb6b4234b30da6856ef62ce3 | [
"MIT"
] | null | null | null | net/dataset/build.py | sdjsngs/Cross-Epoch-Learning-for-Weakly-Supervised-Anomaly-Detection-in-Surveillance-Videos | f734db8d440f2974cb6b4234b30da6856ef62ce3 | [
"MIT"
] | 3 | 2021-07-30T09:26:45.000Z | 2022-03-16T15:31:41.000Z | """
model construction function
"""
import torch
import torch.nn as nn
from fvcore.common.registry import Registry
DATASET_REGISTRY=Registry("DATASET")
def build_dataset(dataset_name,mode,cfg,):
"""
:param cfg:
:param dataset_name: avenue
:param mode: train /test
:return:
"""
# print("MODEL_REGISTRY", MODEL_REGISTRY.__dict__)
# name=dataset_name.capitalize()
# init model with xavier
return DATASET_REGISTRY.get(dataset_name)(mode,cfg)
if __name__=="__main__":
print("dataset register")
| 19.428571 | 55 | 0.698529 | 66 | 544 | 5.439394 | 0.484848 | 0.122563 | 0.083565 | 0.100279 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.185662 | 544 | 27 | 56 | 20.148148 | 0.810384 | 0.380515 | 0 | 0 | 0 | 0 | 0.103679 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.625 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
c2d0b298d3d676435e879f7ed92bd0b9efbaf94d | 942 | py | Python | skip/migrations/0012_auto_20210430_1827.py | LCOGT/skip | 2524ba71c39876aae8a31fff3de55e6cb7aa1f83 | [
"BSD-3-Clause"
] | null | null | null | skip/migrations/0012_auto_20210430_1827.py | LCOGT/skip | 2524ba71c39876aae8a31fff3de55e6cb7aa1f83 | [
"BSD-3-Clause"
] | 4 | 2020-09-10T20:31:54.000Z | 2022-02-27T18:40:23.000Z | skip/migrations/0012_auto_20210430_1827.py | scimma/skip | aa9437d8c4f7d5edbffaec20e6651339241bbb95 | [
"BSD-3-Clause"
] | null | null | null | # Generated by Django 3.1 on 2021-04-30 18:27
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('skip', '0011_auto_20210430_1746'),
]
operations = [
migrations.RemoveIndex(
model_name='alert',
name='alert_timestamp_idx',
),
migrations.RenameField(
model_name='alert',
old_name='alert_identifier',
new_name='identifier',
),
migrations.RenameField(
model_name='alert',
old_name='alert_timestamp',
new_name='timestamp',
),
migrations.RenameField(
model_name='event',
old_name='event_identifier',
new_name='identifier',
),
migrations.AddIndex(
model_name='alert',
index=models.Index(fields=['timestamp'], name='timestamp_idx'),
),
]
| 25.459459 | 75 | 0.553079 | 86 | 942 | 5.825581 | 0.453488 | 0.125749 | 0.111776 | 0.179641 | 0.315369 | 0.187625 | 0.187625 | 0.187625 | 0 | 0 | 0 | 0.047771 | 0.333333 | 942 | 36 | 76 | 26.166667 | 0.75 | 0.045648 | 0 | 0.466667 | 1 | 0 | 0.188406 | 0.025641 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.033333 | 0 | 0.133333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c2f471f5a2e273595e92144d174b5fef7a299b8d | 269 | py | Python | cvm/commands/command.py | composer-version-manager/cvm | 291137c8717fd63d869557667d012e25d1d66142 | [
"MIT"
] | 1 | 2021-07-13T03:24:52.000Z | 2021-07-13T03:24:52.000Z | cvm/commands/command.py | composer-version-manager/cvm | 291137c8717fd63d869557667d012e25d1d66142 | [
"MIT"
] | 1 | 2021-03-02T19:27:16.000Z | 2021-03-02T19:47:38.000Z | cvm/commands/command.py | composer-version-manager/cvm | 291137c8717fd63d869557667d012e25d1d66142 | [
"MIT"
] | 1 | 2021-11-13T11:18:40.000Z | 2021-11-13T11:18:40.000Z | from abc import ABC, abstractmethod
from argparse import Action, Namespace
class Command(ABC):
@abstractmethod
def exec(self, args: Namespace) -> None:
pass
@staticmethod
@abstractmethod
def define_signature(parser: Action):
pass
| 19.214286 | 44 | 0.687732 | 29 | 269 | 6.344828 | 0.655172 | 0.184783 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.241636 | 269 | 13 | 45 | 20.692308 | 0.901961 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0.2 | 0.2 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
c2f85d8551a7acce9dc52ba83afcde3d15a86ced | 132 | py | Python | CursoEmVideo/ex049.py | ElivanLimaJunior/Python | 57c277f3ec0da06d6c8aa125b50d01a5ab88934d | [
"MIT"
] | null | null | null | CursoEmVideo/ex049.py | ElivanLimaJunior/Python | 57c277f3ec0da06d6c8aa125b50d01a5ab88934d | [
"MIT"
] | null | null | null | CursoEmVideo/ex049.py | ElivanLimaJunior/Python | 57c277f3ec0da06d6c8aa125b50d01a5ab88934d | [
"MIT"
] | null | null | null | n1 = int(input('Digite um número para exibir sua tabuada: '))
for c in range(1, 10+1):
print('{} x {} = {}'.format(n1, c, n1*c)) | 44 | 61 | 0.583333 | 24 | 132 | 3.208333 | 0.791667 | 0.077922 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065421 | 0.189394 | 132 | 3 | 62 | 44 | 0.654206 | 0 | 0 | 0 | 0 | 0 | 0.406015 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6c12a9a7ebf97edad3b76c53dc36c8709acb3e03 | 3,066 | py | Python | build/lib/npd_wraper/npd_fields.py | miroine/npd_data | a7c95ef97087ead01557b6f6fd28df028ffdf349 | [
"MIT"
] | 6 | 2020-09-24T07:10:20.000Z | 2022-02-16T13:49:11.000Z | build/lib/npd_wraper/npd_fields.py | AnneEstoppey/npd_data | a7c95ef97087ead01557b6f6fd28df028ffdf349 | [
"MIT"
] | null | null | null | build/lib/npd_wraper/npd_fields.py | AnneEstoppey/npd_data | a7c95ef97087ead01557b6f6fd28df028ffdf349 | [
"MIT"
] | 2 | 2020-09-24T07:11:34.000Z | 2020-10-08T11:39:03.000Z | from .npd_wraper import npd
from datetime import datetime
import pandas as pd
class field(npd):
def get_field_production_monthly(self):
'''
get monthly production
'''
url_dataset=self.npd_path+"field/production-monthly-by-field"
df = self._get_dataframe_data(url_dataset)
df["Date"] = df.apply(lambda row: datetime(int(row['prfYear']),int(row['prfMonth']), 1),axis=1)
df["Date"]=pd.to_datetime(df.Date)
df.set_index("Date", inplace=True)
cols = ["prfInformationCarrier", "prfPrdOilNetMillSm3", "prfPrdGasNetBillSm3",
"prfPrdNGLNetMillSm3", "prfPrdCondensateNetMillSm3", "prfPrdOeNetMillSm3",
"prfPrdProducedWaterInFieldMillSm3"]
return df[cols]
def get_field_production_yearly(self):
'''
return: production yearly data
'''
url_dataset = self.npd_path+"field/production-yearly-by-field"
return self._get_dataframe_data(url_dataset).set_index('prfYear')
def get_field_cumulative_production(self):
'''
return: cumulative production
'''
url_dataset=self.npd_path+"field/production-yearly-total"
return self._get_dataframe_data(url_dataset).set_index('prfYear')
def get_field_description(self):
'''
return: field description
'''
url_dataset=self.npd_path+"field/description"
return self._get_dataframe_data(url_dataset)
def get_field_inplace_volume(self):
'''
return: get field in place volume
'''
url_dataset= self.npd_path+"field/in-place-volumes"
return self._get_dataframe_data(url_dataset)
def get_field_licenses(self):
'''
return field licensees
'''
url_dataset= self.npd_path+"field/licensees"
return self._get_dataframe_data(url_dataset)
def get_field_operators(self):
'''
return field operators
'''
url_dataset= self.npd_path+"field/operators"
return self._get_dataframe_data(url_dataset)
def get_field_overview(self):
'''
return: field overview
'''
url_dataset = self.npd_path+"field/overview"
return self._get_dataframe_data(url_dataset)
def get_field_owners(self):
'''
return field owners
'''
url_dataset=self.npd_path+"field/owners"
return self._get_dataframe_data(url_dataset)
def get_field_reserves(self):
'''
return field reserves
'''
url_dataset=self.npd_path+"field/reserves"
return self._get_dataframe_data(url_dataset)
def get_field_status(self):
'''
return field status
'''
url_dataset=self.npd_path+"field/status"
return self._get_dataframe_data(url_dataset)
def get_field_investments(self):
'''
get field investment yearly
'''
url_dataset=self.npd_path+"investments/yearly-by-field"
return self._get_dataframe_data(url_dataset)
| 29.76699 | 103 | 0.640248 | 347 | 3,066 | 5.354467 | 0.178674 | 0.129171 | 0.097955 | 0.109795 | 0.513994 | 0.502691 | 0.374596 | 0.374596 | 0.302476 | 0.302476 | 0 | 0.00351 | 0.256686 | 3,066 | 102 | 104 | 30.058824 | 0.81176 | 0.0985 | 0 | 0.234043 | 0 | 0 | 0.176329 | 0.089775 | 0 | 0 | 0 | 0 | 0 | 1 | 0.255319 | false | 0 | 0.06383 | 0 | 0.595745 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6c13935ba2ad975f222b50a4f1921d884ba2c0bb | 684 | py | Python | tests/test_strblackout.py | jojoee/strblackout | d813975afe157fe245aab26a50d3935e8e70e6fa | [
"MIT"
] | 2 | 2021-09-14T12:55:30.000Z | 2021-11-08T10:15:54.000Z | tests/test_strblackout.py | jojoee/strblackout | d813975afe157fe245aab26a50d3935e8e70e6fa | [
"MIT"
] | null | null | null | tests/test_strblackout.py | jojoee/strblackout | d813975afe157fe245aab26a50d3935e8e70e6fa | [
"MIT"
] | null | null | null | import unittest
from strblackout import blackout
class TestBlackout(unittest.TestCase):
def test_blackout_default(self):
self.assertEqual(blackout("123456789"), "123456789")
def test_blackout_left(self):
self.assertEqual(blackout("123456789", left=5), "*****6789")
def test_blackout_right(self):
self.assertEqual(blackout("123456789", right=3), "123456***")
def test_blackout_replacement(self):
self.assertEqual(blackout("123456789", left=3, replacement="x"), "xxx456789")
def test_blackout_short_text(self):
self.assertEqual(blackout("123", left=10, right=20), "***")
if __name__ == "__main__":
unittest.main()
| 28.5 | 85 | 0.688596 | 77 | 684 | 5.87013 | 0.402597 | 0.077434 | 0.165929 | 0.298673 | 0.336283 | 0.176991 | 0 | 0 | 0 | 0 | 0 | 0.123909 | 0.162281 | 684 | 23 | 86 | 29.73913 | 0.664921 | 0 | 0 | 0 | 0 | 0 | 0.127193 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | false | 0 | 0.133333 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6c22b37b04b34eb5570518d6fac6b8f5e3e85790 | 229 | py | Python | python/ql/test/3/library-tests/PointsTo/regressions/subprocess-assert/mwe_failure.py | vadi2/codeql | a806a4f08696d241ab295a286999251b56a6860c | [
"MIT"
] | 4,036 | 2020-04-29T00:09:57.000Z | 2022-03-31T14:16:38.000Z | python/ql/test/3/library-tests/PointsTo/regressions/subprocess-assert/mwe_failure.py | vadi2/codeql | a806a4f08696d241ab295a286999251b56a6860c | [
"MIT"
] | 2,970 | 2020-04-28T17:24:18.000Z | 2022-03-31T22:40:46.000Z | python/ql/test/3/library-tests/PointsTo/regressions/subprocess-assert/mwe_failure.py | ScriptBox99/github-codeql | 2ecf0d3264db8fb4904b2056964da469372a235c | [
"MIT"
] | 794 | 2020-04-29T00:28:25.000Z | 2022-03-30T08:21:46.000Z | import subprocess
assert subprocess.call(['run-backup']) == 0
class TestCase:
pass
class MyTest(TestCase):
pass
# found by /home/rasmus/code/ql/python/ql/test/query-tests/Statements/asserts/AssertLiteralConstant.qlref
| 20.818182 | 105 | 0.759825 | 30 | 229 | 5.8 | 0.833333 | 0.137931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004951 | 0.117904 | 229 | 10 | 106 | 22.9 | 0.856436 | 0.449782 | 0 | 0.333333 | 0 | 0 | 0.080645 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0 | true | 0.333333 | 0.166667 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
6c2e6348457187107524c78bcb3f263434b05deb | 223 | py | Python | tests/testing-populational.py | D3Mlab/ppandas | 98a35f4fe6f43e865870f9af3a83ba08e894db53 | [
"MIT"
] | 1 | 2021-05-04T01:26:27.000Z | 2021-05-04T01:26:27.000Z | tests/testing-populational.py | D3Mlab/ppandas | 98a35f4fe6f43e865870f9af3a83ba08e894db53 | [
"MIT"
] | null | null | null | tests/testing-populational.py | D3Mlab/ppandas | 98a35f4fe6f43e865870f9af3a83ba08e894db53 | [
"MIT"
] | null | null | null | import pandas as pd
from ppandas import PDataFrame
df1 = pd.read_csv("testing/populational1.csv")
df1 = df1.drop(columns=["Gender"])
pd1 = PDataFrame.from_populational_data(["Age"],df1,600)
pd1.visualise(show_tables=True) | 27.875 | 56 | 0.775785 | 33 | 223 | 5.121212 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04902 | 0.085202 | 223 | 8 | 57 | 27.875 | 0.779412 | 0 | 0 | 0 | 0 | 0 | 0.151786 | 0.111607 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
6c30e5e73fec25e8740dfeaae58851d357ac9539 | 563 | py | Python | corehq/apps/userreports/ui/widgets.py | kkrampa/commcare-hq | d64d7cad98b240325ad669ccc7effb07721b4d44 | [
"BSD-3-Clause"
] | 1 | 2020-05-05T13:10:01.000Z | 2020-05-05T13:10:01.000Z | corehq/apps/userreports/ui/widgets.py | kkrampa/commcare-hq | d64d7cad98b240325ad669ccc7effb07721b4d44 | [
"BSD-3-Clause"
] | 1 | 2019-12-09T14:00:14.000Z | 2019-12-09T14:00:14.000Z | corehq/apps/userreports/ui/widgets.py | MaciejChoromanski/commcare-hq | fd7f65362d56d73b75a2c20d2afeabbc70876867 | [
"BSD-3-Clause"
] | 5 | 2015-11-30T13:12:45.000Z | 2019-07-01T19:27:07.000Z | from __future__ import absolute_import
import json
from django import forms
import six
from corehq.util.python_compatibility import soft_assert_type_text
class JsonWidget(forms.Textarea):
def render(self, name, value, attrs=None, renderer=None):
if isinstance(value, six.string_types):
soft_assert_type_text(value)
# It's probably invalid JSON
return super(JsonWidget, self).render(name, value, attrs, renderer)
return super(JsonWidget, self).render(name, json.dumps(value, indent=2), attrs, renderer)
| 31.277778 | 97 | 0.726465 | 74 | 563 | 5.351351 | 0.540541 | 0.050505 | 0.070707 | 0.090909 | 0.176768 | 0.176768 | 0 | 0 | 0 | 0 | 0 | 0.002193 | 0.190053 | 563 | 17 | 98 | 33.117647 | 0.866228 | 0.046181 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 1 | 0.090909 | false | 0 | 0.454545 | 0 | 0.818182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
6c38ed5e4a72c7ba22df2b1c46329e925986882d | 333 | py | Python | agent/utils.py | JIElite/RL-gridworld | dd8a84b20f0bb19b4423ceeb5738155c86a52f4a | [
"MIT"
] | 3 | 2018-01-01T18:03:53.000Z | 2019-05-14T09:24:05.000Z | agent/utils.py | JIElite/RL-gridworld | dd8a84b20f0bb19b4423ceeb5738155c86a52f4a | [
"MIT"
] | null | null | null | agent/utils.py | JIElite/RL-gridworld | dd8a84b20f0bb19b4423ceeb5738155c86a52f4a | [
"MIT"
] | null | null | null |
def soft_update_network(target, source, tau):
for target_param, source_param in zip(target.parameters(), source.parameters()):
target_param.data.copy_(
target_param.data * (1 - tau) + source_param.data * tau
)
def hard_update_network(target, source):
target.load_state_dict(source.state_dict()) | 33.3 | 84 | 0.696697 | 44 | 333 | 4.977273 | 0.431818 | 0.150685 | 0.173516 | 0.228311 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003704 | 0.189189 | 333 | 10 | 85 | 33.3 | 0.807407 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6c61014d8c361599717d4da618b54d5d7faa501c | 4,496 | py | Python | mywork_python/prohl.py | DongZhizhen/Rotor-Dynamic | fdcbea02b81af42eaddfac73a36d0a014d71a981 | [
"MIT"
] | null | null | null | mywork_python/prohl.py | DongZhizhen/Rotor-Dynamic | fdcbea02b81af42eaddfac73a36d0a014d71a981 | [
"MIT"
] | null | null | null | mywork_python/prohl.py | DongZhizhen/Rotor-Dynamic | fdcbea02b81af42eaddfac73a36d0a014d71a981 | [
"MIT"
] | null | null | null | """
prohl.py
Description:
This program is used to calculate the rotor mode, based on the transfer matrix method.
The following code shows the calculation method of Prohl.
All the codes had been written by Zhizhen Dong in 2022
"""
import math
import matplotlib.pyplot as plt
import numpy as np
from pylab import mpl
from tqdm import tqdm
mpl.rcParams['font.sans-serif'] = ['SimHei']
mpl.rcParams['axes.unicode_minus'] = False
l1 = 0.12
d = 0.04
A = math.pi * d * d / 4
D = 0.5 # 盘直径
h = 0.025 # 盘厚度
a = 1
u = 0.3
rou = 7800 # 密度
E = 2.0e11 # 弹性模量
G = E / (2 * (1 + u)) # 切变模量
I = math.pi * (d ** 4) / 64 #
K1 = 2.0e7 # 弹簧刚度
v1 = 6 * E * I / (a * G * A * l1 * l1)
mi = rou * math.pi * D ** 2 / 4
Jp = mi * D ** 2 / 8
Jd = Jp / 2
Ji = Jp - Jd
L = np.array([l1, l1, l1, l1, l1, l1, l1, l1, l1, l1, l1, l1, l1, 0, 0])
M = np.array([0, mi, mi, mi, mi, mi, mi, 0, 0, 0, 0, 0, mi, mi, 0])
K = np.array([K1, 0, 0, 0, 0, 0, 0, K1, 0, 0, 0, K1, 0, 0, 0])
v = np.array([v1, v1, v1, v1, v1, v1, v1, v1, v1, v1, v1, v1, v1, 0, 0])
J = np.array([0, Ji, Ji, Ji, Ji, Ji, Ji, 0, 0, 0, 0, 0, Ji, Ji, 0])
k = 0
T = np.zeros((len(L), 4, 4))
Z = np.mat(np.zeros((4, 4)))
H = np.mat(np.zeros((4, 4)))
Tit = ['第一阶模态', '第二阶模态', '第三阶模态']
wi = []
ni = []
def prohl_T(w):
for i in range(15):
T[i, :, :] = np.mat([[1 + (L[i] ** 3) * (1 - v[i]) * (M[i] * w ** 2 - K[i]) / (6 * E * I),
L[i] + L[i] ** 2 * J[i] * w ** 2 / (2 * E * I), L[i] ** 2 / (2 * E * I),
L[i] ** 3 * (1 - v[i]) / (6 * E * I)],
[(L[i] ** 2) * (M[i] * w ** 2 - K[i]) / (2 * E * I), 1 + L[i] * J[i] * w ** 2 / (E * I),
L[i] / (E * I), L[i] ** 2 / (2 * E * I)],
[L[i] * (M[i] * w ** 2 - K[i]), J[i] * w ** 2, 1.0, L[i]],
[M[i] * w ** 2 - K[i], 0.0, 0.0, 1.0]])
H = np.mat(T[0, :, :])
for i in range(1, 15):
H = np.mat(T[i, :, :]) * H
return H
for w in tqdm(np.arange(0, 4000 + 0.01, 0.01)):
Z = prohl_T(w)
F = Z[2, 0] * Z[3, 1] - Z[2, 1] * Z[3, 0]
if F * (-1) ** k < 0:
wi.append(w)
w = wi[k]
ni.append(wi[k] * 30 / math.pi)
k += 1
x = np.zeros(len(L))
y = np.zeros(len(L))
z = np.zeros(len(L))
if len(Tit) > k:
tk = k
else:
tk = len(Tit)
for i in range(tk):
w = wi[i]
for j in range(14):
T[j, :, :] = np.mat([[1 + (L[j] ** 3) * (1 - v[j]) * (M[j] * w ** 2 - K[j]) / (6 * E * I),
L[j] + L[j] ** 2 * J[j] * w ** 2 / (2 * E * I), L[j] ** 2 / (2 * E * I),
L[j] ** 3 * (1 - v[j]) / (6 * E * I)],
[(L[j] ** 2) * (M[j] * w ** 2 - K[j]) / (2 * E * I), 1 + L[j] * J[j] * w ** 2 / (E * I),
L[j] / (E * I), L[j] ** 2 / (2 * E * I)],
[L[j] * (M[j] * w ** 2 - K[j]), J[j] * w ** 2, 1, L[j]],
[M[j] * w ** 2 - K[j], 0, 0, 1]])
Z = np.mat(T[0, :, :])
for j in range(1, 15):
Z = np.mat(T[j, :, :]) * Z
b = -Z[3, 0] / Z[3, 1]
X = np.mat([[1], [b], [0], [0]])
for n in range(1, 15):
X = np.c_[X, np.mat(T[n - 1, :, :]) * np.mat(X[:, X.shape[1] - 1])]
for j in range(15):
y[j] = X[1, j]
z[j] = X[3, j]
x[j] = (j - 1) * l1
# y[15]=X[1,15]
# x[15]=1.56
# z[15]=X[3,15]
y = y / max(abs(y))
z = z / max(abs(z))
plt.subplot(len(Tit), 1, i + 1)
plt.title(Tit[i]+' '+str(round(ni[i], 2))+'rpm')
plt.plot(x, y, 'b-')
plt.plot(x, z, 'r:')
plt.xlabel('轴长 / mm')
plt.ylabel('不平衡值 / kg')
plt.axis([0, 1.6, -1.2, 1.2])
plt.legend(['振型', '弯矩'])
plt.grid()
plt.tight_layout()
plt.show()
# T11 = 1 + (L[i] ** 3) * (1 - v[i]) * (M[i] * w ** 2 - K[i]) / (6 * E * I)
# T12 = L[i] + L[i] ** 2 * J[i] * w ** 2 / (2 * E * I)
# T13 = L[i] ** 2 / (2 * E * I)
# T14 = L[i] ** 3 * (1 - v[i]) / (6 * E * I)
# T21 = (L[i] ** 2) * (M[i] * w ** 2 - K[i]) / (2 * E * I)
# T22 = 1 + L[i] * J[i] * w ** 2 / (E * I)
# T23 = L[i] / (E * I)
# T24 = L[i] ** 2 / (2 * E * I)
# T31 = L[i] * (M[i] * w ** 2 - K[i])
# T32 = J[i] * w ** 2
# T33 = 1.0
# T34 = L[i]
# T41 = M[i] * w ** 2 - K[i]
# T42 = 0.0
# T43 = 0.0
# T44 = 1.0
| 29.194805 | 118 | 0.356762 | 856 | 4,496 | 1.867991 | 0.19743 | 0.03252 | 0.028143 | 0.050031 | 0.260163 | 0.208881 | 0.162602 | 0.150094 | 0.12758 | 0.106316 | 0 | 0.114099 | 0.3879 | 4,496 | 153 | 119 | 29.385621 | 0.466933 | 0.173488 | 0 | 0 | 0 | 0 | 0.023375 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010204 | false | 0 | 0.05102 | 0 | 0.071429 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6c6860f2dbf59c422ea67e8ef4a3c8ba30f58e95 | 1,555 | py | Python | EmailEncrypt.py | brandonskerritt51/Everything | c77141309f48d7cf4791bd73c574a8985d86cdc9 | [
"MIT"
] | 3 | 2020-12-26T18:54:12.000Z | 2021-12-22T16:10:01.000Z | EmailEncrypt.py | brandonskerritt51/Everything | c77141309f48d7cf4791bd73c574a8985d86cdc9 | [
"MIT"
] | null | null | null | EmailEncrypt.py | brandonskerritt51/Everything | c77141309f48d7cf4791bd73c574a8985d86cdc9 | [
"MIT"
] | 1 | 2020-02-28T10:58:11.000Z | 2020-02-28T10:58:11.000Z | """This program was created on 12/04/2015
It takes an user inputted string
encrypts it with the Transposition Cipher
and emails it to the users choice of person
https://www.facebook.com/AiiYourBaseRBel0ngToUs
"""
# SECURITY NOTICE
# THE EMAIL SENDS THE KEY NUMBER
# GET RID OF "myKey" in msg under main()
# to fix this
import sys
import smtplib
import random
import time
import transpositionEncrypt
def main():
message = str(input("enter message here "))
Email(message)
def Email(msg, toaddrs):
# gets email to send to
# i've put it here so timer isn't disrupted
# starts a timer
startTime = time.time()
# Encrypts message with random key
# TODO use Affine cipher as more secure
myKey = random.randint(1, 26)
msg = transpositionEncrypt.encryptMessage(myKey, msg)
# Email credentials & the message with KEY
fromaddr = ''
print("\nThis may take a few seconds")
username = ''
password = ''
KeySTR = str(myKey)
msg = ("The key is ") + KeySTR + ("\n\n") + msg
# The actual mail send
server = smtplib.SMTP('smtp.gmail.com:587')
server.starttls()
server.login(username, password)
server.sendmail(fromaddr, toaddrs, msg)
server.quit()
# stops timer and prints time
totalTime = round(time.time() - startTime, 2)
print(totalTime)
# closes program see close()
close()
def close():
print("\nProgram is now exiting\n")
sys.exit()
if __name__ == '__main__':
main()
| 23.560606 | 58 | 0.641158 | 203 | 1,555 | 4.871921 | 0.581281 | 0.012133 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013925 | 0.261093 | 1,555 | 65 | 59 | 23.923077 | 0.846823 | 0.367846 | 0 | 0 | 0 | 0 | 0.12792 | 0 | 0 | 0 | 0 | 0.015385 | 0 | 1 | 0.096774 | false | 0.064516 | 0.16129 | 0 | 0.258065 | 0.096774 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
6c68c459072f855c93d05b358334984f6e2e1337 | 283 | py | Python | server/server.py | krabo0om/flashcards | 2ceff567c14e1021949b3bb3d459623e8ba642f5 | [
"MIT"
] | 1 | 2019-05-10T22:31:27.000Z | 2019-05-10T22:31:27.000Z | server/server.py | krabo0om/flashcards | 2ceff567c14e1021949b3bb3d459623e8ba642f5 | [
"MIT"
] | null | null | null | server/server.py | krabo0om/flashcards | 2ceff567c14e1021949b3bb3d459623e8ba642f5 | [
"MIT"
] | null | null | null | import http
__author__ = 'pgenssler'
class FlashcardsServer(http.server.HTTPServer):
def __init__(self, server_address, req_handler, cardhandler, config):
super().__init__(server_address, req_handler)
self.config = config
self.cardhandler = cardhandler
| 28.3 | 73 | 0.731449 | 30 | 283 | 6.366667 | 0.566667 | 0.136126 | 0.167539 | 0.240838 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.180212 | 283 | 9 | 74 | 31.444444 | 0.823276 | 0 | 0 | 0 | 0 | 0 | 0.031802 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6c7f2c7d5399eda5ba67cbc95bb33b59fd1a49e3 | 1,218 | py | Python | field_calculator/strip_concatenate.py | Dan-Patterson/Tools_for_ArcGIS_Pro | b5c253d59d57bd1abe7e2433a77aed7d3ea22567 | [
"Info-ZIP"
] | 23 | 2020-05-15T18:40:25.000Z | 2022-03-31T08:44:39.000Z | field_calculator/strip_concatenate.py | Dan-Patterson/Tools_for_ArcGIS_Pro | b5c253d59d57bd1abe7e2433a77aed7d3ea22567 | [
"Info-ZIP"
] | 1 | 2021-12-14T16:47:00.000Z | 2021-12-15T03:06:26.000Z | field_calculator/strip_concatenate.py | Dan-Patterson/Tools_for_ArcGIS_Pro | b5c253d59d57bd1abe7e2433a77aed7d3ea22567 | [
"Info-ZIP"
] | 3 | 2021-08-09T05:42:19.000Z | 2022-03-31T08:44:59.000Z | d e f s t r i p _ c o n c a t e n a t e ( i n _ f l d s , s t r i p _ l i s t = [ " " , " , " , N o n e ] ) :
" " " P r o v i d e t h e f i e l d s a s a l i s t i e [ a , b , c ] t o s t r i p s p a c e s
: a n d r e m o v e n u l l s
: u s e : p y t h o n p a r s e r
: s y n t a x : s t r i p _ s t u f f ( ' ! a ! , ! b ! , ! c ! ] ) a s s u m e d f i e l d n a m e s
" " "
f i x e d = [ ]
f m t = [ ]
f o r i i n i n _ f l d s :
i f i n o t i n s t r i p _ l i s t :
f i x e d . a p p e n d ( i )
f m t . a p p e n d ( " { } " )
f r m t = " " . j o i n ( [ f f o r f i n f m t ] )
f r m t . s t r i p ( )
f l d s = [ s t r ( i ) . s t r i p ( ) f o r i i n f i x e d ]
r e s u l t = f r m t . f o r m a t ( * f i x e d )
r e t u r n r e s u l t
_ _ e s r i _ f i e l d _ c a l c u l a t o r _ s p l i t t e r _ _
s t r i p _ c o n c a t e n a t e ( ) | 64.105263 | 131 | 0.281609 | 83 | 1,218 | 3.963855 | 0.542169 | 0.097264 | 0.018237 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148604 | 1,218 | 19 | 132 | 64.105263 | 0.317261 | 0 | 0 | 0 | 0 | 0.108108 | 0.013126 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6c805d773912557a84a5fb4b56a7374fc3f32be3 | 1,552 | py | Python | api/models.py | Mutugiii/Awards | 92d97c3212588963c768449e57932ce5c0929904 | [
"MIT"
] | null | null | null | api/models.py | Mutugiii/Awards | 92d97c3212588963c768449e57932ce5c0929904 | [
"MIT"
] | 3 | 2021-03-30T13:03:03.000Z | 2021-06-04T22:48:29.000Z | api/models.py | Mutugiii/Awards | 92d97c3212588963c768449e57932ce5c0929904 | [
"MIT"
] | null | null | null | from django.db import models
from django.contrib.auth.models import User
from cloudinary.models import CloudinaryField
from django.dispatch import receiver
from django.db.models.signals import post_save
class Project(models.Model):
'''Model class for Projects tha user posts'''
user = models.ForeignKey(User, on_delete=models.CASCADE, related_name='project')
title = models.CharField(max_length=50)
project_image = CloudinaryField('image')
description = models.TextField()
live_link = models.CharField(max_length=100)
def __str__(self):
return '{} project {}'.format(self.user.username, self.title)
class Profile(models.Model):
'''Model class for User profile'''
user = models.OneToOneField(User, on_delete=models.CASCADE, related_name='profile')
profile_picture = CloudinaryField('image')
profile_bio = models.TextField()
contact_info = models.EmailField()
def __str__(self):
return self.user.username
# @receiver(post_save, sender=User)
# def create_profile(sender, instance, created, **kwargs):
# if created:
# Profile.objects.create(user=instance)
class Rating(models.Model):
'''Model class for rating values'''
user = models.ForeignKey(User, on_delete=models.CASCADE, related_name='rating')
project = models.ForeignKey(Project, on_delete=models.CASCADE, related_name='projectrating')
design = models.IntegerField()
usability = models.IntegerField()
content = models.IntegerField()
def __str__(self):
return self.user.design | 36.093023 | 96 | 0.728737 | 188 | 1,552 | 5.856383 | 0.351064 | 0.036331 | 0.050863 | 0.076294 | 0.27248 | 0.207084 | 0.134423 | 0.101726 | 0.101726 | 0.101726 | 0 | 0.003826 | 0.157861 | 1,552 | 43 | 97 | 36.093023 | 0.838562 | 0.162371 | 0 | 0.107143 | 0 | 0 | 0.043682 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.107143 | false | 0 | 0.178571 | 0.107143 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
6c82d60f90d88e271ae0e06aa8e3f431c8b453eb | 162 | py | Python | spinner.py | mjholtkamp/spinner | af4dbf7658ecf8d0cb17cedea0ea478f90181837 | [
"MIT"
] | null | null | null | spinner.py | mjholtkamp/spinner | af4dbf7658ecf8d0cb17cedea0ea478f90181837 | [
"MIT"
] | null | null | null | spinner.py | mjholtkamp/spinner | af4dbf7658ecf8d0cb17cedea0ea478f90181837 | [
"MIT"
] | null | null | null | #!/usr/bin/python
import sys, time
while True:
for i in ['\o/', '\o>', '<o>', '<o/']:
sys.stdout.write('\r%s' % i);
sys.stdout.flush();
time.sleep(0.1)
| 18 | 40 | 0.530864 | 28 | 162 | 3.071429 | 0.714286 | 0.069767 | 0.069767 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014925 | 0.17284 | 162 | 8 | 41 | 20.25 | 0.626866 | 0.098765 | 0 | 0 | 0 | 0 | 0.110345 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6c845e77dd1781585115fe72cf064221224cfde1 | 311 | py | Python | src/v8unpack/MetaDataObject/Constant.py | saby/v8uncode | 82e398e5d3291c3671db1ea443e5a38def7510d7 | [
"MIT"
] | 10 | 2021-11-01T11:06:56.000Z | 2022-03-25T17:55:05.000Z | src/v8unpack/MetaDataObject/Constant.py | saby/v8uncode | 82e398e5d3291c3671db1ea443e5a38def7510d7 | [
"MIT"
] | 1 | 2021-10-16T05:39:16.000Z | 2021-10-16T05:39:16.000Z | src/v8unpack/MetaDataObject/Constant.py | saby/v8uncode | 82e398e5d3291c3671db1ea443e5a38def7510d7 | [
"MIT"
] | 4 | 2021-12-09T06:58:15.000Z | 2022-01-12T15:14:55.000Z | from ..MetaDataObject.core.Simple import Simple
class Constant(Simple):
ext_code = {
'mgr': '1', # модуль менеджера Константы
'obj': '0', # модуль менеджера значения Константы
}
@classmethod
def get_decode_header(cls, header_data):
return header_data[0][1][1][1][1]
| 23.923077 | 58 | 0.636656 | 38 | 311 | 5.078947 | 0.657895 | 0.031088 | 0.031088 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029536 | 0.237942 | 311 | 12 | 59 | 25.916667 | 0.78481 | 0.199357 | 0 | 0 | 0 | 0 | 0.03252 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.111111 | 0.111111 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
665eb3db1d8992943f25c005e0ddb5343af2c789 | 815 | py | Python | lyapy/outputs/pd_output.py | vdorobantu/lyapy | 485e8490ef77f2286a804b2bd506cc3f46b74c6d | [
"BSD-3-Clause"
] | 36 | 2019-03-09T07:38:05.000Z | 2022-01-06T05:33:51.000Z | lyapy/outputs/pd_output.py | HaiminClack/lyapy | 485e8490ef77f2286a804b2bd506cc3f46b74c6d | [
"BSD-3-Clause"
] | 8 | 2020-01-28T22:35:26.000Z | 2022-02-10T00:08:17.000Z | lyapy/outputs/pd_output.py | HaiminClack/lyapy | 485e8490ef77f2286a804b2bd506cc3f46b74c6d | [
"BSD-3-Clause"
] | 15 | 2019-04-21T00:31:48.000Z | 2021-12-29T10:26:31.000Z | """Base class for outputs with proportional and derivative components."""
from .output import Output
class PDOutput(Output):
"""Base class for outputs with proportional and derivative components.
Override eta, proportional, derivative.
Let n be the number of states, k be the proportional/derivative error sizes.
"""
def proportional(self, x, t):
"""Compute proportional error component of output dynamics.
Outputs a numpy array (k,).
Inputs:
State, x: numpy array (n,)
Time, t: float
"""
pass
def derivative(self, x, t):
"""Compute derivative error component of output dynamics.
Outputs a numpy array (k,).
Inputs:
State, x: numpy array (n,)
Time, t: float
"""
pass
| 22.638889 | 80 | 0.61227 | 97 | 815 | 5.14433 | 0.412371 | 0.08016 | 0.048096 | 0.076152 | 0.577154 | 0.577154 | 0.577154 | 0.577154 | 0.577154 | 0.344689 | 0 | 0 | 0.299387 | 815 | 35 | 81 | 23.285714 | 0.873905 | 0.645399 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0.166667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
666135e38f2b80585168b82eb26581a3cf3a372e | 5,329 | py | Python | ooobuild/lo/embed/x_embedded_object.py | Amourspirit/ooo_uno_tmpl | 64e0c86fd68f24794acc22d63d8d32ae05dd12b8 | [
"Apache-2.0"
] | null | null | null | ooobuild/lo/embed/x_embedded_object.py | Amourspirit/ooo_uno_tmpl | 64e0c86fd68f24794acc22d63d8d32ae05dd12b8 | [
"Apache-2.0"
] | null | null | null | ooobuild/lo/embed/x_embedded_object.py | Amourspirit/ooo_uno_tmpl | 64e0c86fd68f24794acc22d63d8d32ae05dd12b8 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
#
# Copyright 2022 :Barry-Thomas-Paul: Moss
#
# Licensed under the Apache License, Version 2.0 (the "License")
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http: // www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Interface Class
# this is a auto generated file generated by Cheetah
# Libre Office Version: 7.3
# Namespace: com.sun.star.embed
import typing
from abc import abstractmethod
from ..document.x_event_broadcaster import XEventBroadcaster as XEventBroadcaster_2b120f2b
from .x_classified_object import XClassifiedObject as XClassifiedObject_fa3b0dab
from .x_component_supplier import XComponentSupplier as XComponentSupplier_adb0e64
from .x_state_change_broadcaster import XStateChangeBroadcaster as XStateChangeBroadcaster_539f100e
from .x_visual_object import XVisualObject as XVisualObject_c6c80c28
from ..util.x_closeable import XCloseable as XCloseable_99ee0aa8
if typing.TYPE_CHECKING:
from .verb_descriptor import VerbDescriptor as VerbDescriptor_d3680cb3
from .x_embedded_client import XEmbeddedClient as XEmbeddedClient_ddea0cc6
class XEmbeddedObject(XEventBroadcaster_2b120f2b, XClassifiedObject_fa3b0dab, XComponentSupplier_adb0e64, XStateChangeBroadcaster_539f100e, XVisualObject_c6c80c28, XCloseable_99ee0aa8):
"""
represents common functionality for embedded objects.
See Also:
`API XEmbeddedObject <https://api.libreoffice.org/docs/idl/ref/interfacecom_1_1sun_1_1star_1_1embed_1_1XEmbeddedObject.html>`_
"""
__ooo_ns__: str = 'com.sun.star.embed'
__ooo_full_ns__: str = 'com.sun.star.embed.XEmbeddedObject'
__ooo_type_name__: str = 'interface'
__pyunointerface__: str = 'com.sun.star.embed.XEmbeddedObject'
@abstractmethod
def changeState(self, nNewState: int) -> None:
"""
changes the state of the object to the requested one.
Raises:
com.sun.star.embed.UnreachableStateException: ``UnreachableStateException``
com.sun.star.embed.WrongStateException: ``WrongStateException``
com.sun.star.uno.Exception: ``Exception``
"""
@abstractmethod
def doVerb(self, nVerbID: int) -> None:
"""
lets object perform an action referenced by nVerbID.
Raises:
com.sun.star.lang.IllegalArgumentException: ``IllegalArgumentException``
com.sun.star.embed.WrongStateException: ``WrongStateException``
com.sun.star.embed.UnreachableStateException: ``UnreachableStateException``
com.sun.star.uno.Exception: ``Exception``
"""
@abstractmethod
def getClientSite(self) -> 'XEmbeddedClient_ddea0cc6':
"""
provides access to the internal link to the container client.
Raises:
com.sun.star.embed.WrongStateException: ``WrongStateException``
"""
@abstractmethod
def getCurrentState(self) -> int:
"""
returns the current state of the object.
Raises:
com.sun.star.embed.WrongStateException: ``WrongStateException``
"""
@abstractmethod
def getReachableStates(self) -> 'typing.Tuple[int, ...]':
"""
returns supported states for the object.
Raises:
com.sun.star.embed.NeedsRunningStateException: ``NeedsRunningStateException``
com.sun.star.embed.WrongStateException: ``WrongStateException``
"""
@abstractmethod
def getStatus(self, nAspect: int) -> int:
"""
retrieves the status of the object.
Raises:
com.sun.star.embed.WrongStateException: ``WrongStateException``
"""
@abstractmethod
def getSupportedVerbs(self) -> 'typing.Tuple[VerbDescriptor_d3680cb3, ...]':
"""
returns supported verbs for the object.
Raises:
com.sun.star.embed.NeedsRunningStateException: ``NeedsRunningStateException``
com.sun.star.embed.WrongStateException: ``WrongStateException``
"""
@abstractmethod
def setClientSite(self, xClient: 'XEmbeddedClient_ddea0cc6') -> None:
"""
sets a connection to the container's client.
Raises:
com.sun.star.embed.WrongStateException: ``WrongStateException``
"""
@abstractmethod
def setContainerName(self, sName: str) -> None:
"""
provides object with the name of container document.
"""
@abstractmethod
def setUpdateMode(self, nMode: int) -> None:
"""
specifies how often the object's representation should be updated.
Raises:
com.sun.star.embed.WrongStateException: ``WrongStateException``
"""
@abstractmethod
def update(self) -> None:
"""
updates object's representations.
Raises:
com.sun.star.embed.WrongStateException: ``WrongStateException``
com.sun.star.uno.Exception: ``Exception``
"""
__all__ = ['XEmbeddedObject']
| 37.794326 | 185 | 0.693564 | 545 | 5,329 | 6.658716 | 0.385321 | 0.036374 | 0.060623 | 0.074401 | 0.341692 | 0.340039 | 0.315789 | 0.314136 | 0.29898 | 0.224855 | 0 | 0.02034 | 0.2158 | 5,329 | 140 | 186 | 38.064286 | 0.84805 | 0.512479 | 0 | 0.282051 | 0 | 0 | 0.106885 | 0.073664 | 0 | 0 | 0 | 0 | 0 | 1 | 0.282051 | false | 0 | 0.25641 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
666d0dd483259f42bb5c69d9d749bd1454052593 | 4,253 | py | Python | test/test_gradcam_guided_backprop.py | giladcohen/darkon | 856cdf632449c09c1259b09a4526ea361e13239b | [
"Apache-2.0"
] | 254 | 2017-11-14T04:54:53.000Z | 2022-03-29T03:11:34.000Z | test/test_gradcam_guided_backprop.py | giladcohen/darkon | 856cdf632449c09c1259b09a4526ea361e13239b | [
"Apache-2.0"
] | 44 | 2017-11-13T13:20:20.000Z | 2020-06-15T13:01:31.000Z | test/test_gradcam_guided_backprop.py | giladcohen/darkon | 856cdf632449c09c1259b09a4526ea361e13239b | [
"Apache-2.0"
] | 46 | 2017-11-15T03:40:05.000Z | 2022-01-08T06:22:00.000Z | # Copyright 2017 Neosapience, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ========================================================================
import unittest
import darkon
import tensorflow as tf
import numpy as np
_classes = 2
def nn_graph(activation):
# create graph
x = tf.placeholder(tf.float32, (1, 2, 2, 3), 'x_placeholder')
y = tf.placeholder(tf.int32, name='y_placeholder', shape=[1, 2])
with tf.name_scope('conv1'):
conv_1 = tf.layers.conv2d(
inputs=x,
filters=10,
kernel_size=[2, 2],
padding="same",
activation=activation)
with tf.name_scope('fc2'):
flatten = tf.layers.flatten(conv_1)
top = tf.layers.dense(flatten, _classes)
logits = tf.nn.softmax(top)
return x
class GradcamGuidedBackprop(unittest.TestCase):
def setUp(self):
tf.reset_default_graph()
def tearDown(self):
x = nn_graph(activation=self.activation_fn)
image = np.random.uniform(size=(2, 2, 3))
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
gradcam_ops = darkon.Gradcam.candidate_featuremap_op_names(sess)
if self.enable_guided_backprop:
_ = darkon.Gradcam(x, _classes, gradcam_ops[-1])
g = tf.get_default_graph()
from_ts = g.get_operation_by_name(gradcam_ops[-1]).outputs
to_ts = g.get_operation_by_name(gradcam_ops[-2]).outputs
max_output = tf.reduce_max(from_ts, axis=3)
y = tf.reduce_sum(-max_output * 1e2)
grad = tf.gradients(y, to_ts)[0]
grad_val = sess.run(grad, feed_dict={x: np.expand_dims(image, 0)})
if self.enable_guided_backprop:
self.assertTrue(not np.any(grad_val))
else:
self.assertTrue(np.any(grad_val))
def test_relu(self):
self.activation_fn = tf.nn.relu
self.enable_guided_backprop = False
def test_relu_guided(self):
self.activation_fn = tf.nn.relu
self.enable_guided_backprop = True
def test_tanh(self):
self.activation_fn = tf.nn.tanh
self.enable_guided_backprop = False
def test_tanh_guided(self):
self.activation_fn = tf.nn.tanh
self.enable_guided_backprop = True
def test_sigmoid(self):
self.activation_fn = tf.nn.sigmoid
self.enable_guided_backprop = False
def test_sigmoid_guided(self):
self.activation_fn = tf.nn.sigmoid
self.enable_guided_backprop = True
def test_relu6(self):
self.activation_fn = tf.nn.relu6
self.enable_guided_backprop = False
def test_relu6_guided(self):
self.activation_fn = tf.nn.relu6
self.enable_guided_backprop = True
def test_elu(self):
self.activation_fn = tf.nn.elu
self.enable_guided_backprop = False
def test_elu_guided(self):
self.activation_fn = tf.nn.elu
self.enable_guided_backprop = True
def test_selu(self):
self.activation_fn = tf.nn.selu
self.enable_guided_backprop = False
def test_selu_guided(self):
self.activation_fn = tf.nn.selu
self.enable_guided_backprop = True
def test_softplus(self):
self.activation_fn = tf.nn.softplus
self.enable_guided_backprop = False
def test_test_softplus_guided(self):
self.activation_fn = tf.nn.softplus
self.enable_guided_backprop = True
def test_softsign(self):
self.activation_fn = tf.nn.softsign
self.enable_guided_backprop = False
def test_softsign_guided(self):
self.activation_fn = tf.nn.softsign
self.enable_guided_backprop = True
| 30.597122 | 78 | 0.645897 | 570 | 4,253 | 4.603509 | 0.287719 | 0.068598 | 0.109756 | 0.164634 | 0.454649 | 0.433308 | 0.433308 | 0.349085 | 0.325457 | 0.325457 | 0 | 0.013421 | 0.246649 | 4,253 | 138 | 79 | 30.818841 | 0.805556 | 0.150482 | 0 | 0.377778 | 0 | 0 | 0.010564 | 0 | 0 | 0 | 0 | 0 | 0.022222 | 1 | 0.211111 | false | 0 | 0.044444 | 0 | 0.277778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
66712b5448bf85bac168cd4d7220459f0e41d382 | 494 | py | Python | microfreshener/core/importer/jsontype.py | di-unipi-socc/micro-tosca | 5d5c9361b34eeabaed8955ddc62282607672bd81 | [
"MIT"
] | null | null | null | microfreshener/core/importer/jsontype.py | di-unipi-socc/micro-tosca | 5d5c9361b34eeabaed8955ddc62282607672bd81 | [
"MIT"
] | 3 | 2019-10-02T13:55:39.000Z | 2021-06-01T22:55:20.000Z | microfreshener/core/importer/jsontype.py | di-unipi-socc/microFreshener-core | 5d5c9361b34eeabaed8955ddc62282607672bd81 | [
"MIT"
] | null | null | null | # Relationship instance name
JSON_RELATIONSHIP_INTERACT_WITH = "interaction"
JSON_RUN_TIME = "runtime"
JSON_DEPLOYMENT_TIME = "deploymenttime"
JSON_NODE_DATABASE = "datastore"
JSON_NODE_SERVICE= "service"
JSON_NODE_MESSAGE_BROKER = "messagebroker"
JSON_NODE_MESSAGE_ROUTER = "messagerouter"
JSON_NODE_MESSAGE_ROUTER_KSERVICE = "kservice"
JSON_NODE_MESSAGE_ROUTER_KPROXY = "kproxy"
JSON_NODE_MESSAGE_ROUTER_KINGRESS = "kingress"
JSON_GROUPS_EDGE = "edgegroup"
JSON_GROUPS_TEAM = "squadgroup"
| 29.058824 | 47 | 0.840081 | 60 | 494 | 6.366667 | 0.483333 | 0.146597 | 0.196335 | 0.219895 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08502 | 494 | 16 | 48 | 30.875 | 0.845133 | 0.052632 | 0 | 0 | 0 | 0 | 0.246781 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
66716c966492aa53ad746645a7639df69415268a | 362 | py | Python | api/message/__init__.py | fhzhang/staticwebA | 3e96a5cdde60f5ee645e84076df4c2257084e731 | [
"MIT"
] | null | null | null | api/message/__init__.py | fhzhang/staticwebA | 3e96a5cdde60f5ee645e84076df4c2257084e731 | [
"MIT"
] | null | null | null | api/message/__init__.py | fhzhang/staticwebA | 3e96a5cdde60f5ee645e84076df4c2257084e731 | [
"MIT"
] | null | null | null | import logging
import requests
import json
import azure.functions as func
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
res = requests.get("https://61n.azurewebsites.net/get")
return func.HttpResponse(json.dumps(res.json()), status_code=200, mimetype="application/json")
| 25.857143 | 98 | 0.754144 | 48 | 362 | 5.666667 | 0.708333 | 0.117647 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015823 | 0.127072 | 362 | 13 | 99 | 27.846154 | 0.844937 | 0 | 0 | 0 | 0 | 0 | 0.270718 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.5 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
668bba64417e76332375be16790d1d9b67a05237 | 1,955 | py | Python | trajminer/tests/test_segmentation.py | ybj94/trajminer | 7355344be8fe763ba2583b6f508fefc3290c9849 | [
"MIT"
] | 37 | 2019-04-04T19:27:26.000Z | 2021-12-22T07:10:13.000Z | trajminer/tests/test_segmentation.py | ybj94/trajminer | 7355344be8fe763ba2583b6f508fefc3290c9849 | [
"MIT"
] | 26 | 2019-04-04T19:20:44.000Z | 2021-12-22T07:56:53.000Z | trajminer/tests/test_segmentation.py | ybj94/trajminer | 7355344be8fe763ba2583b6f508fefc3290c9849 | [
"MIT"
] | 9 | 2019-04-04T19:17:05.000Z | 2019-11-05T15:06:21.000Z | from trajminer import TrajectoryData
from trajminer.preprocessing import TrajectorySegmenter
data = TrajectoryData(attributes=['poi', 'hour', 'rating'],
data=[[['Bakery', 8, 8.6], ['Work', 9, 8.9],
['Restaurant', 12, 7.7], ['Bank', 12, 5.6],
['Work', 13, 8.9], ['Home', 19, 0]],
[['Home', 8, 0], ['Mall', 10, 9.3],
['Home', 19, 0], ['Pub', 21, 9.5]]],
tids=[20, 24],
labels=[1, 2])
class TestTrajectorySegmenter(object):
def test_missing(self):
assert True
def test_ignore_missing(self):
assert True
def test_strict_no_thresholds(self):
segmenter = TrajectorySegmenter(attributes=data.get_attributes(),
thresholds=None, mode='strict',
n_jobs=1)
print(segmenter.fit_transform(data))
assert True # TO-DO
def test_strict_subset_thresholds(self):
segmenter = TrajectorySegmenter(attributes=data.get_attributes(),
thresholds=None, mode='strict',
n_jobs=1)
print(segmenter.fit_transform(data))
assert True # TO-DO
def test_any_no_thresholds(self):
segmenter = TrajectorySegmenter(attributes=data.get_attributes(),
thresholds=None, mode='any',
n_jobs=1)
print(segmenter.fit_transform(data))
assert True # TO-DO
def test_any_subset_thresholds(self):
segmenter = TrajectorySegmenter(attributes=data.get_attributes(),
thresholds=None, mode='any',
n_jobs=1)
print(segmenter.fit_transform(data))
assert True # TO-DO
| 39.1 | 73 | 0.499744 | 187 | 1,955 | 5.080214 | 0.331551 | 0.044211 | 0.096842 | 0.176842 | 0.7 | 0.7 | 0.641053 | 0.641053 | 0.641053 | 0.641053 | 0 | 0.03682 | 0.388747 | 1,955 | 49 | 74 | 39.897959 | 0.758159 | 0.011765 | 0 | 0.564103 | 0 | 0 | 0.040477 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 1 | 0.153846 | false | 0 | 0.051282 | 0 | 0.230769 | 0.102564 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
669ae9c66756070ea7b82154d703fda978bed22a | 332 | py | Python | python/learn_visualization.py | tingjianlau/Caffe-commons | f23864d2cb4b94302c10f3047693c36a32331be4 | [
"BSD-2-Clause"
] | null | null | null | python/learn_visualization.py | tingjianlau/Caffe-commons | f23864d2cb4b94302c10f3047693c36a32331be4 | [
"BSD-2-Clause"
] | null | null | null | python/learn_visualization.py | tingjianlau/Caffe-commons | f23864d2cb4b94302c10f3047693c36a32331be4 | [
"BSD-2-Clause"
] | null | null | null | import numpy as np
import matplotlib.pyplot as plt
#%matplotlib inline
import caffe
caffe_root= '../'
import os,sys
os.chdir(caffe_root)
sys.path.insert(0,caffe_root+'python')
sys.path.append('usr/local/lib/python2.7/site-packages/')
im = caffe.io.load_image('examples/images/cat.jpg')
print im.shape
plt.imshow(im)
plt.axis('off')
| 23.714286 | 57 | 0.762048 | 57 | 332 | 4.368421 | 0.666667 | 0.108434 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009804 | 0.078313 | 332 | 13 | 58 | 25.538462 | 0.803922 | 0.054217 | 0 | 0 | 0 | 0 | 0.233227 | 0.194888 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.333333 | null | null | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
66a10908238b5f5482f8d7c0822b7fdb7e0ff438 | 198 | py | Python | survivethevoid/characters/_base_character.py | LMikeH/SurviveTheVoid | e517870d1dca388a54f8f523879c6c8583101a02 | [
"bzip2-1.0.6"
] | null | null | null | survivethevoid/characters/_base_character.py | LMikeH/SurviveTheVoid | e517870d1dca388a54f8f523879c6c8583101a02 | [
"bzip2-1.0.6"
] | null | null | null | survivethevoid/characters/_base_character.py | LMikeH/SurviveTheVoid | e517870d1dca388a54f8f523879c6c8583101a02 | [
"bzip2-1.0.6"
] | null | null | null |
import pygame
class _BaseCharacter(pygame.sprite.Sprite):
def __init__(self, screen, x, y, angle):
self.x = x
self.y = y
self.theta = angle
self.screen = screen | 22 | 44 | 0.60101 | 26 | 198 | 4.384615 | 0.5 | 0.175439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.29798 | 198 | 9 | 45 | 22 | 0.820144 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.428571 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
66af3dbeb1b3ebcfcd5e1cfa55504095ab248ab6 | 1,041 | py | Python | 12_Intermediate Data Visualization with Seaborn/03_Additional Plot Types/06_Binning data.py | mohd-faizy/DataScience-With-Python | 13ebb10cf9083343056d5b782957241de1d595f9 | [
"MIT"
] | 5 | 2021-02-03T14:36:58.000Z | 2022-01-01T10:29:26.000Z | 12_Intermediate Data Visualization with Seaborn/03_Additional Plot Types/06_Binning data.py | mohd-faizy/DataScience-With-Python | 13ebb10cf9083343056d5b782957241de1d595f9 | [
"MIT"
] | null | null | null | 12_Intermediate Data Visualization with Seaborn/03_Additional Plot Types/06_Binning data.py | mohd-faizy/DataScience-With-Python | 13ebb10cf9083343056d5b782957241de1d595f9 | [
"MIT"
] | 3 | 2021-02-08T00:31:16.000Z | 2022-03-17T13:52:32.000Z | '''
06 - Binning data
When the data on the x axis is a continuous value, it can
be useful to break it into different bins in order to get
a better visualization of the changes in the data.
For this exercise, we will look at the relationship between
tuition and the Undergraduate population abbreviated as UG in
this data. We will start by looking at a scatter plot of the
data and examining the impact of different bin sizes on the
visualization.
'''
# 1 - Create a regplot of Tuition and UG and set the fit_reg parameter to False to disable the regression line.
sns.regplot(data=df,
y='Tuition',
x="UG",
fit_reg=False)
plt.show()
plt.clf()
# 2 - Create another plot with the UG data divided into 5 bins.
sns.regplot(data=df,
y='Tuition',
x="UG",
x_bins=5)
plt.show()
plt.clf()
# 3 - Create a regplot() with the data divided into 8 bins.
sns.regplot(data=df,
y='Tuition',
x="UG",
x_bins=8)
plt.show()
plt.clf()
| 26.025 | 111 | 0.650336 | 170 | 1,041 | 3.958824 | 0.452941 | 0.041605 | 0.062407 | 0.071322 | 0.147103 | 0.147103 | 0.147103 | 0.147103 | 0.106984 | 0.106984 | 0 | 0.011842 | 0.269933 | 1,041 | 39 | 112 | 26.692308 | 0.873684 | 0.662824 | 0 | 0.833333 | 0 | 0 | 0.079179 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
66b4b20424a0cf0db725cfddc68d9dc96de85e5f | 4,345 | py | Python | src/dbconnector/dbconnector.py | dubovyk/Quoterly | ce49db042fa95e3ebc6cf826eb8d7f9e4f10a057 | [
"MIT"
] | null | null | null | src/dbconnector/dbconnector.py | dubovyk/Quoterly | ce49db042fa95e3ebc6cf826eb8d7f9e4f10a057 | [
"MIT"
] | null | null | null | src/dbconnector/dbconnector.py | dubovyk/Quoterly | ce49db042fa95e3ebc6cf826eb8d7f9e4f10a057 | [
"MIT"
] | null | null | null | import sqlite3
import time
class sqliteConnector:
def __init__(self, dbpath):
self.__dbpath = dbpath
self.__connection = sqlite3.connect(self.__dbpath) # initialize 'connection' (actually, open file)
self.__cursor = self.__connection.cursor()
def add_user(self, username, usermail, userpass):
self.__cursor.execute("INSERT INTO User (username, email, regdate, pass_hash) VALUES (\"{}\", \"{}\", \"{}\", \"{}\");".format(username, usermail, time.ctime(), userpass))
self.__connection.commit() # should be done to finish operation
def is_available_user(self, username, usermail):
self.__cursor.execute(("SELECT * FROM User WHERE username=\"{}\" OR email=\"{}\";").format(username, usermail))
data = self.__cursor.fetchall()
return len(data) == 0
def get_user(self, username):
self.__cursor.execute(("SELECT * FROM User WHERE username=\"{}\" OR email=\"{}\";").format(username, username))
data = self.__cursor.fetchone()
return data
def drop(self):
self.__cursor.execute("DROP TABLE IF EXISTS User;")
self.__cursor.execute("DROP TABLE IF EXISTS Quote;")
self.__connection.commit()
def create(self):
self.__cursor.execute("CREATE TABLE User (username varchar,email varchar,regdate datetime,pass_hash varchar,is_admin boolean default false)")
self.__cursor.execute("CREATE TABLE Quote (id integer PRIMARY KEY AUTOINCREMENT,"
"quote_text text,author text,username varchar,publication_date datetime);")
self.__connection.commit()
def match_password(self, username, password):
self.__cursor.execute(("SELECT * FROM User WHERE username=\"{}\" AND pass_hash=\"{}\";").format(username, password))
data = self.__cursor.fetchall()
return len(data) != 0
def is_admin(self, username):
self.__cursor.execute("SELECT is_admin FROM User WHERE username=\"{}\";".format(username))
data = self.__cursor.fetchall()
if len(data) == 0:
return False
return data[0][0] == 'true'
def update_user(self, username, fieldname, fieldvalue):
self.__cursor.execute("UPDATE User SET {}=\"{}\" WHERE username=\"{}\";".format(fieldname, fieldvalue, username))
self.__connection.commit()
def delete_user(self, username):
self.__cursor.execute("DELETE FROM User WHERE username=\"{}\";".format(username))
self.__connection.commit()
def add_quote(self, text, author, username):
self.__cursor.execute("INSERT INTO Quote (quote_text, author, username, publication_date) VALUES (\"{}\", \"{}\", \"{}\", \"{}\");".format(text, author, username, time.ctime()))
self.__connection.commit()
self.__cursor.execute("SELECT * FROM Quote WHERE quote_text=\"{}\" AND author=\"{}\";".format(text, author))
data = self.__cursor.fetchall()
return data[-1]
def get_random_quote(self):
self.__cursor.execute("SELECT * FROM Quote ORDER BY RANDOM() LIMIT 1;")
data = self.__cursor.fetchone()
if not data:
return None
return data
def get_quote_by_id(self, id):
self.__cursor.execute("SELECT * FROM Quote WHERE id=\"{}\";".format(id))
data = self.__cursor.fetchone()
if not data:
return None
return data
def get_quotes_by_user(self, username):
self.__cursor.execute("SELECT * FROM Quote WHERE username=\"{}\";".format(username))
data = self.__cursor.fetchall()
if len(data) == 0:
return None
return data
def update_quote_field(self, quote_id, field, value):
self.__cursor.execute("UPDATE Quote SET {}=\"{}\" WHERE id=\"{}\";".format(field, value, quote_id))
self.__connection.commit()
def delete_quote(self, quote_id):
self.__cursor.execute("DELETE FROM Quote WHERE id=\"{}\";".format(quote_id))
self.__connection.commit()
def get_user_by_quote(self, quote_id):
query = "SELECT username FROM Quote WHERE id=\"{}\";".format(quote_id)
self.__cursor.execute(query)
data = self.__cursor.fetchone()
if not data:
return None
return data[0]
def reset(self):
self.drop()
self.create()
| 42.184466 | 185 | 0.629919 | 504 | 4,345 | 5.18254 | 0.184524 | 0.111026 | 0.12366 | 0.070444 | 0.493492 | 0.38706 | 0.313553 | 0.257274 | 0.192573 | 0.162711 | 0 | 0.003258 | 0.223015 | 4,345 | 102 | 186 | 42.598039 | 0.770438 | 0.018412 | 0 | 0.361446 | 0 | 0.012048 | 0.23252 | 0.005631 | 0 | 0 | 0 | 0 | 0 | 1 | 0.216867 | false | 0.060241 | 0.024096 | 0 | 0.421687 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
66de744a8578f5562ebfdc373eb337242bafb8d7 | 199 | py | Python | rajk_appman/urls.py | rajk-apps/rajk-appman | 2053aa15b6dc17747022f15840cfaead06e6e8c6 | [
"MIT"
] | null | null | null | rajk_appman/urls.py | rajk-apps/rajk-appman | 2053aa15b6dc17747022f15840cfaead06e6e8c6 | [
"MIT"
] | null | null | null | rajk_appman/urls.py | rajk-apps/rajk-appman | 2053aa15b6dc17747022f15840cfaead06e6e8c6 | [
"MIT"
] | null | null | null | from django.urls import path, include
from . import views
app_name = "rajk-appman"
urlpatterns = [
path("", views.home, name="home"),
path("user_page", views.user_page, name="user_page"),
]
| 22.111111 | 57 | 0.683417 | 28 | 199 | 4.714286 | 0.535714 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155779 | 199 | 8 | 58 | 24.875 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0.165829 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd021406e1aa5f3eed2aa1dd23e5b21e6e8b2ee3 | 1,956 | py | Python | auto_stars.py | fjwCode/auto-stars | 17895cf07676c79af721fff69c0607286fd7d302 | [
"Apache-2.0"
] | 7 | 2018-06-17T12:15:19.000Z | 2019-05-03T06:05:08.000Z | auto_stars.py | fjwCode/auto-stars | 17895cf07676c79af721fff69c0607286fd7d302 | [
"Apache-2.0"
] | 2 | 2020-10-05T07:21:41.000Z | 2021-11-22T09:08:42.000Z | auto_stars.py | ffujiawei/auto-stars | 17895cf07676c79af721fff69c0607286fd7d302 | [
"Apache-2.0"
] | 2 | 2018-08-19T21:06:21.000Z | 2019-02-14T22:34:11.000Z | from time import sleep
from faker import Faker
from platinum import Chromium
from selenium import webdriver
def generate_account(mark='yeah'):
faker = Faker()
user = faker.name().replace(' ', mark)
email = mark + faker.email()
password = faker.password()
return user, email, password
def auto_stars(*repos, headless=True):
options = webdriver.ChromeOptions()
if headless:
options.add_argument(Chromium.HEADLESS)
else:
options.add_argument(Chromium.START_MAXIMIZED)
options.add_argument(Chromium.DISABLE_INFOBARS)
driver = webdriver.Chrome(chrome_options=options)
driver.get('https://github.com/join?source=header-home')
user, email, password = generate_account()
driver.find_element_by_id('user_login').send_keys(user)
driver.find_element_by_id('user_email').send_keys(email)
driver.find_element_by_id('user_password').send_keys(password)
sleep(2)
driver.find_element_by_id('signup_button').click()
sleep(2)
driver.find_element_by_xpath('//div[@class="SignUpContinueActions"]/button[@type="submit"]').click()
sleep(2)
driver.find_element_by_xpath('//form[@class="setup-form"]/input[@type="submit"]').click()
sleep(2)
for repo in repos:
driver.get(repo)
driver.find_element_by_xpath('//form[@class="unstarred js-social-form"]/button[@type="submit"]').click()
print('Star {}'.format(repo))
sleep(2)
driver.quit()
if __name__ == '__main__':
repos = ['https://github.com/rtfd/readthedocs.org', 'https://github.com/fjwCode/cerium', 'https://github.com/fjwCode/auto-answer-tnwz', 'https://github.com/fjwCode/platinum', 'https://github.com/fjwCode/wireless-control', 'https://github.com/SeleniumHQ/selenium', 'https://github.com/requests/requests', 'https://github.com/faif/python-patterns', 'https://github.com/kennethreitz/requests-html', 'https://github.com/joke2k/faker']
auto_stars(*repos) | 37.615385 | 434 | 0.704499 | 252 | 1,956 | 5.285714 | 0.369048 | 0.090841 | 0.115616 | 0.09985 | 0.183934 | 0.159159 | 0.084084 | 0.052553 | 0 | 0 | 0 | 0.003548 | 0.135481 | 1,956 | 52 | 435 | 37.615385 | 0.784151 | 0 | 0 | 0.128205 | 1 | 0 | 0.338784 | 0.08789 | 0 | 0 | 0 | 0 | 0 | 1 | 0.051282 | false | 0.102564 | 0.102564 | 0 | 0.179487 | 0.025641 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
dd29af68b99d52251b7d2da61dcb00d74cf282c0 | 138 | py | Python | tkinter/index.py | Sudani-Coder/python | 9c35f04a0521789ba91b7058695139ed074f7796 | [
"MIT"
] | null | null | null | tkinter/index.py | Sudani-Coder/python | 9c35f04a0521789ba91b7058695139ed074f7796 | [
"MIT"
] | null | null | null | tkinter/index.py | Sudani-Coder/python | 9c35f04a0521789ba91b7058695139ed074f7796 | [
"MIT"
] | null | null | null | from tkinter import *
# create window object
app = Tk()
app.title("Part Manager")
app.geometry("700x350")
#strat program
app.mainloop()
| 13.8 | 25 | 0.724638 | 19 | 138 | 5.263158 | 0.842105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05042 | 0.137681 | 138 | 9 | 26 | 15.333333 | 0.789916 | 0.23913 | 0 | 0 | 0 | 0 | 0.186275 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd2cce58a9990bb69b41f25c3f63368590c44f17 | 873 | py | Python | stratifi_tests/test_login.py | DashkevichBy/Advisors.stratiFi.frontend | e59892e8c1ff0d15ce6ef520fe0b00355e1a6870 | [
"Apache-2.0"
] | null | null | null | stratifi_tests/test_login.py | DashkevichBy/Advisors.stratiFi.frontend | e59892e8c1ff0d15ce6ef520fe0b00355e1a6870 | [
"Apache-2.0"
] | null | null | null | stratifi_tests/test_login.py | DashkevichBy/Advisors.stratiFi.frontend | e59892e8c1ff0d15ce6ef520fe0b00355e1a6870 | [
"Apache-2.0"
] | null | null | null | import logging
import os
import pytest
import time
from common import BaseTest
from pages.advisors_stratifi_pages.sign_in_page import SignInPage
from pages.advisors_stratifi_pages.stratifi_page import StratifiPage
class TestLogin(BaseTest):
stratifi_page = StratifiPage()
sign_in_page = SignInPage()
def test_login(self):
self.sign_in_page.open()
self.sign_in_page.enter_login('akhil@stratifi.com')
self.sign_in_page.enter_password('Hell0w0rld123!')
self.sign_in_page.press_sign_in()
self.stratifi_page = StratifiPage()
time.sleep(5)
self.stratifi_page.check_if_the_page_was_loaded()
# self.stratifi_page.check_if_page_is_loaded()
# print('Successfully logged in: ' + str(stratifi_page.is_element_present('startAnalise', timeout=60)))
if __name__ == "__main__":
pytest.main()
| 27.28125 | 111 | 0.736541 | 115 | 873 | 5.191304 | 0.417391 | 0.070352 | 0.100503 | 0.093802 | 0.241206 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01108 | 0.172967 | 873 | 31 | 112 | 28.16129 | 0.815789 | 0.167239 | 0 | 0 | 0 | 0 | 0.055325 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0.05 | 0.35 | 0 | 0.55 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
dd2f0d7177a9e00337f8cb290f7553ff469c1432 | 249 | py | Python | examples/using_classes/DeleteObject/delete_object.py | praveenbommalibits/JSONManipulator | 51c6d9ad916d07d96643599ec05faa1abd6da82b | [
"MIT"
] | 2 | 2020-07-21T08:03:41.000Z | 2020-09-27T18:19:06.000Z | examples/using_classes/DeleteObject/delete_object.py | praveenbommalibits/JSONManipulator | 51c6d9ad916d07d96643599ec05faa1abd6da82b | [
"MIT"
] | 8 | 2020-09-05T11:47:26.000Z | 2020-10-19T21:25:12.000Z | examples/using_classes/DeleteObject/delete_object.py | praveenbommalibits/JSONManipulator | 51c6d9ad916d07d96643599ec05faa1abd6da82b | [
"MIT"
] | 3 | 2020-07-24T05:24:14.000Z | 2020-10-31T20:07:58.000Z | import sys
import os
from JSONManipulator import DeleteObject
DeleteObject(
key="isbn", value=1935182927,
full_path=os.path.join(
sys.path[0],
"examples/using_classes/DeleteObject/books_after_deletion.json"
)
)
| 19.153846 | 71 | 0.690763 | 29 | 249 | 5.793103 | 0.724138 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05641 | 0.216867 | 249 | 12 | 72 | 20.75 | 0.805128 | 0 | 0 | 0 | 0 | 0 | 0.261044 | 0.24498 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.3 | 0 | 0.3 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd38e973a494991bd893ce279bc79a464dd40467 | 754 | py | Python | lnbits/extensions/events/models.py | blackcoffeexbt/lnbits-legend | a9f2877af77ea56d1900e2b5bc1c21b9b7ac2f64 | [
"MIT"
] | 76 | 2021-11-02T22:19:59.000Z | 2022-03-30T18:01:33.000Z | lnbits/extensions/events/models.py | blackcoffeexbt/lnbits-legend | a9f2877af77ea56d1900e2b5bc1c21b9b7ac2f64 | [
"MIT"
] | 100 | 2021-11-04T16:33:28.000Z | 2022-03-30T15:03:52.000Z | lnbits/extensions/events/models.py | blackcoffeexbt/lnbits-legend | a9f2877af77ea56d1900e2b5bc1c21b9b7ac2f64 | [
"MIT"
] | 57 | 2021-11-08T06:43:59.000Z | 2022-03-31T08:53:16.000Z | from fastapi.param_functions import Query
from pydantic import BaseModel
class CreateEvent(BaseModel):
wallet: str
name: str
info: str
closing_date: str
event_start_date: str
event_end_date: str
amount_tickets: int = Query(..., ge=0)
price_per_ticket: int = Query(..., ge=0)
class CreateTicket(BaseModel):
name: str
email: str
class Events(BaseModel):
id: str
wallet: str
name: str
info: str
closing_date: str
event_start_date: str
event_end_date: str
amount_tickets: int
price_per_ticket: int
sold: int
time: int
class Tickets(BaseModel):
id: str
wallet: str
event: str
name: str
email: str
registered: bool
paid: bool
time: int
| 17.136364 | 44 | 0.65252 | 102 | 754 | 4.656863 | 0.352941 | 0.088421 | 0.101053 | 0.067368 | 0.435789 | 0.357895 | 0.357895 | 0.357895 | 0.357895 | 0.357895 | 0 | 0.003636 | 0.270557 | 754 | 43 | 45 | 17.534884 | 0.86 | 0 | 0 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.057143 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
dd3ceb4df61e5cb33efb561593d4ac38db9c252b | 512 | py | Python | tcp_windows_scan.py | froyo75/scapy_scan_dos | 7ddd2516aea51bf2dc0e750cb1ac8af0c822e9c5 | [
"MIT"
] | 1 | 2016-10-19T16:58:07.000Z | 2016-10-19T16:58:07.000Z | tcp_windows_scan.py | froyo75/scapy_scan_dos | 7ddd2516aea51bf2dc0e750cb1ac8af0c822e9c5 | [
"MIT"
] | null | null | null | tcp_windows_scan.py | froyo75/scapy_scan_dos | 7ddd2516aea51bf2dc0e750cb1ac8af0c822e9c5 | [
"MIT"
] | null | null | null | #! /usr/bin/python
import logging
logging.getLogger("scapy.runtime").setLevel(logging.ERROR)
from scapy.all import *
dst_ip = "10.0.0.1"
src_port = RandShort()
dst_port=80
window_scan_resp = sr1(IP(dst=dst_ip)/TCP(dport=dst_port,flags="A"),timeout=10)
if (str(type(window_scan_resp))=="<type 'NoneType'>"):
print "No response"
elif(window_scan_resp.haslayer(TCP)):
if(window_scan_resp.getlayer(TCP).window == 0):
print "Closed"
elif(window_scan_resp.getlayer(TCP).window > 0):
print "Open" | 28.444444 | 80 | 0.716797 | 81 | 512 | 4.345679 | 0.518519 | 0.142045 | 0.198864 | 0.102273 | 0.210227 | 0.210227 | 0.210227 | 0.210227 | 0 | 0 | 0 | 0.026432 | 0.113281 | 512 | 18 | 81 | 28.444444 | 0.748899 | 0.033203 | 0 | 0 | 0 | 0 | 0.125523 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.142857 | null | null | 0.214286 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd5a5309f9ffe318d210aaf966e17baa5e7aaa12 | 392 | py | Python | requests_/auth_.py | bailihua93/pylearn | dc19ec9b9f3155f89ed6a09e3f0a04135cea14e6 | [
"MIT"
] | null | null | null | requests_/auth_.py | bailihua93/pylearn | dc19ec9b9f3155f89ed6a09e3f0a04135cea14e6 | [
"MIT"
] | null | null | null | requests_/auth_.py | bailihua93/pylearn | dc19ec9b9f3155f89ed6a09e3f0a04135cea14e6 | [
"MIT"
] | null | null | null | import requests
from requests.auth import HTTPBasicAuth
r = requests.get('http://localhost:5000', auth=HTTPBasicAuth('username', 'password'))
r = requests.get('http://localhost:5000', auth=('username', 'password')) # 简写,默认的东西
print(r.status_code)
from requests_oauthlib import OAuth1
url= ''
auth = OAuth1('APP_key','App_secret','user_auth_key','user-token_seleted')
requests.get(url,auth) | 35.636364 | 85 | 0.752551 | 54 | 392 | 5.333333 | 0.481481 | 0.114583 | 0.083333 | 0.111111 | 0.229167 | 0.229167 | 0.229167 | 0 | 0 | 0 | 0 | 0.027701 | 0.079082 | 392 | 11 | 86 | 35.636364 | 0.770083 | 0.020408 | 0 | 0 | 0 | 0 | 0.318538 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.222222 | 0.333333 | 0 | 0.333333 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
dd5be5647c873397600a93049e370750b010f655 | 2,106 | py | Python | shimmer/apps/BtStream/python/accelCalRead.py | tinyos-io/tinyos-3.x-contrib | 3aaf036722a2afc0c0aad588459a5c3e00bd3c01 | [
"BSD-3-Clause",
"MIT"
] | null | null | null | shimmer/apps/BtStream/python/accelCalRead.py | tinyos-io/tinyos-3.x-contrib | 3aaf036722a2afc0c0aad588459a5c3e00bd3c01 | [
"BSD-3-Clause",
"MIT"
] | null | null | null | shimmer/apps/BtStream/python/accelCalRead.py | tinyos-io/tinyos-3.x-contrib | 3aaf036722a2afc0c0aad588459a5c3e00bd3c01 | [
"BSD-3-Clause",
"MIT"
] | null | null | null | #!/usr/bin/python
import sys, struct, array, time, serial
def wait_for_ack():
ddata = ""
ack = struct.pack('B', 0xff)
while ddata != ack:
ddata = ser.read(1)
return
if len(sys.argv) < 2:
print "no device specified"
print "You need to specifiy the serial port of the shimmer you wish to connect to"
print "example:"
print " accelCalRead.py Com5"
print " or"
print " accelCalRead.py /dev/rfcomm0"
print
else:
ser = serial.Serial(sys.argv[1], 115200)
ser.flushInput()
# send get accel calibration command
ser.write(struct.pack('B', 0x13))
# read the acknowledgement
wait_for_ack()
print "Acknowledgement received to get accel calibration command"
# wait for calibration response
ddata = ""
response = struct.pack('B', 0x12)
while ddata != response:
ddata = ser.read(1)
print "Accel calibration response:"
# read incoming data
ddata = ""
numbytes = 0
framesize = 21
while numbytes < framesize:
ddata += ser.read(framesize)
numbytes = len(ddata)
data = ddata[0:framesize]
ddata = ddata[framesize:]
numbytes = len(ddata)
print "Raw packet received from shimmer:",
print ",".join("0x{:02x}".format(ord(c)) for c in data)
print
(Xoffset, Yoffset, Zoffset, Xsensitivity, Ysensitivity, Zsensitivity, align0, align1, align2, align3, align4, align5, align6, align7, align8) = struct.unpack('>hhhhhhbbbbbbbbb', data);
print "Offset Vector (ba) | Sensitivity Matrix (Ka) | Alignment Matrix (Ra)"
print " %4d | %4d 0 0 |" % (Xoffset, Xsensitivity),
print ' {: .2f} {: .2f} {: .2f}'.format(float(align0)/100, float(align1)/100, float(align2)/100)
print " %4d | 0 %4d 0 |" % (Yoffset, Ysensitivity),
print ' {: .2f} {: .2f} {: .2f}'.format(float(align3)/100, float(align4)/100, float(align5)/100)
print " %4d | 0 0 %4d |" % (Zoffset, Zsensitivity),
print ' {: .2f} {: .2f} {: .2f}'.format(float(align6)/100, float(align7)/100, float(align8)/100)
print
ser.close()
| 31.909091 | 187 | 0.614435 | 263 | 2,106 | 4.904943 | 0.406844 | 0.018605 | 0.025581 | 0.025581 | 0.051163 | 0.051163 | 0 | 0 | 0 | 0 | 0 | 0.05768 | 0.24264 | 2,106 | 65 | 188 | 32.4 | 0.751097 | 0.059829 | 0 | 0.208333 | 0 | 0 | 0.293611 | 0 | 0 | 0 | 0.006085 | 0 | 0 | 0 | null | null | 0 | 0.020833 | null | null | 0.416667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
dd71cc71b09574dbccbe2f8e9392587a4b1fb366 | 1,095 | py | Python | core/report.py | alnvdl/oks | 4e00b2c2561b0745dcc81b7583de79f7e2f898bb | [
"Apache-2.0"
] | 2 | 2021-05-12T00:02:14.000Z | 2021-08-06T12:49:29.000Z | core/report.py | alnvdl/oks | 4e00b2c2561b0745dcc81b7583de79f7e2f898bb | [
"Apache-2.0"
] | null | null | null | core/report.py | alnvdl/oks | 4e00b2c2561b0745dcc81b7583de79f7e2f898bb | [
"Apache-2.0"
] | 1 | 2021-05-12T00:02:39.000Z | 2021-05-12T00:02:39.000Z | #!/usr/bin/env python
#-*- coding:utf-8 -*-
import datetime
from core.output import *
class Report:
NAME = ""
DESCRIPTION = ""
OPTIONS = []
def __init__(self, db):
self.db = db
self.data = []
self.bits = []
# An option has the following syntax:
# (name, type_, label, value)
#
# name is an ID for the option
# type_ is the option type
# label is the label shown to the user
# value is the default value for the option
for (name, type_, label, value) in self.OPTIONS:
self.set_option(name, value)
def register_option(self, option, type_, label, value):
self.OPTIONS.append((option, type_, label, value))
self.set_option(option, value)
def get_option(self, option):
return getattr(self, option)
def set_option(self, option, value):
setattr(self, option, value)
def make(self):
self.data = []
self.bits = []
def make_output(self):
self.make()
| 24.886364 | 59 | 0.547032 | 132 | 1,095 | 4.424242 | 0.363636 | 0.077055 | 0.09589 | 0.054795 | 0.082192 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001397 | 0.346119 | 1,095 | 43 | 60 | 25.465116 | 0.814246 | 0.217352 | 0 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.083333 | 0.041667 | 0.541667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
dd732d7dfc1e0fdf4c7b864d24325edb7b1c9989 | 342 | py | Python | tsne_precompute/1_precompute_tsne.py | emcoglab/sensorimotor-distance-calculator | 496390f99a0641cb7ff0efc16e100b32be2cdb51 | [
"CC-BY-4.0"
] | 1 | 2021-09-28T15:30:15.000Z | 2021-09-28T15:30:15.000Z | tsne_precompute/1_precompute_tsne.py | emcoglab/sensorimotor-distance-calculator | 496390f99a0641cb7ff0efc16e100b32be2cdb51 | [
"CC-BY-4.0"
] | null | null | null | tsne_precompute/1_precompute_tsne.py | emcoglab/sensorimotor-distance-calculator | 496390f99a0641cb7ff0efc16e100b32be2cdb51 | [
"CC-BY-4.0"
] | null | null | null | from config import Config
from sensorimotor_norms.config.config import Config as SMConfig; SMConfig(use_config_overrides_from_file=Config.path)
from tsne import valid_distance_names, SensorimotorTSNE
if __name__ == '__main__':
for distance in valid_distance_names:
for dim in [2, 3]:
SensorimotorTSNE(dim, distance)
| 34.2 | 117 | 0.77193 | 45 | 342 | 5.488889 | 0.533333 | 0.097166 | 0.145749 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007018 | 0.166667 | 342 | 9 | 118 | 38 | 0.859649 | 0 | 0 | 0 | 0 | 0 | 0.023392 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
dd84600b9617818e191cff7ec208314c7743e8df | 693 | py | Python | mysite/CustomerApps/migrations/0004_auto_20201207_2301.py | denandreychuk/Django | 792483d11b6c06fe16c47846e53f7809c54c526e | [
"MIT"
] | null | null | null | mysite/CustomerApps/migrations/0004_auto_20201207_2301.py | denandreychuk/Django | 792483d11b6c06fe16c47846e53f7809c54c526e | [
"MIT"
] | null | null | null | mysite/CustomerApps/migrations/0004_auto_20201207_2301.py | denandreychuk/Django | 792483d11b6c06fe16c47846e53f7809c54c526e | [
"MIT"
] | null | null | null | # Generated by Django 3.1.4 on 2020-12-07 21:01
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('CustomerApps', '0003_auto_20201207_1922'),
]
operations = [
migrations.RemoveField(
model_name='customerapp',
name='paid_status',
),
migrations.AddField(
model_name='customerapp',
name='status',
field=models.CharField(default='Test', max_length=20),
),
migrations.AlterField(
model_name='customerapp',
name='token',
field=models.CharField(max_length=10, unique=True),
),
]
| 24.75 | 66 | 0.575758 | 67 | 693 | 5.820896 | 0.671642 | 0.069231 | 0.153846 | 0.184615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.073069 | 0.308802 | 693 | 27 | 67 | 25.666667 | 0.741127 | 0.064935 | 0 | 0.285714 | 1 | 0 | 0.145511 | 0.035604 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.047619 | 0 | 0.190476 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd84ecd4d4aa1107b38abe544e7f25183d58c7bc | 15,241 | py | Python | frameworks/hdfs/tests/test_shakedown.py | akshitjain/dcos-commons_edited | 371675b07971afc1604800b0fa6b6ce11ae3a705 | [
"Apache-2.0"
] | null | null | null | frameworks/hdfs/tests/test_shakedown.py | akshitjain/dcos-commons_edited | 371675b07971afc1604800b0fa6b6ce11ae3a705 | [
"Apache-2.0"
] | null | null | null | frameworks/hdfs/tests/test_shakedown.py | akshitjain/dcos-commons_edited | 371675b07971afc1604800b0fa6b6ce11ae3a705 | [
"Apache-2.0"
] | null | null | null | import pytest
import time
import xml.etree.ElementTree as etree
import shakedown
import sdk_cmd as cmd
import sdk_hosts as hosts
import sdk_install as install
import sdk_marathon as marathon
import sdk_plan as plan
import sdk_tasks as tasks
import sdk_utils as utils
from tests.config import *
def setup_module(module):
install.uninstall(FOLDERED_SERVICE_NAME, package_name=PACKAGE_NAME)
utils.gc_frameworks()
install.install(
PACKAGE_NAME,
DEFAULT_TASK_COUNT,
service_name=FOLDERED_SERVICE_NAME,
additional_options={"service": { "name": FOLDERED_SERVICE_NAME } })
plan.wait_for_completed_deployment(FOLDERED_SERVICE_NAME)
def setup_function(function):
check_healthy()
def teardown_module(module):
install.uninstall(FOLDERED_SERVICE_NAME, package_name=PACKAGE_NAME)
@pytest.mark.sanity
def test_endpoints():
# check that we can reach the scheduler via admin router, and that returned endpoints are sanitized:
core_site = etree.fromstring(cmd.run_cli('hdfs --name={} endpoints core-site.xml'.format(FOLDERED_SERVICE_NAME)))
check_properties(core_site, {
'ha.zookeeper.parent-znode': '/dcos-service-test__integration__hdfs/hadoop-ha'
})
hdfs_site = etree.fromstring(cmd.run_cli('hdfs --name={} endpoints hdfs-site.xml'.format(FOLDERED_SERVICE_NAME)))
expect = {
'dfs.namenode.shared.edits.dir': 'qjournal://' + ';'.join([
hosts.autoip_host(FOLDERED_SERVICE_NAME, 'journal-{}-node'.format(i), 8485) for i in range(3)]) + '/hdfs',
}
for i in range(2):
expect['dfs.namenode.rpc-address.hdfs.name-{}-node'.format(i)] = hosts.autoip_host(FOLDERED_SERVICE_NAME, 'name-{}-node'.format(i), 9001)
expect['dfs.namenode.http-address.hdfs.name-{}-node'.format(i)] = hosts.autoip_host(FOLDERED_SERVICE_NAME, 'name-{}-node'.format(i), 9002)
check_properties(hdfs_site, expect)
def check_properties(xml, expect):
found = {}
for prop in xml.findall('property'):
name = prop.find('name').text
if name in expect:
found[name] = prop.find('value').text
utils.out('expect: {}\nfound: {}'.format(expect, found))
assert expect == found
@pytest.mark.skip(reason="HDFS-451")
@pytest.mark.data_integrity
@pytest.mark.sanity
def test_integrity_on_data_node_failure():
write_some_data('data-0-node', TEST_FILE_1_NAME)
# gives chance for write to succeed and replication to occur
time.sleep(9)
tasks.kill_task_with_pattern("DataNode", hosts.system_host(FOLDERED_SERVICE_NAME, 'data-0-node'))
tasks.kill_task_with_pattern("DataNode", hosts.system_host(FOLDERED_SERVICE_NAME, 'data-1-node'))
time.sleep(1) # give DataNode a chance to die
read_some_data('data-2-node', TEST_FILE_1_NAME)
check_healthy()
@pytest.mark.skip(reason="HDFS-451")
@pytest.mark.data_integrity
@pytest.mark.sanity
def test_integrity_on_name_node_failure():
"""
The first name node (name-0-node) is the active name node by default when HDFS gets installed.
This test checks that it is possible to write and read data after the first name node fails.
"""
tasks.kill_task_with_pattern("NameNode", hosts.system_host(FOLDERED_SERVICE_NAME, 'name-0-node'))
time.sleep(1) # give NameNode a chance to die
write_some_data('data-0-node', TEST_FILE_2_NAME)
read_some_data('data-2-node', TEST_FILE_2_NAME)
check_healthy()
@pytest.mark.recovery
def test_kill_journal_node():
journal_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'journal-0')
name_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'name')
data_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'data')
tasks.kill_task_with_pattern('journalnode', hosts.system_host(FOLDERED_SERVICE_NAME, 'journal-0-node'))
check_healthy()
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'journal', journal_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'name', name_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'data', data_ids)
@pytest.mark.sanity
@pytest.mark.recovery
def test_kill_name_node():
name_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'name-0')
journal_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'journal')
data_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'data')
tasks.kill_task_with_pattern('namenode', hosts.system_host(FOLDERED_SERVICE_NAME, 'name-0-node'))
check_healthy()
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'name', name_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'journal', journal_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'data', data_ids)
@pytest.mark.sanity
@pytest.mark.recovery
def test_kill_data_node():
data_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'data-0')
journal_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'journal')
name_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'name')
tasks.kill_task_with_pattern('datanode', hosts.system_host(FOLDERED_SERVICE_NAME, 'data-0-node'))
check_healthy()
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'data', data_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'journal', journal_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'name', name_ids)
@pytest.mark.sanity
@pytest.mark.recovery
def test_kill_scheduler():
tasks.kill_task_with_pattern('hdfs.scheduler.Main', shakedown.get_service_ips('marathon').pop())
check_healthy()
@pytest.mark.sanity
@pytest.mark.recovery
def test_kill_all_journalnodes():
journal_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'journal')
data_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'data')
for host in shakedown.get_service_ips(FOLDERED_SERVICE_NAME):
tasks.kill_task_with_pattern('journalnode', host)
check_healthy()
# name nodes fail and restart, so don't check those
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'journal', journal_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'data', data_ids)
@pytest.mark.sanity
@pytest.mark.recovery
def test_kill_all_namenodes():
journal_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'journal')
name_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'name')
data_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'data')
for host in shakedown.get_service_ips(FOLDERED_SERVICE_NAME):
tasks.kill_task_with_pattern('namenode', host)
check_healthy()
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'name', name_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'journal', journal_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'data', data_ids)
@pytest.mark.sanity
@pytest.mark.recovery
def test_kill_all_datanodes():
journal_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'journal')
name_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'name')
data_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'data')
for host in shakedown.get_service_ips(FOLDERED_SERVICE_NAME):
tasks.kill_task_with_pattern('datanode', host)
check_healthy()
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'data', data_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'journal', journal_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'name', name_ids)
@pytest.mark.sanity
@pytest.mark.recovery
def test_permanently_replace_namenodes():
replace_name_node(0)
replace_name_node(1)
replace_name_node(0)
@pytest.mark.sanity
@pytest.mark.recovery
def test_permanent_and_transient_namenode_failures_0_1():
check_healthy()
name_0_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'name-0')
name_1_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'name-1')
journal_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'journal')
data_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'data')
cmd.run_cli('hdfs --name={} pods replace name-0'.format(FOLDERED_SERVICE_NAME))
cmd.run_cli('hdfs --name={} pods restart name-1'.format(FOLDERED_SERVICE_NAME))
check_healthy()
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'name-0', name_0_ids)
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'name-1', name_1_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'journal', journal_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'data', data_ids)
@pytest.mark.sanity
@pytest.mark.recovery
def test_permanent_and_transient_namenode_failures_1_0():
check_healthy()
name_0_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'name-0')
name_1_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'name-1')
journal_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'journal')
data_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'data')
cmd.run_cli('hdfs --name={} pods replace name-1'.format(FOLDERED_SERVICE_NAME))
cmd.run_cli('hdfs --name={} pods restart name-0'.format(FOLDERED_SERVICE_NAME))
check_healthy()
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'name-0', name_0_ids)
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'name-1', name_1_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'journal', journal_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'data', data_ids)
@pytest.mark.smoke
def test_install():
check_healthy()
@pytest.mark.sanity
def test_bump_journal_cpus():
journal_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'journal')
utils.out('journal ids: ' + str(journal_ids))
marathon.bump_cpu_count_config(FOLDERED_SERVICE_NAME, 'JOURNAL_CPUS')
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'journal', journal_ids)
check_healthy()
@pytest.mark.sanity
def test_bump_data_nodes():
data_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'data')
utils.out('data ids: ' + str(data_ids))
marathon.bump_task_count_config(FOLDERED_SERVICE_NAME, 'DATA_COUNT')
check_healthy(DEFAULT_TASK_COUNT + 1)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'data', data_ids)
@pytest.mark.sanity
def test_modify_app_config():
app_config_field = 'TASKCFG_ALL_CLIENT_READ_SHORTCIRCUIT_STREAMS_CACHE_SIZE_EXPIRY_MS'
journal_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'journal')
name_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'name')
config = marathon.get_config(FOLDERED_SERVICE_NAME)
utils.out('marathon config: ')
utils.out(config)
expiry_ms = int(config['env'][app_config_field])
config['env'][app_config_field] = str(expiry_ms + 1)
marathon.update_app(FOLDERED_SERVICE_NAME, config)
# All tasks should be updated because hdfs-site.xml has changed
check_healthy()
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'journal', journal_ids)
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'name', name_ids)
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'data', journal_ids)
@pytest.mark.sanity
def test_modify_app_config_rollback():
app_config_field = 'TASKCFG_ALL_CLIENT_READ_SHORTCIRCUIT_STREAMS_CACHE_SIZE_EXPIRY_MS'
journal_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'journal')
data_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'data')
old_config = marathon.get_config(FOLDERED_SERVICE_NAME)
config = marathon.get_config(FOLDERED_SERVICE_NAME)
utils.out('marathon config: ')
utils.out(config)
expiry_ms = int(config['env'][app_config_field])
utils.out('expiry ms: ' + str(expiry_ms))
config['env'][app_config_field] = str(expiry_ms + 1)
marathon.update_app(FOLDERED_SERVICE_NAME, config)
# Wait for journal nodes to be affected by the change
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'journal', journal_ids)
journal_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'journal')
utils.out('old config: ')
utils.out(old_config)
# Put the old config back (rollback)
marathon.update_app(FOLDERED_SERVICE_NAME, old_config)
# Wait for the journal nodes to return to their old configuration
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, 'journal', journal_ids)
check_healthy()
config = marathon.get_config(FOLDERED_SERVICE_NAME)
assert int(config['env'][app_config_field]) == expiry_ms
# Data tasks should not have been affected
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'data', data_ids)
def replace_name_node(index):
check_healthy()
name_node_name = 'name-' + str(index)
name_id = tasks.get_task_ids(FOLDERED_SERVICE_NAME, name_node_name)
journal_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'journal')
data_ids = tasks.get_task_ids(FOLDERED_SERVICE_NAME, 'data')
cmd.run_cli('hdfs --name={} pods replace {}'.format(FOLDERED_SERVICE_NAME, name_node_name))
check_healthy()
tasks.check_tasks_updated(FOLDERED_SERVICE_NAME, name_node_name, name_id)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'journal', journal_ids)
tasks.check_tasks_not_updated(FOLDERED_SERVICE_NAME, 'data', data_ids)
def write_some_data(data_node_name, file_name):
def write_data_to_hdfs():
write_command = "echo '{}' | ./bin/hdfs dfs -put - /{}".format(TEST_CONTENT_SMALL, file_name)
rc, _ = run_hdfs_command(data_node_name, write_command)
# rc being True is effectively it being 0...
return rc
shakedown.wait_for(lambda: write_data_to_hdfs(), timeout_seconds=HDFS_CMD_TIMEOUT_SEC)
def read_some_data(data_node_name, file_name):
def read_data_from_hdfs():
read_command = "./bin/hdfs dfs -cat /{}".format(file_name)
rc, output = run_hdfs_command(data_node_name, read_command)
return rc and output.rstrip() == TEST_CONTENT_SMALL
shakedown.wait_for(lambda: read_data_from_hdfs(), timeout_seconds=HDFS_CMD_TIMEOUT_SEC)
def run_hdfs_command(task_name, command):
"""
Go into the Data Node hdfs directory, set JAVA_HOME, and execute the command.
"""
host = hosts.system_host(FOLDERED_SERVICE_NAME, task_name)
java_home = find_java_home(host)
# Find hdfs home directory by looking up the Data Node process.
# Hdfs directory is found in an arg to the java command.
hdfs_dir_cmd = """ps -ef | grep hdfs | grep DataNode \
| awk 'BEGIN {RS=" "}; /-Dhadoop.home.dir/' | sed s/-Dhadoop.home.dir=//"""
full_command = """cd $({}) &&
export JAVA_HOME={} &&
{}""".format(hdfs_dir_cmd, java_home, command)
rc, output = shakedown.run_command_on_agent(host, full_command)
return rc, output
def find_java_home(host):
"""
Find java home by looking up the Data Node process.
Java home is found in the process command.
"""
java_home_cmd = """ps -ef | grep hdfs | grep DataNode | grep -v grep \
| awk '{print $8}' | sed s:/bin/java::"""
rc, output = shakedown.run_command_on_agent(host, java_home_cmd)
assert rc
java_home = output.rstrip()
utils.out("java_home: {}".format(java_home))
return java_home
def check_healthy(count=DEFAULT_TASK_COUNT):
plan.wait_for_completed_deployment(FOLDERED_SERVICE_NAME, timeout_seconds=20 * 60)
plan.wait_for_completed_recovery(FOLDERED_SERVICE_NAME, timeout_seconds=20 * 60)
tasks.check_running(FOLDERED_SERVICE_NAME, count)
| 38.78117 | 146 | 0.753231 | 2,202 | 15,241 | 4.840599 | 0.118074 | 0.113519 | 0.192513 | 0.087813 | 0.73262 | 0.703068 | 0.666854 | 0.642931 | 0.590018 | 0.575664 | 0 | 0.006168 | 0.138311 | 15,241 | 392 | 147 | 38.880102 | 0.805452 | 0.068434 | 0 | 0.505455 | 0 | 0.003636 | 0.122913 | 0.023917 | 0 | 0 | 0 | 0 | 0.010909 | 1 | 0.109091 | false | 0 | 0.043636 | 0 | 0.167273 | 0.003636 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd8a1dee57b91dd2fdbcf654cb5eef4bd921ea57 | 172 | py | Python | declarative/version.py | mccullerlp/python-declarative | 1c17c2ceb5ec5c0faac8160f57c03cc069e293cc | [
"Apache-2.0"
] | 6 | 2018-02-28T18:32:06.000Z | 2022-03-20T13:04:05.000Z | declarative/version.py | mccullerlp/python-declarative | 1c17c2ceb5ec5c0faac8160f57c03cc069e293cc | [
"Apache-2.0"
] | 2 | 2021-02-22T17:18:59.000Z | 2021-03-03T16:39:22.000Z | declarative/version.py | mccullerlp/python-declarative | 1c17c2ceb5ec5c0faac8160f57c03cc069e293cc | [
"Apache-2.0"
] | 1 | 2021-02-09T18:58:53.000Z | 2021-02-09T18:58:53.000Z | """
"""
from __future__ import division, print_function, unicode_literals
version_info = (1, 3, 2)
version = '.'.join(str(v) for v in version_info)
__version__ = version
| 19.111111 | 65 | 0.72093 | 24 | 172 | 4.666667 | 0.75 | 0.196429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020408 | 0.145349 | 172 | 8 | 66 | 21.5 | 0.741497 | 0 | 0 | 0 | 0 | 0 | 0.006098 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd8b52f4796e32cb00663af19a5acf3af1348375 | 2,725 | py | Python | mdssdk/parsers/interface/show_interface_transceiver_detail.py | akshatha-s13/mdssdk | 615a5528d0af1201e8fe8f305c62b258e5433990 | [
"Apache-2.0"
] | 4 | 2020-12-13T20:02:43.000Z | 2022-02-27T23:36:58.000Z | mdssdk/parsers/interface/show_interface_transceiver_detail.py | akshatha-s13/mdssdk | 615a5528d0af1201e8fe8f305c62b258e5433990 | [
"Apache-2.0"
] | 13 | 2020-09-23T07:30:15.000Z | 2022-03-30T01:12:25.000Z | mdssdk/parsers/interface/show_interface_transceiver_detail.py | akshatha-s13/mdssdk | 615a5528d0af1201e8fe8f305c62b258e5433990 | [
"Apache-2.0"
] | 12 | 2020-05-11T09:33:21.000Z | 2022-03-18T11:11:28.000Z | import logging
import re
log = logging.getLogger(__name__)
ALL_PAT = [
"^fc\d+\/\d+\s+(?P<sfp_present>.*)",
"Name is (?P<name>\S+)",
"Manufacturer's part number is (?P<part_number>\S+)",
"Cisco extended id is (?P<cisco_id>.*)",
"Cisco part number is (?P<cisco_part_number>\S+)",
"Cisco pid is (?P<cisco_product_id>\S+)",
"Nominal bit rate is (?P<bit_rate>\d+)",
"Min speed:\s+(?P<min_speed>\d+)\s+Mb/s,\s+Max speed:\s+(?P<max_speed>\d+)",
"Temperature\s+(?P<temperature>\S+ C)",
"Voltage\s+(?P<voltage>\S+ V)",
"Current\s+(?P<current>\S+ mA)",
"Tx Power\s+(?P<tx_power>\S+ dBm)",
"Rx Power\s+(?P<rx_power>\S+ dBm)",
]
class ShowInterfaceTransceiverDetail(object):
def __init__(self, outlines, vsan_id=None):
self._group_dict = {}
self.process_all(outlines)
log.debug(self._group_dict)
def process_all(self, outlines):
for eachline in outlines:
eachline = eachline.strip()
for eachpat in ALL_PAT:
match = re.search(eachpat, eachline)
if match:
self._group_dict = {**self._group_dict, **match.groupdict()}
break
@property
def sfp_present(self):
sfp = self._group_dict.get("sfp_present", None)
if sfp is None:
return None
return "sfp is present" in sfp
@property
def name(self):
return self._group_dict.get("name", None)
@property
def part_number(self):
return self._group_dict.get("part_number", None)
@property
def cisco_id(self):
return self._group_dict.get("cisco_id", None)
@property
def cisco_part_number(self):
return self._group_dict.get("cisco_part_number", None)
@property
def cisco_product_id(self):
return self._group_dict.get("cisco_product_id", None)
@property
def bit_rate(self):
bit_rate = self._group_dict.get("bit_rate", None)
if bit_rate is not None:
return int(bit_rate)
return None
@property
def min_speed(self):
return self._group_dict.get("min_speed", None)
@property
def max_speed(self):
return self._group_dict.get("max_speed", None)
@property
def temperature(self):
return self._group_dict.get("temperature", None)
@property
def voltage(self):
return self._group_dict.get("voltage", None)
@property
def current(self):
return self._group_dict.get("current", None)
@property
def tx_power(self):
return self._group_dict.get("tx_power", None)
@property
def rx_power(self):
return self._group_dict.get("rx_power", None)
| 27.525253 | 80 | 0.606972 | 373 | 2,725 | 4.198391 | 0.182306 | 0.103448 | 0.149425 | 0.14304 | 0.275224 | 0.275224 | 0.170498 | 0.088123 | 0 | 0 | 0 | 0 | 0.251009 | 2,725 | 98 | 81 | 27.806122 | 0.767271 | 0 | 0 | 0.202532 | 0 | 0.012658 | 0.235229 | 0.103853 | 0 | 0 | 0 | 0 | 0 | 1 | 0.202532 | false | 0 | 0.025316 | 0.151899 | 0.443038 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
dd8e7940705b82390e66a30f34236b2ceb8e4993 | 71 | py | Python | medusa/func/keyword_only.py | deadwind4/medusa | f2d2041349e70a7321f0e1f1c30137ac8769edd5 | [
"MIT"
] | null | null | null | medusa/func/keyword_only.py | deadwind4/medusa | f2d2041349e70a7321f0e1f1c30137ac8769edd5 | [
"MIT"
] | null | null | null | medusa/func/keyword_only.py | deadwind4/medusa | f2d2041349e70a7321f0e1f1c30137ac8769edd5 | [
"MIT"
] | null | null | null |
def foo(a, b, *, bar=True):
print(bar)
# 直接调用报错
foo(1, 2, 3)
| 6.454545 | 27 | 0.492958 | 13 | 71 | 2.692308 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06 | 0.295775 | 71 | 10 | 28 | 7.1 | 0.64 | 0.084507 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.333333 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dd9547da00f29f206304758b935bd35185ebb4db | 1,594 | py | Python | python/process_simulation/src/defender/defender.py | amir-heinisch/snippets | a193def6fa16ba85e045940ee1f6450e80eaa0d3 | [
"MIT"
] | null | null | null | python/process_simulation/src/defender/defender.py | amir-heinisch/snippets | a193def6fa16ba85e045940ee1f6450e80eaa0d3 | [
"MIT"
] | null | null | null | python/process_simulation/src/defender/defender.py | amir-heinisch/snippets | a193def6fa16ba85e045940ee1f6450e80eaa0d3 | [
"MIT"
] | null | null | null | """
The is the abstract defender base class.
A defender can see all values at the beginning of
each simulation round and can try to detect an attack.
It is also possible to first let the defender learn
before running an attack.
"""
from abc import ABC, abstractmethod
__author__ = 'Amir Heinisch'
__email__ = 'mail@amir-heinisch.de'
__date__ = '2020/03/18'
class AttackDetected(Exception):
""" This exception should be raised if the defender detects an attack. """
def __init__(self, message):
self.message = message
class Defender(ABC):
# Nuber of rounds before start to defend.
learningRounds = 0
# Ingore the first few detections.
ignore = 0
def __init__(self, config=None):
""" Init defender with given config """
try:
self.learningRounds = int(config['roundsToLearn'])
self.ignore = int(config['ignore'])
except:
pass
def attackDetected(self, msg="Attack detected"):
if self.ignore > 0:
self.ignore -= 1
else:
raise AttackDetected(msg)
def learn(self, processValues):
""" This method should implement a learning algorithm. """
pass
# TODO: make this abstract.
def detect(self, processValues):
"""
This method should implement the detection algorithm.
TODO: maybe it makes sense to implement a more detect hooks
which the attacker call after running the control loops or
eaven after each control loop.
"""
pass
| 27.964912 | 78 | 0.629235 | 192 | 1,594 | 5.119792 | 0.53125 | 0.024415 | 0.02238 | 0.054934 | 0.085453 | 0.085453 | 0 | 0 | 0 | 0 | 0 | 0.010676 | 0.294856 | 1,594 | 56 | 79 | 28.464286 | 0.863879 | 0.427227 | 0 | 0.12 | 0 | 0 | 0.097257 | 0.026185 | 0 | 0 | 0 | 0.035714 | 0 | 1 | 0.2 | false | 0.12 | 0.04 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
06bbbbffdc4db446b17fea8dd01dca23ba961c3d | 362 | py | Python | examples/operator_matmul.py | igfish/toyvm | bb1ab371a8c71ba01522556235fc9f017c9b6b8f | [
"MIT"
] | null | null | null | examples/operator_matmul.py | igfish/toyvm | bb1ab371a8c71ba01522556235fc9f017c9b6b8f | [
"MIT"
] | null | null | null | examples/operator_matmul.py | igfish/toyvm | bb1ab371a8c71ba01522556235fc9f017c9b6b8f | [
"MIT"
] | null | null | null | # TODO
class A:
def __init__(self, value):
self.value = value
def __matmul__(self, other):
print('__matmul__')
return A(self.value * other.value)
def __imatmul__(self, other):
print('__imatmul__')
self.value *= other.value
return self
a = A(1)
b = A(2)
print((a @ b).value)
a @= b
print(a.value) | 15.73913 | 42 | 0.569061 | 49 | 362 | 3.795918 | 0.306122 | 0.193548 | 0.150538 | 0.204301 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007813 | 0.292818 | 362 | 23 | 43 | 15.73913 | 0.71875 | 0.01105 | 0 | 0 | 0 | 0 | 0.058824 | 0 | 0 | 0 | 0 | 0.043478 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.4 | 0.266667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
06c0c1c28304617bdd92a59fbb6241d67a42d19b | 8,107 | py | Python | lib/Parser.py | gabsmoreira/tornado-docs-generator | c6ed6fa453016c0088fa0738ad6854db3206d802 | [
"MIT"
] | null | null | null | lib/Parser.py | gabsmoreira/tornado-docs-generator | c6ed6fa453016c0088fa0738ad6854db3206d802 | [
"MIT"
] | null | null | null | lib/Parser.py | gabsmoreira/tornado-docs-generator | c6ed6fa453016c0088fa0738ad6854db3206d802 | [
"MIT"
] | null | null | null | from Tokenizer import Tokenizer
from writer import MKDocsWriter
import json
import jsonutils
class Parser:
def run(code, file, path):
Parser.tokens = Tokenizer(code)
Parser.file = file
Parser.writer = MKDocsWriter()
Parser.path = path
ret = Parser.parseDocstring()
return ret
def parseDocstring():
while Parser.tokens.actual.type != 'EOF':
if Parser.tokens.actual.value == 'Description':
Parser.parseDescription()
continue
if Parser.tokens.actual.value == 'Parameters':
Parser.parseParameters()
continue
if Parser.tokens.actual.value == 'Response':
Parser.parseResponse()
continue
Parser.tokens.select_next()
def parseParameters():
parameters = {}
Parser.tokens.select_next()
Parser.file.write(Parser.writer.heading('Parameters:', level=4))
while Parser.tokens.actual.type != 'TITLE':
if Parser.tokens.actual.value == 'Header':
Parser.tokens.select_next()
header = Parser.parseHeader()
parameters['header'] = header
if Parser.tokens.actual.value == 'Body':
Parser.tokens.select_next()
body = Parser.parseBody()
parameters['body'] = body
if Parser.tokens.actual.value == 'Path':
Parser.tokens.select_next()
path = Parser.parsePath()
parameters['path'] = path
else:
return parameters
return parameters
def parsePath():
path = {}
Parser.file.write(Parser.writer.heading('Path parameters:', level=5))
while Parser.tokens.actual.type not in ['EOF', 'TITLE', 'SUB']:
key = Parser.tokens.actual.value
Parser.tokens.select_next()
if Parser.tokens.actual.type != 'SEPARATOR':
raise SyntaxError(f'Missing - between key and value for path, instead got {Parser.tokens.actual.value}')
Parser.tokens.select_next()
value = []
while Parser.tokens.actual.type != 'ENDLINE':
value.append(Parser.tokens.actual.value)
Parser.tokens.select_next()
value.append(Parser.tokens.actual.value)
value = ' '.join(value)
Parser.tokens.select_next()
path[key] = value
Parser.file.write(Parser.writer.table(path, keyname='Name', valuename='Description'))
return path
def parseHeader():
header = {}
Parser.file.write(Parser.writer.heading('Header parameters:', level=5))
while Parser.tokens.actual.type not in ['EOF', 'TITLE', 'SUB']:
key = Parser.tokens.actual.value
Parser.tokens.select_next()
if Parser.tokens.actual.type != 'SEPARATOR':
raise SyntaxError(f'Missing - between key and value for header, instead got {Parser.tokens.actual.value}')
Parser.tokens.select_next()
value = []
while Parser.tokens.actual.type != 'ENDLINE':
value.append(Parser.tokens.actual)
Parser.tokens.select_next()
value.append(Parser.tokens.actual.value)
value = ' '.join(value)
Parser.tokens.select_next()
header[key] = value
Parser.file.write(Parser.writer.table(header))
return header
def parseSchema():
try:
file_path = Parser.tokens.actual.value
if Parser.tokens.actual.value[:2] == './':
file_path = Parser.tokens.actual.value[2:]
with open(Parser.path + file_path) as f:
data = json.load(f)
return data
except FileNotFoundError as err:
raise err
def parseSchemaBody():
try:
file_path = Parser.tokens.actual.value
if Parser.tokens.actual.value[:2] == './':
file_path = Parser.tokens.actual.value[2:]
body = jsonutils.make_payload(Parser.path + file_path)
return body
except FileNotFoundError as err:
raise err
def parseBody():
body = {}
Parser.file.write(Parser.writer.heading('Body parameters:', level=5))
if Parser.tokens.actual.type == 'FILE':
Parser.tokens.select_next()
body = Parser.parseSchemaBody()
Parser.file.write(Parser.writer.json_code(json.dumps(body)))
return body
while Parser.tokens.actual.type not in ['EOF', 'TITLE', 'SUB']:
key = Parser.tokens.actual
Parser.tokens.select_next()
if Parser.tokens.actual.type != 'SEPARATOR':
raise SyntaxError(f'Missing - between key and value for body, instead got {Parser.tokens.actual.value}')
Parser.tokens.select_next()
value = []
while Parser.tokens.actual.type != 'ENDLINE':
value.append(Parser.tokens.actual)
Parser.tokens.select_next()
value.append(Parser.tokens.actual.value)
value = ' '.join(value)
Parser.tokens.select_next()
body[key.value] = value
Parser.file.write(Parser.writer.json_code(json.dumps(body)))
return body
def parseResponse():
response = {}
Parser.tokens.select_next()
Parser.file.write(Parser.writer.heading('Response:', level=4))
while Parser.tokens.actual.type not in ['EOF', 'TITLE']:
actual = Parser.tokens.actual.value
if not actual.isnumeric():
raise SyntaxError(f'Response must have a code, instead got {Parser.tokens.actual.value}')
try:
color = Parser.writer.RESPONSE_COLOR[str(actual)]
except KeyError:
color = 'red'
Parser.file.write(Parser.writer.heading(Parser.writer.text_color(f'Code: {actual}', color=color), level=5))
key_reponse = str(actual)
Parser.tokens.select_next()
body = {}
if Parser.tokens.actual.type == 'FILE':
Parser.tokens.select_next()
body = Parser.parseSchema()
Parser.tokens.select_next()
response[key_reponse] = body
while Parser.tokens.actual.type not in ['EOF', 'TITLE', 'SUB']:
key = Parser.tokens.actual.value
Parser.tokens.select_next()
if Parser.tokens.actual.type != 'SEPARATOR':
raise SyntaxError(f'Missing - between key and value for reponse, instead got {Parser.tokens.actual.value}')
Parser.tokens.select_next()
value = []
while Parser.tokens.actual.type != 'ENDLINE':
value.append(Parser.tokens.actual.value)
Parser.tokens.select_next()
value.append(Parser.tokens.actual.value)
value = ' '.join(value)
Parser.tokens.select_next()
body[key] = value
response[key_reponse] = body
try:
color = Parser.writer.RESPONSE_COLOR[str(actual)]
Parser.file.write(Parser.writer.RESPONSE_NOTATION[str(actual)](Parser.writer.code(json.dumps(body))))
except KeyError:
Parser.file.write(Parser.writer.failure(Parser.writer.code(json.dumps(body))))
return response
def parseDescription():
description = []
Parser.tokens.select_next()
Parser.file.write(Parser.writer.heading('Description:', level=4))
while Parser.tokens.actual.type != 'TITLE':
description.append(Parser.tokens.actual.value)
Parser.tokens.select_next()
description = ' '.join(description)
Parser.file.write(Parser.writer.text(description))
return description | 41.574359 | 127 | 0.565561 | 833 | 8,107 | 5.451381 | 0.111645 | 0.206122 | 0.19423 | 0.141819 | 0.720766 | 0.646113 | 0.573442 | 0.554944 | 0.502092 | 0.482052 | 0 | 0.002 | 0.321697 | 8,107 | 195 | 128 | 41.574359 | 0.823786 | 0 | 0 | 0.536723 | 0 | 0 | 0.088431 | 0.017267 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056497 | false | 0 | 0.022599 | 0 | 0.146893 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
06cd38dbbe54053ce0ae6120b49eb90528ac04c6 | 1,140 | py | Python | Cryptography/2.Hill Codes New/driver.py | swethapraba/SeniorYearCSElectives | 67b989ffecd5cf7508258783b0ec26468cdf94fc | [
"CNRI-Python"
] | null | null | null | Cryptography/2.Hill Codes New/driver.py | swethapraba/SeniorYearCSElectives | 67b989ffecd5cf7508258783b0ec26468cdf94fc | [
"CNRI-Python"
] | null | null | null | Cryptography/2.Hill Codes New/driver.py | swethapraba/SeniorYearCSElectives | 67b989ffecd5cf7508258783b0ec26468cdf94fc | [
"CNRI-Python"
] | null | null | null | ffrom sympy import *
# import MatrixCiphers
from MatrixCiphers import *
from Cryptoalpha import *
print("-"*50)
print("Testing Hill Codes")
code1 = Cryptoalpha("ABCDEFGHIJKLMNOPQRSTUVWXYZ!' ")
plaintext = "Don't Mine at Night!"
E = Matrix([[4,19],[13,10]])
ciphertext = encrypt(E, plaintext, code1)
print("'%s' encodes as '%s'" % (plaintext, ciphertext))
print(" using encryption matrix")
pprint(E)
print("And %s decrypts to %s" % (ciphertext, decrypt(E.inv_mod(code1.m),ciphertext, code1)))
print("-"*50)
print("Cracking a code using crib text")
ciphertext = "!NITFOITTFW!ITFULBAY"
answer = decrypt(get_decryption_matrix("ESAT", "EIZS", code1),ciphertext, code1)
print("ciphertext %s is %s" % (ciphertext, answer))
print("-"*50)
print("New messages")
newEncryptMe = "Isabelle is weird!"
encryptM = get_random_invertible_matrix(len(code1.alphabet))
encryptedText = encrypt(encryptM, newEncryptMe, code1)
print("'%s' encodes as '%s'" % (newEncryptMe, encryptedText))
print(" using encryption matrix")
pprint(encryptM)
print("And %s decrypts to %s" % (encryptedText, decrypt(encryptM.inv_mod(code1.m),encryptedText, code1)))
print("-"*50) | 35.625 | 105 | 0.72807 | 146 | 1,140 | 5.636986 | 0.438356 | 0.060753 | 0.043742 | 0.043742 | 0.1774 | 0.099635 | 0 | 0 | 0 | 0 | 0 | 0.024558 | 0.107018 | 1,140 | 32 | 106 | 35.625 | 0.78389 | 0.017544 | 0 | 0.214286 | 0 | 0 | 0.277033 | 0.025022 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.107143 | null | null | 0.571429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
06ce9364c5e0eeef0e6ed063552f6f6e0f113248 | 851 | py | Python | A_source_code/core/make_y0.py | vanHoek-dgnm/CARBON-DISC | 3ecd5f4efba5e032d43679ee977064d6b25154a9 | [
"Naumen",
"Condor-1.1",
"MS-PL"
] | null | null | null | A_source_code/core/make_y0.py | vanHoek-dgnm/CARBON-DISC | 3ecd5f4efba5e032d43679ee977064d6b25154a9 | [
"Naumen",
"Condor-1.1",
"MS-PL"
] | null | null | null | A_source_code/core/make_y0.py | vanHoek-dgnm/CARBON-DISC | 3ecd5f4efba5e032d43679ee977064d6b25154a9 | [
"Naumen",
"Condor-1.1",
"MS-PL"
] | null | null | null | # ******************************************************
## Revision "$LastChangedDate: 2018-07-08 18:08:17 +0200 (zo, 08 jul 2018) $"
## Date "$LastChangedRevision: 1 $"
## Author "$LastChangedBy: arthurbeusen $"
## URL "$HeadURL: https://pbl.sliksvn.com/dgnm/core/make_y0.py $"
## Copyright 2019, PBL Netherlands Environmental Assessment Agency and Utrecht University.
## Reuse permitted under Gnu Public License, GPL v3.
# ******************************************************
# Import local modules.
import general_func
def make_y0(params,species):
y0 = general_func.get_amount(species)
# Append spool and rpool
#for i in params.phytoindex:
#y0.append(species[i].get_spool())
#y0.append(species[i].get_rpool())
#for i in params.pocindex:
#y0.append(species[i].get_dissolved())
return y0
| 32.730769 | 90 | 0.594595 | 99 | 851 | 5.030303 | 0.646465 | 0.048193 | 0.090361 | 0.096386 | 0.182731 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051893 | 0.162162 | 851 | 25 | 91 | 34.04 | 0.646564 | 0.774383 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
06e453a9211c8b2e6236ebbd83beaf2e831b8b91 | 1,053 | py | Python | GOTE/utils/logger.py | Lenferd/ANSYS-OpenFOAM | b133cfa48aff7e734ccbd1f3b7fd6d6de029fe16 | [
"MIT"
] | null | null | null | GOTE/utils/logger.py | Lenferd/ANSYS-OpenFOAM | b133cfa48aff7e734ccbd1f3b7fd6d6de029fe16 | [
"MIT"
] | null | null | null | GOTE/utils/logger.py | Lenferd/ANSYS-OpenFOAM | b133cfa48aff7e734ccbd1f3b7fd6d6de029fe16 | [
"MIT"
] | null | null | null | from enum import IntEnum
class LogLvl(IntEnum):
LOG_ERROR = 0
LOG_INFO = 1
LOG_DEBUG = 2
def to_str(self):
return "[" + self.name + "] "
class Logger:
def __init__(self, log_lvl=LogLvl.LOG_INFO):
self.log_lvl = log_lvl
def log(self, msg_log_lvl=LogLvl.LOG_INFO, message=""):
generated_msg = self._generate_message(msg_log_lvl, message)
if len(generated_msg):
print(generated_msg)
def error(self, message):
self.log(LogLvl.LOG_ERROR, message)
def info(self, message):
self.log(LogLvl.LOG_INFO, message)
def debug(self, message):
self.log(LogLvl.LOG_DEBUG, message)
def _generate_message(self, msg_log_lvl=LogLvl.LOG_INFO, message=""):
if self.log_lvl >= msg_log_lvl:
message = message.replace('\n', '\n' + msg_log_lvl.to_str())
return "{header}{body}".format(header=msg_log_lvl.to_str(), body=message)
else:
return ""
def set_level(self, log_lvl):
self.log_lvl = log_lvl
| 26.325 | 85 | 0.625831 | 147 | 1,053 | 4.190476 | 0.251701 | 0.126623 | 0.087662 | 0.073052 | 0.366883 | 0.238636 | 0.107143 | 0.107143 | 0 | 0 | 0 | 0.003817 | 0.253561 | 1,053 | 39 | 86 | 27 | 0.779898 | 0 | 0 | 0.071429 | 0 | 0 | 0.019943 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.035714 | 0.035714 | 0.607143 | 0.035714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
06ea00e12e4b36e521ed9c8cb9bcd38a3d83c382 | 72 | py | Python | norns/__about__.py | simonvh/norns | dc89eaa3351ca423c787d44967ca1e51a1d1c8a0 | [
"MIT"
] | null | null | null | norns/__about__.py | simonvh/norns | dc89eaa3351ca423c787d44967ca1e51a1d1c8a0 | [
"MIT"
] | 4 | 2017-05-02T09:01:31.000Z | 2022-01-28T12:47:01.000Z | norns/__about__.py | simonvh/norns | dc89eaa3351ca423c787d44967ca1e51a1d1c8a0 | [
"MIT"
] | 4 | 2019-03-27T07:59:28.000Z | 2020-03-25T19:00:00.000Z | """Metadata"""
__version__ = '0.1.4'
__author__ = "Simon van Heeringen"
| 18 | 34 | 0.680556 | 9 | 72 | 4.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 0.125 | 72 | 3 | 35 | 24 | 0.603175 | 0.111111 | 0 | 0 | 0 | 0 | 0.413793 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
06f90c3ccc2767140db564ab01596cc7a95a39ef | 38,340 | py | Python | google/cloud/asset_v1p2beta1/proto/asset_service_pb2.py | vam-google/python-asset | dbf236ecd633a5fa5dd2371493425ef0adfcc266 | [
"Apache-2.0"
] | null | null | null | google/cloud/asset_v1p2beta1/proto/asset_service_pb2.py | vam-google/python-asset | dbf236ecd633a5fa5dd2371493425ef0adfcc266 | [
"Apache-2.0"
] | null | null | null | google/cloud/asset_v1p2beta1/proto/asset_service_pb2.py | vam-google/python-asset | dbf236ecd633a5fa5dd2371493425ef0adfcc266 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: google/cloud/asset_v1p2beta1/proto/asset_service.proto
from google.protobuf.internal import enum_type_wrapper
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from google.api import annotations_pb2 as google_dot_api_dot_annotations__pb2
from google.api import client_pb2 as google_dot_api_dot_client__pb2
from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2
from google.api import resource_pb2 as google_dot_api_dot_resource__pb2
from google.cloud.asset_v1p2beta1.proto import (
assets_pb2 as google_dot_cloud_dot_asset__v1p2beta1_dot_proto_dot_assets__pb2,
)
from google.longrunning import (
operations_pb2 as google_dot_longrunning_dot_operations__pb2,
)
from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2
from google.protobuf import field_mask_pb2 as google_dot_protobuf_dot_field__mask__pb2
from google.protobuf import timestamp_pb2 as google_dot_protobuf_dot_timestamp__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name="google/cloud/asset_v1p2beta1/proto/asset_service.proto",
package="google.cloud.asset.v1p2beta1",
syntax="proto3",
serialized_options=b"\n com.google.cloud.asset.v1p2beta1B\021AssetServiceProtoP\001ZAgoogle.golang.org/genproto/googleapis/cloud/asset/v1p2beta1;asset\252\002\034Google.Cloud.Asset.V1p2Beta1\312\002\034Google\\Cloud\\Asset\\V1p2Beta1",
serialized_pb=b'\n6google/cloud/asset_v1p2beta1/proto/asset_service.proto\x12\x1cgoogle.cloud.asset.v1p2beta1\x1a\x1cgoogle/api/annotations.proto\x1a\x17google/api/client.proto\x1a\x1fgoogle/api/field_behavior.proto\x1a\x19google/api/resource.proto\x1a/google/cloud/asset_v1p2beta1/proto/assets.proto\x1a#google/longrunning/operations.proto\x1a\x1bgoogle/protobuf/empty.proto\x1a google/protobuf/field_mask.proto\x1a\x1fgoogle/protobuf/timestamp.proto"u\n\x11\x43reateFeedRequest\x12\x13\n\x06parent\x18\x01 \x01(\tB\x03\xe0\x41\x02\x12\x14\n\x07\x66\x65\x65\x64_id\x18\x02 \x01(\tB\x03\xe0\x41\x02\x12\x35\n\x04\x66\x65\x65\x64\x18\x03 \x01(\x0b\x32".google.cloud.asset.v1p2beta1.FeedB\x03\xe0\x41\x02"F\n\x0eGetFeedRequest\x12\x34\n\x04name\x18\x01 \x01(\tB&\xe0\x41\x02\xfa\x41 \n\x1e\x63loudasset.googleapis.com/Feed"\'\n\x10ListFeedsRequest\x12\x13\n\x06parent\x18\x01 \x01(\tB\x03\xe0\x41\x02"F\n\x11ListFeedsResponse\x12\x31\n\x05\x66\x65\x65\x64s\x18\x01 \x03(\x0b\x32".google.cloud.asset.v1p2beta1.Feed"\x80\x01\n\x11UpdateFeedRequest\x12\x35\n\x04\x66\x65\x65\x64\x18\x01 \x01(\x0b\x32".google.cloud.asset.v1p2beta1.FeedB\x03\xe0\x41\x02\x12\x34\n\x0bupdate_mask\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.FieldMaskB\x03\xe0\x41\x02"I\n\x11\x44\x65leteFeedRequest\x12\x34\n\x04name\x18\x01 \x01(\tB&\xe0\x41\x02\xfa\x41 \n\x1e\x63loudasset.googleapis.com/Feed"f\n\x0cOutputConfig\x12G\n\x0fgcs_destination\x18\x01 \x01(\x0b\x32,.google.cloud.asset.v1p2beta1.GcsDestinationH\x00\x42\r\n\x0b\x64\x65stination"-\n\x0eGcsDestination\x12\r\n\x03uri\x18\x01 \x01(\tH\x00\x42\x0c\n\nobject_uri""\n\x11PubsubDestination\x12\r\n\x05topic\x18\x01 \x01(\t"p\n\x10\x46\x65\x65\x64OutputConfig\x12M\n\x12pubsub_destination\x18\x01 \x01(\x0b\x32/.google.cloud.asset.v1p2beta1.PubsubDestinationH\x00\x42\r\n\x0b\x64\x65stination"\xe9\x02\n\x04\x46\x65\x65\x64\x12\x11\n\x04name\x18\x01 \x01(\tB\x03\xe0\x41\x02\x12\x13\n\x0b\x61sset_names\x18\x02 \x03(\t\x12\x13\n\x0b\x61sset_types\x18\x03 \x03(\t\x12?\n\x0c\x63ontent_type\x18\x04 \x01(\x0e\x32).google.cloud.asset.v1p2beta1.ContentType\x12O\n\x12\x66\x65\x65\x64_output_config\x18\x05 \x01(\x0b\x32..google.cloud.asset.v1p2beta1.FeedOutputConfigB\x03\xe0\x41\x02:\x91\x01\xea\x41\x8d\x01\n\x1e\x63loudasset.googleapis.com/Feed\x12\x1fprojects/{project}/feeds/{feed}\x12\x1d\x66olders/{folder}/feeds/{feed}\x12)organizations/{organization}/feeds/{feed} \x01*I\n\x0b\x43ontentType\x12\x1c\n\x18\x43ONTENT_TYPE_UNSPECIFIED\x10\x00\x12\x0c\n\x08RESOURCE\x10\x01\x12\x0e\n\nIAM_POLICY\x10\x02\x32\xbf\x06\n\x0c\x41ssetService\x12\x94\x01\n\nCreateFeed\x12/.google.cloud.asset.v1p2beta1.CreateFeedRequest\x1a".google.cloud.asset.v1p2beta1.Feed"1\x82\xd3\xe4\x93\x02""\x1d/v1p2beta1/{parent=*/*}/feeds:\x01*\xda\x41\x06parent\x12\x89\x01\n\x07GetFeed\x12,.google.cloud.asset.v1p2beta1.GetFeedRequest\x1a".google.cloud.asset.v1p2beta1.Feed",\x82\xd3\xe4\x93\x02\x1f\x12\x1d/v1p2beta1/{name=*/*/feeds/*}\xda\x41\x04name\x12\x9c\x01\n\tListFeeds\x12..google.cloud.asset.v1p2beta1.ListFeedsRequest\x1a/.google.cloud.asset.v1p2beta1.ListFeedsResponse".\x82\xd3\xe4\x93\x02\x1f\x12\x1d/v1p2beta1/{parent=*/*}/feeds\xda\x41\x06parent\x12\x97\x01\n\nUpdateFeed\x12/.google.cloud.asset.v1p2beta1.UpdateFeedRequest\x1a".google.cloud.asset.v1p2beta1.Feed"4\x82\xd3\xe4\x93\x02\'2"/v1p2beta1/{feed.name=*/*/feeds/*}:\x01*\xda\x41\x04\x66\x65\x65\x64\x12\x83\x01\n\nDeleteFeed\x12/.google.cloud.asset.v1p2beta1.DeleteFeedRequest\x1a\x16.google.protobuf.Empty",\x82\xd3\xe4\x93\x02\x1f*\x1d/v1p2beta1/{name=*/*/feeds/*}\xda\x41\x04name\x1aM\xca\x41\x19\x63loudasset.googleapis.com\xd2\x41.https://www.googleapis.com/auth/cloud-platformB\xb8\x01\n com.google.cloud.asset.v1p2beta1B\x11\x41ssetServiceProtoP\x01ZAgoogle.golang.org/genproto/googleapis/cloud/asset/v1p2beta1;asset\xaa\x02\x1cGoogle.Cloud.Asset.V1p2Beta1\xca\x02\x1cGoogle\\Cloud\\Asset\\V1p2Beta1b\x06proto3',
dependencies=[
google_dot_api_dot_annotations__pb2.DESCRIPTOR,
google_dot_api_dot_client__pb2.DESCRIPTOR,
google_dot_api_dot_field__behavior__pb2.DESCRIPTOR,
google_dot_api_dot_resource__pb2.DESCRIPTOR,
google_dot_cloud_dot_asset__v1p2beta1_dot_proto_dot_assets__pb2.DESCRIPTOR,
google_dot_longrunning_dot_operations__pb2.DESCRIPTOR,
google_dot_protobuf_dot_empty__pb2.DESCRIPTOR,
google_dot_protobuf_dot_field__mask__pb2.DESCRIPTOR,
google_dot_protobuf_dot_timestamp__pb2.DESCRIPTOR,
],
)
_CONTENTTYPE = _descriptor.EnumDescriptor(
name="ContentType",
full_name="google.cloud.asset.v1p2beta1.ContentType",
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name="CONTENT_TYPE_UNSPECIFIED",
index=0,
number=0,
serialized_options=None,
type=None,
),
_descriptor.EnumValueDescriptor(
name="RESOURCE", index=1, number=1, serialized_options=None, type=None
),
_descriptor.EnumValueDescriptor(
name="IAM_POLICY", index=2, number=2, serialized_options=None, type=None
),
],
containing_type=None,
serialized_options=None,
serialized_start=1560,
serialized_end=1633,
)
_sym_db.RegisterEnumDescriptor(_CONTENTTYPE)
ContentType = enum_type_wrapper.EnumTypeWrapper(_CONTENTTYPE)
CONTENT_TYPE_UNSPECIFIED = 0
RESOURCE = 1
IAM_POLICY = 2
_CREATEFEEDREQUEST = _descriptor.Descriptor(
name="CreateFeedRequest",
full_name="google.cloud.asset.v1p2beta1.CreateFeedRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="parent",
full_name="google.cloud.asset.v1p2beta1.CreateFeedRequest.parent",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b"\340A\002",
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="feed_id",
full_name="google.cloud.asset.v1p2beta1.CreateFeedRequest.feed_id",
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b"\340A\002",
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="feed",
full_name="google.cloud.asset.v1p2beta1.CreateFeedRequest.feed",
index=2,
number=3,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b"\340A\002",
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=385,
serialized_end=502,
)
_GETFEEDREQUEST = _descriptor.Descriptor(
name="GetFeedRequest",
full_name="google.cloud.asset.v1p2beta1.GetFeedRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="name",
full_name="google.cloud.asset.v1p2beta1.GetFeedRequest.name",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b"\340A\002\372A \n\036cloudasset.googleapis.com/Feed",
file=DESCRIPTOR,
)
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=504,
serialized_end=574,
)
_LISTFEEDSREQUEST = _descriptor.Descriptor(
name="ListFeedsRequest",
full_name="google.cloud.asset.v1p2beta1.ListFeedsRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="parent",
full_name="google.cloud.asset.v1p2beta1.ListFeedsRequest.parent",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b"\340A\002",
file=DESCRIPTOR,
)
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=576,
serialized_end=615,
)
_LISTFEEDSRESPONSE = _descriptor.Descriptor(
name="ListFeedsResponse",
full_name="google.cloud.asset.v1p2beta1.ListFeedsResponse",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="feeds",
full_name="google.cloud.asset.v1p2beta1.ListFeedsResponse.feeds",
index=0,
number=1,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
)
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=617,
serialized_end=687,
)
_UPDATEFEEDREQUEST = _descriptor.Descriptor(
name="UpdateFeedRequest",
full_name="google.cloud.asset.v1p2beta1.UpdateFeedRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="feed",
full_name="google.cloud.asset.v1p2beta1.UpdateFeedRequest.feed",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b"\340A\002",
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="update_mask",
full_name="google.cloud.asset.v1p2beta1.UpdateFeedRequest.update_mask",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b"\340A\002",
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=690,
serialized_end=818,
)
_DELETEFEEDREQUEST = _descriptor.Descriptor(
name="DeleteFeedRequest",
full_name="google.cloud.asset.v1p2beta1.DeleteFeedRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="name",
full_name="google.cloud.asset.v1p2beta1.DeleteFeedRequest.name",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b"\340A\002\372A \n\036cloudasset.googleapis.com/Feed",
file=DESCRIPTOR,
)
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=820,
serialized_end=893,
)
_OUTPUTCONFIG = _descriptor.Descriptor(
name="OutputConfig",
full_name="google.cloud.asset.v1p2beta1.OutputConfig",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="gcs_destination",
full_name="google.cloud.asset.v1p2beta1.OutputConfig.gcs_destination",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
)
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name="destination",
full_name="google.cloud.asset.v1p2beta1.OutputConfig.destination",
index=0,
containing_type=None,
fields=[],
)
],
serialized_start=895,
serialized_end=997,
)
_GCSDESTINATION = _descriptor.Descriptor(
name="GcsDestination",
full_name="google.cloud.asset.v1p2beta1.GcsDestination",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="uri",
full_name="google.cloud.asset.v1p2beta1.GcsDestination.uri",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
)
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name="object_uri",
full_name="google.cloud.asset.v1p2beta1.GcsDestination.object_uri",
index=0,
containing_type=None,
fields=[],
)
],
serialized_start=999,
serialized_end=1044,
)
_PUBSUBDESTINATION = _descriptor.Descriptor(
name="PubsubDestination",
full_name="google.cloud.asset.v1p2beta1.PubsubDestination",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="topic",
full_name="google.cloud.asset.v1p2beta1.PubsubDestination.topic",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
)
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1046,
serialized_end=1080,
)
_FEEDOUTPUTCONFIG = _descriptor.Descriptor(
name="FeedOutputConfig",
full_name="google.cloud.asset.v1p2beta1.FeedOutputConfig",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="pubsub_destination",
full_name="google.cloud.asset.v1p2beta1.FeedOutputConfig.pubsub_destination",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
)
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name="destination",
full_name="google.cloud.asset.v1p2beta1.FeedOutputConfig.destination",
index=0,
containing_type=None,
fields=[],
)
],
serialized_start=1082,
serialized_end=1194,
)
_FEED = _descriptor.Descriptor(
name="Feed",
full_name="google.cloud.asset.v1p2beta1.Feed",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="name",
full_name="google.cloud.asset.v1p2beta1.Feed.name",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b"\340A\002",
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="asset_names",
full_name="google.cloud.asset.v1p2beta1.Feed.asset_names",
index=1,
number=2,
type=9,
cpp_type=9,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="asset_types",
full_name="google.cloud.asset.v1p2beta1.Feed.asset_types",
index=2,
number=3,
type=9,
cpp_type=9,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="content_type",
full_name="google.cloud.asset.v1p2beta1.Feed.content_type",
index=3,
number=4,
type=14,
cpp_type=8,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="feed_output_config",
full_name="google.cloud.asset.v1p2beta1.Feed.feed_output_config",
index=4,
number=5,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b"\340A\002",
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=b"\352A\215\001\n\036cloudasset.googleapis.com/Feed\022\037projects/{project}/feeds/{feed}\022\035folders/{folder}/feeds/{feed}\022)organizations/{organization}/feeds/{feed} \001",
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1197,
serialized_end=1558,
)
_CREATEFEEDREQUEST.fields_by_name["feed"].message_type = _FEED
_LISTFEEDSRESPONSE.fields_by_name["feeds"].message_type = _FEED
_UPDATEFEEDREQUEST.fields_by_name["feed"].message_type = _FEED
_UPDATEFEEDREQUEST.fields_by_name[
"update_mask"
].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK
_OUTPUTCONFIG.fields_by_name["gcs_destination"].message_type = _GCSDESTINATION
_OUTPUTCONFIG.oneofs_by_name["destination"].fields.append(
_OUTPUTCONFIG.fields_by_name["gcs_destination"]
)
_OUTPUTCONFIG.fields_by_name[
"gcs_destination"
].containing_oneof = _OUTPUTCONFIG.oneofs_by_name["destination"]
_GCSDESTINATION.oneofs_by_name["object_uri"].fields.append(
_GCSDESTINATION.fields_by_name["uri"]
)
_GCSDESTINATION.fields_by_name["uri"].containing_oneof = _GCSDESTINATION.oneofs_by_name[
"object_uri"
]
_FEEDOUTPUTCONFIG.fields_by_name["pubsub_destination"].message_type = _PUBSUBDESTINATION
_FEEDOUTPUTCONFIG.oneofs_by_name["destination"].fields.append(
_FEEDOUTPUTCONFIG.fields_by_name["pubsub_destination"]
)
_FEEDOUTPUTCONFIG.fields_by_name[
"pubsub_destination"
].containing_oneof = _FEEDOUTPUTCONFIG.oneofs_by_name["destination"]
_FEED.fields_by_name["content_type"].enum_type = _CONTENTTYPE
_FEED.fields_by_name["feed_output_config"].message_type = _FEEDOUTPUTCONFIG
DESCRIPTOR.message_types_by_name["CreateFeedRequest"] = _CREATEFEEDREQUEST
DESCRIPTOR.message_types_by_name["GetFeedRequest"] = _GETFEEDREQUEST
DESCRIPTOR.message_types_by_name["ListFeedsRequest"] = _LISTFEEDSREQUEST
DESCRIPTOR.message_types_by_name["ListFeedsResponse"] = _LISTFEEDSRESPONSE
DESCRIPTOR.message_types_by_name["UpdateFeedRequest"] = _UPDATEFEEDREQUEST
DESCRIPTOR.message_types_by_name["DeleteFeedRequest"] = _DELETEFEEDREQUEST
DESCRIPTOR.message_types_by_name["OutputConfig"] = _OUTPUTCONFIG
DESCRIPTOR.message_types_by_name["GcsDestination"] = _GCSDESTINATION
DESCRIPTOR.message_types_by_name["PubsubDestination"] = _PUBSUBDESTINATION
DESCRIPTOR.message_types_by_name["FeedOutputConfig"] = _FEEDOUTPUTCONFIG
DESCRIPTOR.message_types_by_name["Feed"] = _FEED
DESCRIPTOR.enum_types_by_name["ContentType"] = _CONTENTTYPE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
CreateFeedRequest = _reflection.GeneratedProtocolMessageType(
"CreateFeedRequest",
(_message.Message,),
{
"DESCRIPTOR": _CREATEFEEDREQUEST,
"__module__": "google.cloud.asset_v1p2beta1.proto.asset_service_pb2",
"__doc__": """Create asset feed request.
Attributes:
parent:
Required. The name of the project/folder/organization where
this feed should be created in. It can only be an organization
number (such as “organizations/123”), a folder number (such as
“folders/123”), a project ID (such as “projects/my-project-
id”)“, or a project number (such as”projects/12345").
feed_id:
Required. This is the client-assigned asset feed identifier
and it needs to be unique under a specific parent
project/folder/organization.
feed:
Required. The feed details. The field ``name`` must be empty
and it will be generated in the format of:
projects/project_number/feeds/feed_id
folders/folder_number/feeds/feed_id
organizations/organization_number/feeds/feed_id
""",
# @@protoc_insertion_point(class_scope:google.cloud.asset.v1p2beta1.CreateFeedRequest)
},
)
_sym_db.RegisterMessage(CreateFeedRequest)
GetFeedRequest = _reflection.GeneratedProtocolMessageType(
"GetFeedRequest",
(_message.Message,),
{
"DESCRIPTOR": _GETFEEDREQUEST,
"__module__": "google.cloud.asset_v1p2beta1.proto.asset_service_pb2",
"__doc__": """Get asset feed request.
Attributes:
name:
Required. The name of the Feed and it must be in the format
of: projects/project_number/feeds/feed_id
folders/folder_number/feeds/feed_id
organizations/organization_number/feeds/feed_id
""",
# @@protoc_insertion_point(class_scope:google.cloud.asset.v1p2beta1.GetFeedRequest)
},
)
_sym_db.RegisterMessage(GetFeedRequest)
ListFeedsRequest = _reflection.GeneratedProtocolMessageType(
"ListFeedsRequest",
(_message.Message,),
{
"DESCRIPTOR": _LISTFEEDSREQUEST,
"__module__": "google.cloud.asset_v1p2beta1.proto.asset_service_pb2",
"__doc__": """List asset feeds request.
Attributes:
parent:
Required. The parent project/folder/organization whose feeds
are to be listed. It can only be using
project/folder/organization number (such as “folders/12345”)“,
or a project ID (such as”projects/my-project-id").
""",
# @@protoc_insertion_point(class_scope:google.cloud.asset.v1p2beta1.ListFeedsRequest)
},
)
_sym_db.RegisterMessage(ListFeedsRequest)
ListFeedsResponse = _reflection.GeneratedProtocolMessageType(
"ListFeedsResponse",
(_message.Message,),
{
"DESCRIPTOR": _LISTFEEDSRESPONSE,
"__module__": "google.cloud.asset_v1p2beta1.proto.asset_service_pb2",
"__doc__": """
Attributes:
feeds:
A list of feeds.
""",
# @@protoc_insertion_point(class_scope:google.cloud.asset.v1p2beta1.ListFeedsResponse)
},
)
_sym_db.RegisterMessage(ListFeedsResponse)
UpdateFeedRequest = _reflection.GeneratedProtocolMessageType(
"UpdateFeedRequest",
(_message.Message,),
{
"DESCRIPTOR": _UPDATEFEEDREQUEST,
"__module__": "google.cloud.asset_v1p2beta1.proto.asset_service_pb2",
"__doc__": """Update asset feed request.
Attributes:
feed:
Required. The new values of feed details. It must match an
existing feed and the field ``name`` must be in the format of:
projects/project_number/feeds/feed_id or
folders/folder_number/feeds/feed_id or
organizations/organization_number/feeds/feed_id.
update_mask:
Required. Only updates the ``feed`` fields indicated by this
mask. The field mask must not be empty, and it must not
contain fields that are immutable or only set by the server.
""",
# @@protoc_insertion_point(class_scope:google.cloud.asset.v1p2beta1.UpdateFeedRequest)
},
)
_sym_db.RegisterMessage(UpdateFeedRequest)
DeleteFeedRequest = _reflection.GeneratedProtocolMessageType(
"DeleteFeedRequest",
(_message.Message,),
{
"DESCRIPTOR": _DELETEFEEDREQUEST,
"__module__": "google.cloud.asset_v1p2beta1.proto.asset_service_pb2",
"__doc__": """
Attributes:
name:
Required. The name of the feed and it must be in the format
of: projects/project_number/feeds/feed_id
folders/folder_number/feeds/feed_id
organizations/organization_number/feeds/feed_id
""",
# @@protoc_insertion_point(class_scope:google.cloud.asset.v1p2beta1.DeleteFeedRequest)
},
)
_sym_db.RegisterMessage(DeleteFeedRequest)
OutputConfig = _reflection.GeneratedProtocolMessageType(
"OutputConfig",
(_message.Message,),
{
"DESCRIPTOR": _OUTPUTCONFIG,
"__module__": "google.cloud.asset_v1p2beta1.proto.asset_service_pb2",
"__doc__": """Output configuration for export assets destination.
Attributes:
destination:
Asset export destination.
gcs_destination:
Destination on Cloud Storage.
""",
# @@protoc_insertion_point(class_scope:google.cloud.asset.v1p2beta1.OutputConfig)
},
)
_sym_db.RegisterMessage(OutputConfig)
GcsDestination = _reflection.GeneratedProtocolMessageType(
"GcsDestination",
(_message.Message,),
{
"DESCRIPTOR": _GCSDESTINATION,
"__module__": "google.cloud.asset_v1p2beta1.proto.asset_service_pb2",
"__doc__": """A Cloud Storage location.
Attributes:
object_uri:
Required.
uri:
The uri of the Cloud Storage object. It’s the same uri that is
used by gsutil. For example: “gs://bucket_name/object_name”.
See `Viewing and Editing Object Metadata
<https://cloud.google.com/storage/docs/viewing-editing-
metadata>`__ for more information.
""",
# @@protoc_insertion_point(class_scope:google.cloud.asset.v1p2beta1.GcsDestination)
},
)
_sym_db.RegisterMessage(GcsDestination)
PubsubDestination = _reflection.GeneratedProtocolMessageType(
"PubsubDestination",
(_message.Message,),
{
"DESCRIPTOR": _PUBSUBDESTINATION,
"__module__": "google.cloud.asset_v1p2beta1.proto.asset_service_pb2",
"__doc__": """A Cloud Pubsub destination.
Attributes:
topic:
The name of the Cloud Pub/Sub topic to publish to. For
example: ``projects/PROJECT_ID/topics/TOPIC_ID``.
""",
# @@protoc_insertion_point(class_scope:google.cloud.asset.v1p2beta1.PubsubDestination)
},
)
_sym_db.RegisterMessage(PubsubDestination)
FeedOutputConfig = _reflection.GeneratedProtocolMessageType(
"FeedOutputConfig",
(_message.Message,),
{
"DESCRIPTOR": _FEEDOUTPUTCONFIG,
"__module__": "google.cloud.asset_v1p2beta1.proto.asset_service_pb2",
"__doc__": """Output configuration for asset feed destination.
Attributes:
destination:
Asset feed destination.
pubsub_destination:
Destination on Cloud Pubsub.
""",
# @@protoc_insertion_point(class_scope:google.cloud.asset.v1p2beta1.FeedOutputConfig)
},
)
_sym_db.RegisterMessage(FeedOutputConfig)
Feed = _reflection.GeneratedProtocolMessageType(
"Feed",
(_message.Message,),
{
"DESCRIPTOR": _FEED,
"__module__": "google.cloud.asset_v1p2beta1.proto.asset_service_pb2",
"__doc__": """An asset feed used to export asset updates to a
destinations. An asset feed filter controls what updates are exported.
The asset feed must be created within a project, organization, or
folder. Supported destinations are: Cloud Pub/Sub topics.
Attributes:
name:
Required. The format will be projects/{project_number}/feeds
/{client-assigned_feed_identifier} or
folders/{folder_number}/feeds/{client-
assigned_feed_identifier} or
organizations/{organization_number}/feeds/{client-
assigned_feed_identifier} The client-assigned feed identifier
must be unique within the parent project/folder/organization.
asset_names:
A list of the full names of the assets to receive updates. You
must specify either or both of asset_names and asset_types.
Only asset updates matching specified asset_names and
asset_types are exported to the feed. For example: ``//compute
.googleapis.com/projects/my_project_123/zones/zone1/instances/
instance1``. See `Resource Names <https://cloud.google.com/api
s/design/resource_names#full_resource_name>`__ for more info.
asset_types:
A list of types of the assets to receive updates. You must
specify either or both of asset_names and asset_types. Only
asset updates matching specified asset_names and asset_types
are exported to the feed. For example:
“compute.googleapis.com/Disk” See `Introduction to Cloud Asset
Inventory <https://cloud.google.com/resource-manager/docs
/cloud-asset-inventory/overview>`__ for all supported asset
types.
content_type:
Asset content type. If not specified, no content but the asset
name and type will be returned.
feed_output_config:
Required. Feed output configuration defining where the asset
updates are published to.
""",
# @@protoc_insertion_point(class_scope:google.cloud.asset.v1p2beta1.Feed)
},
)
_sym_db.RegisterMessage(Feed)
DESCRIPTOR._options = None
_CREATEFEEDREQUEST.fields_by_name["parent"]._options = None
_CREATEFEEDREQUEST.fields_by_name["feed_id"]._options = None
_CREATEFEEDREQUEST.fields_by_name["feed"]._options = None
_GETFEEDREQUEST.fields_by_name["name"]._options = None
_LISTFEEDSREQUEST.fields_by_name["parent"]._options = None
_UPDATEFEEDREQUEST.fields_by_name["feed"]._options = None
_UPDATEFEEDREQUEST.fields_by_name["update_mask"]._options = None
_DELETEFEEDREQUEST.fields_by_name["name"]._options = None
_FEED.fields_by_name["name"]._options = None
_FEED.fields_by_name["feed_output_config"]._options = None
_FEED._options = None
_ASSETSERVICE = _descriptor.ServiceDescriptor(
name="AssetService",
full_name="google.cloud.asset.v1p2beta1.AssetService",
file=DESCRIPTOR,
index=0,
serialized_options=b"\312A\031cloudasset.googleapis.com\322A.https://www.googleapis.com/auth/cloud-platform",
serialized_start=1636,
serialized_end=2467,
methods=[
_descriptor.MethodDescriptor(
name="CreateFeed",
full_name="google.cloud.asset.v1p2beta1.AssetService.CreateFeed",
index=0,
containing_service=None,
input_type=_CREATEFEEDREQUEST,
output_type=_FEED,
serialized_options=b'\202\323\344\223\002""\035/v1p2beta1/{parent=*/*}/feeds:\001*\332A\006parent',
),
_descriptor.MethodDescriptor(
name="GetFeed",
full_name="google.cloud.asset.v1p2beta1.AssetService.GetFeed",
index=1,
containing_service=None,
input_type=_GETFEEDREQUEST,
output_type=_FEED,
serialized_options=b"\202\323\344\223\002\037\022\035/v1p2beta1/{name=*/*/feeds/*}\332A\004name",
),
_descriptor.MethodDescriptor(
name="ListFeeds",
full_name="google.cloud.asset.v1p2beta1.AssetService.ListFeeds",
index=2,
containing_service=None,
input_type=_LISTFEEDSREQUEST,
output_type=_LISTFEEDSRESPONSE,
serialized_options=b"\202\323\344\223\002\037\022\035/v1p2beta1/{parent=*/*}/feeds\332A\006parent",
),
_descriptor.MethodDescriptor(
name="UpdateFeed",
full_name="google.cloud.asset.v1p2beta1.AssetService.UpdateFeed",
index=3,
containing_service=None,
input_type=_UPDATEFEEDREQUEST,
output_type=_FEED,
serialized_options=b"\202\323\344\223\002'2\"/v1p2beta1/{feed.name=*/*/feeds/*}:\001*\332A\004feed",
),
_descriptor.MethodDescriptor(
name="DeleteFeed",
full_name="google.cloud.asset.v1p2beta1.AssetService.DeleteFeed",
index=4,
containing_service=None,
input_type=_DELETEFEEDREQUEST,
output_type=google_dot_protobuf_dot_empty__pb2._EMPTY,
serialized_options=b"\202\323\344\223\002\037*\035/v1p2beta1/{name=*/*/feeds/*}\332A\004name",
),
],
)
_sym_db.RegisterServiceDescriptor(_ASSETSERVICE)
DESCRIPTOR.services_by_name["AssetService"] = _ASSETSERVICE
# @@protoc_insertion_point(module_scope)
| 36.65392 | 3,902 | 0.663067 | 4,206 | 38,340 | 5.782216 | 0.107228 | 0.038651 | 0.069531 | 0.084293 | 0.641694 | 0.564885 | 0.498602 | 0.42241 | 0.387582 | 0.359046 | 0 | 0.052179 | 0.229212 | 38,340 | 1,045 | 3,903 | 36.688995 | 0.770777 | 0.029291 | 0 | 0.56594 | 1 | 0.015576 | 0.360296 | 0.190806 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.014538 | 0 | 0.014538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
06f9298a2ab879d6097b1d4978cae97b24b07d29 | 1,412 | py | Python | junsu/battle_ai_test/WebClientServer.py | GreedyOsori/Chat | 58c440af7836bc394e59a7a3b210582cfec19043 | [
"MIT"
] | null | null | null | junsu/battle_ai_test/WebClientServer.py | GreedyOsori/Chat | 58c440af7836bc394e59a7a3b210582cfec19043 | [
"MIT"
] | null | null | null | junsu/battle_ai_test/WebClientServer.py | GreedyOsori/Chat | 58c440af7836bc394e59a7a3b210582cfec19043 | [
"MIT"
] | null | null | null | import tornado.websocket
import tornado.ioloop
from Room import Room
class WebClientServer(tornado.websocket.WebSocketHandler):
def initialize(self, web_client_list=set(), battle_ai_list=dict(), player_server=None):
self.web_client_list = web_client_list # set()
self.battle_ai_list = battle_ai_list # dict()
self.player_server = player_server
# accept web_client
def open(self, *args, **kwargs):
self.web_client_list.add(self)
pass
def on_message(self, message):
# all received message is battle request
# make attendee object
# !! case 1 : make new room
# !! make player object
# !! !! get player stream
try:
p1 = self.battle_ai_list.pop('player1')
p2 = self.battle_ai_list.pop('player2')
# !! make player list
player_list = [p1, p2]
except Exception as e:
print e
# concurrent access error
# make room
room = Room(player_list)
room.add_attendee(self)
# fire and forget go!! *** PlayerServer.__play_handler()
tornado.ioloop.IOLoop.spawn_callback(self.player_server.__game_handler, room)
# =================================================================
# case 2 : join existing room
pass
def close(self):
self.web_client_list.pop(self)
| 30.042553 | 91 | 0.58711 | 164 | 1,412 | 4.835366 | 0.426829 | 0.068096 | 0.081967 | 0.08575 | 0.047919 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007882 | 0.281161 | 1,412 | 46 | 92 | 30.695652 | 0.773399 | 0.258499 | 0 | 0.083333 | 0 | 0 | 0.013566 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.083333 | 0.125 | null | null | 0.041667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
66158c0fe842b5072d8dca7f9c7ef90138e51252 | 533 | py | Python | czsc/utils/__init__.py | vercity/czsc | 7a372baa3a550b18ff319008ac3fcab0f3faa684 | [
"MIT"
] | 1 | 2022-02-22T06:31:40.000Z | 2022-02-22T06:31:40.000Z | czsc/utils/__init__.py | vercity/czsc | 7a372baa3a550b18ff319008ac3fcab0f3faa684 | [
"MIT"
] | 1 | 2021-09-25T02:32:39.000Z | 2021-09-25T02:32:39.000Z | czsc/utils/__init__.py | vercity/czsc | 7a372baa3a550b18ff319008ac3fcab0f3faa684 | [
"MIT"
] | null | null | null | # coding: utf-8
from .echarts_plot import kline_pro, heat_map
from .ta import KDJ, MACD, EMA, SMA
from .io import read_pkl, save_pkl, read_json, save_json
from .log import create_logger
from .word_writer import WordWriter
def x_round(x: [float, int], digit=4):
"""用去尾法截断小数
:param x: 数字
:param digit: 保留小数位数
:return:
"""
if isinstance(x, int):
return x
try:
digit_ = pow(10, digit)
x = int(x * digit_) / digit_
except:
print(f"x_round error: x = {x}")
return x
| 20.5 | 56 | 0.622889 | 81 | 533 | 3.925926 | 0.592593 | 0.037736 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01023 | 0.266417 | 533 | 25 | 57 | 21.32 | 0.803069 | 0.125704 | 0 | 0.142857 | 0 | 0 | 0.049661 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.357143 | 0 | 0.571429 | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
6616152ce05afc90e03520671e0b0b9e316b82dc | 2,607 | py | Python | py/manipulation/props/object_collection.py | wx-b/dm_robotics | 5d407622360ccf7f0b4b50bcee84589e2cfd0783 | [
"Apache-2.0"
] | 128 | 2021-09-08T18:39:39.000Z | 2022-03-27T11:29:05.000Z | py/manipulation/props/object_collection.py | wx-b/dm_robotics | 5d407622360ccf7f0b4b50bcee84589e2cfd0783 | [
"Apache-2.0"
] | 7 | 2021-10-11T14:26:17.000Z | 2022-03-15T17:26:45.000Z | py/manipulation/props/object_collection.py | wx-b/dm_robotics | 5d407622360ccf7f0b4b50bcee84589e2cfd0783 | [
"Apache-2.0"
] | 8 | 2021-09-08T18:25:49.000Z | 2022-02-21T23:45:16.000Z | # Copyright 2020 DeepMind Technologies Limited.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Manage collections of props."""
import collections
from typing import Any, Callable, Union
class Singleton(type):
_instances = {}
def __call__(cls, *args, **kwargs):
if cls not in cls._instances:
cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs)
return cls._instances[cls]
VersionedSequence = collections.namedtuple('VersionedSequence',
['version', 'ids'])
class PropSetDict(dict):
"""A dictionary that supports a function evaluation on every key access.
Extends the standard dictionary to provide dynamic behaviour for object sets.
"""
def __getitem__(self, key: Any) -> VersionedSequence:
# The method is called during [] access.
# Provides a collection of prop names.
return self._evaluate(dict.__getitem__(self, key))
def __repr__(self) -> str:
return f'{type(self).__name__}({super().__repr__()})'
def get(self, key) -> VersionedSequence:
return self.__getitem__(key)
def values(self):
values = super().values()
return [self._evaluate(x) for x in values]
def items(self):
new_dict = {k: self._evaluate(v) for k, v in super().items()}
return new_dict.items()
def _evaluate(
self, sequence_or_function: Union[VersionedSequence,
Callable[[], VersionedSequence]]
) -> VersionedSequence:
"""Based on the type of an argument, execute different actions.
Supports static sequence containers or functions that create such. When the
argument is a contrainer, the function returns the argument "as is". In case
a callable is provided as an argument, it will be evaluated to create a
container.
Args:
sequence_or_function: A sequence or a function that creates a sequence.
Returns:
A versioned set of names.
"""
if isinstance(sequence_or_function, VersionedSequence):
return sequence_or_function
new_sequence = sequence_or_function()
return new_sequence
| 33.423077 | 80 | 0.699271 | 337 | 2,607 | 5.249258 | 0.445104 | 0.033917 | 0.050876 | 0.018089 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003897 | 0.212505 | 2,607 | 77 | 81 | 33.857143 | 0.857769 | 0.475643 | 0 | 0 | 0 | 0 | 0.054054 | 0.033205 | 0 | 0 | 0 | 0 | 0 | 1 | 0.225806 | false | 0 | 0.064516 | 0.096774 | 0.645161 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
661903c9d82a160b0423709dd9d82c95f0cb3a11 | 5,048 | py | Python | ecssweb/settings.example.py | Lewes/ecssweb | 62c332757c24d7edac52a04121d8b77eced783a1 | [
"MIT"
] | 4 | 2021-03-17T21:09:18.000Z | 2022-03-03T17:10:51.000Z | ecssweb/settings.example.py | Lewes/ecssweb | 62c332757c24d7edac52a04121d8b77eced783a1 | [
"MIT"
] | 15 | 2018-08-21T19:01:06.000Z | 2022-03-11T23:29:26.000Z | ecssweb/settings.example.py | Lewes/ecssweb | 62c332757c24d7edac52a04121d8b77eced783a1 | [
"MIT"
] | 2 | 2018-08-21T18:46:36.000Z | 2021-11-13T16:23:53.000Z | """
Django settings for ecssweb project.
Generated by 'django-admin startproject' using Django 2.0.5.
For more information on this file, see
https://docs.djangoproject.com/en/2.0/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/2.0/ref/settings/
"""
import os
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/2.0/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = 'secret_key'
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True
SERVER_EMAIL = 'ecssweb@example.com'
EMAIL_SUBJECT_PREFIX = '[ECSSWEB] '
ADMINS = [('Example', 'example@example.com')]
ALLOWED_HOSTS = ['localhost']
# Sites
SITE_ID = 1
# Set to None to use session-based CSRF cookies
# https://docs.djangoproject.com/en/2.0/ref/settings/#csrf-cookie-age
CSRF_COOKIE_AGE = None
CSRF_COOKIE_SECURE = False
# Application definition
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'django.contrib.sites',
'django.contrib.sitemaps',
'website.apps.WebsiteConfig',
'ecsswebauth.apps.EcsswebauthConfig',
'ecsswebadmin.apps.EcsswebadminConfig',
'portal.apps.PortalConfig',
'feedback.apps.FeedbackConfig',
'auditlog.apps.AuditlogConfig',
'fbevents.apps.FbeventsConfig',
'jumpstart.apps.JumpstartConfig',
'shop.apps.ShopConfig',
'election.apps.ElectionConfig',
]
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'ecssweb.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'ecssweb.wsgi.application'
# Database
# https://docs.djangoproject.com/en/2.0/ref/settings/#databases
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
}
}
# Password validation
# https://docs.djangoproject.com/en/2.0/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# Internationalization
# https://docs.djangoproject.com/en/2.0/topics/i18n/
LANGUAGE_CODE = 'en-gb'
TIME_ZONE = 'Europe/London'
USE_I18N = True
USE_L10N = True
USE_TZ = True
# Auth
AUTHENTICATION_BACKENDS = [
'ecsswebauth.backends.SamlBackend',
'django.contrib.auth.backends.ModelBackend',
]
LOGIN_REDIRECT_URL = 'portal:overview'
LOGIN_URL = 'ecsswebauth:auth'
LOGOUT_REDIRECT_URL = 'ecsswebauth:auth'
# Messages
MESSAGE_STORAGE = 'django.contrib.messages.storage.session.SessionStorage'
# Sessions
SESSION_COOKIE_SECURE = False
SESSION_EXPIRE_AT_BROWSER_CLOSE = True
# Logging
# LOGGING = {
# 'version': 1,
# 'disable_existing_loggers': False,
# 'handlers': {
# 'console': {
# 'class': 'logging.StreamHandler',
# },
# 'mail_admins': {
# 'level': 'ERROR',
# 'class': 'django.utils.log.AdminEmailHandler',
# },
# },
# 'loggers': {
# 'django': {
# 'handlers': ['console', 'mail_admins'],
# 'level': 'WARN',
# 'propagate': True,
# },
# },
# }
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/2.0/howto/static-files/
STATIC_URL = '/static/'
STATIC_ROOT = ''
MEDIA_URL = '/media/'
MEDIA_ROOT = os.path.join(BASE_DIR, 'media')
# SAML
# SAML config file folders
SAML_FOLDER = os.path.join(BASE_DIR, 'ecsswebauth', 'saml_config')
SAML_GROUP_PREFIX = 'saml_'
# FB
FB_PAGE_ID = ''
FB_ACCESS_TOKEN = ''
# Face Detection
FACE_DETECT_ENABLED = False
FACE_DETECT_API = ''
| 22.043668 | 91 | 0.682052 | 547 | 5,048 | 6.159049 | 0.411335 | 0.073316 | 0.052241 | 0.059365 | 0.167112 | 0.13743 | 0.091125 | 0.091125 | 0.047492 | 0 | 0 | 0.007 | 0.179279 | 5,048 | 228 | 92 | 22.140351 | 0.806179 | 0.320721 | 0 | 0 | 1 | 0 | 0.518376 | 0.410492 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.04902 | 0.009804 | 0 | 0.009804 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
66244c6d628ac135f26a1502ff5016116fde5d77 | 897 | py | Python | src/icolos/utils/enums/general_utils_enums.py | CMargreitter/Icolos | fd7b664ce177df875fefa910dc4d5c574b521cb3 | [
"Apache-2.0"
] | 11 | 2022-01-30T14:36:13.000Z | 2022-03-22T09:40:57.000Z | src/icolos/utils/enums/general_utils_enums.py | CMargreitter/Icolos | fd7b664ce177df875fefa910dc4d5c574b521cb3 | [
"Apache-2.0"
] | 2 | 2022-03-23T07:56:49.000Z | 2022-03-24T12:01:42.000Z | src/icolos/utils/enums/general_utils_enums.py | CMargreitter/Icolos | fd7b664ce177df875fefa910dc4d5c574b521cb3 | [
"Apache-2.0"
] | 8 | 2022-01-28T10:32:31.000Z | 2022-03-22T09:40:59.000Z | class CheckFileGenerationEnum:
GENERATED_SUCCESS = "generated_success"
GENERATED_EMPTY = "generated_empty"
NOT_GENERATED = "not_generated"
# try to find the internal value and return
def __getattr__(self, name):
if name in self:
return name
raise AttributeError
# prohibit any attempt to set any values
def __setattr__(self, key, value):
raise ValueError("No changes allowed.")
class JSONSchemasEnum:
WORKFLOW_SCHEMA = "workflow"
# sub-schemas
HEADER_SCHEMA = "header"
STEPS_SCHEMA = "steps"
STEP_SCHEMA = "step"
# try to find the internal value and return
def __getattr__(self, name):
if name in self:
return name
raise AttributeError
# prohibit any attempt to set any values
def __setattr__(self, key, value):
raise ValueError("No changes allowed.")
| 24.916667 | 47 | 0.662207 | 104 | 897 | 5.461538 | 0.394231 | 0.056338 | 0.088028 | 0.042254 | 0.626761 | 0.626761 | 0.626761 | 0.626761 | 0.626761 | 0.626761 | 0 | 0 | 0.272018 | 897 | 35 | 48 | 25.628571 | 0.869832 | 0.192865 | 0 | 0.571429 | 1 | 0 | 0.147632 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.190476 | false | 0 | 0 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
66270b5a42f78028a68027c72f4fef97c8b7bba7 | 7,165 | py | Python | twitter_countryGeo/twitter-geo/embers/utils.py | nwself/geocoding | 0919dc2dc209a01a05930bfe21783fc324a584a0 | [
"MIT"
] | 3 | 2018-03-13T00:51:24.000Z | 2020-04-01T16:40:01.000Z | twitter_countryGeo/twitter-geo/embers/utils.py | nwself/geocoding | 0919dc2dc209a01a05930bfe21783fc324a584a0 | [
"MIT"
] | 2 | 2020-05-14T01:28:02.000Z | 2020-09-24T21:56:38.000Z | twitter_countryGeo/twitter-geo/embers/utils.py | nwself/geocoding | 0919dc2dc209a01a05930bfe21783fc324a584a0 | [
"MIT"
] | 4 | 2018-03-13T00:03:48.000Z | 2020-05-13T18:00:16.000Z | #!/usr/bin/env python
# -*- coding: UTF-8 -*-
# vim: ts=4 sts=4 sw=4 tw=79 sta et
"""%prog [options]
Python source code - @todo
"""
__author__ = 'Patrick Butler'
__email__ = 'pbutler@killertux.org'
import collections
import os
import unicodedata
import time
import datetime
import json
import calendar
import copy
from .logging_conf import gen_logger
import atexit
import weakref
import hashlib
import pytz
import warnings
logger = gen_logger("utils")
TIME_FORMAT = "%a, %d %b %Y %H:%M:%S +0000"
TIME_ISOFORMAT = "%Y-%m-%dT%H:%M:%S.%f"
TIME_ISOFORMAT2 = "%Y-%m-%dT%H:%M:%S"
minjsondump = lambda o: json.dumps(o, skipkeys=True, indent=None,
separators=(",", ":"))
def twittime_to_dt(t):
return datetime.datetime(*(time.strptime(t, TIME_FORMAT)[0:6]),
tzinfo=pytz.UTC)
def isotime_to_dt(t):
warnings.warn("This function returns a naive datetime"
" and should not be used", FutureWarning)
try:
d = datetime.datetime.strptime(t, TIME_ISOFORMAT)
return d.replace(tzinfo=pytz.UTC)
except:
d = datetime.datetime.strptime(t, TIME_ISOFORMAT2)
return d.replace(tzinfo=pytz.UTC)
def uuid_to_dt(u):
t = (u.time - 0x01b21dd213814000L) / 1e7
return datetime.datetime.fromtimestamp(t, pytz.UTC)
def now():
return datetime.datetime.now(pytz.UTC)
def dt_to_ts(dt):
"""Convert a datetime to a timestamp
:param dt: datetime object
:returns: floating point timestamp in seconds/epoch
"""
ts = calendar.timegm(dt.utctimetuple())
ts += dt.microsecond / 1.e6
return ts
def normalize_str(s,lower=True):
if isinstance(s, str):
s = s.decode("utf8")
s = unicodedata.normalize("NFKD", s)
if lower:
return s.encode('ASCII', 'ignore').lower()
else:
return s.encode('ASCII', 'ignore')
def flatten_dict(d, prefix=''):
o = {(prefix + k): v for k, v in d.iteritems() if not isinstance(v, dict)}
for k, v in d.items():
if isinstance(v, dict):
i = flatten_dict(v, prefix=prefix + k + "_")
o.update({k: v for k, v in i.items()})
return o
def normalize_payload(payload):
if payload['interaction']['type'] != "twitter":
logger.error("Non twitter payload enecountered '%s'" %
minjsondump(payload))
tweet = copy.deepcopy(payload)
users = []
users += [tweet['interaction']['author']]
del tweet['interaction']['author']
if 'retweet' in tweet['twitter']:
tweet[u'is_retweet'] = 1
tweet['twitter'].update(tweet['twitter']['retweet'])
del tweet['twitter']['retweet']
users += [tweet['twitter']['retweeted']['user']]
tweet['twitter']['retweeted']['user'] = users[1]['screen_name']
tweet['twitter']['retweeted']['uid'] = users[1]['id']
tweet['twitter']['retweeted']['created_at'] = \
twittime_to_dt(tweet['twitter']['retweeted']['created_at'])
else:
tweet[u'is_retweet'] = 0
users[0].update(tweet['twitter']['user'])
tweet['interaction']['created_at'] = \
twittime_to_dt(tweet['interaction']['created_at'])
if 'links' in tweet['twitter']:
tweet['twitter']['links'] = u" ".join(tweet['twitter']['links'])
if 'domains' in tweet['twitter']:
tweet['twitter']['domains'] = u" ".join(tweet['twitter']['domains'])
if 'mentions' in tweet['twitter']:
tweet['twitter']['mentions'] = u" ".join(tweet['twitter']['mentions'])
tweet['twitter']['user'] = users[0]['screen_name']
tweet['twitter']['uid'] = users[0]['id']
tweet = flatten_dict(tweet)
del tweet['twitter_text']
del tweet['twitter_created_at']
del tweet['twitter_source']
if 'twitter_geo_longitude' in tweet:
del tweet['twitter_geo_longitude']
del tweet['twitter_geo_latitude']
if 'twitter_retweeted_geo_longitude' in tweet:
del tweet['twitter_retweeted_geo_longitude']
del tweet['twitter_retweeted_geo_latitude']
for u in users:
u['created_at'] = twittime_to_dt(u['created_at'])
if 'geo_enabled' in u and u['geo_enabled']:
u['geo_enabled'] = 1
if 'verified' in u and u['verified']:
u['geo_enabled'] = 1
return tweet, users
def datetimeU(*args, **kwargs):
kwargs['tzinfo'] = pytz.UTC
return datetime.datetime(*args, **kwargs)
def filter_date(src, start_t, stop_t, field="date"):
for data in src:
tdate = isotime_to_dt(data[field])
if tdate >= start_t and tdate < stop_t:
yield data
def read_mjson(fname):
with open(fname) as f:
for l in f:
yield json.loads(l)
class FlatFileSaver(object):
"""A class for handling saving blocks of text into files blocked by a
period"""
def __init__(self, dir, prefix, keyfunc=None):
"""@todo: to be defined
:param dir: @todo
:param prefix: @todo
"""
self._dir = dir
self._prefix = prefix
self._keyfunc = lambda t: (t.strftime("%Y-%m-%d") +
("-%02d" % (t.hour / 12 * 12)))
self._files = collections.OrderedDict()
self._maxopen = 2
atexit.register(FlatFileSaver.cleanup, weakref.ref(self))
@staticmethod
def cleanup(self):
self = self()
if self:
for k, f in self._files.iteritems():
f.close()
def open(self, key):
print self._dir, self._prefix + "." + key
filename = os.path.join(self._dir, self._prefix + "." + key)
return open(filename, "a")
def writeline(self, time, line):
k = self._keyfunc(time)
f = None
if k not in self._files:
if len(self._files) >= self._maxopen:
_, f = self._files.popitem(last=False)
f.close()
self._files[k] = self.open(k)
f = self._files[k]
f.write(line)
f.write("\n")
def annotate_msg(msg):
"""Enriches a message per EMBERS policy
:param msg: message to be enriched
:returns: enriched message
"""
if 'data' in msg:
data = msg.data['data']
else:
data = msg
if 'embers_id' in data:
data['embersId'] = data['embers_id']
del data['embers_id']
elif 'embersId' not in data:
data['embersId'] = hashlib.sha1(str(data)).hexdigest()
if 'date' not in data:
try:
created_interaction = data["interaction"]["created_at"]
dt = twittime_to_dt(created_interaction)
created_interaction_iso = dt.isoformat()
data['date'] = created_interaction_iso
except:
data['date'] = "NULL"
return data
def fileobj(strobj, mode="r"):
"""@todo: Docstring for fileobj
:param strobj: either a string or a file object, if the former open a file
return if the latter pass it through
:returns: a file object
"""
if isinstance(strobj, basestring):
return open(strobj, mode)
else:
assert(hasattr(strobj, 'read'))
return strobj
| 28.098039 | 78 | 0.592463 | 917 | 7,165 | 4.505998 | 0.283533 | 0.081317 | 0.029042 | 0.018393 | 0.141094 | 0.064376 | 0.016457 | 0 | 0 | 0 | 0 | 0.009445 | 0.261131 | 7,165 | 254 | 79 | 28.208661 | 0.771062 | 0.010607 | 0 | 0.081395 | 0 | 0 | 0.172365 | 0.0242 | 0 | 0 | 0 | 0.019685 | 0.005814 | 0 | null | null | 0 | 0.081395 | null | null | 0.005814 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6633205e20a873d90ce6fc546139b3448195fdd7 | 337 | py | Python | movies/redbox/admin.py | thinkjson/movies.thinkjson.com | 5c17a4af931a6a504ae1ce36566173f3df8d25d8 | [
"MIT"
] | null | null | null | movies/redbox/admin.py | thinkjson/movies.thinkjson.com | 5c17a4af931a6a504ae1ce36566173f3df8d25d8 | [
"MIT"
] | null | null | null | movies/redbox/admin.py | thinkjson/movies.thinkjson.com | 5c17a4af931a6a504ae1ce36566173f3df8d25d8 | [
"MIT"
] | null | null | null | from django.contrib import admin
from redbox.models import Movie
class MovieAdmin(admin.ModelAdmin):
ordering = ('-score',)
list_display = ('title', 'productid', 'metascore', 'critics_score', 'audience_score', 'score', 'format', 'mpaarating',)
list_filter = ('format', 'mpaarating',)
admin.site.register(Movie, MovieAdmin)
| 33.7 | 123 | 0.712166 | 37 | 337 | 6.378378 | 0.675676 | 0.135593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.127596 | 337 | 9 | 124 | 37.444444 | 0.802721 | 0 | 0 | 0 | 0 | 0 | 0.275964 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6637a7ca569dd83163c88f04d7ca1825d54d233d | 731 | py | Python | drivers/oasissiren.py | BuloZB/turhouse | e76db0cdc96d9c9acfc5bd99ed94d9ad1dfecfa1 | [
"Apache-2.0"
] | null | null | null | drivers/oasissiren.py | BuloZB/turhouse | e76db0cdc96d9c9acfc5bd99ed94d9ad1dfecfa1 | [
"Apache-2.0"
] | null | null | null | drivers/oasissiren.py | BuloZB/turhouse | e76db0cdc96d9c9acfc5bd99ed94d9ad1dfecfa1 | [
"Apache-2.0"
] | 1 | 2016-11-21T16:56:07.000Z | 2016-11-21T16:56:07.000Z | # -*- coding: utf-8 -*-
from oasisbase import *
class OasisSiren(OasisBase):
__mapper_args__ = {
'polymorphic_identity': 'OasisSiren'
}
def __init__(self, device_name):
super(OasisSiren, self).__init__(device_name)
def processMessage(self, msg):
'''
process message
'''
self.processTamper(msg)
self.processButton(msg)
self.processBeacon(msg)
self.processBlackout(msg)
def processBlackout(self, msgDict):
"""
process blackout message
"""
if msgDict['msg'][3] == 'BLACKOUT:1':
pass
# TODO
if msgDict['msg'][3] == 'BLACKOUT:0':
pass
# TODO
| 21.5 | 53 | 0.53762 | 66 | 731 | 5.712121 | 0.515152 | 0.055703 | 0.06366 | 0.068966 | 0.111406 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010373 | 0.340629 | 731 | 33 | 54 | 22.151515 | 0.771784 | 0.099863 | 0 | 0.117647 | 0 | 0 | 0.093023 | 0 | 0 | 0 | 0 | 0.030303 | 0 | 1 | 0.176471 | false | 0.117647 | 0.058824 | 0 | 0.352941 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
6642ebe6149fbc1a24df3b39f586d0387bd1c822 | 579 | py | Python | xbbg/core/pdblp.py | jkassies/xbbg | e30bec32e7b5674986b4aaec6ea28c1f8f2d1714 | [
"Apache-2.0"
] | null | null | null | xbbg/core/pdblp.py | jkassies/xbbg | e30bec32e7b5674986b4aaec6ea28c1f8f2d1714 | [
"Apache-2.0"
] | null | null | null | xbbg/core/pdblp.py | jkassies/xbbg | e30bec32e7b5674986b4aaec6ea28c1f8f2d1714 | [
"Apache-2.0"
] | null | null | null | from abc import abstractmethod
class Session(object):
@abstractmethod
def start(self): return False
class BCon(object):
def __init__(self, port=8194, timeout=500, **kwargs):
self.host = kwargs.pop('host', 'localhost')
self.port = port
self.timeout = timeout
self.debug = kwargs.pop('debug', False)
self.session = kwargs.pop('session', None)
self.identity = kwargs.pop('identity', None)
self._session = Session()
@abstractmethod
def start(self): pass
@abstractmethod
def stop(self): pass
| 22.269231 | 57 | 0.632124 | 67 | 579 | 5.38806 | 0.402985 | 0.099723 | 0.121884 | 0.144044 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016055 | 0.246978 | 579 | 25 | 58 | 23.16 | 0.811927 | 0 | 0 | 0.176471 | 0 | 0 | 0.056995 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0.117647 | 0.058824 | 0.058824 | 0.411765 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
664a37ef927fc87f28d05bf9725082537e008ed1 | 857 | py | Python | ndnu-connect-backend/tutor_match/admin.py | NDNUSeniorProj2020/ndnu-connect | 28ec7e7dce3677a2a58a8cf79518873bcbf41521 | [
"MIT"
] | 2 | 2020-02-12T03:33:55.000Z | 2020-02-12T03:37:54.000Z | ndnu-connect-backend/tutor_match/admin.py | NDNUSeniorProj2020/ndnu-connect | 28ec7e7dce3677a2a58a8cf79518873bcbf41521 | [
"MIT"
] | 41 | 2020-01-22T04:53:28.000Z | 2021-09-22T18:33:34.000Z | ndnu-connect-backend/tutor_match/admin.py | NDNUSeniorProj2020/ndnu-connect | 28ec7e7dce3677a2a58a8cf79518873bcbf41521 | [
"MIT"
] | 2 | 2020-02-12T03:07:06.000Z | 2020-02-12T22:24:29.000Z | from django.contrib import admin
from .models import Tutor
from .models import Student
from .models import Department
from .models import Subject
from .models import Schedule
from .models import SubjToDept
admin.site.site_header = "NDNU Connect: Admin Portal"
admin.site.index_title = "NDNU Connect: Admin Portal"
admin.site.site_title = "NDNU Connect: Admin"
class StudentAdmin(admin.ModelAdmin):
list_display = ('user','major','standing','method',)
list_filter = ('major','standing',)
class TutorAdmin(admin.ModelAdmin):
list_display = ('user', 'get_subjects', 'rating', 'num_of_ratings', )
list_filter = ('rating', 'subject',)
admin.site.register(Tutor,TutorAdmin)
admin.site.register(Student, StudentAdmin)
admin.site.register(Department)
admin.site.register(Subject)
admin.site.register(Schedule)
admin.site.register(SubjToDept)
| 28.566667 | 73 | 0.763127 | 109 | 857 | 5.908257 | 0.33945 | 0.125776 | 0.149068 | 0.068323 | 0.189441 | 0.096273 | 0 | 0 | 0 | 0 | 0 | 0 | 0.112019 | 857 | 29 | 74 | 29.551724 | 0.846255 | 0 | 0 | 0 | 0 | 0 | 0.18203 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.318182 | 0 | 0.590909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b0781fefc933340aa178cacf26a7dae0b20881c4 | 16,754 | py | Python | endless_piano.py | asigalov61/Endless-Piano | 61cccf92fb0221cded36aef3730ab5798c977b17 | [
"Apache-2.0"
] | 5 | 2021-05-21T09:41:50.000Z | 2022-01-06T14:58:54.000Z | endless_piano.py | asigalov61/Endless-Piano | 61cccf92fb0221cded36aef3730ab5798c977b17 | [
"Apache-2.0"
] | null | null | null | endless_piano.py | asigalov61/Endless-Piano | 61cccf92fb0221cded36aef3730ab5798c977b17 | [
"Apache-2.0"
] | 1 | 2021-05-23T10:52:50.000Z | 2021-05-23T10:52:50.000Z | # -*- coding: utf-8 -*-
"""Endless_Piano.ipynb
Automatically generated by Colaboratory.
Original file is located at
https://colab.research.google.com/drive/1K32GFmZDDvEjMX1ZjIaa5Qgmvwi5NIFK
# Endless Piano (ver. 7.0)
***
## Endless Semi-Generative Performance Piano Music Maker
***
### Powered by tegridy-tools TMIDI Optimus processors & GiantMIDI Dataset https://github.com/bytedance/GiantMIDI-Piano
***
#### Project Los Angeles
#### Tegridy Code 2021
***
# Setup environment
"""
#@title Install tegridy-tools
!git clone https://github.com/asigalov61/tegridy-tools
#@title Import all needed modules
print('Loading needed modules. Please wait...')
import os
import copy
import tqdm
from tqdm import auto
import secrets
import random
if not os.path.exists('/content/Dataset'):
os.makedirs('/content/Dataset')
os.chdir('/content/tegridy-tools/tegridy-tools')
import TMIDI
# stats stuff
import statistics
from scipy.spatial import distance
os.chdir('/content/')
print('Loading complete. Enjoy! :)')
"""# (BEST OPTION) Download and load processed GiantMIDI dataset
## NOTE: Loading will take about 5 minutes and 10GB RAM
"""
# Commented out IPython magic to ensure Python compatibility.
#@title Download processed GiantMIDI dataset
# %cd /content/
!wget --no-check-certificate -O GiantMIDI.zip "https://onedrive.live.com/download?cid=8A0D502FC99C608F&resid=8A0D502FC99C608F%2118491&authkey=AKrxNM53z9DGX2Y"
!unzip -j GiantMIDI.zip
#@title Load the dataset
#@markdown NOTE: This may take a while. Please wait...
slices_length_in_miliseconds = 4000 #@param {type:"slider", min:1000, max:8000, step:1000}
overlap_notes_per_slice = 2
print('=' * 70)
print('Loading GiantMIDI Dataset. Please wait...')
print('=' * 70)
quarter_pairs1 = TMIDI.Tegridy_Any_Pickle_File_Loader('/content/GiantMIDI')
print('=' * 70)
print('Randomizing the dataset...')
random.shuffle(quarter_pairs1[0])
print('=' * 70)
print('Slicing the dataset...')
quarter_pairs = []
for qp in auto.tqdm(quarter_pairs1[0]):
quarter_pairs.extend(TMIDI.Tegridy_Score_Slicer(qp, slices_length_in_miliseconds, overlap_notes=overlap_notes_per_slice)[0])
print('=' * 70)
#print('Transforming the dataset...')
#quarter_pairs = []
#for qp in auto.tqdm(quarter_pairs2):
# quarter_pairs.append(TMIDI.Tegridy_Transform(qp))
print('Randomizing the score slices...')
random.shuffle(quarter_pairs)
print('=' * 70)
print('Processing finished! Enjoy! :)')
print('=' * 70)
"""# (ALTERNATIVE) Download and load processed POP909/POP17k dataset
## NOTE: This dataset is small so use it if you have a limited RAM
"""
# Commented out IPython magic to ensure Python compatibility.
#@title Download processed POP909/POP17k dataset
# %cd /content/
!wget https://github.com/asigalov61/Endless-Piano/raw/main/Model/Endless-Piano-Music-Dataset.zip
!unzip -j Endless-Piano-Music-Dataset.zip
#@title Load POP909/POP17k dataset
#@markdown NOTE: This may take a while. Please wait...
slices_length_in_miliseconds = 4000 #@param {type:"slider", min:1000, max:8000, step:1000}
overlap_notes_per_slice = 2
print('=' * 70)
print('Loading POP909/POP17k Dataset. Please wait...')
print('=' * 70)
quarter_pairs1 = TMIDI.Tegridy_Any_Pickle_File_Loader('/content/Endless-Piano-Music-Dataset')
print('=' * 70)
print('Randomizing the dataset...')
random.shuffle(quarter_pairs1[0])
print('=' * 70)
print('Slicing the dataset...')
quarter_pairs = []
for qp in auto.tqdm(quarter_pairs1[0]):
quarter_pairs.extend(TMIDI.Tegridy_Score_Slicer(qp, slices_length_in_miliseconds, overlap_notes=overlap_notes_per_slice)[0])
print('=' * 70)
#print('Transforming the dataset...')
#quarter_pairs = []
#for qp in auto.tqdm(quarter_pairs2):
# quarter_pairs.append(TMIDI.Tegridy_Transform(qp))
print('Randomizing the score slices...')
random.shuffle(quarter_pairs)
print('=' * 70)
print('Processing finished! Enjoy! :)')
print('=' * 70)
"""# (ALTERNATIVE) Download and load preprocessed MAESTRO 3.0 dataset"""
# Commented out IPython magic to ensure Python compatibility.
#@title Download processed MAESTRO 3.0 dataset
# %cd /content/
!wget --no-check-certificate -O Endless-Piano-Music-Dataset-2.zip "https://onedrive.live.com/download?cid=8A0D502FC99C608F&resid=8A0D502FC99C608F%2118492&authkey=APgohbotB54fmeE"
!unzip -j Endless-Piano-Music-Dataset-2.zip
#@title Load MAESTRO 3.0 dataset
#@markdown NOTE: This may take a while. Please wait...
slices_length_in_miliseconds = 4000 #@param {type:"slider", min:1000, max:8000, step:1000}
overlap_notes_per_slice = 2
print('=' * 70)
print('Loading MAESTRO 3.0 Dataset. Please wait...')
print('=' * 70)
quarter_pairs1 = TMIDI.Tegridy_Any_Pickle_File_Loader('/content/Endless-Piano-Music-Dataset-2')
print('=' * 70)
print('Randomizing the dataset...')
random.shuffle(quarter_pairs1[0])
print('=' * 70)
print('Slicing the dataset...')
quarter_pairs = []
for qp in auto.tqdm(quarter_pairs1[0]):
quarter_pairs.extend(TMIDI.Tegridy_Score_Slicer(qp, slices_length_in_miliseconds, overlap_notes=overlap_notes_per_slice)[0])
print('=' * 70)
#print('Transforming the dataset...')
#quarter_pairs = []
#for qp in auto.tqdm(quarter_pairs2):
# quarter_pairs.append(TMIDI.Tegridy_Transform(qp))
print('Randomizing the score slices...')
random.shuffle(quarter_pairs)
print('=' * 70)
print('Processing finished! Enjoy! :)')
print('=' * 70)
"""# (CUSTOM) Process your own dataset"""
#@title Process MIDIs to special MIDI dataset with Optimus MIDI Processor
desired_dataset_name = "Endless-Piano-Music-Dataset" #@param {type:"string"}
file_name_to_output_dataset_to = "/content/Endless-Piano-Music-Dataset" #@param {type:"string"}
desired_MIDI_channel_to_process = 0 #@param {type:"slider", min:-1, max:16, step:1}
encode_MIDI_channels = False #@param {type:"boolean"}
encode_velocities = False #@param {type:"boolean"}
chordify_input_MIDIs = False #@param {type:"boolean"}
melody_conditioned_encoding = False #@param {type:"boolean"}
melody_pitch_baseline = 60 #@param {type:"slider", min:1, max:127, step:1}
time_denominator = 1 #@param {type:"slider", min:1, max:20, step:1}
chars_encoding_offset = 33 #@param {type:"number"}
slices_length_in_miliseconds = 4000 #@param {type:"slider", min:1000, max:8000, step:1000}
transform_to_pitch = 0 #@param {type:"slider", min:0, max:127, step:1}
perfect_timings = False #@param {type:"boolean"}
MuseNet_encoding = False #@param {type:"boolean"}
print('=' * 70)
print('TMIDI Optimus Processor')
print('Starting up...')
print('=' * 70)
###########
average_note_pitch = 0
min_note = 127
max_note = 0
files_count = 0
ev = 0
notes_list_f = []
chords_list_f = []
melody_list_f = []
chords_list = []
chords_count = 0
melody_chords = []
melody_count = 0
TXT = ''
melody = []
chords = []
bf = 0
###########
print('Loading MIDI files...')
print('This may take a while on a large dataset in particular.')
dataset_addr = "/content/Dataset/"
os.chdir(dataset_addr)
filez = list()
for (dirpath, dirnames, filenames) in os.walk(dataset_addr):
filez += [os.path.join(dirpath, file) for file in filenames]
print('=' * 70)
# Stamping the dataset
print('Stamping the dataset...')
TXT_String = 'DATASET=' + str(desired_dataset_name) + chr(10)
TXT_String += 'CHARS_ENCODING_OFFSET=' + str(chars_encoding_offset) + chr(10)
TXT_String += 'LEGEND=STA-DUR-PTC'
if encode_velocities:
TXT_String += '-VEL'
if encode_MIDI_channels:
TXT_String += '-CHA'
TXT_String += chr(10)
pf = []
kar_ev = []
pxp_ev = []
print('=' * 70)
print('Processing MIDI files. Please wait...')
for f in tqdm.auto.tqdm(filez):
try:
fn = os.path.basename(f)
fnn = fn
fn1 = fnn.split('.')[0]
fn3 = ['MIDI']
TXT, melody, chords = TMIDI.Optimus_MIDI_TXT_Processor(f,
line_by_line_output=False,
chordify_TXT=chordify_input_MIDIs,
output_MIDI_channels=encode_MIDI_channels,
char_offset=chars_encoding_offset,
dataset_MIDI_events_time_denominator=time_denominator,
output_velocity=encode_velocities,
MIDI_channel=desired_MIDI_channel_to_process,
MIDI_patch=range(0,127),
melody_conditioned_encoding=melody_conditioned_encoding,
melody_pitch_baseline=melody_pitch_baseline,
song_name=fn1,
perfect_timings=perfect_timings,
musenet_encoding=MuseNet_encoding,
transform=transform_to_pitch)
chords_list_f.append(chords)
melody_list_f.append(melody)
pf.append([fn1, f.split('/')[-2], fn3])
files_count += 1
except KeyboardInterrupt:
print('Exiting...Saving progress...')
break
except:
bf += 1
print('Bad MIDI:', f)
print('Count:', bf)
continue
print('Task complete! :)')
print('=' * 70)
print('Number of processed dataset MIDI files:', files_count)
print('First chord event:', chords_list_f[0][0], 'Last chord event:', chords_list_f[-1][-1])
print('First melody event:', melody_list_f[0][0], 'Last Melody event:', melody_list_f[-1][-1])
print('=' * 70)
# Dataset
print('Finalizing the dataset...')
MusicDataset = [chords_list_f, melody_list_f, kar_ev, filez, pf, bf, files_count]
print('=' * 70)
print('Randomizing the dataset...')
random.shuffle(chords_list_f)
print('=' * 70)
quarter_pairs1 = [chords_list_f]
print('Slicing the dataset...')
quarter_pairs = []
for d in auto.tqdm(quarter_pairs1[0]):
quarter_pairs.extend(TMIDI.Tegridy_Score_Slicer(d, slices_length_in_miliseconds)[0])
print('=' * 70)
print('Randomizing the score slices...')
random.shuffle(quarter_pairs)
print('=' * 70)
TMIDI.Tegridy_Pickle_File_Writer(MusicDataset, file_name_to_output_dataset_to)
"""# Generate Endless Classical Piano Music"""
#@title Generate music with Optimus score slices signatures matching
#@markdown NOTE: If nothing is being generated or if the song is too short: re-run the generator.
#@markdown NOTE: Yes, it is slow and yes, you may need to re-run many times before you will generate anything decent. This is the real price of the present generative music.
number_of_slices_to_try_to_generate = 20 #@param {type:"slider", min:1, max:100, step:1}
slow_extra_match = "minkowski" #@param ["velocities", "minkowski"]
print('=' * 70)
print('Endless Piano')
print('=' * 70)
print('Starting up...')
print('=' * 70)
print('Number of MIDI compositions in the dataset:', len(quarter_pairs1[0]))
print('Number of compositions scores slices in the dataset:', len(quarter_pairs))
print('=' * 70)
print('Randomizing score slices...')
random.shuffle(quarter_pairs)
print('=' * 70)
# Constants
c = 2
total_notes = 0
###########
idx = secrets.randbelow(len(quarter_pairs))
song = []
song.append(quarter_pairs[idx])
print('Starting slice index:', idx, 'out of', len(quarter_pairs))
print('=' * 70)
print('Starting main search...')
print('=' * 70)
print('Extra match type requested:', slow_extra_match)
print('=' * 70)
for i in auto.tqdm(range(number_of_slices_to_try_to_generate)):
try:
sig1 = TMIDI.Optimus_Signature(song[-1])[1] # [1] == Best Optimus Signature
# Decoding sig...
p1mh = sig1[0] # PMedH
p1m = sig1[1] # PMed
p1ml = sig1[2] # PMedL
d1 = sig1[3] # Duration
v1 = sig1[4] # Velocity
mtds1 = sig1[5] # Beat
for qp in quarter_pairs:
if len(qp) > 1:
sig2 = TMIDI.Optimus_Signature(qp)[1]
p2mh = sig2[0]
p2m = sig2[1]
p2ml = sig2[2]
d2 = sig2[3]
v2 = sig2[4]
mtds2 = sig2[5]
# search with velocity matching
if slow_extra_match == 'velocities':
if p1m == p2m and p1mh == p2mh and p1ml == p2ml:
if d1 == d2:
if v1 == v2:
if mtds1 == mtds2:
if qp not in song:
song.append(qp)
total_notes += len(song[-1])
print('Found', c, 'slices /', total_notes, 'notes...')
c += 1
break
# minkowski
else:
if p1m == p2m and p1mh == p2mh and p1ml == p2ml:
if d1 == d2:
if mtds1 == mtds2:
if distance.minkowski(sig1, sig2) < 2:
if qp not in song:
song.append(qp)
total_notes += len(song[-1])
print('Found', c, 'slices /', total_notes, 'notes...')
c += 1
break
if c == i + 1:
print('=' * 70)
print('Generator exhausted. Stopping...')
break
except KeyboardInterrupt:
print('=' * 70)
print('Keyboard interrupt requested...')
print('Saving progress and writing resulting MIDI...')
break
print('=' * 70)
if c >= i + 1:
print('Finalizing generated song...')
song1 = []
for s in song:
song1.extend(s)
song1 = [s for s in song1 if type(s) == list]
print('=' * 70)
print('Final song stats...')
sig1 = TMIDI.Optimus_Signature(song1)[1] # [1] == Best Optimus Signature
# Decoding sig...
p1mh = sig1[0] # PMedH
p1m = sig1[1] # PMed
p1ml = sig1[2] # PMedL
d1 = sig1[3] # Duration
v1 = sig1[4] # Velocity
mtds1 = sig1[5] # Beat
print('Song PMH:', p1mh)
print('Song PM:', p1m)
print('Song PML:', p1ml)
print('Song DUR:', d1)
print('Song VEL:', v1)
print('Song BT:', mtds1)
print('=' * 70)
print('Analyzing generated song...')
ptime = 0
count = 1
ptime = song1[0][1]
for s in song1:
if abs(s[1] - ptime) > 2000:
count += 1
ptime = s[1]
print('Song has', count, 'unique pieces.')
if count < 4:
print('PLAGIARIZM WARNING: Your composition is most likely plagiarizm')
print('=' * 70)
print('Adding unique pieces labels to the song...')
song2 = []
ptime = song1[0][1]
song2.append(['text_event', song1[0][1], str(song1[0])])
for s in song1:
if abs(s[1] - ptime) > 2000:
song2.append(['text_event', s[1], str(s)])
song2.append(s)
else:
song2.append(s)
ptime = s[1]
print('=' * 70)
print('Recalculating songs timings...')
song3 = TMIDI.Tegridy_Timings_Converter(song2)[0]
print('=' * 70)
print('Total song length:', len(song3))
print('=' * 70)
comp_numb = sum([y[4] for y in song3 if y[0] == 'note'])
comp_length = len(song3)
print('Endless Piano Composition #:', comp_numb, '-', comp_length)
print('=' * 70)
stats = TMIDI.Tegridy_SONG_to_MIDI_Converter(song3,
output_signature='Endless Piano',
output_file_name='/content/Endless-Piano-Music-Composition',
list_of_MIDI_patches=[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
track_name='Composition #:' + str(comp_numb) + '-' + str(comp_length))
print('=' * 70)
"""# Plot and Listen"""
#@title Install prerequisites
!apt install fluidsynth #Pip does not work for some reason. Only apt works
!pip install midi2audio
!pip install pretty_midi
#@title Plot and listen to the last generated composition
#@markdown NOTE: May be very slow with the long compositions
from midi2audio import FluidSynth
from IPython.display import display, Javascript, HTML, Audio
import pretty_midi
import librosa.display
import matplotlib.pyplot as plt
from mpl_toolkits import mplot3d
import numpy as np
print('Synthesizing the last output MIDI... ')
fname = '/content/Endless-Piano-Music-Composition'
fn = os.path.basename(fname + '.mid')
fn1 = fn.split('.')[0]
print('Plotting the composition. Please wait...')
pm = pretty_midi.PrettyMIDI(fname + '.mid')
# Retrieve piano roll of the MIDI file
piano_roll = pm.get_piano_roll()
plt.figure(figsize=(14, 5))
librosa.display.specshow(piano_roll, x_axis='time', y_axis='cqt_note', fmin=1, hop_length=160, sr=16000, cmap=plt.cm.hot)
plt.title('Composition: ' + fn1)
FluidSynth("/usr/share/sounds/sf2/FluidR3_GM.sf2", 16000).midi_to_audio(str(fname + '.mid'), str(fname + '.wav'))
Audio(str(fname + '.wav'), rate=16000)
"""# Congrats! You did it! :)""" | 29.758437 | 178 | 0.642712 | 2,190 | 16,754 | 4.776256 | 0.208676 | 0.033461 | 0.037859 | 0.006501 | 0.424092 | 0.371224 | 0.340153 | 0.323614 | 0.307648 | 0.307648 | 0 | 0.043982 | 0.216963 | 16,754 | 563 | 179 | 29.758437 | 0.753335 | 0.16026 | 0 | 0.402857 | 1 | 0.005714 | 0.197232 | 0.023914 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.045714 | null | null | 0.325714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b089a8cbef9ce998923e1e7fa0023bd86c305546 | 997 | py | Python | ex3.py | BryanQ98/Learning_Python | 216de86c2267cca6518896db13ba0303e5562016 | [
"MIT"
] | null | null | null | ex3.py | BryanQ98/Learning_Python | 216de86c2267cca6518896db13ba0303e5562016 | [
"MIT"
] | null | null | null | ex3.py | BryanQ98/Learning_Python | 216de86c2267cca6518896db13ba0303e5562016 | [
"MIT"
] | null | null | null | print "I will now count my chickens:"
# + is for addition
# - is for subtraction
# / is for division
# * is for multiplication
# < is less than
# > is greater than
# <= is less than equal
# >= is greater than equal
# % is the symbol for modulus
# 30 divided by 6 plus 25 = 30
print "Hens", 25 + 30 / 6
# 3 modulated 4 multiplied by 25 minus 100 = 97
print "Roosters", 100 - 25 * 3 % 4
print "Now I will count the eggs:"
#long equations become easier to solve
print 3 + 2 + 1 - 5 + 4 % 2 - 1 / 4 + 6
print "Is it true that 3 + 2 < 5 - 7?"
# 3 plus 2 is greater than 5 - 7
print 3 + 2 < 5 - 7
# The quotes will print the question, then the program does and prints the math
print "What is 3 + 2?", 3 + 2
print "What is 5 - 7?", 5 - 7
print "Oh, that's why it's False."
print "How about some more."
# Question is printed, inequality is solved
print "Is it greater?", 5 > -2
print "Is it greater or equal?", 5>= -2
print "Is it less or equal?", 5<= -2 | 24.925 | 80 | 0.615848 | 179 | 997 | 3.430168 | 0.391061 | 0.016287 | 0.058632 | 0.013029 | 0.035831 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0875 | 0.277834 | 997 | 40 | 81 | 24.925 | 0.765278 | 0.45336 | 0 | 0 | 0 | 0 | 0.464358 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
b08d400f54f0df77ad724f0d12c0e47c811f721a | 25,755 | py | Python | btrdbextras/eventproc/protobuff/api_pb2.py | PingThingsIO/btrdbextras | 95299923011e57150f8987dbc88bffc0283d9f0b | [
"BSD-3-Clause"
] | null | null | null | btrdbextras/eventproc/protobuff/api_pb2.py | PingThingsIO/btrdbextras | 95299923011e57150f8987dbc88bffc0283d9f0b | [
"BSD-3-Clause"
] | 3 | 2020-10-23T22:12:47.000Z | 2021-08-05T17:18:05.000Z | btrdbextras/eventproc/protobuff/api_pb2.py | PingThingsIO/btrdbextras | 95299923011e57150f8987dbc88bffc0283d9f0b | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: api.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='api.proto',
package='eventprocapi',
syntax='proto3',
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_pb=b'\n\tapi.proto\x12\x0c\x65ventprocapi\"#\n\x13ListHandlersRequest\x12\x0c\n\x04hook\x18\x01 \x01(\t\"?\n\x14ListHandlersResponse\x12\'\n\x08handlers\x18\x01 \x03(\x0b\x32\x15.eventprocapi.Handler\"\xe7\x01\n\x07Handler\x12\n\n\x02id\x18\x01 \x01(\x05\x12\x0c\n\x04hook\x18\x02 \x01(\t\x12\x0c\n\x04name\x18\x03 \x01(\t\x12\x0f\n\x07version\x18\x04 \x01(\x05\x12\x10\n\x08\x63\x61llable\x18\x05 \x01(\t\x12\x0b\n\x03tag\x18\x06 \x03(\t\x12\x19\n\x11notify_on_success\x18\x07 \x01(\t\x12\x19\n\x11notify_on_failure\x18\x08 \x01(\t\x12\x12\n\ncreated_by\x18\t \x01(\t\x12\x12\n\ncreated_at\x18\n \x01(\x03\x12\x12\n\nupdated_by\x18\x0b \x01(\t\x12\x12\n\nupdated_at\x18\x0c \x01(\x03\"\x12\n\x10ListHooksRequest\"6\n\x11ListHooksResponse\x12!\n\x05hooks\x18\x01 \x03(\x0b\x32\x12.eventprocapi.Hook\"\x14\n\x04Hook\x12\x0c\n\x04name\x18\x01 \x01(\t\"\xb1\x01\n\x0cRegistration\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x0c\n\x04hook\x18\x02 \x01(\t\x12\x0c\n\x04tags\x18\x03 \x03(\t\x12\x0c\n\x04\x62lob\x18\x04 \x01(\x0c\x12\x19\n\x11notify_on_success\x18\x05 \x01(\t\x12\x19\n\x11notify_on_failure\x18\x06 \x01(\t\x12\x14\n\x0c\x64\x65pendencies\x18\x07 \x01(\t\x12\x0c\n\x04user\x18\x08 \x01(\t\x12\x0f\n\x07\x61pi_key\x18\t \x01(\t\"C\n\x0fRegisterRequest\x12\x30\n\x0cregistration\x18\x01 \x01(\x0b\x32\x1a.eventprocapi.Registration\":\n\x10RegisterResponse\x12&\n\x07handler\x18\x01 \x01(\x0b\x32\x15.eventprocapi.Handler\"\x1f\n\x11\x44\x65registerRequest\x12\n\n\x02id\x18\x01 \x01(\x05\" \n\x12\x44\x65registerResponse\x12\n\n\x02id\x18\x01 \x01(\x05\x32\xe1\x02\n\x16\x45ventProcessingService\x12N\n\tListHooks\x12\x1e.eventprocapi.ListHooksRequest\x1a\x1f.eventprocapi.ListHooksResponse\"\x00\x12W\n\x0cListHandlers\x12!.eventprocapi.ListHandlersRequest\x1a\".eventprocapi.ListHandlersResponse\"\x00\x12K\n\x08Register\x12\x1d.eventprocapi.RegisterRequest\x1a\x1e.eventprocapi.RegisterResponse\"\x00\x12Q\n\nDeregister\x12\x1f.eventprocapi.DeregisterRequest\x1a .eventprocapi.DeregisterResponse\"\x00\x62\x06proto3'
)
_LISTHANDLERSREQUEST = _descriptor.Descriptor(
name='ListHandlersRequest',
full_name='eventprocapi.ListHandlersRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='hook', full_name='eventprocapi.ListHandlersRequest.hook', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=27,
serialized_end=62,
)
_LISTHANDLERSRESPONSE = _descriptor.Descriptor(
name='ListHandlersResponse',
full_name='eventprocapi.ListHandlersResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='handlers', full_name='eventprocapi.ListHandlersResponse.handlers', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=64,
serialized_end=127,
)
_HANDLER = _descriptor.Descriptor(
name='Handler',
full_name='eventprocapi.Handler',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='eventprocapi.Handler.id', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='hook', full_name='eventprocapi.Handler.hook', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='eventprocapi.Handler.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='version', full_name='eventprocapi.Handler.version', index=3,
number=4, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='callable', full_name='eventprocapi.Handler.callable', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tag', full_name='eventprocapi.Handler.tag', index=5,
number=6, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='notify_on_success', full_name='eventprocapi.Handler.notify_on_success', index=6,
number=7, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='notify_on_failure', full_name='eventprocapi.Handler.notify_on_failure', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='created_by', full_name='eventprocapi.Handler.created_by', index=8,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='created_at', full_name='eventprocapi.Handler.created_at', index=9,
number=10, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='updated_by', full_name='eventprocapi.Handler.updated_by', index=10,
number=11, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='updated_at', full_name='eventprocapi.Handler.updated_at', index=11,
number=12, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=130,
serialized_end=361,
)
_LISTHOOKSREQUEST = _descriptor.Descriptor(
name='ListHooksRequest',
full_name='eventprocapi.ListHooksRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=363,
serialized_end=381,
)
_LISTHOOKSRESPONSE = _descriptor.Descriptor(
name='ListHooksResponse',
full_name='eventprocapi.ListHooksResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='hooks', full_name='eventprocapi.ListHooksResponse.hooks', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=383,
serialized_end=437,
)
_HOOK = _descriptor.Descriptor(
name='Hook',
full_name='eventprocapi.Hook',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='name', full_name='eventprocapi.Hook.name', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=439,
serialized_end=459,
)
_REGISTRATION = _descriptor.Descriptor(
name='Registration',
full_name='eventprocapi.Registration',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='name', full_name='eventprocapi.Registration.name', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='hook', full_name='eventprocapi.Registration.hook', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tags', full_name='eventprocapi.Registration.tags', index=2,
number=3, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='blob', full_name='eventprocapi.Registration.blob', index=3,
number=4, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='notify_on_success', full_name='eventprocapi.Registration.notify_on_success', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='notify_on_failure', full_name='eventprocapi.Registration.notify_on_failure', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='dependencies', full_name='eventprocapi.Registration.dependencies', index=6,
number=7, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='user', full_name='eventprocapi.Registration.user', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='api_key', full_name='eventprocapi.Registration.api_key', index=8,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=462,
serialized_end=639,
)
_REGISTERREQUEST = _descriptor.Descriptor(
name='RegisterRequest',
full_name='eventprocapi.RegisterRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='registration', full_name='eventprocapi.RegisterRequest.registration', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=641,
serialized_end=708,
)
_REGISTERRESPONSE = _descriptor.Descriptor(
name='RegisterResponse',
full_name='eventprocapi.RegisterResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='handler', full_name='eventprocapi.RegisterResponse.handler', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=710,
serialized_end=768,
)
_DEREGISTERREQUEST = _descriptor.Descriptor(
name='DeregisterRequest',
full_name='eventprocapi.DeregisterRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='eventprocapi.DeregisterRequest.id', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=770,
serialized_end=801,
)
_DEREGISTERRESPONSE = _descriptor.Descriptor(
name='DeregisterResponse',
full_name='eventprocapi.DeregisterResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='eventprocapi.DeregisterResponse.id', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=803,
serialized_end=835,
)
_LISTHANDLERSRESPONSE.fields_by_name['handlers'].message_type = _HANDLER
_LISTHOOKSRESPONSE.fields_by_name['hooks'].message_type = _HOOK
_REGISTERREQUEST.fields_by_name['registration'].message_type = _REGISTRATION
_REGISTERRESPONSE.fields_by_name['handler'].message_type = _HANDLER
DESCRIPTOR.message_types_by_name['ListHandlersRequest'] = _LISTHANDLERSREQUEST
DESCRIPTOR.message_types_by_name['ListHandlersResponse'] = _LISTHANDLERSRESPONSE
DESCRIPTOR.message_types_by_name['Handler'] = _HANDLER
DESCRIPTOR.message_types_by_name['ListHooksRequest'] = _LISTHOOKSREQUEST
DESCRIPTOR.message_types_by_name['ListHooksResponse'] = _LISTHOOKSRESPONSE
DESCRIPTOR.message_types_by_name['Hook'] = _HOOK
DESCRIPTOR.message_types_by_name['Registration'] = _REGISTRATION
DESCRIPTOR.message_types_by_name['RegisterRequest'] = _REGISTERREQUEST
DESCRIPTOR.message_types_by_name['RegisterResponse'] = _REGISTERRESPONSE
DESCRIPTOR.message_types_by_name['DeregisterRequest'] = _DEREGISTERREQUEST
DESCRIPTOR.message_types_by_name['DeregisterResponse'] = _DEREGISTERRESPONSE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
ListHandlersRequest = _reflection.GeneratedProtocolMessageType('ListHandlersRequest', (_message.Message,), {
'DESCRIPTOR' : _LISTHANDLERSREQUEST,
'__module__' : 'api_pb2'
# @@protoc_insertion_point(class_scope:eventprocapi.ListHandlersRequest)
})
_sym_db.RegisterMessage(ListHandlersRequest)
ListHandlersResponse = _reflection.GeneratedProtocolMessageType('ListHandlersResponse', (_message.Message,), {
'DESCRIPTOR' : _LISTHANDLERSRESPONSE,
'__module__' : 'api_pb2'
# @@protoc_insertion_point(class_scope:eventprocapi.ListHandlersResponse)
})
_sym_db.RegisterMessage(ListHandlersResponse)
Handler = _reflection.GeneratedProtocolMessageType('Handler', (_message.Message,), {
'DESCRIPTOR' : _HANDLER,
'__module__' : 'api_pb2'
# @@protoc_insertion_point(class_scope:eventprocapi.Handler)
})
_sym_db.RegisterMessage(Handler)
ListHooksRequest = _reflection.GeneratedProtocolMessageType('ListHooksRequest', (_message.Message,), {
'DESCRIPTOR' : _LISTHOOKSREQUEST,
'__module__' : 'api_pb2'
# @@protoc_insertion_point(class_scope:eventprocapi.ListHooksRequest)
})
_sym_db.RegisterMessage(ListHooksRequest)
ListHooksResponse = _reflection.GeneratedProtocolMessageType('ListHooksResponse', (_message.Message,), {
'DESCRIPTOR' : _LISTHOOKSRESPONSE,
'__module__' : 'api_pb2'
# @@protoc_insertion_point(class_scope:eventprocapi.ListHooksResponse)
})
_sym_db.RegisterMessage(ListHooksResponse)
Hook = _reflection.GeneratedProtocolMessageType('Hook', (_message.Message,), {
'DESCRIPTOR' : _HOOK,
'__module__' : 'api_pb2'
# @@protoc_insertion_point(class_scope:eventprocapi.Hook)
})
_sym_db.RegisterMessage(Hook)
Registration = _reflection.GeneratedProtocolMessageType('Registration', (_message.Message,), {
'DESCRIPTOR' : _REGISTRATION,
'__module__' : 'api_pb2'
# @@protoc_insertion_point(class_scope:eventprocapi.Registration)
})
_sym_db.RegisterMessage(Registration)
RegisterRequest = _reflection.GeneratedProtocolMessageType('RegisterRequest', (_message.Message,), {
'DESCRIPTOR' : _REGISTERREQUEST,
'__module__' : 'api_pb2'
# @@protoc_insertion_point(class_scope:eventprocapi.RegisterRequest)
})
_sym_db.RegisterMessage(RegisterRequest)
RegisterResponse = _reflection.GeneratedProtocolMessageType('RegisterResponse', (_message.Message,), {
'DESCRIPTOR' : _REGISTERRESPONSE,
'__module__' : 'api_pb2'
# @@protoc_insertion_point(class_scope:eventprocapi.RegisterResponse)
})
_sym_db.RegisterMessage(RegisterResponse)
DeregisterRequest = _reflection.GeneratedProtocolMessageType('DeregisterRequest', (_message.Message,), {
'DESCRIPTOR' : _DEREGISTERREQUEST,
'__module__' : 'api_pb2'
# @@protoc_insertion_point(class_scope:eventprocapi.DeregisterRequest)
})
_sym_db.RegisterMessage(DeregisterRequest)
DeregisterResponse = _reflection.GeneratedProtocolMessageType('DeregisterResponse', (_message.Message,), {
'DESCRIPTOR' : _DEREGISTERRESPONSE,
'__module__' : 'api_pb2'
# @@protoc_insertion_point(class_scope:eventprocapi.DeregisterResponse)
})
_sym_db.RegisterMessage(DeregisterResponse)
_EVENTPROCESSINGSERVICE = _descriptor.ServiceDescriptor(
name='EventProcessingService',
full_name='eventprocapi.EventProcessingService',
file=DESCRIPTOR,
index=0,
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_start=838,
serialized_end=1191,
methods=[
_descriptor.MethodDescriptor(
name='ListHooks',
full_name='eventprocapi.EventProcessingService.ListHooks',
index=0,
containing_service=None,
input_type=_LISTHOOKSREQUEST,
output_type=_LISTHOOKSRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='ListHandlers',
full_name='eventprocapi.EventProcessingService.ListHandlers',
index=1,
containing_service=None,
input_type=_LISTHANDLERSREQUEST,
output_type=_LISTHANDLERSRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='Register',
full_name='eventprocapi.EventProcessingService.Register',
index=2,
containing_service=None,
input_type=_REGISTERREQUEST,
output_type=_REGISTERRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='Deregister',
full_name='eventprocapi.EventProcessingService.Deregister',
index=3,
containing_service=None,
input_type=_DEREGISTERREQUEST,
output_type=_DEREGISTERRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
])
_sym_db.RegisterServiceDescriptor(_EVENTPROCESSINGSERVICE)
DESCRIPTOR.services_by_name['EventProcessingService'] = _EVENTPROCESSINGSERVICE
# @@protoc_insertion_point(module_scope)
| 39.200913 | 2,035 | 0.757173 | 3,055 | 25,755 | 6.053028 | 0.075614 | 0.042397 | 0.069868 | 0.067164 | 0.661259 | 0.612103 | 0.602423 | 0.596799 | 0.590634 | 0.551968 | 0 | 0.032029 | 0.121103 | 25,755 | 656 | 2,036 | 39.260671 | 0.7849 | 0.036304 | 0 | 0.676223 | 1 | 0.011804 | 0.171821 | 0.118392 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006745 | 0 | 0.006745 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b08e73aea888895d94622da314c021c5953603da | 265 | py | Python | amorphous/tests/models.py | japrogramer/django-amorphous | 6fc45c348af10dfcaef53b919e3fdedded92c5d2 | [
"BSD-3-Clause"
] | 3 | 2016-02-23T07:58:23.000Z | 2016-07-03T06:48:38.000Z | amorphous/tests/models.py | japrogramer/django-amorphous | 6fc45c348af10dfcaef53b919e3fdedded92c5d2 | [
"BSD-3-Clause"
] | null | null | null | amorphous/tests/models.py | japrogramer/django-amorphous | 6fc45c348af10dfcaef53b919e3fdedded92c5d2 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from django.db import models
from django.contrib.postgres.fields import JSONField
class TestModel(models.Model):
# we only need one field
metadata = JSONField(default=dict())
def get_absolute_url(self):
return '/one/'
| 22.083333 | 52 | 0.690566 | 35 | 265 | 5.171429 | 0.828571 | 0.110497 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004673 | 0.192453 | 265 | 11 | 53 | 24.090909 | 0.841122 | 0.166038 | 0 | 0 | 0 | 0 | 0.022936 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0.166667 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 2 |
b092ca076e4304c19d647c840b2587137fdedcd5 | 772 | py | Python | geoApp/migrations/0002_auto_20210831_2215.py | sebastian-konicz/geo-murals | 6c10832534242c8cdfccb21d05aebb317c6738d3 | [
"MIT"
] | null | null | null | geoApp/migrations/0002_auto_20210831_2215.py | sebastian-konicz/geo-murals | 6c10832534242c8cdfccb21d05aebb317c6738d3 | [
"MIT"
] | null | null | null | geoApp/migrations/0002_auto_20210831_2215.py | sebastian-konicz/geo-murals | 6c10832534242c8cdfccb21d05aebb317c6738d3 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.6 on 2021-08-31 20:15
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('geoApp', '0001_initial'),
]
operations = [
migrations.RemoveField(
model_name='mural',
name='creation_date',
),
migrations.RemoveField(
model_name='mural',
name='file',
),
migrations.AlterField(
model_name='mural',
name='lat',
field=models.DecimalField(decimal_places=12, max_digits=15),
),
migrations.AlterField(
model_name='mural',
name='lon',
field=models.DecimalField(decimal_places=12, max_digits=15),
),
]
| 24.125 | 72 | 0.549223 | 74 | 772 | 5.594595 | 0.554054 | 0.086957 | 0.135266 | 0.173913 | 0.608696 | 0.608696 | 0.236715 | 0.236715 | 0.236715 | 0 | 0 | 0.052529 | 0.334197 | 772 | 31 | 73 | 24.903226 | 0.752918 | 0.05829 | 0 | 0.56 | 1 | 0 | 0.084138 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.04 | 0 | 0.16 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b0abcd0ec83889c30e116b9838fc3692bf0f366c | 779 | py | Python | categorical_label_to_id.py | furaoing/tensorflow-tutorial | 72e57b4dba50c579dbc24798d23dc86ceeaa9470 | [
"Apache-2.0"
] | null | null | null | categorical_label_to_id.py | furaoing/tensorflow-tutorial | 72e57b4dba50c579dbc24798d23dc86ceeaa9470 | [
"Apache-2.0"
] | null | null | null | categorical_label_to_id.py | furaoing/tensorflow-tutorial | 72e57b4dba50c579dbc24798d23dc86ceeaa9470 | [
"Apache-2.0"
] | null | null | null | import random
import pandas
import numpy as np
from sklearn import metrics, cross_validation
import tensorflow as tf
from tensorflow.contrib import layers
from tensorflow.contrib import learn
random.seed(42)
"""
data = pandas.read_csv('titanic_train.csv')
X = data[["Embarked"]]
y = data["Survived"]
X_train, X_test, y_train, y_test = cross_validation.train_test_split(X, y, test_size=0.2, random_state=42)
embarked_classes = X_train["Embarked"].unique()
n_classes = len(embarked_classes) + 1
print('Embarked has next classes: ', embarked_classes)
"""
X_train = ["s", "a", "s", "d"]
cat_processor = learn.preprocessing.CategoricalProcessor()
X_train = np.array(list(cat_processor.fit_transform(X_train)))
t = X_train[0][0]
result = cat_processor.vocabularies_[0].reverse(t)
| 27.821429 | 106 | 0.762516 | 119 | 779 | 4.773109 | 0.470588 | 0.06338 | 0.073944 | 0.09507 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014368 | 0.106547 | 779 | 27 | 107 | 28.851852 | 0.801724 | 0 | 0 | 0 | 0 | 0 | 0.009195 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.538462 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b0b77f0e80ae2d3ba271189578bbd184103756e6 | 170 | py | Python | Python3/Books/Douson/chapter02/personal_greeter.py | neon1ks/Study | 5d40171cf3bf5e8d3a95539e91f5afec54d1daf3 | [
"MIT"
] | null | null | null | Python3/Books/Douson/chapter02/personal_greeter.py | neon1ks/Study | 5d40171cf3bf5e8d3a95539e91f5afec54d1daf3 | [
"MIT"
] | null | null | null | Python3/Books/Douson/chapter02/personal_greeter.py | neon1ks/Study | 5d40171cf3bf5e8d3a95539e91f5afec54d1daf3 | [
"MIT"
] | null | null | null | # Personal Greeter
# Demonstrates getting user input
name = input("Hi. What's your name? ")
print(name)
print("Hi,", name)
input("\n\nPress the enter key to exit.")
| 15.454545 | 41 | 0.682353 | 26 | 170 | 4.461538 | 0.730769 | 0.155172 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170588 | 170 | 10 | 42 | 17 | 0.822695 | 0.282353 | 0 | 0 | 0 | 0 | 0.487395 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
b0b9ec82f9111063ad9216b2e7074bfe924d8b54 | 175 | py | Python | users/signals.py | uktrade/great-cms | f13fa335ddcb925bc33a5fa096fe73ef7bdd351a | [
"MIT"
] | 10 | 2020-04-30T12:04:35.000Z | 2021-07-21T12:48:55.000Z | users/signals.py | uktrade/great-cms | f13fa335ddcb925bc33a5fa096fe73ef7bdd351a | [
"MIT"
] | 1,461 | 2020-01-23T18:20:26.000Z | 2022-03-31T08:05:56.000Z | users/signals.py | uktrade/great-cms | f13fa335ddcb925bc33a5fa096fe73ef7bdd351a | [
"MIT"
] | 3 | 2020-04-07T20:11:36.000Z | 2020-10-16T16:22:59.000Z | def approve_new_user(sender, instance, created, *args, **kwarg):
if created:
instance.is_staff = True
instance.is_superuser = True
instance.save()
| 29.166667 | 64 | 0.651429 | 21 | 175 | 5.238095 | 0.714286 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.245714 | 175 | 5 | 65 | 35 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b0e38928137e6003ca10bbe2916ef3950a99086b | 375 | py | Python | OnlineStudy/utils/middlewares.py | NanRenTeam-9/MongoMicroCourse | 59053c88faf76de3592b5aa02b1425b126fe2f2d | [
"MIT"
] | 132 | 2019-07-11T01:17:09.000Z | 2022-03-28T02:49:21.000Z | OnlineStudy/utils/middlewares.py | liuqiao1995/onlinestudy | b8abfc7b4f2466e595be801bd9a19a509e03534e | [
"MIT"
] | 10 | 2019-07-18T06:50:45.000Z | 2022-01-29T08:31:31.000Z | OnlineStudy/utils/middlewares.py | liuqiao1995/onlinestudy | b8abfc7b4f2466e595be801bd9a19a509e03534e | [
"MIT"
] | 49 | 2019-07-11T00:31:26.000Z | 2022-03-05T19:25:35.000Z | from django.utils.deprecation import MiddlewareMixin
class MyCors(MiddlewareMixin):
def process_response(self, requesst, response):
response['Access-Control-Allow-Origin'] = '*'
if requesst.method == 'OPTIONS':
response["Access-Control-Allow-Headers"] = "*"
response['Access-Control-Allow-Methods'] = '*'
return response
| 34.090909 | 58 | 0.658667 | 36 | 375 | 6.833333 | 0.638889 | 0.170732 | 0.256098 | 0.317073 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210667 | 375 | 10 | 59 | 37.5 | 0.831081 | 0 | 0 | 0 | 0 | 0 | 0.248 | 0.221333 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b0e6d739b70566ae9140e6b5ff2a5a228566dd78 | 1,809 | py | Python | tests/fixtures/sound.py | bcurnow/rfid-security-svc | d3806cb74d3d0cc2623ea425230dc8781ba4d8b4 | [
"Apache-2.0"
] | null | null | null | tests/fixtures/sound.py | bcurnow/rfid-security-svc | d3806cb74d3d0cc2623ea425230dc8781ba4d8b4 | [
"Apache-2.0"
] | null | null | null | tests/fixtures/sound.py | bcurnow/rfid-security-svc | d3806cb74d3d0cc2623ea425230dc8781ba4d8b4 | [
"Apache-2.0"
] | null | null | null | import os
from io import BytesIO
import pytest
from werkzeug.datastructures import FileStorage
from rfidsecuritysvc.model.sound import Sound
@pytest.fixture(scope='session')
def wav_content():
test_wav = os.path.join(os.path.dirname(__file__), 'test.wav')
with open(test_wav, 'rb') as f:
return f.read()
@pytest.fixture(scope='session')
def sounds(wav_content):
return [
Sound(1, 'test1.wav', '2021-09-25 23:13:25', wav_content),
Sound(2, 'test2.wav', '2021-09-25 23:13:25', wav_content),
]
@pytest.fixture(scope='session')
def creatable_sound(sounds, wav_content):
return Sound(len(sounds) + 1, 'creatable.wav', '2021-09-25 23:13:25', wav_content)
@pytest.fixture(scope='session')
def default_sound(sounds):
return sounds[0]
@pytest.fixture(autouse=True, scope='session')
def add_sound_helpers(monkeypatch_session):
def convert(self):
# Can't use eithe of the existing to_json methods as one doesn't contain
# content and the other base64 encodes it
copy = self.__dict__.copy()
del copy['id']
del copy['last_update_timestamp']
return copy
def test_to_row(self):
copy = self.__dict__.copy()
return copy
def test_to_multipart(self, content_type='audio/wav'):
fs = FileStorage(BytesIO(self.content), 'local file name.wav', self.name, content_type, len(self.content))
return {'name': self.name, 'content': fs}
monkeypatch_session.setattr(Sound, 'test_create', convert, raising=False)
monkeypatch_session.setattr(Sound, 'test_update', convert, raising=False)
monkeypatch_session.setattr(Sound, 'test_to_row', test_to_row, raising=False)
monkeypatch_session.setattr(Sound, 'test_to_multipart', test_to_multipart, raising=False)
| 31.736842 | 114 | 0.694306 | 251 | 1,809 | 4.812749 | 0.346614 | 0.049669 | 0.062086 | 0.082781 | 0.393212 | 0.24255 | 0.24255 | 0.24255 | 0.113411 | 0.09106 | 0 | 0.033693 | 0.179657 | 1,809 | 56 | 115 | 32.303571 | 0.780323 | 0.060807 | 0 | 0.205128 | 0 | 0 | 0.144458 | 0.012382 | 0 | 0 | 0 | 0 | 0 | 1 | 0.205128 | false | 0 | 0.128205 | 0.076923 | 0.512821 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b0ebc6e1817d377a8cf42408164f57be188861e2 | 484 | py | Python | window-functions/part-3/00-the-data.py | yorek/databricks-samples | 2e6dab922337ed67d19ab36d909273b6c597fa40 | [
"MIT"
] | null | null | null | window-functions/part-3/00-the-data.py | yorek/databricks-samples | 2e6dab922337ed67d19ab36d909273b6c597fa40 | [
"MIT"
] | null | null | null | window-functions/part-3/00-the-data.py | yorek/databricks-samples | 2e6dab922337ed67d19ab36d909273b6c597fa40 | [
"MIT"
] | null | null | null | # Databricks notebook source
# MAGIC %md
# MAGIC # Project Timesheet Source Data
# COMMAND ----------
# MAGIC %md
# MAGIC *Access to csv data source has been configured at the cluster level so that you don't need to set the Azure Blob auth info in every workbook*
# COMMAND ----------
# MAGIC %sql
# MAGIC describe table samples.project_timesheet
# COMMAND ----------
# MAGIC %sql
# MAGIC select
# MAGIC *
# MAGIC from
# MAGIC samples.project_timesheet
# COMMAND ----------
| 19.36 | 149 | 0.673554 | 64 | 484 | 5.0625 | 0.609375 | 0.148148 | 0.074074 | 0.123457 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188017 | 484 | 24 | 150 | 20.166667 | 0.824427 | 0.913223 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b0f589e6abbbbd14184659797a72a1912600958d | 3,222 | py | Python | migrations/versions/d472e8cc6100_new_subscription_table.py | ourresearch/journalsdb | 169feb9be684eac59f3294dccdb319eb10fe1958 | [
"MIT"
] | 8 | 2021-02-01T21:00:20.000Z | 2022-01-25T09:51:24.000Z | migrations/versions/d472e8cc6100_new_subscription_table.py | ourresearch/journalsdb | 169feb9be684eac59f3294dccdb319eb10fe1958 | [
"MIT"
] | 43 | 2021-04-28T00:20:53.000Z | 2022-03-09T00:39:56.000Z | migrations/versions/d472e8cc6100_new_subscription_table.py | ourresearch/journalsdb | 169feb9be684eac59f3294dccdb319eb10fe1958 | [
"MIT"
] | null | null | null | """new subscription table
Revision ID: d472e8cc6100
Revises: e487c3f2bdd0
Create Date: 2021-08-28 10:34:02.566215
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "d472e8cc6100"
down_revision = "e487c3f2bdd0"
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table(
"subscription_price_new",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("journal_id", sa.Integer(), nullable=False),
sa.Column("price", sa.Numeric(precision=10, scale=2), nullable=False),
sa.Column("currency_id", sa.Integer(), nullable=False),
sa.Column("country_id", sa.Integer(), nullable=True),
sa.Column("region_id", sa.Integer(), nullable=True),
sa.Column("fte_from", sa.Integer(), nullable=True),
sa.Column("fte_to", sa.Integer(), nullable=True),
sa.Column("year", sa.Integer(), nullable=False),
sa.Column(
"created_at", sa.DateTime(), server_default=sa.text("now()"), nullable=False
),
sa.Column("updated_at", sa.DateTime(), nullable=True),
sa.ForeignKeyConstraint(
["country_id"],
["countries.id"],
),
sa.ForeignKeyConstraint(
["currency_id"],
["currency.id"],
),
sa.ForeignKeyConstraint(
["journal_id"],
["journals.id"],
),
sa.ForeignKeyConstraint(
["region_id"],
["regions.id"],
),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
op.f("ix_subscription_price_new_country_id"),
"subscription_price_new",
["country_id"],
unique=False,
)
op.create_index(
op.f("ix_subscription_price_new_currency_id"),
"subscription_price_new",
["currency_id"],
unique=False,
)
op.create_index(
op.f("ix_subscription_price_new_journal_id"),
"subscription_price_new",
["journal_id"],
unique=False,
)
op.create_index(
op.f("ix_subscription_price_new_region_id"),
"subscription_price_new",
["region_id"],
unique=False,
)
op.create_index(
op.f("ix_subscription_price_new_year"),
"subscription_price_new",
["year"],
unique=False,
)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(
op.f("ix_subscription_price_new_year"), table_name="subscription_price_new"
)
op.drop_index(
op.f("ix_subscription_price_new_region_id"), table_name="subscription_price_new"
)
op.drop_index(
op.f("ix_subscription_price_new_journal_id"),
table_name="subscription_price_new",
)
op.drop_index(
op.f("ix_subscription_price_new_currency_id"),
table_name="subscription_price_new",
)
op.drop_index(
op.f("ix_subscription_price_new_country_id"),
table_name="subscription_price_new",
)
op.drop_table("subscription_price_new")
# ### end Alembic commands ###
| 29.559633 | 88 | 0.611732 | 365 | 3,222 | 5.10137 | 0.213699 | 0.200859 | 0.236305 | 0.053706 | 0.590226 | 0.532223 | 0.500537 | 0.396885 | 0.327605 | 0.252417 | 0 | 0.021136 | 0.251086 | 3,222 | 108 | 89 | 29.833333 | 0.750518 | 0.094351 | 0 | 0.522222 | 0 | 0 | 0.297119 | 0.212426 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022222 | false | 0 | 0.022222 | 0 | 0.044444 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9fd797b781e8cbd7f961e3b67c0ecc431ed6f353 | 1,315 | py | Python | bitbucket/__init__.py | c9s/bitbucket-client | 045b1b5b1b186bfeb5aff392a16c49a4c8b0c072 | [
"MIT"
] | 3 | 2017-08-16T00:22:16.000Z | 2017-08-22T09:22:04.000Z | bitbucket/__init__.py | c9s/py-bitbucket | 045b1b5b1b186bfeb5aff392a16c49a4c8b0c072 | [
"MIT"
] | null | null | null | bitbucket/__init__.py | c9s/py-bitbucket | 045b1b5b1b186bfeb5aff392a16c49a4c8b0c072 | [
"MIT"
] | null | null | null | import oauthlib
from requests_oauthlib import OAuth1Session, OAuth2Session
from oauthlib.oauth2 import BackendApplicationClient
import click
import pickle
import os
import sys
import yaml
import json
authorization_base_url = 'https://bitbucket.org/site/oauth2/authorize'
token_url = 'https://bitbucket.org/site/oauth2/access_token'
class BitBucketClient:
def __init__(self, client_id, client_secret, token=None, scope=None, client=None):
self.client = client if client else BackendApplicationClient(client_id=client_id)
self.oauth = OAuth2Session(client=self.client, token=token, scope=scope)
self.token = self.oauth.fetch_token(token_url=token_url, client_id=client_id, client_secret=client_secret)
def hooks(self, user, repo):
resp = self.oauth.get('https://api.bitbucket.org/2.0/repositories/%s/%s/hooks' % (user, repo))
data = json.loads(resp.content)
return data['values']
def create_hook(self, user : str, repo : str, url, description, events):
resp = self.oauth.post('https://api.bitbucket.org/2.0/repositories/%s/%s/hooks' % (user, repo), data = json.dumps({
"url": url,
"description": description,
"active": True,
"events": events
}))
return json.loads(resp.content)
| 36.527778 | 123 | 0.697338 | 169 | 1,315 | 5.301775 | 0.366864 | 0.044643 | 0.0625 | 0.044643 | 0.194196 | 0.194196 | 0.127232 | 0.127232 | 0.127232 | 0.127232 | 0 | 0.009302 | 0.18251 | 1,315 | 35 | 124 | 37.571429 | 0.824186 | 0 | 0 | 0 | 0 | 0 | 0.174144 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.107143 | false | 0 | 0.321429 | 0 | 0.535714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
9fdd193b6b1abff10be20cce917548b8be932827 | 507 | py | Python | test.py | markmelnic/Car-Indexes-Scraper | 93c0ec053651aa47dd7852f8450685eb959c2be9 | [
"MIT"
] | 2 | 2020-12-20T21:08:30.000Z | 2021-09-05T10:53:25.000Z | test.py | markmelnic/Car-Indexes-Scraper | 93c0ec053651aa47dd7852f8450685eb959c2be9 | [
"MIT"
] | null | null | null | test.py | markmelnic/Car-Indexes-Scraper | 93c0ec053651aa47dd7852f8450685eb959c2be9 | [
"MIT"
] | null | null | null |
import os
import json
import subprocess
import mobile_de
import autoscout24_ch
import anibis_ch
if __name__ == '__main__':
try:
#autoscout24_ch.scrape_makes()
#autoscout24_ch.scrape_models()
#mobile_de.scrape_makes()
#mobile_de.scrape_models()
anibis_ch.scrape_makes()
# check "models": []
except KeyboardInterrupt:
exit(0)
except json.decoder.JSONDecodeError:
os.remove('makes.json')
subprocess.run('python test.py')
| 20.28 | 40 | 0.66075 | 58 | 507 | 5.413793 | 0.482759 | 0.076433 | 0.121019 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018229 | 0.242604 | 507 | 24 | 41 | 21.125 | 0.799479 | 0.250493 | 0 | 0 | 0 | 0 | 0.085562 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.428571 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
9fe3528bfc47fb560111d8f4da915471ed646453 | 1,586 | py | Python | shopify/products/objects/image.py | alikhan126/python-shopify-api | 656cdf1af99485b25be545e2ed527bcb653076fd | [
"Unlicense"
] | 10 | 2016-12-29T06:53:21.000Z | 2022-03-01T10:35:32.000Z | shopify/products/objects/image.py | alikhan126/python-shopify-api | 656cdf1af99485b25be545e2ed527bcb653076fd | [
"Unlicense"
] | 4 | 2016-12-30T15:12:47.000Z | 2021-07-24T07:14:20.000Z | shopify/products/objects/image.py | alikhan126/python-shopify-api | 656cdf1af99485b25be545e2ed527bcb653076fd | [
"Unlicense"
] | 8 | 2016-12-29T19:13:39.000Z | 2022-03-22T18:02:58.000Z | import base64
from ...base import BaseParser, datetime_to_string, string_to_datetime
class Image(BaseParser):
def __init__(self, d=None, **kwargs):
BaseParser.__init__(self, d, **kwargs)
self.image_location = None
@property
def product_id(self):
return self._dict.get('product_id')
# No product_id setter since that value shouldn't be modified.
@property
def position(self):
return self._dict.get('position')
@position.setter
def position(self, val):
self._dict['position'] = int(val)
@property
def created_at(self):
return string_to_datetime(self._dict.get('created_at'))
@created_at.setter
def created_at(self, val):
self._dict['created_at'] = datetime_to_string(val)
@property
def updated_at(self):
return string_to_datetime(self._dict.get('updated_at'))
@updated_at.setter
def updated_at(self, val):
self._dict['updated_at'] = datetime_to_string(val)
@property
def src(self):
return self._dict.get('src')
@src.setter
def src(self, val):
self._dict['src'] = val
def attach(self, f):
"""
Attach an image file instead of using a url.
:param f: Path to image file.
:return:
"""
with open(f, 'rb') as f:
encoded = base64.b64encode(f.read())
self._dict['attachment'] = encoded
self.image_location = f
def __repr__(self):
return '<Image id={} src={} attachment={}>'.format(self.id, self.src, self.image_location)
| 25.174603 | 98 | 0.62169 | 206 | 1,586 | 4.543689 | 0.291262 | 0.08547 | 0.058761 | 0.064103 | 0.255342 | 0.151709 | 0.151709 | 0.083333 | 0.083333 | 0 | 0 | 0.005072 | 0.254098 | 1,586 | 62 | 99 | 25.580645 | 0.786137 | 0.091425 | 0 | 0.125 | 0 | 0 | 0.084226 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0.05 | 0.15 | 0.525 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
9feaf536c91421d1ae4813d4aad435b998574253 | 1,043 | py | Python | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_ipv6_nd_subscriber_cfg.py | Maikor/ydk-py | b86c4a7c570ae3b2c5557d098420446df5de4929 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_ipv6_nd_subscriber_cfg.py | Maikor/ydk-py | b86c4a7c570ae3b2c5557d098420446df5de4929 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_ipv6_nd_subscriber_cfg.py | Maikor/ydk-py | b86c4a7c570ae3b2c5557d098420446df5de4929 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | """ Cisco_IOS_XR_ipv6_nd_subscriber_cfg
This module contains a collection of YANG definitions
for Cisco IOS\-XR ipv6\-nd\-subscriber package configuration.
This YANG module augments the
Cisco\-IOS\-XR\-subscriber\-infra\-tmplmgr\-cfg
module with configuration data.
Copyright (c) 2013\-2018 by Cisco Systems, Inc.
All rights reserved.
"""
from collections import OrderedDict
from ydk.types import Entity, EntityPath, Identity, Enum, YType, YLeaf, YLeafList, YList, LeafDataList, Bits, Empty, Decimal64
from ydk.filters import YFilter
from ydk.errors import YError, YModelError
from ydk.errors.error_handler import handle_type_error as _handle_type_error
class Ipv6NdRouterPrefTemplate(Enum):
"""
Ipv6NdRouterPrefTemplate (Enum Class)
Ipv6 nd router pref template
.. data:: high = 1
High preference
.. data:: medium = 2
Medium preference
.. data:: low = 3
Low preference
"""
high = Enum.YLeaf(1, "high")
medium = Enum.YLeaf(2, "medium")
low = Enum.YLeaf(3, "low")
| 20.45098 | 126 | 0.71908 | 136 | 1,043 | 5.426471 | 0.544118 | 0.03794 | 0.04065 | 0.03794 | 0.070461 | 0.070461 | 0 | 0 | 0 | 0 | 0 | 0.024793 | 0.187919 | 1,043 | 50 | 127 | 20.86 | 0.846517 | 0.507191 | 0 | 0 | 0 | 0 | 0.027837 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.555556 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
9ff7b173719cc24d48e55a44cee39a790259886b | 259 | py | Python | apps/job/admin.py | matheuslins/cuzjobs | 0f46402d534fefaef394ccd09b454fe361bb36f2 | [
"MIT"
] | 1 | 2018-07-10T20:30:52.000Z | 2018-07-10T20:30:52.000Z | apps/job/admin.py | matheuslins/cuscuzjobs | 0f46402d534fefaef394ccd09b454fe361bb36f2 | [
"MIT"
] | 10 | 2019-04-25T00:01:29.000Z | 2021-04-08T18:52:52.000Z | apps/job/admin.py | matheuslins/cuzjobs | 0f46402d534fefaef394ccd09b454fe361bb36f2 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Job
from .models import JobCandidate
class JobAdmin(admin.ModelAdmin):
ordering = ['date_created']
search_fields = ['title']
admin.site.register(Job, JobAdmin)
admin.site.register(JobCandidate)
| 19.923077 | 34 | 0.764479 | 32 | 259 | 6.125 | 0.59375 | 0.102041 | 0.163265 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135135 | 259 | 12 | 35 | 21.583333 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0.065637 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b002d18724bd4efe53bdcb041c9e1477d9234919 | 5,476 | py | Python | RNAPuzzles/rnapuzzles/models/puzzles.py | whinyadventure/RNA-Puzzles | bbd147e1a0748a77b5e3424a93ad57bb430b5a0e | [
"Apache-2.0"
] | null | null | null | RNAPuzzles/rnapuzzles/models/puzzles.py | whinyadventure/RNA-Puzzles | bbd147e1a0748a77b5e3424a93ad57bb430b5a0e | [
"Apache-2.0"
] | 26 | 2019-10-08T11:11:25.000Z | 2022-03-12T00:52:30.000Z | RNAPuzzles/rnapuzzles/models/puzzles.py | whinyadventure/RNA-Puzzles | bbd147e1a0748a77b5e3424a93ad57bb430b5a0e | [
"Apache-2.0"
] | 1 | 2020-05-11T18:51:04.000Z | 2020-05-11T18:51:04.000Z | from django.core.validators import FileExtensionValidator
from django.db import models
from django.utils import timezone
from .user import CustomUser
import datetime
from six import text_type
import os
def puzzle_info_img_filename(instance, filename):
return 'puzzle_{}_img{}'.format(instance.pk, os.path.splitext(filename)[1])
class PuzzleInfo(models.Model):
description = models.CharField(verbose_name="Description", max_length=250)
sequence = models.TextField(verbose_name="RNA sequence (5' to 3')")
publish_date = models.DateTimeField(verbose_name='Target 3D structure publication date', blank=True, null=True)
reference = models.TextField(verbose_name="Reference", blank=True)
reference_url = models.URLField(verbose_name="Reference URL", blank=True)
pdb_id = models.CharField(verbose_name="PDB ID", max_length=4, blank=True)
pdb_url = models.URLField(verbose_name="PDB URL", blank=True)
pdb_file = models.FileField(verbose_name="Target 3D structure file",
validators=[FileExtensionValidator(allowed_extensions=['pdb'])])
img = models.ImageField(verbose_name="Target 3D structure graphic representation",
upload_to=puzzle_info_img_filename, blank=True,
validators=[FileExtensionValidator(allowed_extensions=['jpg', 'png'])])
author = models.ForeignKey(CustomUser, on_delete=models.SET_NULL, null=True, blank=True, editable=False)
metrics = models.ManyToManyField("rnapuzzles.Metric")
class Meta:
verbose_name = 'Puzzle Information'
def __get_label(self, field):
return text_type(self._meta.get_field(field).verbose_name)
@property
def description_label(self):
return self.__get_label('description')
@property
def sequence_label(self):
return self.__get_label('sequence')
@property
def publish_date_label(self):
return self.__get_label('publish_date')
@property
def reference_label(self):
return self.__get_label('reference')
@property
def reference_url_label(self):
return self.__get_label('reference_url')
@property
def pdb_id_label(self):
return self.__get_label('pdb_id')
@property
def pdb_url_label(self):
return self.__get_label('pdb_url')
@property
def pdb_file_label(self):
return self.__get_label('pdb_file')
@property
def img_label(self):
return self.__get_label('img')
def __str__(self):
return 'Puzzle %s' % (str(self.id))
class Challenge(models.Model):
CREATED = 0
OPEN = 1
EVALUATED = 2
COMPLETED = 3
round = models.IntegerField(default=1, editable=False)
created_at = models.DateTimeField(auto_now_add=True)
start_date = models.DateTimeField(verbose_name='Opening date')
end_date = models.DateTimeField(verbose_name='Closing date for human category')
end_automatic = models.DateTimeField(verbose_name='Closing date for server category')
result_published = models.BooleanField(default=False)
notification_email_send = models.BooleanField(default=False)
author = models.ForeignKey(CustomUser, on_delete=models.SET_NULL, null=True, blank=True, editable=False)
puzzle_info = models.ForeignKey(PuzzleInfo, on_delete=models.CASCADE, blank=True, null=True)
alignment = models.CharField(max_length=20, blank=True)
class Meta:
ordering = ['-puzzle_info', '-created_at']
permissions = [
("metrics_challenge", "Run computation of metrics"),
]
def __str__(self):
if self.round == 1:
return 'Puzzle %s' % self.puzzle_info_id
else:
return 'Puzzle %s-%s' % (self.puzzle_info_id, str(self.round))
def __get_label(self, field):
return text_type(self._meta.get_field(field).verbose_name)
def save(self, *args, **kwargs):
print(self.end_date)
self.start_date = self.start_date.replace(second=0)
print(self.end_date)
self.end_date = self.end_date.replace(second=0)
self.end_automatic = self.end_automatic.replace(second=0)
super(Challenge, self).save(*args, **kwargs)
@property
def current_status(self):
if timezone.now() < self.start_date:
return self.CREATED
if timezone.now() < self.end_date:
return self.OPEN
if not self.result_published:
return self.EVALUATED
return self.COMPLETED
@property
def current_status_label(self):
if timezone.now() < self.start_date:
return 'Created'
if timezone.now() < self.end_date:
return 'Open'
if not self.result_published:
return 'Evaluated'
return 'Completed'
@property
def start_date_label(self):
return self.__get_label('start_date')
@property
def end_date_label(self):
return self.__get_label('end_date')
@property
def end_automatic_label(self):
return self.__get_label('end_automatic')
class ChallengeFile(models.Model):
note = models.CharField(max_length=50, help_text='Information about file content. Maximum 50 characters.', blank=True)
file = models.FileField(blank=True)
challenge = models.ForeignKey(Challenge, on_delete=models.CASCADE, editable=False)
def __str__(self):
return 'challenge: %s file.id: %s' % (self.challenge, str(self.id))
| 32.595238 | 122 | 0.682615 | 673 | 5,476 | 5.309064 | 0.219911 | 0.04478 | 0.050378 | 0.063812 | 0.354884 | 0.282116 | 0.25021 | 0.124265 | 0.083403 | 0.083403 | 0 | 0.00579 | 0.211468 | 5,476 | 167 | 123 | 32.790419 | 0.821677 | 0 | 0 | 0.266129 | 0 | 0 | 0.112673 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.169355 | false | 0 | 0.056452 | 0.137097 | 0.709677 | 0.016129 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
b01eb43ca5e707902868dad66504b225396a1969 | 263 | py | Python | tests/test_config.py | Hellowlol/bw_plex | 86768d6ee89ee1c08d2f6e6468976e4c51135915 | [
"MIT"
] | 371 | 2018-01-01T17:07:06.000Z | 2022-03-30T01:52:28.000Z | tests/test_config.py | Hellowlol/bw_plex | 86768d6ee89ee1c08d2f6e6468976e4c51135915 | [
"MIT"
] | 144 | 2018-01-01T22:36:36.000Z | 2022-01-31T19:26:55.000Z | tests/test_config.py | Hellowlol/bw_plex | 86768d6ee89ee1c08d2f6e6468976e4c51135915 | [
"MIT"
] | 44 | 2018-01-02T07:58:45.000Z | 2021-11-30T10:53:52.000Z | import os
from conftest import TEST_DATA
from bw_plex.config import read_or_make
def test_config():
conf = read_or_make(os.path.join(TEST_DATA, 'test_config.ini'))
assert 'level' not in conf['general']
assert conf['general']['loglevel'] == 'info'
| 21.916667 | 67 | 0.718631 | 41 | 263 | 4.390244 | 0.585366 | 0.088889 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155894 | 263 | 11 | 68 | 23.909091 | 0.810811 | 0 | 0 | 0 | 0 | 0 | 0.174905 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 1 | 0.142857 | false | 0 | 0.428571 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b0224431dbb2718740529e7328f4dbed5181d3cc | 2,113 | py | Python | examples/AgentTester.py | henrydeng/MonkeyHelper | 042f9794050c80773b04e09128c62c2796620aec | [
"Apache-2.0"
] | 5 | 2016-06-14T02:28:35.000Z | 2021-06-20T10:24:53.000Z | examples/AgentTester.py | henrydeng/MonkeyHelper | 042f9794050c80773b04e09128c62c2796620aec | [
"Apache-2.0"
] | null | null | null | examples/AgentTester.py | henrydeng/MonkeyHelper | 042f9794050c80773b04e09128c62c2796620aec | [
"Apache-2.0"
] | 2 | 2016-06-29T03:18:22.000Z | 2016-09-18T16:31:48.000Z | #
# Copyright 2014 Mingyuan Xia (http://mxia.me) and others
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Contributor(s):
# Mingyuan Xia
# Xinye Lin
#
""" This script tests all agents, hosted by monkeyrunner
"""
import os, sys, inspect
def module_path():
''' returns the module path without the use of __file__.
from http://stackoverflow.com/questions/729583/getting-file-path-of-imported-module'''
return os.path.abspath(os.path.dirname(inspect.getsourcefile(module_path)))
sys.path.append(module_path())
sys.path.append(os.path.join(module_path(), '..', 'src'))
from Agents import CellularAgent, ScreenAgent, SystemStatusAgent, WifiAgent
from MonkeyHelper import EMonkeyDevice
from time import sleep
device = EMonkeyDevice()
test = CellularAgent(device)
print 'current cellular data status', test.getCellularDataStatus()
print 'toggle cellular data status', test.toggleCellularDataStatus()
sleep(5)
print 'turn off cellular data status', test.turnOffCellularData()
sleep(5)
print 'turn on cellular data status', test.turnOnCellularData()
test = ScreenAgent(device)
print 'current screen rotation status', test.getScreenRotationStatus()
print 'current orientation', test.getOrientation()
test = SystemStatusAgent(device)
print 'current WIFI status', test.getWifiStatus()
print 'current battery level', test.getBatteryLevel()
test = WifiAgent(device)
print 'current WIFI status', test.getWiFiStatus()
print 'toggle WIFI status', test.changeWifiStatus()
sleep(5)
print'turn off cellular data status', test.turnOffWifi()
sleep(5)
print 'turn on cellular data status', test.turnOnWifi()
| 33.015625 | 90 | 0.767629 | 281 | 2,113 | 5.743772 | 0.476868 | 0.061958 | 0.066915 | 0.081784 | 0.188352 | 0.159851 | 0.159851 | 0.159851 | 0.097893 | 0 | 0 | 0.009793 | 0.130147 | 2,113 | 63 | 91 | 33.539683 | 0.868335 | 0.292002 | 0 | 0.137931 | 0 | 0 | 0.236407 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.137931 | null | null | 0.413793 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
b025405ac550e6f5e067572549042346dca41a8c | 1,619 | py | Python | apps/utils/shopify_upload_product_utility/import_partspal.py | crisariasgg/RepinSolution | 27e9b04ccc887b4300d77dda8657e761f9523123 | [
"MIT"
] | null | null | null | apps/utils/shopify_upload_product_utility/import_partspal.py | crisariasgg/RepinSolution | 27e9b04ccc887b4300d77dda8657e761f9523123 | [
"MIT"
] | null | null | null | apps/utils/shopify_upload_product_utility/import_partspal.py | crisariasgg/RepinSolution | 27e9b04ccc887b4300d77dda8657e761f9523123 | [
"MIT"
] | 1 | 2021-12-09T21:27:35.000Z | 2021-12-09T21:27:35.000Z | # Local imports
import datetime
# Third party
import pandas as pd
import pymysql
from sqlalchemy import create_engine
class PartsPalImport():
def __init__(self,name,user,host,port,password):
self.user = user
self.password = password
self.host = host
self.port = port
self.name = name
self.connection_results = self.connection_to_db(self.name,self.user,self.host,self.port,self.password)
def connection_to_db(self,name,user,host,port,password):
connection_string = 'mysql+pymysql://{}:{}@{}:{}/{}?charset=utf8mb4'.format(user, password, host, port, name)
try:
print('establishing connection...')
connection = create_engine(connection_string)
connection.connect()
except Exception:
raise("Error, can't establishing connection...")
else:
print ('No exception occurred')
return connection
def read_input_file(self,filename):
data=pd.read_csv(filename)
return data
def insert_date_to_column(self,data,day):
data['date'] = datetime.date.today()+datetime.timedelta(days=day)
return data
def import_to_sql(self,data,connection):
data.to_sql(name='import_excel_partsauthority', con=connection, if_exists = 'append', index=False)
# def main():
# # Database credentials
# NAME='excel_comparison'
# USER = 'root'
# PASSWORD = 'Minimalista1'
# HOST = 'localhost'
# PORT = 3306
# IMPORTING CSV FILE
# data = read_input_file('prueba.csv')
# add_future_date(data,1)
# # IMPORTING DATA TO SQL
# connection=connection_to_db(USER,PASSWORD,HOST,PORT,NAME)
# import_to_sql(data,connection)
# if __name__ == "__main__":
# main() | 19.987654 | 111 | 0.714639 | 211 | 1,619 | 5.28436 | 0.388626 | 0.0287 | 0.037668 | 0.0287 | 0.125561 | 0.050224 | 0 | 0 | 0 | 0 | 0 | 0.005857 | 0.156269 | 1,619 | 81 | 112 | 19.987654 | 0.810395 | 0.239654 | 0 | 0.064516 | 0 | 0 | 0.139439 | 0.060231 | 0 | 0 | 0 | 0 | 0 | 1 | 0.16129 | false | 0.16129 | 0.225806 | 0 | 0.516129 | 0.064516 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
b02f273813190341358cf922ed88152689db9e54 | 3,153 | py | Python | configs/aitod/v001.14.09_aitod_faster_rcnn_r50_giou_nms_only_train.py | jwwangchn/mmdet-aitod | a4f1cad1f6ba079037a58edda885981654885b9e | [
"Apache-2.0"
] | 2 | 2021-04-23T12:15:43.000Z | 2021-06-09T05:28:10.000Z | configs/aitod/v001.14.09_aitod_faster_rcnn_r50_giou_nms_only_train.py | jwwangchn/mmdet-aitod | a4f1cad1f6ba079037a58edda885981654885b9e | [
"Apache-2.0"
] | null | null | null | configs/aitod/v001.14.09_aitod_faster_rcnn_r50_giou_nms_only_train.py | jwwangchn/mmdet-aitod | a4f1cad1f6ba079037a58edda885981654885b9e | [
"Apache-2.0"
] | 1 | 2021-12-18T02:11:30.000Z | 2021-12-18T02:11:30.000Z | """
Faster R-CNN with Wasserstein NMS (only train)
Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=1500 ] = 0.115
Average Precision (AP) @[ IoU=0.25 | area= all | maxDets=1500 ] = -1.000
Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1500 ] = 0.265
Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1500 ] = 0.083
Average Precision (AP) @[ IoU=0.50:0.95 | area=verytiny | maxDets=1500 ] = 0.000
Average Precision (AP) @[ IoU=0.50:0.95 | area= tiny | maxDets=1500 ] = 0.076
Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1500 ] = 0.238
Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1500 ] = 0.345
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.173
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.178
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1500 ] = 0.178
Average Recall (AR) @[ IoU=0.50:0.95 | area=verytiny | maxDets=1500 ] = 0.000
Average Recall (AR) @[ IoU=0.50:0.95 | area= tiny | maxDets=1500 ] = 0.111
Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1500 ] = 0.381
Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1500 ] = 0.459
Optimal LRP @[ IoU=0.50 | area= all | maxDets=1500 ] = 0.893
Optimal LRP Loc @[ IoU=0.50 | area= all | maxDets=1500 ] = 0.304
Optimal LRP FP @[ IoU=0.50 | area= all | maxDets=1500 ] = 0.465
Optimal LRP FN @[ IoU=0.50 | area= all | maxDets=1500 ] = 0.722
# Class-specific LRP-Optimal Thresholds #
[0.723 0.678 0.538 0.533 0.804 0.427 0.47 0.056]
+----------+-------+---------------+-------+--------------+-------+
| category | AP | category | AP | category | AP |
+----------+-------+---------------+-------+--------------+-------+
| airplane | 0.224 | bridge | 0.029 | storage-tank | 0.204 |
| ship | 0.201 | swimming-pool | 0.086 | vehicle | 0.129 |
| person | 0.043 | wind-mill | 0.000 | None | None |
+----------+-------+---------------+-------+--------------+-------+
+----------+-------+---------------+-------+--------------+-------+
| category | oLRP | category | oLRP | category | oLRP |
+----------+-------+---------------+-------+--------------+-------+
| airplane | 0.803 | bridge | 0.964 | storage-tank | 0.813 |
| ship | 0.823 | swimming-pool | 0.910 | vehicle | 0.878 |
| person | 0.953 | wind-mill | 1.000 | None | None |
+----------+-------+---------------+-------+--------------+-------+
"""
_base_ = [
'../_base_/models/faster_rcnn_r50_fpn_aitod.py',
'../_base_/datasets/aitod_detection.py',
'../_base_/schedules/schedule_1x.py',
'../_base_/default_runtime.py'
]
model = dict(
train_cfg=dict(
rpn_proposal=dict(
nms_pre=3000,
max_per_img=3000,
nms=dict(type='giou_nms', iou_threshold=0.7))))
# optimizer
optimizer = dict(type='SGD', lr=0.01, momentum=0.9, weight_decay=0.0001)
# learning policy
checkpoint_config = dict(interval=4)
| 51.688525 | 81 | 0.502696 | 419 | 3,153 | 3.720764 | 0.317422 | 0.048749 | 0.065427 | 0.053881 | 0.490699 | 0.450289 | 0.450289 | 0.436818 | 0.356639 | 0.279025 | 0 | 0.148849 | 0.228671 | 3,153 | 60 | 82 | 52.55 | 0.492188 | 0.839201 | 0 | 0 | 0 | 0 | 0.311245 | 0.289157 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b03519205abfa9273f7ff66ccc2dd61c23ac6922 | 336 | py | Python | src/bananas/admin/api/schemas/__init__.py | beshrkayali/django-bananas | 8e832ca91287c5b3eed5af8de948c67fd026c4b9 | [
"MIT"
] | 26 | 2015-04-07T12:18:26.000Z | 2021-07-23T18:05:52.000Z | src/bananas/admin/api/schemas/__init__.py | beshrkayali/django-bananas | 8e832ca91287c5b3eed5af8de948c67fd026c4b9 | [
"MIT"
] | 55 | 2016-10-25T08:13:50.000Z | 2022-03-04T12:53:24.000Z | src/bananas/admin/api/schemas/__init__.py | beshrkayali/django-bananas | 8e832ca91287c5b3eed5af8de948c67fd026c4b9 | [
"MIT"
] | 16 | 2015-10-13T10:11:59.000Z | 2021-11-11T12:30:32.000Z | from drf_yasg.utils import (
swagger_auto_schema as schema,
swagger_serializer_method as schema_serializer_method,
)
from .yasg import (
BananasSimpleRouter as BananasRouter,
BananasSwaggerSchema as BananasSchema,
)
__all__ = (
"schema",
"schema_serializer_method",
"BananasRouter",
"BananasSchema",
)
| 19.764706 | 58 | 0.735119 | 33 | 336 | 7.090909 | 0.484848 | 0.205128 | 0.188034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 336 | 16 | 59 | 21 | 0.860294 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 0.071429 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b04c5800c52d394339f8fb36ba1df845070fae71 | 1,662 | py | Python | src/anaplan_api/ImportTask.py | pieter-pot/anaplan-api | 1b099cb102f98b114afa0794a40aaf0de19956c1 | [
"BSD-2-Clause"
] | null | null | null | src/anaplan_api/ImportTask.py | pieter-pot/anaplan-api | 1b099cb102f98b114afa0794a40aaf0de19956c1 | [
"BSD-2-Clause"
] | null | null | null | src/anaplan_api/ImportTask.py | pieter-pot/anaplan-api | 1b099cb102f98b114afa0794a40aaf0de19956c1 | [
"BSD-2-Clause"
] | null | null | null | from .TaskFactory import TaskFactory
from .AnaplanConnection import AnaplanConnection
from .Action import Action
from .ParameterAction import ParameterAction
from .Parser import Parser
from .ImportParser import ImportParser
class ImportTask(TaskFactory):
"""Factory to generate an Anaplan import task"""
@staticmethod
def get_action(conn: AnaplanConnection, action_id: str, retry_count: int, mapping_params: dict = None) -> Action:
"""Get an instantiated Action object
:param conn: Object with authentication, workspace, and model details
:type conn: AnaplanConnection
:param action_id: ID of the specified Anaplan action
:type action_id: str
:param retry_count: Number of times to attempt to retry in case of network or server error
:type retry_count: int
:param mapping_params: Import parameters requested at runtime
:type mapping_params: dict
:return:
"""
if not mapping_params:
return Action(conn=conn, action_id=action_id, retry_count=retry_count, mapping_params=mapping_params)
elif mapping_params:
return ParameterAction(conn=conn, action_id=action_id, retry_count=retry_count, mapping_params=mapping_params)
@staticmethod
def get_parser(conn: AnaplanConnection, results: dict, url: str) -> Parser:
"""Get an instantiated Parser object for import tasks
:param conn: Object with authentication, workspace, and model details
:type conn: AnaplanConnection
:param results: JSON response with details of task results
:type results: dict
:param url: Anaplan task URL
:type url: str
:return: Instantiated Parser object for an import task
:rtype: ImportParser
"""
return ImportParser(results, url)
| 36.933333 | 114 | 0.781588 | 219 | 1,662 | 5.817352 | 0.30137 | 0.091837 | 0.028257 | 0.029827 | 0.246468 | 0.246468 | 0.246468 | 0.246468 | 0.246468 | 0.246468 | 0 | 0 | 0.150421 | 1,662 | 44 | 115 | 37.772727 | 0.902266 | 0.489771 | 0 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.5 | 0 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b04e36d25ffded27a3bbd1b598789a30048f99a1 | 168 | py | Python | tests/testsClean.py | ttm/gmaneLegacy | df9e0926c93358abdb632f7f9ce891330959c835 | [
"Unlicense"
] | 1 | 2019-04-25T16:31:12.000Z | 2019-04-25T16:31:12.000Z | tests/testsClean.py | ttm/gmaneLegacy | df9e0926c93358abdb632f7f9ce891330959c835 | [
"Unlicense"
] | null | null | null | tests/testsClean.py | ttm/gmaneLegacy | df9e0926c93358abdb632f7f9ce891330959c835 | [
"Unlicense"
] | null | null | null | import gmaneLegacy as g, importlib
dl=g.DownloadGmaneData('~/.gmane2/')
dl.downloadListIDS()
#dl.getDownloadedLists()
#dl.correctFilenames()
dl.cleanDownloadedLists()
| 21 | 36 | 0.785714 | 17 | 168 | 7.764706 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006369 | 0.065476 | 168 | 7 | 37 | 24 | 0.834395 | 0.261905 | 0 | 0 | 0 | 0 | 0.082645 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b04fd2b86635d1f4f03caa3bd7e3f16ba0f1405e | 2,000 | py | Python | Python2/Modulo_3_exe_pratico.py | Belaschich/SoulON | 9f908b025b34fc79187b4efd5ea93a78dca0ef7e | [
"MIT"
] | null | null | null | Python2/Modulo_3_exe_pratico.py | Belaschich/SoulON | 9f908b025b34fc79187b4efd5ea93a78dca0ef7e | [
"MIT"
] | null | null | null | Python2/Modulo_3_exe_pratico.py | Belaschich/SoulON | 9f908b025b34fc79187b4efd5ea93a78dca0ef7e | [
"MIT"
] | null | null | null | """
1. Crie uma base de dados chamada sistema_escolar_soul_on
2. Crie uma tabela alunos com os campos id, nome, matricula, turma.
3. Alimente a tabela com os seguintes dados:
"""
import mysql.connector
db = mysql.connector.connect(
host = "localhost",
user = "root",
password = "",
database = "sistema_escolar_soul_on"
)
mycursor = db.cursor()
#mycursor.execute("CREATE DATABASE sistema_escolar_soul_on")
#print("Database criada com sucesso!")
#mycursor.execute("CREATE TABLE alunos(id INT AUTO_INCREMENT PRIMARY KEY, name VARCHAR(255), matricula VARCHAR(255), turma VARCHAR(255))")
#print("Tabela criada com sucesso!")
#adicionar = "INSERT INTO alunos (name, matricula, turma) VALUES(%s, %s, %s)"
#val = [
# ("José Lima", "MAT90551", "BCW22"),
# ("Carlos Augusto", "MAT90552", "BCW22"),
# ("Lívia Lima", "MAT90553", "BCW22"),
# ("Sandra Gomes", "MAT90554", "BCW23"),
# ("João Augusto", "MAT90555", "BCW23"),
# ("Breno Lima", "MAT90556", "BCW24"),
# ("José Vinícius", "MAT90557", "BCW25")
#]
#mycursor.executemany(adicionar, val)
#print(mycursor.rowcount, "linha(s) alterada(s)!")
#db.commit()
"""4. Faça as seguintes consultas:
• Liste todos os registros de sua tabela.
• Liste apenas nome e matrícula dos alunos do BCW23.
• Liste apenas o nome dos alunos que tiverem o sobrenome Lima.
"""
#mycursor.execute("SELECT * FROM alunos")
#mycursor.execute("SELECT name FROM alunos WHERE turma = 'BCW23' ")
#adicionar = "SELECT name FROM alunos WHERE name LIKE '%Lima%'"
#mycursor.execute(adicionar)
#myresult = mycursor.fetchall()
#for x in myresult:
# print(x)
'''
5. O aluno Carlos Augusto está na turma errada. Matricule o mesmo no BCW25.
6. O aluno José Vinicius desistiu do curso, ele deve ser excluído do sistema.
'''
#adicionar = "UPDATE alunos SET turma = 'BCW25' WHERE name = 'Carlos Augusto'"
adicionar = "DELETE FROM alunos WHERE name = 'José Vinicius'"
mycursor.execute(adicionar)
#db.commit()
print(mycursor.rowcount, "Linha(s) afetada(s)")
| 33.333333 | 138 | 0.698 | 270 | 2,000 | 5.144444 | 0.485185 | 0.064795 | 0.038877 | 0.043197 | 0.115191 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042578 | 0.1545 | 2,000 | 59 | 139 | 33.898305 | 0.777055 | 0.634 | 0 | 0 | 0 | 0 | 0.288515 | 0.064426 | 0 | 0 | 0 | 0.016949 | 0 | 1 | 0 | false | 0.090909 | 0.090909 | 0 | 0.090909 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
c65d89d5ffc0d8e1edaf96ef349b5e2df83808a1 | 131 | py | Python | AtCoder/ABC/000-159/ABC144_C.py | sireline/PyCode | 8578467710c3c1faa89499f5d732507f5d9a584c | [
"MIT"
] | null | null | null | AtCoder/ABC/000-159/ABC144_C.py | sireline/PyCode | 8578467710c3c1faa89499f5d732507f5d9a584c | [
"MIT"
] | null | null | null | AtCoder/ABC/000-159/ABC144_C.py | sireline/PyCode | 8578467710c3c1faa89499f5d732507f5d9a584c | [
"MIT"
] | null | null | null | import math
N = int(input())
for i in range(1, int(math.sqrt(N))+1)[::-1]:
if N%i == 0:
print(N//i+i-2)
break
| 16.375 | 45 | 0.48855 | 26 | 131 | 2.461538 | 0.615385 | 0.0625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053191 | 0.282443 | 131 | 7 | 46 | 18.714286 | 0.62766 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.166667 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c66470cb7bce43ae0647fcf9f829482cfbeb4e28 | 543 | py | Python | restaurant/schema.py | miguel550/restaurant | fe24940dd309e415366aec4f918162dcb45a9a29 | [
"MIT"
] | null | null | null | restaurant/schema.py | miguel550/restaurant | fe24940dd309e415366aec4f918162dcb45a9a29 | [
"MIT"
] | null | null | null | restaurant/schema.py | miguel550/restaurant | fe24940dd309e415366aec4f918162dcb45a9a29 | [
"MIT"
] | null | null | null | import graphene
import dishes.schema
class Query(dishes.schema.Query, graphene.ObjectType):
pass
class Mutations(graphene.ObjectType):
create_category = dishes.schema.CreateCategory.Field()
edit_category = dishes.schema.EditCategory.Field()
delete_category = dishes.schema.DeleteCategory.Field()
create_dish = dishes.schema.CreateDish.Field()
edit_dish = dishes.schema.EditDish.Field()
delete_dish = dishes.schema.DeleteDish.Field()
schema = graphene.Schema(query=Query, mutation=Mutations)
| 28.578947 | 59 | 0.744015 | 60 | 543 | 6.633333 | 0.366667 | 0.241206 | 0.150754 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.154696 | 543 | 18 | 60 | 30.166667 | 0.867102 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.083333 | 0.166667 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
c66991e76e4c0c8ac5c322554ccda09a94d0373d | 759 | py | Python | pinga/_nbdev.py | hmelberg/pinga | c2dc61d9fdd5b3d418445705dedf46375d8a7393 | [
"Apache-2.0"
] | null | null | null | pinga/_nbdev.py | hmelberg/pinga | c2dc61d9fdd5b3d418445705dedf46375d8a7393 | [
"Apache-2.0"
] | 2 | 2021-09-28T05:43:47.000Z | 2022-02-26T10:20:40.000Z | pinga/_nbdev.py | hmelberg/pinga | c2dc61d9fdd5b3d418445705dedf46375d8a7393 | [
"Apache-2.0"
] | null | null | null | # AUTOGENERATED BY NBDEV! DO NOT EDIT!
__all__ = ["index", "modules", "custom_doc_links", "git_url"]
index = {"nothing_here": "00_core.ipynb",
"expand_hyphen": "notation.ipynb",
"del_dot": "notation.ipynb",
"del_zero": "notation.ipynb",
"get_unique": "notation.ipynb",
"expand_star": "notation.ipynb",
"expand_colon": "notation.ipynb",
"expand_regex": "notation.ipynb",
"expand_code": "notation.ipynb",
"get_rows": "query.ipynb"}
modules = ["core.py",
"charlson.py",
"notation.py",
"query.py"]
doc_url = "https://hmelberg.github.io/pinga/"
git_url = "https://github.com/hmelberg/pinga/tree/master/"
def custom_doc_links(name): return None
| 29.192308 | 61 | 0.600791 | 88 | 759 | 4.931818 | 0.534091 | 0.239631 | 0.175115 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003401 | 0.225296 | 759 | 25 | 62 | 30.36 | 0.734694 | 0.047431 | 0 | 0 | 1 | 0 | 0.542302 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0 | 0.055556 | 0.055556 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c669fbd1c3287b07ca370ba1e0d35d92e0855af8 | 252 | py | Python | cumulative_function.py | lwoznicki/Python-simple-code | 72486f8f18f8ffc019838be4b1d7d45c68356a0e | [
"MIT"
] | null | null | null | cumulative_function.py | lwoznicki/Python-simple-code | 72486f8f18f8ffc019838be4b1d7d45c68356a0e | [
"MIT"
] | null | null | null | cumulative_function.py | lwoznicki/Python-simple-code | 72486f8f18f8ffc019838be4b1d7d45c68356a0e | [
"MIT"
] | 1 | 2020-01-04T20:45:26.000Z | 2020-01-04T20:45:26.000Z | def cumulative(list_of_numbers):
cumulative_sum = 0
new_list = []
for i in list_of_numbers:
cumulative_sum += i
new_list.append(cumulative_sum)
return new_list
list = [1,2,3,4,5,6,7,8,9]
print(cumulative(list)) | 25.2 | 40 | 0.638889 | 40 | 252 | 3.775 | 0.575 | 0.258278 | 0.172185 | 0.304636 | 0.344371 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05291 | 0.25 | 252 | 10 | 41 | 25.2 | 0.746032 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0 | 0 | 0.222222 | 0.111111 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c66d93d4232de45798dae38f27b77b93ef5ff26e | 285 | py | Python | plugins/example_collector/test/test_config.py | someengineering/resoto | ee17313f5376e9797ed305e7fdb62d40139a6608 | [
"Apache-2.0"
] | 126 | 2022-01-13T18:22:03.000Z | 2022-03-31T11:03:14.000Z | plugins/example_collector/test/test_config.py | someengineering/resoto | ee17313f5376e9797ed305e7fdb62d40139a6608 | [
"Apache-2.0"
] | 110 | 2022-01-13T22:27:55.000Z | 2022-03-30T22:26:50.000Z | plugins/example_collector/test/test_config.py | someengineering/resoto | ee17313f5376e9797ed305e7fdb62d40139a6608 | [
"Apache-2.0"
] | 8 | 2022-01-15T10:28:16.000Z | 2022-03-30T16:38:21.000Z | from resotolib.config import Config
from resoto_plugin_example_collector import ExampleCollectorPlugin
def test_config():
config = Config("dummy", "dummy")
ExampleCollectorPlugin.add_config(config)
Config.init_default_config()
# assert Config.example.region is None
| 23.75 | 66 | 0.785965 | 33 | 285 | 6.575758 | 0.575758 | 0.221198 | 0.165899 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140351 | 285 | 11 | 67 | 25.909091 | 0.885714 | 0.126316 | 0 | 0 | 0 | 0 | 0.040984 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
c67160961b6ec649cd0279c044e32091e84f85f7 | 1,241 | py | Python | ezcliy/positional.py | kpostekk/ezcliy | f3038a4e1d482895a311bfb699d3c04c6975faea | [
"MIT"
] | null | null | null | ezcliy/positional.py | kpostekk/ezcliy | f3038a4e1d482895a311bfb699d3c04c6975faea | [
"MIT"
] | 2 | 2021-06-02T03:52:32.000Z | 2021-08-19T21:26:03.000Z | ezcliy/positional.py | kpostekk/ezcliy | f3038a4e1d482895a311bfb699d3c04c6975faea | [
"MIT"
] | null | null | null | from typing import Optional, Any
from ezcliy.exceptions import MissingPositional
class Positional:
"""Asign value (by source order) to object, allows asking for value or provide default one."""
value: str = None
"""Fetched value by positional"""
description: str = None
"""Description for help"""
def __init__(self, name: str, ask_if_missing: Optional[str] = None, optional: Optional[Any] = None):
self.name = name
self.ask_if_missing = ask_if_missing
self.optional = optional
def pass_values(self, values: list[str], position: int):
try:
self.value = values[position]
except IndexError:
if self.ask_if_missing:
self.value = input(self.ask_if_missing + ": ").strip()
elif self.optional is not None:
self.value = self.optional
else:
raise MissingPositional(self, position)
def __repr__(self):
return f'<Positional expecting {self.name}>'
def __str__(self):
return self.value
def __int__(self):
return int(self.value)
def __float__(self):
return float(self.value)
def __bool__(self):
return bool(self.value)
| 28.204545 | 104 | 0.620467 | 148 | 1,241 | 4.966216 | 0.385135 | 0.085714 | 0.081633 | 0.065306 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.282836 | 1,241 | 43 | 105 | 28.860465 | 0.825843 | 0.070911 | 0 | 0 | 0 | 0 | 0.033088 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.241379 | false | 0.034483 | 0.068966 | 0.172414 | 0.586207 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
c67ec08a8d631d900e1acc95f7a9b62927323937 | 563 | py | Python | python/tests/test_1154.py | SousaPedro11/urionlinejudge | 7422fb706d4f470ef492566e6e536b35cf5f62fd | [
"MIT"
] | null | null | null | python/tests/test_1154.py | SousaPedro11/urionlinejudge | 7422fb706d4f470ef492566e6e536b35cf5f62fd | [
"MIT"
] | null | null | null | python/tests/test_1154.py | SousaPedro11/urionlinejudge | 7422fb706d4f470ef492566e6e536b35cf5f62fd | [
"MIT"
] | null | null | null | import io
from unittest import TestCase
from unittest import mock
from unittest.mock import patch
from python.implementations.problem1154 import Problem1154
@patch('builtins.input', side_effect=['34', '56', '44', '23', '-2'])
class Test1154(TestCase):
def test_1154(self, mocked_input):
output = "39.25"
with mock.patch('sys.stdout', new=io.StringIO()) as fake_stdout:
solution_output = Problem1154().solv()
stdout_value = fake_stdout.getvalue()
assert solution_output == output or stdout_value == f'{output}\n'
| 33.117647 | 73 | 0.694494 | 73 | 563 | 5.232877 | 0.60274 | 0.094241 | 0.094241 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071584 | 0.181172 | 563 | 16 | 74 | 35.1875 | 0.75705 | 0 | 0 | 0 | 0 | 0 | 0.087034 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 1 | 0.076923 | false | 0 | 0.384615 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
c68a2a505113ea82f50d0f6c2f771311cbd7134a | 731 | py | Python | setup.py | CS207-Project-Team-1/cs207-FinalProject | 417f315fe5efe34d27ffd59ed639d1208d0b300e | [
"MIT"
] | null | null | null | setup.py | CS207-Project-Team-1/cs207-FinalProject | 417f315fe5efe34d27ffd59ed639d1208d0b300e | [
"MIT"
] | 17 | 2018-10-16T23:52:15.000Z | 2018-12-12T04:02:23.000Z | setup.py | CS207-Project-Team-1/cs207-FinalProject | 417f315fe5efe34d27ffd59ed639d1208d0b300e | [
"MIT"
] | 1 | 2020-12-11T20:50:48.000Z | 2020-12-11T20:50:48.000Z | #!/usr/bin/env python
from setuptools import setup
setup(
name="AutoDiffX",
version="0.2",
packages=['ad'],
# metadata to display on PyPI
author="William Fu",
author_email="wfu@college.harvard.edu",
description="Lightweight Package for Automatic Differentiation",
license="MIT",
keywords="automatic differentiation, differentiation",
url="https://github.com/CS207-Project-Team-1/cs207-FinalProject/",
project_urls={
"Bug Tracker": "https://github.com/CS207-Project-Team-1/cs207-FinalProject/",
"Documentation": "https://github.com/CS207-Project-Team-1/cs207-FinalProject/",
"Source Code": "https://github.com/CS207-Project-Team-1/cs207-FinalProject/",
}
)
| 31.782609 | 87 | 0.683995 | 86 | 731 | 5.790698 | 0.604651 | 0.088353 | 0.11245 | 0.15261 | 0.385542 | 0.385542 | 0.385542 | 0.385542 | 0.385542 | 0 | 0 | 0.048701 | 0.157319 | 731 | 22 | 88 | 33.227273 | 0.75974 | 0.065663 | 0 | 0 | 0 | 0 | 0.604993 | 0.033774 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.058824 | 0 | 0.058824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c69932d5e54a24af1ea4acb2a09321bc75a0ecc2 | 1,290 | py | Python | main.py | CDargis/igrill | e7d840f0c8d7170c95a11dbb1fc4892873066803 | [
"MIT"
] | null | null | null | main.py | CDargis/igrill | e7d840f0c8d7170c95a11dbb1fc4892873066803 | [
"MIT"
] | null | null | null | main.py | CDargis/igrill | e7d840f0c8d7170c95a11dbb1fc4892873066803 | [
"MIT"
] | null | null | null | import os
from bluepy.btle import Scanner
from scan import DeviceForwardingDelegate
from persistence import DataPersistence
from igrill import IGrillHandler
from tokencube import TokenCubeHandler
device_settings = {
"70:91:8f:01:f7:30": {
"device": "iGrill Mini",
"addr": "70:91:8f:01:f7:30",
"type": "red probe"
},
}
CLOUD_URL = ""
INFLUX_SERVER = os.environ.get("INFLUX_SERVER", None) or "raspberrypi.local"
INFLUX_DATABASE = os.environ.get("INFLUX_DB", None) or "sensors"
INFLUX_USER = os.environ.get("INFLUX_USER", None) or "root"
INFLUX_PASSWORD = os.environ.get("INFLUX_PASSWORD", None) or "root"
if __name__ == "__main__":
print "Creating Scanner"
delegate = DeviceForwardingDelegate()
delegate.handlers.append(IGrillHandler(device_settings))
scanner = Scanner()
scanner.withDelegate(delegate)
print "Connecting to InfluxDB server"
persistence = DataPersistence(INFLUX_SERVER, INFLUX_DATABASE, INFLUX_USER, INFLUX_PASSWORD)
while True:
try:
print "Scanning..."
scanner.scan(15)
print "Persisting..."
for handler in delegate.handlers:
handler.persist_stats(persistence)
except Exception as ex:
print "exception: ", ex
| 28.043478 | 95 | 0.679845 | 147 | 1,290 | 5.802721 | 0.47619 | 0.042204 | 0.056272 | 0.084408 | 0.028136 | 0.028136 | 0 | 0 | 0 | 0 | 0 | 0.021718 | 0.214729 | 1,290 | 45 | 96 | 28.666667 | 0.820336 | 0 | 0 | 0 | 0 | 0 | 0.183088 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.057143 | 0.171429 | null | null | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
c69b484b591c8b359a20827b5d5c8b9175d15f2c | 3,535 | py | Python | home/hairygael/GESTURES/howmanyfingersdoihave.py | rv8flyboy/pyrobotlab | 4e04fb751614a5cb6044ea15dcfcf885db8be65a | [
"Apache-2.0"
] | 63 | 2015-02-03T18:49:43.000Z | 2022-03-29T03:52:24.000Z | home/hairygael/GESTURES/howmanyfingersdoihave.py | hirwaHenryChristian/pyrobotlab | 2debb381fc2db4be1e7ea6e5252a50ae0de6f4a9 | [
"Apache-2.0"
] | 16 | 2016-01-26T19:13:29.000Z | 2018-11-25T21:20:51.000Z | home/hairygael/GESTURES/howmanyfingersdoihave.py | hirwaHenryChristian/pyrobotlab | 2debb381fc2db4be1e7ea6e5252a50ae0de6f4a9 | [
"Apache-2.0"
] | 151 | 2015-01-03T18:55:54.000Z | 2022-03-04T07:04:23.000Z | def howmanyfingersdoihave():
ear.pauseListening()
sleep(1)
fullspeed()
i01.moveHead(49,74)
i01.moveArm("left",75,83,79,24)
i01.moveArm("right",65,82,71,24)
i01.moveHand("left",74,140,150,157,168,92)
i01.moveHand("right",89,80,98,120,114,0)
sleep(2)
i01.moveHand("right",0,80,98,120,114,0)
i01.mouth.speakBlocking("ten")
sleep(.1)
i01.moveHand("right",0,0,98,120,114,0)
i01.mouth.speakBlocking("nine")
sleep(.1)
i01.moveHand("right",0,0,0,120,114,0)
i01.mouth.speakBlocking("eight")
sleep(.1)
i01.moveHand("right",0,0,0,0,114,0)
i01.mouth.speakBlocking("seven")
sleep(.1)
i01.moveHand("right",0,0,0,0,0,0)
i01.mouth.speakBlocking("six")
sleep(.5)
i01.setHeadSpeed(.70,.70)
i01.moveHead(40,105)
i01.moveArm("left",75,83,79,24)
i01.moveArm("right",65,82,71,24)
i01.moveHand("left",0,0,0,0,0,180)
i01.moveHand("right",0,0,0,0,0,0)
sleep(0.1)
i01.mouth.speakBlocking("and five makes eleven")
sleep(0.7)
i01.setHeadSpeed(0.7,0.7)
i01.moveHead(40,50)
sleep(0.5)
i01.setHeadSpeed(0.7,0.7)
i01.moveHead(49,105)
sleep(0.7)
i01.setHeadSpeed(0.7,0.8)
i01.moveHead(40,50)
sleep(0.7)
i01.setHeadSpeed(0.7,0.8)
i01.moveHead(49,105)
sleep(0.7)
i01.setHeadSpeed(0.7,0.7)
i01.moveHead(90,85)
sleep(0.7)
i01.mouth.speakBlocking("eleven")
i01.moveArm("left",70,75,70,20)
i01.moveArm("right",60,75,65,20)
sleep(1)
i01.mouth.speakBlocking("that doesn't seem right")
sleep(2)
i01.mouth.speakBlocking("I think I better try that again")
i01.moveHead(40,105)
i01.moveArm("left",75,83,79,24)
i01.moveArm("right",65,82,71,24)
i01.moveHand("left",140,168,168,168,158,90)
i01.moveHand("right",87,138,109,168,158,25)
sleep(2)
i01.moveHand("left",10,140,168,168,158,90)
i01.mouth.speakBlocking("one")
sleep(.1)
i01.moveHand("left",10,10,168,168,158,90)
i01.mouth.speakBlocking("two")
sleep(.1)
i01.moveHand("left",10,10,10,168,158,90)
i01.mouth.speakBlocking("three")
sleep(.1)
i01.moveHand("left",10,10,10,10,158,90)
i01.mouth.speakBlocking("four")
sleep(.1)
i01.moveHand("left",10,10,10,10,10,90)
i01.mouth.speakBlocking("five")
sleep(.1)
i01.setHeadSpeed(0.65,0.65)
i01.moveHead(53,65)
i01.moveArm("right",48,80,78,11)
i01.setHandSpeed("left", 1.0, 1.0, 1.0, 1.0, 1.0, 1.0)
i01.setHandSpeed("right", 1.0, 1.0, 1.0, 1.0, 1.0, 1.0)
i01.moveHand("left",10,10,10,10,10,90)
i01.moveHand("right",10,10,10,10,10,25)
sleep(1)
i01.mouth.speakBlocking("and five makes ten")
sleep(.5)
i01.mouth.speakBlocking("there that's better")
i01.moveHead(95,85)
i01.moveArm("left",75,83,79,24)
i01.moveArm("right",40,70,70,10)
sleep(0.5)
i01.mouth.speakBlocking("inmoov has ten fingers")
sleep(0.5)
i01.moveHead(90,90)
i01.setHandSpeed("left", 0.8, 0.8, 0.8, 0.8, 0.8, 0.8)
i01.setHandSpeed("right", 0.8, 0.8, 0.8, 0.8, 0.8, 0.8)
i01.moveHand("left",140,140,140,140,140,60)
i01.moveHand("right",140,140,140,140,140,60)
sleep(1.0)
i01.setArmSpeed("right", 1.0, 1.0, 1.0, 1.0)
i01.setArmSpeed("left", 1.0, 1.0, 1.0, 1.0)
i01.moveArm("left",5,90,30,11)
i01.moveArm("right",5,90,30,11)
sleep(0.5)
relax()
sleep(0.5)
ear.resumeListening()
| 28.28 | 63 | 0.596322 | 605 | 3,535 | 3.484298 | 0.152066 | 0.019924 | 0.169355 | 0.030361 | 0.582543 | 0.523245 | 0.473435 | 0.359108 | 0.325427 | 0.253795 | 0 | 0.242295 | 0.201414 | 3,535 | 124 | 64 | 28.508065 | 0.504428 | 0 | 0 | 0.453704 | 0 | 0 | 0.100481 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.009259 | true | 0 | 0 | 0 | 0.009259 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c6aa6dc830d8243909c401d7811ff993beffd9ce | 9,006 | py | Python | search/util.py | The-Danx/comp30024-project-part-a | 352eaaf3c03d815a4b145ada4b5b49396f599fc3 | [
"MIT"
] | 1 | 2022-03-23T06:19:29.000Z | 2022-03-23T06:19:29.000Z | search/util.py | The-Danx/comp30024-project-part-a | 352eaaf3c03d815a4b145ada4b5b49396f599fc3 | [
"MIT"
] | null | null | null | search/util.py | The-Danx/comp30024-project-part-a | 352eaaf3c03d815a4b145ada4b5b49396f599fc3 | [
"MIT"
] | null | null | null | """
COMP30024 Artificial Intelligence, Semester 1, 2021
Project Part A: Searching
This module contains some helper functions for printing actions and boards.
"""
from search.node import distance
def get_board_dict(data):
"""Convert the data dictionary into a format that is used by the
print_board function.
"""
board_dict = {}
for type, tokens in data.items():
if type == "upper":
for tok in tokens:
board_dict[(tok[1], tok[2])] = "({})".format(tok[0].upper())
elif type == "lower":
for tok in tokens:
board_dict[(tok[1], tok[2])] = "({})".format(tok[0])
else:
for tok in tokens:
board_dict[(tok[1], tok[2])] = "(X)"
return board_dict
def print_board(board_dict, message="", compact=True, ansi=False, **kwargs):
"""
For help with visualisation and debugging: output a board diagram with
any information you like (tokens, heuristic values, distances, etc.).
Arguments:
board_dict -- A dictionary with (r, q) tuples as keys (following axial
coordinate system from specification) and printable objects (e.g.
strings, numbers) as values.
This function will arrange these printable values on a hex grid
and output the result.
Note: At most the first 5 characters will be printed from the string
representation of each value.
message -- A printable object (e.g. string, number) that will be placed
above the board in the visualisation. Default is "" (no message).
ansi -- True if you want to use ANSI control codes to enrich the output.
Compatible with terminals supporting ANSI control codes. Default
False.
compact -- True if you want to use a compact board visualisation,
False to use a bigger one including axial coordinates along with
the printable information in each hex. Default True (small board).
Any other keyword arguments are passed through to the print function.
Example:
>>> board_dict = {
... ( 0, 0): "hello",
... ( 0, 2): "world",
... ( 3,-2): "(p)",
... ( 2,-1): "(S)",
... (-4, 0): "(R)",
... }
>>> print_board(board_dict, "message goes here", ansi=False)
# message goes here
# .-'-._.-'-._.-'-._.-'-._.-'-.
# | | | | | |
# .-'-._.-'-._.-'-._.-'-._.-'-._.-'-.
# | | | (p) | | | |
# .-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-.
# | | | | (S) | | | |
# .-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-.
# | | | | | | | | |
# .-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-.
# | | | | |hello| |world| | |
# '-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'
# | | | | | | | | |
# '-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'
# | | | | | | | |
# '-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'
# | | | | | | |
# '-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'
# | (R) | | | | |
# '-._.-'-._.-'-._.-'-._.-'-._.-'
"""
if compact:
template = """# {00:}
# .-'-._.-'-._.-'-._.-'-._.-'-.
# |{57:}|{58:}|{59:}|{60:}|{61:}|
# .-'-._.-'-._.-'-._.-'-._.-'-._.-'-.
# |{51:}|{52:}|{53:}|{54:}|{55:}|{56:}|
# .-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-.
# |{44:}|{45:}|{46:}|{47:}|{48:}|{49:}|{50:}|
# .-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-.
# |{36:}|{37:}|{38:}|{39:}|{40:}|{41:}|{42:}|{43:}|
# .-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-.
# |{27:}|{28:}|{29:}|{30:}|{31:}|{32:}|{33:}|{34:}|{35:}|
# '-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'
# |{19:}|{20:}|{21:}|{22:}|{23:}|{24:}|{25:}|{26:}|
# '-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'
# |{12:}|{13:}|{14:}|{15:}|{16:}|{17:}|{18:}|
# '-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'
# |{06:}|{07:}|{08:}|{09:}|{10:}|{11:}|
# '-._.-'-._.-'-._.-'-._.-'-._.-'-._.-'
# |{01:}|{02:}|{03:}|{04:}|{05:}|
# '-._.-'-._.-'-._.-'-._.-'-._.-'"""
else:
template = """# {00:}
# ,-' `-._,-' `-._,-' `-._,-' `-._,-' `-.
# | {57:} | {58:} | {59:} | {60:} | {61:} |
# | 4,-4 | 4,-3 | 4,-2 | 4,-1 | 4, 0 |
# ,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-.
# | {51:} | {52:} | {53:} | {54:} | {55:} | {56:} |
# | 3,-4 | 3,-3 | 3,-2 | 3,-1 | 3, 0 | 3, 1 |
# ,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-.
# | {44:} | {45:} | {46:} | {47:} | {48:} | {49:} | {50:} |
# | 2,-4 | 2,-3 | 2,-2 | 2,-1 | 2, 0 | 2, 1 | 2, 2 |
# ,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-.
# | {36:} | {37:} | {38:} | {39:} | {40:} | {41:} | {42:} | {43:} |
# | 1,-4 | 1,-3 | 1,-2 | 1,-1 | 1, 0 | 1, 1 | 1, 2 | 1, 3 |
# ,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-.
# | {27:} | {28:} | {29:} | {30:} | {31:} | {32:} | {33:} | {34:} | {35:} |
# | 0,-4 | 0,-3 | 0,-2 | 0,-1 | 0, 0 | 0, 1 | 0, 2 | 0, 3 | 0, 4 |
# `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-'
# | {19:} | {20:} | {21:} | {22:} | {23:} | {24:} | {25:} | {26:} |
# | -1,-3 | -1,-2 | -1,-1 | -1, 0 | -1, 1 | -1, 2 | -1, 3 | -1, 4 |
# `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-'
# | {12:} | {13:} | {14:} | {15:} | {16:} | {17:} | {18:} |
# | -2,-2 | -2,-1 | -2, 0 | -2, 1 | -2, 2 | -2, 3 | -2, 4 |
# `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-'
# | {06:} | {07:} | {08:} | {09:} | {10:} | {11:} |
# | -3,-1 | -3, 0 | -3, 1 | -3, 2 | -3, 3 | -3, 4 | key:
# `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' ,-' `-.
# | {01:} | {02:} | {03:} | {04:} | {05:} | | input |
# | -4, 0 | -4, 1 | -4, 2 | -4, 3 | -4, 4 | | r, q |
# `-._,-' `-._,-' `-._,-' `-._,-' `-._,-' `-._,-'"""
# prepare the provided board contents as strings, formatted to size.
ran = range(-4, +4+1)
cells = []
for rq in [(r, q) for r in ran for q in ran if -r-q in ran]:
if rq in board_dict:
cell = str(board_dict[rq]).center(5)
if ansi:
# put contents in bold
cell = f"\033[1m{cell}\033[0m"
else:
cell = " " # 5 spaces will fill a cell
cells.append(cell)
# prepare the message, formatted across multiple lines
multiline_message = "\n# ".join(message.splitlines())
# fill in the template to create the board drawing, then print!
board = template.format(multiline_message, *cells)
print(board, **kwargs)
def print_move(turn, current, after):
"""Print swing or slide action"""
for j in range(len(current)):
if distance(current[j], after[j])==2:
print_swing(turn, current[j][1], current[j][2],
after[j][1], after[j][2])
else:
print_slide(turn, current[j][1], current[j][2],
after[j][1], after[j][2])
def print_path(path):
"""Print the solution in required format"""
i = len(path) - 1
while i > 0:
curr_tokens = path[i].upper
next_tokens = path[i-1].upper
curr_tokens_copy = path[i].upper_copy
next_tokens_copy = path[i-1].upper_copy
# when no upper tokens are defeated
if len(curr_tokens) == len(next_tokens):
print_move(-i + len(path), curr_tokens, next_tokens)
# upper tokens are defeated
else:
print_move(-i + len(path), curr_tokens, next_tokens_copy)
i -= 1
def print_priority_queue(pq):
"""Print out the priority queue."""
print(pq[0][0], pq[0][1].upper, pq[0][1].lower)
def print_slide(t, r_a, q_a, r_b, q_b, **kwargs):
"""
Output a slide action for turn t of a token from hex (r_a, q_a)
to hex (r_b, q_b), according to the format instructions.
Any keyword arguments are passed through to the print function.
"""
print(f"Turn {t}: SLIDE from {(r_a, q_a)} to {(r_b, q_b)}", **kwargs)
def print_swing(t, r_a, q_a, r_b, q_b, **kwargs):
"""
Output a swing action for turn t of a token from hex (r_a, q_a)
to hex (r_b, q_b), according to the format instructions.
Any keyword arguments are passed through to the print function.
"""
print(f"Turn {t}: SWING from {(r_a, q_a)} to {(r_b, q_b)}", **kwargs)
| 43.298077 | 77 | 0.391739 | 963 | 9,006 | 3.390447 | 0.283489 | 0.033078 | 0.005513 | 0.007351 | 0.331087 | 0.309035 | 0.279632 | 0.25023 | 0.230628 | 0.181623 | 0 | 0.070704 | 0.321563 | 9,006 | 207 | 78 | 43.507246 | 0.463666 | 0.380746 | 0 | 0.266667 | 0 | 0.238095 | 0.583538 | 0.150426 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.009524 | 0 | 0.085714 | 0.133333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c6c33ad46d4bef0c10d838ee531472b4de264a2b | 186 | py | Python | src/robotcommand.py | Team-Joker-JU/mower-raspberry-pi | 86542aeb27ab414d2b54ca5bef857b3b6baed37e | [
"Apache-2.0"
] | null | null | null | src/robotcommand.py | Team-Joker-JU/mower-raspberry-pi | 86542aeb27ab414d2b54ca5bef857b3b6baed37e | [
"Apache-2.0"
] | null | null | null | src/robotcommand.py | Team-Joker-JU/mower-raspberry-pi | 86542aeb27ab414d2b54ca5bef857b3b6baed37e | [
"Apache-2.0"
] | null | null | null | from enum import IntEnum
class RobotCommand(IntEnum):
CONNECTED = 0,
DISCONNECTED = 1,
ACCELERATION = 2,
STEERING = 3,
COLLISION = 4,
MODE = 5,
POSITION = 6
| 16.909091 | 28 | 0.612903 | 21 | 186 | 5.428571 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053846 | 0.301075 | 186 | 10 | 29 | 18.6 | 0.823077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
c6c3c605e767f792bd4881bc70b32d670643fec7 | 403 | py | Python | CreatureRogue/data_layer/encounter.py | DaveTCode/CreatureRogue | 74ce2bf731b52b014198a2376dfba0d9782cbf01 | [
"MIT"
] | 1 | 2020-06-12T00:10:32.000Z | 2020-06-12T00:10:32.000Z | CreatureRogue/data_layer/encounter.py | DaveTCode/CreatureRogue | 74ce2bf731b52b014198a2376dfba0d9782cbf01 | [
"MIT"
] | null | null | null | CreatureRogue/data_layer/encounter.py | DaveTCode/CreatureRogue | 74ce2bf731b52b014198a2376dfba0d9782cbf01 | [
"MIT"
] | null | null | null | from CreatureRogue.data_layer.species import Species
class Encounter:
def __init__(self, species: Species, min_level: int, max_level: int, rarity):
self.species = species
self.min_level = min_level
self.max_level = max_level
self.rarity = rarity
def __str__(self):
return "Encounter: {0} ({1},{2})".format(self.species, self.min_level, self.max_level)
| 31 | 94 | 0.677419 | 54 | 403 | 4.740741 | 0.407407 | 0.125 | 0.140625 | 0.148438 | 0.15625 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009464 | 0.2134 | 403 | 12 | 95 | 33.583333 | 0.798107 | 0 | 0 | 0 | 0 | 0 | 0.059553 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0.111111 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.