hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0478bf88cbefe164767e09f66a708ee577f1c1e0 | 47 | py | Python | qiling/qiling/debugger/qdb/__init__.py | mrTavas/owasp-fstm-auto | 6e9ff36e46d885701c7419db3eca15f12063a7f3 | [
"CC0-1.0"
] | 2 | 2021-05-05T12:03:01.000Z | 2021-06-04T14:27:15.000Z | qiling/qiling/debugger/qdb/__init__.py | mrTavas/owasp-fstm-auto | 6e9ff36e46d885701c7419db3eca15f12063a7f3 | [
"CC0-1.0"
] | null | null | null | qiling/qiling/debugger/qdb/__init__.py | mrTavas/owasp-fstm-auto | 6e9ff36e46d885701c7419db3eca15f12063a7f3 | [
"CC0-1.0"
] | 2 | 2021-05-05T12:03:09.000Z | 2021-06-04T14:27:21.000Z | #!/usr/bin/env python3
from .qdb import QlQdb
| 11.75 | 22 | 0.723404 | 8 | 47 | 4.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025 | 0.148936 | 47 | 3 | 23 | 15.666667 | 0.825 | 0.446809 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0491ca23ce62155c17534185d77519ed2c079be0 | 21 | py | Python | gsflow/crt/__init__.py | pygsflow/pygsflow | 83860cd58078017a65e1633b1192469777f1ce15 | [
"CC0-1.0",
"BSD-3-Clause"
] | 17 | 2019-11-11T02:49:29.000Z | 2022-02-17T03:45:19.000Z | gsflow/crt/__init__.py | jonathanqv/pygsflow | d671fdd84245ecb421a0fcab17a578425b514e93 | [
"Unlicense"
] | 21 | 2019-07-10T21:45:11.000Z | 2022-02-22T17:57:20.000Z | gsflow/crt/__init__.py | jonathanqv/pygsflow | d671fdd84245ecb421a0fcab17a578425b514e93 | [
"Unlicense"
] | 8 | 2019-11-11T02:49:36.000Z | 2021-09-30T18:43:45.000Z | from .crt import CRT
| 10.5 | 20 | 0.761905 | 4 | 21 | 4 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 21 | 1 | 21 | 21 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
04940c12969cd7978d97162055816578a11e76c8 | 184 | py | Python | django_postgres/__init__.py | zacharyvoase/django-postgres | 93e9f9809cabee0b327f18d181cbc9aeab1f8f2e | [
"Unlicense"
] | 51 | 2015-01-17T16:40:07.000Z | 2021-07-14T02:51:42.000Z | django_postgres/__init__.py | zacharyvoase/django-postgres | 93e9f9809cabee0b327f18d181cbc9aeab1f8f2e | [
"Unlicense"
] | 1 | 2015-02-23T11:01:42.000Z | 2015-02-23T11:01:42.000Z | django_postgres/__init__.py | zacharyvoase/django-postgres | 93e9f9809cabee0b327f18d181cbc9aeab1f8f2e | [
"Unlicense"
] | 14 | 2015-01-22T09:53:32.000Z | 2018-04-30T17:57:03.000Z | from django_postgres.view import View
from django_postgres.bitstrings import BitStringField, BitStringExpression as B, Bits
from django_postgres.citext import CaseInsensitiveTextField
| 46 | 85 | 0.88587 | 22 | 184 | 7.272727 | 0.590909 | 0.1875 | 0.3375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 184 | 3 | 86 | 61.333333 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
04cdeec38ac8fc54883fccd423f26bd02cdd2ed1 | 209 | py | Python | courses/admin.py | nikolnikon/otus_web_like_otus | e9e87be013b890602e474197e87298730e7840fc | [
"MIT"
] | null | null | null | courses/admin.py | nikolnikon/otus_web_like_otus | e9e87be013b890602e474197e87298730e7840fc | [
"MIT"
] | null | null | null | courses/admin.py | nikolnikon/otus_web_like_otus | e9e87be013b890602e474197e87298730e7840fc | [
"MIT"
] | null | null | null | from django.contrib import admin
from courses.models import Course, Module, Lesson, Employer
admin.site.register(Course)
admin.site.register(Module)
admin.site.register(Lesson)
admin.site.register(Employer)
| 23.222222 | 59 | 0.818182 | 29 | 209 | 5.896552 | 0.448276 | 0.210526 | 0.397661 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08134 | 209 | 8 | 60 | 26.125 | 0.890625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
04e8fc192da63aed5eff6c88e63257e59c8e90eb | 7,719 | py | Python | aiops/utils/data_augmentation/bert.py | amandeep1991/aiops | 0f9959b740ef51a0aefb8dba1f7d5b4049ef495e | [
"MIT"
] | null | null | null | aiops/utils/data_augmentation/bert.py | amandeep1991/aiops | 0f9959b740ef51a0aefb8dba1f7d5b4049ef495e | [
"MIT"
] | 3 | 2020-06-02T10:23:57.000Z | 2021-10-18T20:42:40.000Z | aiops/utils/data_augmentation/bert.py | amandeep1991/aiops | 0f9959b740ef51a0aefb8dba1f7d5b4049ef495e | [
"MIT"
] | null | null | null | import random
import torch
from torchtext import data
from torchtext import datasets
from aiops.config import cache_dir_for_torch_text, logger
from aiops.constants import SEED
class DataSets:
def __init__(self, tokenizer) -> None:
super().__init__()
self.tokenizer = tokenizer
self.text_processor = self.tokenizer.get_text_processor()
self.label_processor = self.tokenizer.get_label_processor()
class TorchTextInbuiltClassificationDataSets(DataSets):
def __init__(self, tokenizer) -> None:
super().__init__(tokenizer)
def trec_split(self, overwrite_labels_by=dict(ABBR="urg", DESC="urg", ENTY="urg", HUM="urg", LOC="urg", NUM="urg"), **kwargs):
train_data_trec, test_data_trec = datasets.TREC.splits(self.text_processor, self.label_processor, fine_grained=False, root=cache_dir_for_torch_text, **kwargs)
train_data_trec, valid_data_trec = train_data_trec.split(random_state=random.seed(SEED))
for ex in (train_data_trec + valid_data_trec + test_data_trec):
ex.label = overwrite_labels_by.get(ex.label)
return train_data_trec, valid_data_trec, test_data_trec
def imdb_split(self, overwrite_labels_by=dict(pos="green", neg="amber"), **kwargs):
train_data_imdb, test_data_imdb = datasets.IMDB.splits(self.text_processor, self.label_processor, root=cache_dir_for_torch_text, **kwargs)
train_data_imdb, valid_data_imdb = train_data_imdb.split(random_state=random.seed(SEED))
for ex in (train_data_imdb + test_data_imdb + valid_data_imdb):
ex.label = overwrite_labels_by.get(ex.label)
return train_data_imdb, test_data_imdb, valid_data_imdb
class DomainSpecificClassificationDataSet(DataSets):
def __init__(self, tokenizer, path, format='json') -> None:
super().__init__(tokenizer)
loaded_tabular_dataset = data.TabularDataset(
path=path,
format=format,
fields=dict(text=('text', self.text_processor), label=('label', self.label_processor))
)
self.dataset = data.Dataset(
loaded_tabular_dataset.examples,
fields=dict(text=self.text_processor, label=self.label_processor)
)
class FiveClassesClassificationDataSet(DataSets):
def __init__(self, tokenizer, path, format='json') -> None:
super().__init__(tokenizer)
self.inbuilt_dataset = TorchTextInbuiltClassificationDataSets(tokenizer)
self.domain_dataset = DomainSpecificClassificationDataSet(tokenizer, path, format)
def get_merged_dataset(self):
logger.info("trec data loading.....")
train_data_trec, valid_data_trec, test_data_trec = self.inbuilt_dataset.trec_split()
logger.info("imdb data loading.....")
train_data_imdb, valid_data_imdb, test_data_imdb = self.inbuilt_dataset.imdb_split()
logger.info("merging data for training.....")
merged_train_data_examples_list = train_data_trec.examples + train_data_imdb.examples + self.domain_dataset.dataset.examples
merged_train_data = data.Dataset(merged_train_data_examples_list, fields=[("text", self.text_processor), ("label", self.label_processor)])
logger.info("merging data for validation.....")
merged_valid_data_examples_list = (valid_data_trec.examples + valid_data_imdb.examples)
merged_valid_data = data.Dataset(merged_valid_data_examples_list, fields=[("text", self.text_processor), ("label", self.label_processor)])
logger.info("merging data for testing.....")
merged_test_data_examples_list = (test_data_trec.examples + test_data_imdb.examples)
merged_test_data = data.Dataset(merged_test_data_examples_list, fields=[("text", self.text_processor), ("label", self.label_processor)])
return merged_train_data, merged_valid_data, merged_test_data
class FOUR(DataSets):
def __init__(self, tokenizer, path, format='json') -> None:
super().__init__(tokenizer)
self.inbuilt_dataset = TorchTextInbuiltClassificationDataSets(tokenizer)
self.domain_dataset = DomainSpecificClassificationDataSet(tokenizer, path, format)
def get_train_and_valid_datasets(self, train_ratio=0.8):
logger.info("get_train_and_valid_datasets.......")
logger.info("imdb data loading.....")
train_data_imdb, valid_data_imdb, _ = self.inbuilt_dataset.imdb_split()
total_examples = len(self.domain_dataset.dataset.examples)
training_examples_count = int(train_ratio * total_examples)
merged_train_data_examples_list = self.domain_dataset.dataset.examples[:training_examples_count] + train_data_imdb.examples
merged_train_data = data.Dataset(merged_train_data_examples_list, fields=[("text", self.text_processor), ("label", self.label_processor)])
merged_valid_data_examples_list = self.domain_dataset.dataset.examples[training_examples_count:] + valid_data_imdb.examples
merged_valid_data = data.Dataset(merged_valid_data_examples_list, fields=[("text", self.text_processor), ("label", self.label_processor)])
return merged_train_data, merged_valid_data
def get_train_and_valid_domain_dataset(self, train_ratio=0.8):
logger.info("get_train_and_valid_domain_dataset.......")
total_examples = len(self.domain_dataset.dataset.examples)
training_examples_count = int(train_ratio * total_examples)
merged_train_data_examples_list = self.domain_dataset.dataset.examples[:training_examples_count]
merged_train_data = data.Dataset(merged_train_data_examples_list, fields=[("text", self.text_processor), ("label", self.label_processor)])
merged_valid_data_examples_list = self.domain_dataset.dataset.examples[training_examples_count:]
merged_valid_data = data.Dataset(merged_valid_data_examples_list, fields=[("text", self.text_processor), ("label", self.label_processor)])
return merged_train_data, merged_valid_data
def get_train_and_valid_and_test_dataset_from_three(self, train_ratio=0.8):
logger.info("trec data loading.....")
train_data_trec, valid_data_trec, test_data_trec = self.inbuilt_dataset.trec_split(overwrite_labels_by=dict(ABBR="amber", DESC="amber", ENTY="amber", HUM="amber", LOC="amber", NUM="amber"))
logger.info("imdb data loading.....")
train_data_imdb, valid_data_imdb, test_data_imdb = self.inbuilt_dataset.imdb_split(overwrite_labels_by=dict(pos="green", neg="amber"))
total_examples = len(self.domain_dataset.dataset.examples)
training_examples_count = int(train_ratio * total_examples)
logger.info("merging data for training.....")
merged_train_data_examples_list = self.domain_dataset.dataset.examples[:training_examples_count] + train_data_trec.examples + valid_data_trec.examples
merged_train_data = data.Dataset(merged_train_data_examples_list, fields=[("text", self.text_processor), ("label", self.label_processor)])
logger.info("merging data for validation.....")
merged_valid_data_examples_list = self.domain_dataset.dataset.examples[training_examples_count:] + train_data_trec.examples + valid_data_imdb.examples
merged_valid_data = data.Dataset(merged_valid_data_examples_list, fields=[("text", self.text_processor), ("label", self.label_processor)])
logger.info("merging data for testing.....")
merged_test_data_examples_list = (test_data_trec.examples + test_data_imdb.examples)
merged_test_data = data.Dataset(merged_test_data_examples_list, fields=[("text", self.text_processor), ("label", self.label_processor)])
return merged_train_data, merged_valid_data, merged_test_data
| 55.532374 | 197 | 0.741288 | 983 | 7,719 | 5.403866 | 0.090539 | 0.060994 | 0.060241 | 0.04744 | 0.852221 | 0.818524 | 0.795181 | 0.771649 | 0.733998 | 0.697477 | 0 | 0.000915 | 0.150279 | 7,719 | 138 | 198 | 55.934783 | 0.808965 | 0 | 0 | 0.475248 | 0 | 0 | 0.070864 | 0.009846 | 0 | 0 | 0 | 0 | 0 | 1 | 0.108911 | false | 0 | 0.059406 | 0 | 0.277228 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b6cbee7c8cfa8b586aab367761ec9cd47f437557 | 9,195 | py | Python | mlmachine/explore/eda_preprocessing.py | datablets/mlmachine | f9d98b3d5b1d7a81bd45a0624cb1202472742d6c | [
"MIT"
] | null | null | null | mlmachine/explore/eda_preprocessing.py | datablets/mlmachine | f9d98b3d5b1d7a81bd45a0624cb1202472742d6c | [
"MIT"
] | null | null | null | mlmachine/explore/eda_preprocessing.py | datablets/mlmachine | f9d98b3d5b1d7a81bd45a0624cb1202472742d6c | [
"MIT"
] | null | null | null | import numpy as np
import matplotlib.pyplot as plt
from scipy import stats
from scipy import special
from prettierplot.plotter import PrettierPlot
from prettierplot import style
def eda_missing_summary(self, training_data=True, color=style.style_grey, display_df=False, chart_scale=15):
"""
Documentation:
---
Description:
Creates vertical bar chart visualizing the percent of values missing for each feature.
Optionally displays the underlying Pandas DataFrame.
---
Parameters:
training_data : boolean, dafault=True
Controls which dataset (training or validation) is used for visualization.
color : str or color code, default=style.style_grey
Bar color.
display_df : boolean, default=False
Controls whether to display summary data in Pandas DataFrame in addition to chart.
chart_scale : int or float, default=15
Controls size and proportions of chart and chart elements. Higher value creates
larger plots and increases visual elements proportionally.
"""
# dynamically choose training data objects or validation data objects
data, _ = self.training_or_validation_dataset(training_data)
# return missingness summary
percent_missing = self.missing_summary(training_data)
# if missingness summary is not empty, create the visualization
if not percent_missing.empty:
# optionally display DataFrame summary
if display_df:
display(percent_missing)
# create prettierplot object
p = PrettierPlot(chart_scale=chart_scale, plot_orientation="wide_standard")
# add canvas to prettierplot object
ax = p.make_canvas(
title="Percent missing by feature",
y_shift=0.8,
title_scale=0.8,
)
# add vertical bar chart to canvas
p.bar_v(
x=percent_missing.index,
counts=percent_missing["Percent missing"],
label_rotate=45 if len(percent_missing.index) <=5 else 90,
color=color,
y_units="p",
x_tick_wrap=False,
ax=ax,
)
ax.set_ylim([0,100])
# if missingness summary is empty, just print "No Nulls"
else:
print("No nulls")
def eda_skew_summary(self, training_data=True, color=style.style_grey, display_df=False, chart_scale=15):
"""
Documentation:
---
Description:
Creates vertical bar chart visualizing the skew for each feature. Optionally
displaying the underlying Pandas DataFrame.
---
Parameters:
training_data : boolean, dafault=True
Controls which dataset (training or validation) is used for visualization.
color : str, color code, default=style.style_grey
Bar color.
display_df : boolean, default=False
Controls whether to display summary data in Pandas DataFrame along with chart.
chart_scale : int or float, default=15
Controls size and proportions of chart and chart elements. Higher value creates
larger plots and increases visual elements proportionally.
"""
# dynamically choose training data objects or validation data objects
data, _ = self.training_or_validation_dataset(training_data)
# return skewness summary
skew_summary = self.skew_summary(data)
# optionally display DataFrame summary
if display_df:
display(skew_summary)
# create prettierplot object
p = PrettierPlot(chart_scale=chart_scale, plot_orientation="wide_standard")
# add canvas to prettierplot object
ax = p.make_canvas(
title="Skew by feature",
y_shift=0.8,
title_scale=0.8,
)
# add vertical bar chart to canvas
p.bar_v(
x=skew_summary.index,
counts=skew_summary["Skew"],
label_rotate=45 if len(skew_summary.index) <=5 else 90,
color=color,
y_units="fff",
x_tick_wrap=False,
ax=ax,
)
def eda_transform_target(self, data, name, chart_scale=15):
"""
Documentation:
---
Description:
Creates a two_panel visualization. The left plot is the current distribution
overlayed on a normal distribution. The right plot is a qqplot overlayed
across a straight line.
---
Parameters:
data : Pandas Series
Target variable data object.
name : str
Name of target variable.
chart_scale : int or float, default=15
Controls size and proportions of chart and chart elements. Higher value
creates larger plots and increases visual elements proportionally.
"""
# create prettierplot object
p = PrettierPlot(chart_scale=chart_scale)
# add canvas to prettierplot object
ax = p.make_canvas(
title=f"dist/kde - {name} (initial)",
x_label="",
y_label="",
y_shift=0.8,
position=221,
)
# add distribution / kernel density plot to canvas
p.dist_plot(
data, color=style.style_grey, fit=stats.norm, x_rotate=True, ax=ax
)
# turn off x and y ticks
plt.xticks([])
plt.yticks([])
# add canvas to prettierplot object
ax = p.make_canvas(
title=f"probability plot - {name} (initial)",
x_label="",
y_label="",
y_shift=0.8,
position=222,
)
# add QQ / probability plot to canvas
p.prob_plot(data, plot=ax)
# turn off x and y ticks
plt.xticks([])
plt.yticks([])
def eda_transform_log1(self, data, name, chart_scale=15):
"""
Documentation:
---
Description:
Creates a two_panel visualization. The left plot is the log + 1 transformed
distribution overlayed on a normal distribution. The right plot is a log + 1
adjusted qqplot overlayed across a straight line.
---
Parameters:
data : Pandas Series
Target variable data object.
name : str
Name of target variable.
chart_scale : int or float, default=15
Controls size and proportions of chart and chart elements. Higher value creates
larger plots and increases visual elements proportionally.
"""
# create prettierplot object
p = PrettierPlot(chart_scale=chart_scale)
# add canvas to prettierplot object
ax = p.make_canvas(
title=f"dist/kde - {name} (log+1)",
x_label="",
y_label="",
y_shift=0.8,
position=223,
)
# add distribution / kernel density plot to canvas
p.dist_plot(
np.log1p(data), color=style.style_grey, fit=stats.norm, x_rotate=True, ax=ax
)
# turn off x and y ticks
plt.xticks([])
plt.yticks([])
# add canvas to prettierplot object
ax = p.make_canvas(
title=f"probability plot - {name} (log+1)",
x_label="",
y_label="",
y_shift=0.8,
position=224,
)
# add QQ / probability plot to canvas
p.prob_plot(np.log1p(data), plot=ax)
# turn off x and y ticks
plt.xticks([])
plt.yticks([])
def eda_transform_box_cox(self, data, name, lmbda, chart_scale=15):
"""
Documentation:
---
Description:
Creates a two_panel visualization. The left plot is the box-cox transformed
distribution overlayed on a normal distribution. The right plot is a Box-Cox
transformed qqplot overlayed across a straight line.
---
Parameters:
data : Pandas Series
Target variable data object.
name : str
Name of target variable.
lmbda : float
Box-Cox transformation parameter.
chart_scale : int or float, default=15
Controls size and proportions of chart and chart elements. Higher value
creates larger plots and increases visual elements proportionally.
"""
# create prettierplot object
p = PrettierPlot(chart_scale=chart_scale)
# add canvas to prettierplot object
ax = p.make_canvas(
title=f"dist/kde - {name} (box-cox, {lmbda})",
x_label="",
y_label="",
y_shift=0.8,
position=223,
)
# add distribution / kernel density plot to canvas
p.dist_plot(
special.boxcox1p(data, lmbda),
color=style.style_grey,
fit=stats.norm,
x_rotate=True,
ax=ax,
)
# turn off x and y ticks
plt.xticks([])
plt.yticks([])
# add canvas to prettierplot object
ax = p.make_canvas(
title=f"Probability plot - {name} (box-cox, {lmbda})",
x_label="",
y_label="",
y_shift=0.8,
position=224,
)
# add QQ / probability plot to canvas
p.prob_plot(special.boxcox1p(data, lmbda), plot=ax)
# turn off x and y ticks
plt.xticks([])
plt.yticks([]) | 30.752508 | 108 | 0.612942 | 1,110 | 9,195 | 4.961261 | 0.164865 | 0.036317 | 0.01598 | 0.033412 | 0.843835 | 0.828582 | 0.822045 | 0.822045 | 0.793354 | 0.786635 | 0 | 0.012774 | 0.310386 | 9,195 | 299 | 109 | 30.752508 | 0.855701 | 0.51354 | 0 | 0.531746 | 0 | 0 | 0.073255 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039683 | false | 0 | 0.047619 | 0 | 0.087302 | 0.007937 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b6f585127f16d778b6de545b938a133ab03dfa3d | 218 | py | Python | flloat/__init__.py | aadeshnpn/flloat | 5a84608400d401799421f872e561689e3159a513 | [
"MIT"
] | 3 | 2019-07-14T21:15:26.000Z | 2019-12-12T21:51:35.000Z | flloat/__init__.py | MarcoFavorito/flloat | 75e8ec9219763eba5feb362438604693b6cc7346 | [
"MIT"
] | 1 | 2019-09-03T16:35:59.000Z | 2019-09-03T16:35:59.000Z | flloat/__init__.py | MarcoFavorito/flloat | 75e8ec9219763eba5feb362438604693b6cc7346 | [
"MIT"
] | 1 | 2019-08-30T18:15:02.000Z | 2019-08-30T18:15:02.000Z | # -*- coding: utf-8 -*-
"""Top-level package for FLLOAT."""
from .__version__ import __title__, __description__, __url__, __version__
from .__version__ import __author__, __author_email__, __license__, __copyright__
| 31.142857 | 81 | 0.770642 | 23 | 218 | 5.521739 | 0.782609 | 0.173228 | 0.267717 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005181 | 0.114679 | 218 | 6 | 82 | 36.333333 | 0.65285 | 0.238532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f3fe6dff8cbe12d056eb00a46024eeb4ea5b5782 | 140 | py | Python | tests/tfTests/testPhrasePickle.py | Los-Phoenix/Word2vec_LP | c645bdc08bb81d5b70afe3f3e0b6a1549c1a1802 | [
"MIT"
] | 1 | 2018-09-09T08:34:02.000Z | 2018-09-09T08:34:02.000Z | tests/tfTests/testPhrasePickle.py | Los-Phoenix/Word2vec_LP | c645bdc08bb81d5b70afe3f3e0b6a1549c1a1802 | [
"MIT"
] | null | null | null | tests/tfTests/testPhrasePickle.py | Los-Phoenix/Word2vec_LP | c645bdc08bb81d5b70afe3f3e0b6a1549c1a1802 | [
"MIT"
] | null | null | null | import cPickle as pickle
f1 = file('../../data/phraseClusterMean.pkl', 'rb')
samP = pickle.load(f1)
samP = pickle.load(f1)
print len(samP) | 20 | 51 | 0.692857 | 21 | 140 | 4.619048 | 0.666667 | 0.206186 | 0.28866 | 0.329897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02439 | 0.121429 | 140 | 7 | 52 | 20 | 0.764228 | 0 | 0 | 0.4 | 0 | 0 | 0.241135 | 0.22695 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.2 | null | null | 0.2 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
edd6534f947a5f055da56faa7facea4d5c460cfa | 287 | py | Python | populus/cli/__init__.py | mandarvaze/populus | 4577bba67f6cba3c4800d250f2a228562a5d9a8f | [
"MIT"
] | 2 | 2018-08-15T21:27:59.000Z | 2018-08-21T17:56:12.000Z | populus/cli/__init__.py | mandarvaze/populus | 4577bba67f6cba3c4800d250f2a228562a5d9a8f | [
"MIT"
] | null | null | null | populus/cli/__init__.py | mandarvaze/populus | 4577bba67f6cba3c4800d250f2a228562a5d9a8f | [
"MIT"
] | 1 | 2021-12-06T04:03:32.000Z | 2021-12-06T04:03:32.000Z | from .main import main # NOQA
from .chain_cmd import chain_cmd # NOQA
from .compile_cmd import compile_cmd # NOQA
from .config_cmd import config_cmd # NOQA
from .deploy_cmd import deploy_cmd # NOQA
from .init_cmd import init_cmd # NOQA
from .upgrade_cmd import upgrade_cmd # NOQA
| 31.888889 | 44 | 0.780488 | 47 | 287 | 4.510638 | 0.234043 | 0.226415 | 0.259434 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170732 | 287 | 8 | 45 | 35.875 | 0.890756 | 0.118467 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
edf702e72726cf88ac677e273a262f4b0465c870 | 3,080 | py | Python | kmip/core/factories/payloads/response.py | ondrap/PyKMIP | c8ea17d8faf827e0f9d004972835128a1a71569f | [
"Apache-2.0"
] | 179 | 2015-03-20T06:08:59.000Z | 2022-03-14T02:24:38.000Z | kmip/core/factories/payloads/response.py | imharshr/PyKMIP | 9403ff3d2aa83de4c786b8eedeb85d169fd4a594 | [
"Apache-2.0"
] | 600 | 2015-04-08T14:14:48.000Z | 2022-03-28T13:49:47.000Z | kmip/core/factories/payloads/response.py | imharshr/PyKMIP | 9403ff3d2aa83de4c786b8eedeb85d169fd4a594 | [
"Apache-2.0"
] | 131 | 2015-03-30T12:51:49.000Z | 2022-03-23T04:34:34.000Z | # Copyright (c) 2014 The Johns Hopkins University/Applied Physics Laboratory
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from kmip.core.factories.payloads import PayloadFactory
from kmip.core.messages import payloads
class ResponsePayloadFactory(PayloadFactory):
# TODO (peterhamilton) Alphabetize these
def _create_create_payload(self):
return payloads.CreateResponsePayload()
def _create_create_key_pair_payload(self):
return payloads.CreateKeyPairResponsePayload()
def _create_register_payload(self):
return payloads.RegisterResponsePayload()
def _create_derive_key_payload(self):
return payloads.DeriveKeyResponsePayload()
def _create_rekey_payload(self):
return payloads.RekeyResponsePayload()
def _create_rekey_key_pair_payload(self):
return payloads.RekeyKeyPairResponsePayload()
def _create_locate_payload(self):
return payloads.LocateResponsePayload()
def _create_check_payload(self):
return payloads.CheckResponsePayload()
def _create_get_payload(self):
return payloads.GetResponsePayload()
def _create_get_attribute_list_payload(self):
return payloads.GetAttributeListResponsePayload()
def _create_get_attributes_payload(self):
return payloads.GetAttributesResponsePayload()
def _create_delete_attribute_payload(self):
return payloads.DeleteAttributeResponsePayload()
def _create_set_attribute_payload(self):
return payloads.SetAttributeResponsePayload()
def _create_modify_attribute_payload(self):
return payloads.ModifyAttributeResponsePayload()
def _create_destroy_payload(self):
return payloads.DestroyResponsePayload()
def _create_query_payload(self):
return payloads.QueryResponsePayload()
def _create_discover_versions_payload(self):
return payloads.DiscoverVersionsResponsePayload()
def _create_activate_payload(self):
return payloads.ActivateResponsePayload()
def _create_revoke_payload(self):
return payloads.RevokeResponsePayload()
def _create_mac_payload(self):
return payloads.MACResponsePayload()
def _create_encrypt_payload(self):
return payloads.EncryptResponsePayload()
def _create_decrypt_payload(self):
return payloads.DecryptResponsePayload()
def _create_sign_payload(self):
return payloads.SignResponsePayload()
def _create_signature_verify_payload(self):
return payloads.SignatureVerifyResponsePayload()
| 32.765957 | 76 | 0.761688 | 324 | 3,080 | 6.978395 | 0.432099 | 0.095533 | 0.180451 | 0.265369 | 0.073419 | 0.028306 | 0 | 0 | 0 | 0 | 0 | 0.003146 | 0.174351 | 3,080 | 93 | 77 | 33.11828 | 0.885961 | 0.212662 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010753 | 0 | 1 | 0.470588 | false | 0 | 0.039216 | 0.470588 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
6119987889b5b981438ca5e2ac870e55d9a619aa | 250 | py | Python | models/Users/Exceptions.py | OkanyO/eden | 06a65d95ea580de204f2470de0bdb5bbc88d6691 | [
"MIT"
] | null | null | null | models/Users/Exceptions.py | OkanyO/eden | 06a65d95ea580de204f2470de0bdb5bbc88d6691 | [
"MIT"
] | 1 | 2021-09-09T10:19:14.000Z | 2021-09-09T10:19:14.000Z | models/Users/Exceptions.py | OkanyO/eden | 06a65d95ea580de204f2470de0bdb5bbc88d6691 | [
"MIT"
] | 19 | 2021-08-19T13:01:18.000Z | 2021-08-19T15:08:05.000Z | class IncompleteDetails(Exception):
def __init__(self, message):
super(IncompleteDetails, self).__init__(message)
class InvalidCustomer(Exception):
def __init__(self, message):
super(InvalidCustomer, self).__init__(message)
| 27.777778 | 56 | 0.736 | 24 | 250 | 7 | 0.375 | 0.142857 | 0.190476 | 0.238095 | 0.380952 | 0.380952 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 250 | 8 | 57 | 31.25 | 0.8 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
611f66c2fc3680c212aea633f2c5d9856be33688 | 87,772 | py | Python | tb_rest_client/rest_client_ce.py | samson0v/python_tb_rest_client | 08ff7898740f7cec2170e85d5c3c89e222e967f7 | [
"Apache-2.0"
] | 30 | 2020-06-19T06:42:50.000Z | 2021-08-23T21:16:36.000Z | tb_rest_client/rest_client_ce.py | samson0v/python_tb_rest_client | 08ff7898740f7cec2170e85d5c3c89e222e967f7 | [
"Apache-2.0"
] | 25 | 2021-08-30T01:17:27.000Z | 2022-03-16T14:10:14.000Z | tb_rest_client/rest_client_ce.py | samson0v/python_tb_rest_client | 08ff7898740f7cec2170e85d5c3c89e222e967f7 | [
"Apache-2.0"
] | 23 | 2020-07-06T13:41:54.000Z | 2021-08-23T21:04:50.000Z | # coding: utf-8
# Copyright 2020. ThingsBoard
# #
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# #
# http://www.apache.org/licenses/LICENSE-2.0
# #
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from tb_rest_client.rest_client_base import *
logger = getLogger(__name__)
class RestClientCE(RestClientBase):
def __init__(self, base_url):
super().__init__(base_url)
# O Auth 2 Config Template Controller
def delete_client_registration_template(self, client_registration_template_id: EntityId):
client_registration_template_id = self.get_id(client_registration_template_id)
return self.o_auth2_config_template_controller.delete_client_registration_template_using_delete(
client_registration_template_id=client_registration_template_id)
def get_client_registration_templates(self, ):
return self.o_auth2_config_template_controller.get_client_registration_templates_using_get()
def save_client_registration_template(self, body=None):
return self.o_auth2_config_template_controller.save_client_registration_template_using_post(body=body)
# Asset Controller
def get_asset_info_by_id(self, asset_id: AssetId):
asset_id = self.get_id(asset_id)
return self.asset_controller.get_asset_info_by_id_using_get(asset_id=asset_id)
def delete_asset(self, asset_id: AssetId):
asset_id = self.get_id(asset_id)
return self.asset_controller.delete_asset_using_delete(asset_id=asset_id)
def assign_asset_to_edge(self, edge_id: EdgeId, asset_id: AssetId):
edge_id = self.get_id(edge_id)
asset_id = self.get_id(asset_id)
return self.asset_controller.assign_asset_to_edge_using_post(edge_id=edge_id, asset_id=asset_id)
def find_by_query(self, body=None):
return self.asset_controller.find_by_query_using_post(body=body)
def get_customer_asset_infos(self, customer_id: CustomerId, page_size: int, page: int, type=None, text_search=None,
sort_property=None, sort_order=None):
customer_id = self.get_id(customer_id)
return self.asset_controller.get_customer_asset_infos_using_get(customer_id=customer_id, page_size=page_size,
page=page, type=type, text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def get_customer_assets(self, customer_id: CustomerId, page_size: int, page: int, type=None, text_search=None,
sort_property=None, sort_order=None):
customer_id = self.get_id(customer_id)
return self.asset_controller.get_customer_assets_using_get(customer_id=customer_id, page_size=page_size,
page=page, type=type, text_search=text_search,
sort_property=sort_property, sort_order=sort_order)
def get_tenant_asset_infos(self, page_size: int, page: int, type=None, text_search=None, sort_property=None,
sort_order=None):
return self.asset_controller.get_tenant_asset_infos_using_get(page_size=page_size, page=page, type=type,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def process_assets_bulk_import(self, body=None):
return self.asset_controller.process_assets_bulk_import_using_post(body=body)
def unassign_asset_from_edge(self, edge_id: EdgeId, asset_id: AssetId):
edge_id = self.get_id(edge_id)
asset_id = self.get_id(asset_id)
return self.asset_controller.unassign_asset_from_edge_using_delete(edge_id=edge_id, asset_id=asset_id)
def get_tenant_asset(self, asset_name: str):
return self.asset_controller.get_tenant_asset_using_get(asset_name=asset_name)
def get_edge_assets(self, edge_id: EdgeId, page_size: int, page: int, type=None, text_search=None,
sort_property=None, sort_order=None, start_time=None, end_time=None):
edge_id = self.get_id(edge_id)
return self.asset_controller.get_edge_assets_using_get(edge_id=edge_id, page_size=page_size, page=page,
type=type, text_search=text_search,
sort_property=sort_property, sort_order=sort_order,
start_time=start_time, end_time=end_time)
def assign_asset_to_customer(self, customer_id: CustomerId, asset_id: AssetId):
customer_id = self.get_id(customer_id)
asset_id = self.get_id(asset_id)
return self.asset_controller.assign_asset_to_customer_using_post(customer_id=customer_id, asset_id=asset_id)
def get_asset_by_id(self, asset_id: AssetId):
asset_id = self.get_id(asset_id)
return self.asset_controller.get_asset_by_id_using_get(asset_id=asset_id)
def unassign_asset_from_customer(self, asset_id: AssetId):
asset_id = self.get_id(asset_id)
return self.asset_controller.unassign_asset_from_customer_using_delete(asset_id=asset_id)
def assign_asset_to_public_customer(self, asset_id: AssetId):
asset_id = self.get_id(asset_id)
return self.asset_controller.assign_asset_to_public_customer_using_post(asset_id=asset_id)
def get_assets_by_ids(self, asset_ids: list):
return self.asset_controller.get_assets_by_ids_using_get(asset_ids=asset_ids)
def save_asset(self, body=None):
return self.asset_controller.save_asset_using_post(body=body)
def get_tenant_assets(self, page_size: int, page: int, type=None, text_search=None, sort_property=None,
sort_order=None):
return self.asset_controller.get_tenant_assets_using_get(page_size=page_size, page=page, type=type,
text_search=text_search, sort_property=sort_property,
sort_order=sort_order)
def get_asset_types(self, ):
return self.asset_controller.get_asset_types_using_get()
# Edge Controller
def get_tenant_edge_infos(self, page_size: int, page: int, type=None, text_search=None, sort_property=None,
sort_order=None):
return self.edge_controller.get_tenant_edge_infos_using_get(page_size=page_size, page=page, type=type,
text_search=text_search,
sort_property=sort_property, sort_order=sort_order)
def get_edge_info_by_id(self, edge_id: EdgeId):
edge_id = self.get_id(edge_id)
return self.edge_controller.get_edge_info_by_id_using_get(edge_id=edge_id)
def get_customer_edge_infos(self, customer_id: CustomerId, page_size: int, page: int, type=None, text_search=None,
sort_property=None, sort_order=None):
customer_id = self.get_id(customer_id)
return self.edge_controller.get_customer_edge_infos_using_get(customer_id=customer_id, page_size=page_size,
page=page, type=type, text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def assign_edge_to_customer(self, customer_id: CustomerId, edge_id: EdgeId):
customer_id = self.get_id(customer_id)
edge_id = self.get_id(edge_id)
return self.edge_controller.assign_edge_to_customer_using_post(customer_id=customer_id, edge_id=edge_id)
def find_by_query_v2(self, body=None):
return self.edge_controller.find_by_query_using_post2(body=body)
def sync_edge(self, edge_id: EdgeId):
edge_id = self.get_id(edge_id)
return self.edge_controller.sync_edge_using_post(edge_id=edge_id)
def check_instance(self, body=None):
return self.edge_controller.check_instance_using_post(body=body)
def get_tenant_edges(self, page_size: int, page: int, type=None, text_search=None, sort_property=None,
sort_order=None):
return self.edge_controller.get_tenant_edges_using_get(page_size=page_size, page=page, type=type,
text_search=text_search, sort_property=sort_property,
sort_order=sort_order)
def find_missing_to_related_rule_chains(self, edge_id: EdgeId):
edge_id = self.get_id(edge_id)
return self.edge_controller.find_missing_to_related_rule_chains_using_get(edge_id=edge_id)
def get_customer_edges(self, customer_id: CustomerId, page_size: int, page: int, type=None, text_search=None,
sort_property=None, sort_order=None):
customer_id = self.get_id(customer_id)
return self.edge_controller.get_customer_edges_using_get(customer_id=customer_id, page_size=page_size,
page=page, type=type, text_search=text_search,
sort_property=sort_property, sort_order=sort_order)
def process_edges_bulk_import(self, body=None):
return self.edge_controller.process_edges_bulk_import_using_post(body=body)
def activate_instance(self, license_secret: str, release_date: str):
return self.edge_controller.activate_instance_using_post(license_secret=license_secret,
release_date=release_date)
def get_tenant_edge(self, edge_name: str):
return self.edge_controller.get_tenant_edge_using_get(edge_name=edge_name)
def get_edge_by_id(self, edge_id: EdgeId):
edge_id = self.get_id(edge_id)
return self.edge_controller.get_edge_by_id_using_get(edge_id=edge_id)
def delete_edge(self, edge_id: EdgeId):
edge_id = self.get_id(edge_id)
return self.edge_controller.delete_edge_using_delete(edge_id=edge_id)
def save_edge(self, body=None):
return self.edge_controller.save_edge_using_post(body=body)
def is_edges_support_enabled(self, ):
return self.edge_controller.is_edges_support_enabled_using_get()
def get_edges(self, page_size: int, page: int, text_search=None, sort_property=None, sort_order=None):
return self.edge_controller.get_edges_using_get(page_size=page_size, page=page, text_search=text_search,
sort_property=sort_property, sort_order=sort_order)
def unassign_edge_from_customer(self, edge_id: EdgeId):
edge_id = self.get_id(edge_id)
return self.edge_controller.unassign_edge_from_customer_using_delete(edge_id=edge_id)
def assign_edge_to_public_customer(self, edge_id: EdgeId):
edge_id = self.get_id(edge_id)
return self.edge_controller.assign_edge_to_public_customer_using_post(edge_id=edge_id)
def get_edge_types(self, ):
return self.edge_controller.get_edge_types_using_get()
def set_edge_root_rule_chain(self, edge_id: EdgeId, rule_chain_id: RuleChainId):
edge_id = self.get_id(edge_id)
rule_chain_id = self.get_id(rule_chain_id)
return self.edge_controller.set_edge_root_rule_chain_using_post(edge_id=edge_id, rule_chain_id=rule_chain_id)
def get_edges_by_ids(self, edge_ids: list):
return self.edge_controller.get_edges_by_ids_using_get(edge_ids=edge_ids)
# Rule Chain Controller
def export_rule_chains(self, limit: int):
return self.rule_chain_controller.export_rule_chains_using_get(limit=limit)
def save_rule_chain_meta_data(self, body=None, update_related=None):
return self.rule_chain_controller.save_rule_chain_meta_data_using_post(body=body, update_related=update_related)
def delete_rule_chain(self, rule_chain_id: RuleChainId):
rule_chain_id = self.get_id(rule_chain_id)
return self.rule_chain_controller.delete_rule_chain_using_delete(rule_chain_id=rule_chain_id)
def get_rule_chain_output_labels(self, rule_chain_id: RuleChainId):
rule_chain_id = self.get_id(rule_chain_id)
return self.rule_chain_controller.get_rule_chain_output_labels_using_get(rule_chain_id=rule_chain_id)
def set_edge_template_root_rule_chain(self, rule_chain_id: RuleChainId):
rule_chain_id = self.get_id(rule_chain_id)
return self.rule_chain_controller.set_edge_template_root_rule_chain_using_post(rule_chain_id=rule_chain_id)
def save_rule_chain(self, body=None):
return self.rule_chain_controller.save_rule_chain_using_post(body=body)
def assign_rule_chain_to_edge(self, edge_id: EdgeId, rule_chain_id: RuleChainId):
edge_id = self.get_id(edge_id)
rule_chain_id = self.get_id(rule_chain_id)
return self.rule_chain_controller.assign_rule_chain_to_edge_using_post(edge_id=edge_id,
rule_chain_id=rule_chain_id)
def unassign_rule_chain_from_edge(self, edge_id: EdgeId, rule_chain_id: RuleChainId):
edge_id = self.get_id(edge_id)
rule_chain_id = self.get_id(rule_chain_id)
return self.rule_chain_controller.unassign_rule_chain_from_edge_using_delete(edge_id=edge_id,
rule_chain_id=rule_chain_id)
def unset_auto_assign_to_edge_rule_chain(self, rule_chain_id: RuleChainId):
rule_chain_id = self.get_id(rule_chain_id)
return self.rule_chain_controller.unset_auto_assign_to_edge_rule_chain_using_delete(rule_chain_id=rule_chain_id)
def get_rule_chain_by_id(self, rule_chain_id: RuleChainId):
rule_chain_id = self.get_id(rule_chain_id)
return self.rule_chain_controller.get_rule_chain_by_id_using_get(rule_chain_id=rule_chain_id)
def test_script(self, body=None):
return self.rule_chain_controller.test_script_using_post(body=body)
def save_rule_chain_v1(self, body=None):
return self.rule_chain_controller.save_rule_chain_using_post1(body=body)
def get_edge_rule_chains(self, edge_id: EdgeId, page_size: int, page: int, text_search=None, sort_property=None,
sort_order=None):
edge_id = self.get_id(edge_id)
return self.rule_chain_controller.get_edge_rule_chains_using_get(edge_id=edge_id, page_size=page_size,
page=page, text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def set_auto_assign_to_edge_rule_chain(self, rule_chain_id: RuleChainId):
rule_chain_id = self.get_id(rule_chain_id)
return self.rule_chain_controller.set_auto_assign_to_edge_rule_chain_using_post(rule_chain_id=rule_chain_id)
def import_rule_chains(self, body=None, overwrite=None):
return self.rule_chain_controller.import_rule_chains_using_post(body=body, overwrite=overwrite)
def set_root_rule_chain(self, rule_chain_id: RuleChainId):
rule_chain_id = self.get_id(rule_chain_id)
return self.rule_chain_controller.set_root_rule_chain_using_post(rule_chain_id=rule_chain_id)
def get_rule_chain_output_labels_usage(self, rule_chain_id: RuleChainId):
rule_chain_id = self.get_id(rule_chain_id)
return self.rule_chain_controller.get_rule_chain_output_labels_usage_using_get(rule_chain_id=rule_chain_id)
def get_rule_chains(self, page_size: int, page: int, type=None, text_search=None, sort_property=None,
sort_order=None):
return self.rule_chain_controller.get_rule_chains_using_get(page_size=page_size, page=page, type=type,
text_search=text_search,
sort_property=sort_property, sort_order=sort_order)
def get_auto_assign_to_edge_rule_chains(self, ):
return self.rule_chain_controller.get_auto_assign_to_edge_rule_chains_using_get()
def get_latest_rule_node_debug_input(self, rule_node_id: RuleNodeId):
rule_node_id = self.get_id(rule_node_id)
return self.rule_chain_controller.get_latest_rule_node_debug_input_using_get(rule_node_id=rule_node_id)
def get_rule_chain_meta_data(self, rule_chain_id: RuleChainId):
rule_chain_id = self.get_id(rule_chain_id)
return self.rule_chain_controller.get_rule_chain_meta_data_using_get(rule_chain_id=rule_chain_id)
# Auth Controller
def get_user(self, ):
return self.auth_controller.get_user_using_get()
def change_password(self, body=None):
return self.auth_controller.change_password_using_post(body=body)
def logout(self, ):
return self.auth_controller.logout_using_post()
def check_reset_token(self, reset_token: str):
return self.auth_controller.check_reset_token_using_get(reset_token=reset_token)
def reset_password(self, body=None):
return self.auth_controller.reset_password_using_post(body=body)
def activate_user(self, body=None, send_activation_mail=None):
return self.auth_controller.activate_user_using_post(body=body, send_activation_mail=send_activation_mail)
def get_user_password_policy(self, ):
return self.auth_controller.get_user_password_policy_using_get()
def check_activate_token(self, activate_token: str):
return self.auth_controller.check_activate_token_using_get(activate_token=activate_token)
def request_reset_password_by_email(self, body=None):
return self.auth_controller.request_reset_password_by_email_using_post(body=body)
# Event Controller
def get_events_post(self, tenant_id: TenantId, page_size: int, page: int, entity_type: str, entity_id: EntityId,
body=None, text_search=None, sort_property=None, sort_order=None, start_time=None,
end_time=None):
tenant_id = self.get_id(tenant_id)
entity_id = self.get_id(entity_id)
return self.event_controller.get_events_using_post(tenant_id=tenant_id, page_size=page_size, page=page,
entity_type=entity_type, entity_id=entity_id, body=body,
text_search=text_search, sort_property=sort_property,
sort_order=sort_order, start_time=start_time,
end_time=end_time)
def get_events_v1_get1(self, entity_type: str, entity_id: EntityId, event_type: str, tenant_id: TenantId,
page_size: int, page: int, text_search=None, sort_property=None, sort_order=None,
start_time=None, end_time=None):
entity_id = self.get_id(entity_id)
tenant_id = self.get_id(tenant_id)
return self.event_controller.get_events_using_get1(entity_type=entity_type, entity_id=entity_id,
event_type=event_type, tenant_id=tenant_id,
page_size=page_size, page=page, text_search=text_search,
sort_property=sort_property, sort_order=sort_order,
start_time=start_time, end_time=end_time)
def get_events_get(self, entity_type: str, entity_id: EntityId, tenant_id: TenantId, page_size: int, page: int,
text_search=None, sort_property=None, sort_order=None, start_time=None, end_time=None):
entity_id = self.get_id(entity_id)
tenant_id = self.get_id(tenant_id)
return self.event_controller.get_events_using_get(entity_type=entity_type, entity_id=entity_id,
tenant_id=tenant_id, page_size=page_size, page=page,
text_search=text_search, sort_property=sort_property,
sort_order=sort_order, start_time=start_time,
end_time=end_time)
# Telemetry Controller
def get_attribute_keys_by_scope(self, entity_type: str, entity_id: EntityId, scope: str):
entity_id = self.get_id(entity_id)
return self.telemetry_controller.get_attribute_keys_by_scope_using_get(entity_type=entity_type,
entity_id=entity_id, scope=scope)
def get_timeseries(self, entity_type: str, entity_id: EntityId, keys: str, start_ts: int, end_ts: int,
interval=None, limit=None, agg=None, order_by=None, use_strict_data_types=None):
entity_id = self.get_id(entity_id)
return self.telemetry_controller.get_timeseries_using_get(entity_type=entity_type, entity_id=entity_id,
keys=keys, start_ts=start_ts, end_ts=end_ts,
interval=interval, limit=limit, agg=agg,
order_by=order_by,
use_strict_data_types=use_strict_data_types)
def delete_device_attributes(self, device_id: DeviceId, scope: str, keys: str):
device_id = self.get_id(device_id)
return self.telemetry_controller.delete_device_attributes_using_delete(device_id=device_id, scope=scope,
keys=keys)
def delete_entity_timeseries(self, entity_type: str, entity_id: EntityId, keys: str, delete_all_data_for_keys=None,
start_ts=None, end_ts=None, rewrite_latest_if_deleted=None):
entity_id = self.get_id(entity_id)
return self.telemetry_controller.delete_entity_timeseries_using_delete(entity_type=entity_type,
entity_id=entity_id, keys=keys,
delete_all_data_for_keys=delete_all_data_for_keys,
start_ts=start_ts, end_ts=end_ts,
rewrite_latest_if_deleted=rewrite_latest_if_deleted)
def save_entity_attributes_v1(self, entity_type: str, entity_id: EntityId, scope: str, body=None):
entity_id = self.get_id(entity_id)
return self.telemetry_controller.save_entity_attributes_v1_using_post(entity_type=entity_type,
entity_id=entity_id, scope=scope,
body=body)
def save_device_attributes(self, device_id: DeviceId, scope: str, body=None):
device_id = self.get_id(device_id)
return self.telemetry_controller.save_device_attributes_using_post(device_id=device_id, scope=scope, body=body)
def get_latest_timeseries(self, entity_type: str, entity_id: EntityId, keys=None, use_strict_data_types=None):
entity_id = self.get_id(entity_id)
return self.telemetry_controller.get_latest_timeseries_using_get(entity_type=entity_type, entity_id=entity_id,
keys=keys,
use_strict_data_types=use_strict_data_types)
def get_timeseries_keys_v1(self, entity_type: str, entity_id: EntityId):
entity_id = self.get_id(entity_id)
return self.telemetry_controller.get_timeseries_keys_using_get1(entity_type=entity_type, entity_id=entity_id)
def get_attributes_by_scope(self, entity_type: str, entity_id: EntityId, scope: str, keys=None):
entity_id = self.get_id(entity_id)
return self.telemetry_controller.get_attributes_by_scope_using_get(entity_type=entity_type, entity_id=entity_id,
scope=scope, keys=keys)
def get_attribute_keys(self, entity_type: str, entity_id: EntityId):
entity_id = self.get_id(entity_id)
return self.telemetry_controller.get_attribute_keys_using_get(entity_type=entity_type, entity_id=entity_id)
def save_entity_attributes_v2(self, entity_type: str, entity_id: EntityId, scope: str, body=None):
entity_id = self.get_id(entity_id)
return self.telemetry_controller.save_entity_attributes_v2_using_post(entity_type=entity_type,
entity_id=entity_id, scope=scope,
body=body)
def save_entity_telemetry(self, entity_type: str, entity_id: EntityId, scope: str, body=None):
entity_id = self.get_id(entity_id)
return self.telemetry_controller.save_entity_telemetry_using_post(entity_type=entity_type, entity_id=entity_id,
scope=scope, body=body)
def save_entity_telemetry_with_ttl(self, entity_type: str, entity_id: EntityId, scope: str, ttl: int, body=None):
entity_id = self.get_id(entity_id)
return self.telemetry_controller.save_entity_telemetry_with_ttl_using_post(entity_type=entity_type,
entity_id=entity_id, scope=scope,
ttl=ttl, body=body)
def get_attributes(self, entity_type: str, entity_id: EntityId, keys=None):
entity_id = self.get_id(entity_id)
return self.telemetry_controller.get_attributes_using_get(entity_type=entity_type, entity_id=entity_id,
keys=keys)
def delete_entity_attributes(self, entity_type: str, entity_id: EntityId, scope: str, keys: str):
entity_id = self.get_id(entity_id)
return self.telemetry_controller.delete_entity_attributes_using_delete(entity_type=entity_type,
entity_id=entity_id, scope=scope,
keys=keys)
# Alarm Controller
def ack_alarm(self, alarm_id: AlarmId):
alarm_id = self.get_id(alarm_id)
return self.alarm_controller.ack_alarm_using_post(alarm_id=alarm_id)
def get_alarm_info_by_id(self, alarm_id: AlarmId):
alarm_id = self.get_id(alarm_id)
return self.alarm_controller.get_alarm_info_by_id_using_get(alarm_id=alarm_id)
def get_highest_alarm_severity(self, entity_type: str, entity_id: EntityId, search_status=None, status=None):
entity_id = self.get_id(entity_id)
return self.alarm_controller.get_highest_alarm_severity_using_get(entity_type=entity_type, entity_id=entity_id,
search_status=search_status, status=status)
def clear_alarm(self, alarm_id: AlarmId):
alarm_id = self.get_id(alarm_id)
return self.alarm_controller.clear_alarm_using_post(alarm_id=alarm_id)
def save_alarm(self, body=None):
return self.alarm_controller.save_alarm_using_post(body=body)
def get_alarms(self, entity_type: str, entity_id: EntityId, page_size: int, page: int, search_status=None,
status=None, text_search=None, sort_property=None, sort_order=None, start_time=None, end_time=None,
fetch_originator=None):
entity_id = self.get_id(entity_id)
return self.alarm_controller.get_alarms_using_get(entity_type=entity_type, entity_id=entity_id,
page_size=page_size, page=page, search_status=search_status,
status=status, text_search=text_search,
sort_property=sort_property, sort_order=sort_order,
start_time=start_time, end_time=end_time,
fetch_originator=fetch_originator)
def get_alarm_by_id(self, alarm_id: AlarmId):
alarm_id = self.get_id(alarm_id)
return self.alarm_controller.get_alarm_by_id_using_get(alarm_id=alarm_id)
def get_all_alarms(self, page_size: int, page: int, search_status=None, status=None, text_search=None,
sort_property=None, sort_order=None, start_time=None, end_time=None, fetch_originator=None):
return self.alarm_controller.get_all_alarms_using_get(page_size=page_size, page=page,
search_status=search_status, status=status,
text_search=text_search, sort_property=sort_property,
sort_order=sort_order, start_time=start_time,
end_time=end_time, fetch_originator=fetch_originator)
def delete_alarm(self, alarm_id: AlarmId):
alarm_id = self.get_id(alarm_id)
return self.alarm_controller.delete_alarm_using_delete(alarm_id=alarm_id)
# RPC v2 Controller
def get_persisted_rpc(self, rpc_id: RpcId):
rpc_id = self.get_id(rpc_id)
return self.rpc_v2_controller.get_persisted_rpc_using_get(rpc_id=rpc_id)
def handle_one_way_device_rpc_request_v1(self, device_id: DeviceId, body=None):
device_id = self.get_id(device_id)
return self.rpc_v2_controller.handle_one_way_device_rpc_request_using_post1(device_id=device_id, body=body)
def handle_two_way_device_rpc_request_v1(self, device_id: DeviceId, body=None):
device_id = self.get_id(device_id)
return self.rpc_v2_controller.handle_two_way_device_rpc_request_using_post1(device_id=device_id, body=body)
def get_persisted_rpc_by_device(self, device_id: DeviceId, page_size: int, page: int, rpc_status: str,
text_search=None, sort_property=None, sort_order=None):
device_id = self.get_id(device_id)
return self.rpc_v2_controller.get_persisted_rpc_by_device_using_get(device_id=device_id, page_size=page_size,
page=page, rpc_status=rpc_status,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def delete_resource(self, rpc_id: RpcId):
rpc_id = self.get_id(rpc_id)
return self.rpc_v2_controller.delete_resource_using_delete(rpc_id=rpc_id)
# Edge Event Controller
def get_edge_events(self, edge_id: EdgeId, page_size: int, page: int, text_search=None, sort_property=None,
sort_order=None, start_time=None, end_time=None):
edge_id = self.get_id(edge_id)
return self.edge_event_controller.get_edge_events_using_get(edge_id=edge_id, page_size=page_size, page=page,
text_search=text_search,
sort_property=sort_property, sort_order=sort_order,
start_time=start_time, end_time=end_time)
# Customer Controller
def get_customer_title_by_id(self, customer_id: CustomerId):
customer_id = self.get_id(customer_id)
return self.customer_controller.get_customer_title_by_id_using_get(customer_id=customer_id)
def get_customers(self, page_size: int, page: int, text_search=None, sort_property=None, sort_order=None):
return self.customer_controller.get_customers_using_get(page_size=page_size, page=page, text_search=text_search,
sort_property=sort_property, sort_order=sort_order)
def get_customer_by_id(self, customer_id: CustomerId):
customer_id = self.get_id(customer_id)
return self.customer_controller.get_customer_by_id_using_get(customer_id=customer_id)
def get_short_customer_info_by_id(self, customer_id: CustomerId):
customer_id = self.get_id(customer_id)
return self.customer_controller.get_short_customer_info_by_id_using_get(customer_id=customer_id)
def save_customer(self, body=None):
return self.customer_controller.save_customer_using_post(body=body)
def get_tenant_customer(self, customer_title: str):
return self.customer_controller.get_tenant_customer_using_get(customer_title=customer_title)
def delete_customer(self, customer_id: CustomerId):
customer_id = self.get_id(customer_id)
return self.customer_controller.delete_customer_using_delete(customer_id=customer_id)
# User Controller
def get_user_token(self, user_id: UserId):
user_id = self.get_id(user_id)
return self.user_controller.get_user_token_using_get(user_id=user_id)
def get_activation_link(self, user_id: UserId):
user_id = self.get_id(user_id)
return self.user_controller.get_activation_link_using_get(user_id=user_id)
def delete_user(self, user_id: UserId):
user_id = self.get_id(user_id)
return self.user_controller.delete_user_using_delete(user_id=user_id)
def get_users(self, page_size: int, page: int, text_search=None, sort_property=None, sort_order=None):
return self.user_controller.get_users_using_get(page_size=page_size, page=page, text_search=text_search,
sort_property=sort_property, sort_order=sort_order)
def set_user_credentials_enabled(self, user_id: UserId, user_credentials_enabled=None):
user_id = self.get_id(user_id)
return self.user_controller.set_user_credentials_enabled_using_post(user_id=user_id,
user_credentials_enabled=user_credentials_enabled)
def get_customer_users(self, customer_id: CustomerId, page_size: int, page: int, text_search=None,
sort_property=None, sort_order=None):
customer_id = self.get_id(customer_id)
return self.user_controller.get_customer_users_using_get(customer_id=customer_id, page_size=page_size,
page=page, text_search=text_search,
sort_property=sort_property, sort_order=sort_order)
def get_user_by_id(self, user_id: UserId):
user_id = self.get_id(user_id)
return self.user_controller.get_user_by_id_using_get(user_id=user_id)
def get_tenant_admins(self, tenant_id: TenantId, page_size: int, page: int, text_search=None, sort_property=None,
sort_order=None):
tenant_id = self.get_id(tenant_id)
return self.user_controller.get_tenant_admins_using_get(tenant_id=tenant_id, page_size=page_size, page=page,
text_search=text_search, sort_property=sort_property,
sort_order=sort_order)
def is_user_token_access_enabled(self, ):
return self.user_controller.is_user_token_access_enabled_using_get()
def save_user(self, body=None, send_activation_mail=None):
return self.user_controller.save_user_using_post(body=body, send_activation_mail=send_activation_mail)
def send_activation_email(self, email: str):
return self.user_controller.send_activation_email_using_post(email=email)
# Queue Controller
def get_tenant_queues_by_service_type(self, service_type: str):
return self.queue_controller.get_tenant_queues_by_service_type_using_get(service_type=service_type)
# RPC v1 Controller
def handle_one_way_device_rpc_request(self, device_id: DeviceId, body=None):
device_id = self.get_id(device_id)
return self.rpc_v1_controller.handle_one_way_device_rpc_request_using_post(device_id=device_id, body=body)
def handle_two_way_device_rpc_request(self, device_id: DeviceId, body=None):
device_id = self.get_id(device_id)
return self.rpc_v1_controller.handle_two_way_device_rpc_request_using_post(device_id=device_id, body=body)
# Device Controller
def get_device_types(self, ):
return self.device_controller.get_device_types_using_get()
def process_devices_bulk_import(self, body=None):
return self.device_controller.process_devices_bulk_import_using_post(body=body)
def count_by_device_profile_and_empty_ota_package(self, ota_package_type: str, device_profile_id: DeviceProfileId):
device_profile_id = self.get_id(device_profile_id)
return self.device_controller.count_by_device_profile_and_empty_ota_package_using_get(
ota_package_type=ota_package_type, device_profile_id=device_profile_id)
def get_devices_by_ids(self, device_ids: list):
return self.device_controller.get_devices_by_ids_using_get(device_ids=device_ids)
def claim_device(self, device_name: str, body=None):
return self.device_controller.claim_device_using_post(device_name=device_name, body=body)
def save_device_with_credentials(self, body=None):
return self.device_controller.save_device_with_credentials_using_post(body=body)
def update_device_credentials(self, body=None):
return self.device_controller.update_device_credentials_using_post(body=body)
def save_device(self, body=None, access_token=None):
return self.device_controller.save_device_using_post(body=body, access_token=access_token)
def assign_device_to_public_customer(self, device_id: DeviceId):
device_id = self.get_id(device_id)
return self.device_controller.assign_device_to_public_customer_using_post(device_id=device_id)
def unassign_device_from_customer(self, device_id: DeviceId):
device_id = self.get_id(device_id)
return self.device_controller.unassign_device_from_customer_using_delete(device_id=device_id)
def get_device_by_id(self, device_id: DeviceId):
device_id = self.get_id(device_id)
return self.device_controller.get_device_by_id_using_get(device_id=device_id)
def get_tenant_device_infos(self, page_size: int, page: int, type=None, device_profile_id=None, text_search=None,
sort_property=None, sort_order=None):
device_profile_id = self.get_id(device_profile_id)
return self.device_controller.get_tenant_device_infos_using_get(page_size=page_size, page=page, type=type,
device_profile_id=device_profile_id,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def get_customer_device_infos(self, customer_id: CustomerId, page_size: int, page: int, type=None,
device_profile_id=None, text_search=None, sort_property=None, sort_order=None):
customer_id = self.get_id(customer_id)
device_profile_id = self.get_id(device_profile_id)
return self.device_controller.get_customer_device_infos_using_get(customer_id=customer_id, page_size=page_size,
page=page, type=type,
device_profile_id=device_profile_id,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def get_tenant_devices(self, page_size: int, page: int, type=None, text_search=None, sort_property=None,
sort_order=None):
return self.device_controller.get_tenant_devices_using_get(page_size=page_size, page=page, type=type,
text_search=text_search, sort_property=sort_property,
sort_order=sort_order)
def get_customer_devices(self, customer_id: CustomerId, page_size: int, page: int, type=None, text_search=None,
sort_property=None, sort_order=None):
customer_id = self.get_id(customer_id)
return self.device_controller.get_customer_devices_using_get(customer_id=customer_id, page_size=page_size,
page=page, type=type, text_search=text_search,
sort_property=sort_property, sort_order=sort_order)
def get_device_info_by_id(self, device_id: DeviceId):
device_id = self.get_id(device_id)
return self.device_controller.get_device_info_by_id_using_get(device_id=device_id)
def unassign_device_from_edge(self, edge_id: EdgeId, device_id: DeviceId):
edge_id = self.get_id(edge_id)
device_id = self.get_id(device_id)
return self.device_controller.unassign_device_from_edge_using_delete(edge_id=edge_id, device_id=device_id)
def assign_device_to_tenant(self, tenant_id: TenantId, device_id: DeviceId):
tenant_id = self.get_id(tenant_id)
device_id = self.get_id(device_id)
return self.device_controller.assign_device_to_tenant_using_post(tenant_id=tenant_id, device_id=device_id)
def find_by_query_v1(self, body=None):
return self.device_controller.find_by_query_using_post1(body=body)
def assign_device_to_edge(self, edge_id: EdgeId, device_id: DeviceId):
edge_id = self.get_id(edge_id)
device_id = self.get_id(device_id)
return self.device_controller.assign_device_to_edge_using_post(edge_id=edge_id, device_id=device_id)
def delete_device(self, device_id: DeviceId):
device_id = self.get_id(device_id)
return self.device_controller.delete_device_using_delete(device_id=device_id)
def re_claim_device(self, device_name: str):
return self.device_controller.re_claim_device_using_delete(device_name=device_name)
def assign_device_to_customer(self, customer_id: CustomerId, device_id: DeviceId):
customer_id = self.get_id(customer_id)
device_id = self.get_id(device_id)
return self.device_controller.assign_device_to_customer_using_post(customer_id=customer_id, device_id=device_id)
def get_edge_devices(self, edge_id: EdgeId, page_size: int, page: int, type=None, text_search=None,
sort_property=None, sort_order=None, start_time=None, end_time=None):
edge_id = self.get_id(edge_id)
return self.device_controller.get_edge_devices_using_get(edge_id=edge_id, page_size=page_size, page=page,
type=type, text_search=text_search,
sort_property=sort_property, sort_order=sort_order,
start_time=start_time, end_time=end_time)
def get_tenant_device(self, device_name: str):
return self.device_controller.get_tenant_device_using_get(device_name=device_name)
def get_device_credentials_by_device_id(self, device_id: DeviceId):
device_id = self.get_id(device_id)
return self.device_controller.get_device_credentials_by_device_id_using_get(device_id=device_id)
# Entity View Controller
def find_by_to_v1(self, to_id: EntityId, to_type: str, relation_type_group=None):
to_id = self.get_id(to_id)
return self.entity_relation_controller.find_by_to_using_get1(to_id=to_id, to_type=to_type,
relation_type_group=relation_type_group)
def find_info_by_to(self, to_id: EntityId, to_type: str, relation_type_group=None):
to_id = self.get_id(to_id)
return self.entity_relation_controller.find_info_by_to_using_get(to_id=to_id, to_type=to_type,
relation_type_group=relation_type_group)
def delete_relations(self, entity_id: EntityId, entity_type: str):
entity_id = self.get_id(entity_id)
return self.entity_relation_controller.delete_relations_using_delete(entity_id=entity_id,
entity_type=entity_type)
def delete_relation(self, from_id: EntityId, from_type: str, relation_type: str, to_id: EntityId, to_type: str,
relation_type_group=None):
from_id = self.get_id(from_id)
to_id = self.get_id(to_id)
return self.entity_relation_controller.delete_relation_using_delete(from_id=from_id, from_type=from_type,
relation_type=relation_type, to_id=to_id,
to_type=to_type,
relation_type_group=relation_type_group)
def find_by_from_v1(self, from_id: EntityId, from_type: str, relation_type_group=None):
from_id = self.get_id(from_id)
return self.entity_relation_controller.find_by_from_using_get1(from_id=from_id, from_type=from_type,
relation_type_group=relation_type_group)
def find_by_query_v3(self, body=None):
return self.entity_relation_controller.find_by_query_using_post3(body=body)
def find_info_by_query(self, body=None):
return self.entity_relation_controller.find_info_by_query_using_post(body=body)
def save_relation(self, body=None):
return self.entity_relation_controller.save_relation_using_post(body=body)
def find_by_to(self, to_id: EntityId, to_type: str, relation_type: str, relation_type_group=None):
to_id = self.get_id(to_id)
return self.entity_relation_controller.find_by_to_using_get(to_id=to_id, to_type=to_type,
relation_type=relation_type,
relation_type_group=relation_type_group)
def find_info_by_from(self, from_id: EntityId, from_type: str, relation_type_group=None):
from_id = self.get_id(from_id)
return self.entity_relation_controller.find_info_by_from_using_get(from_id=from_id, from_type=from_type,
relation_type_group=relation_type_group)
def get_relation(self, from_id: EntityId, from_type: str, relation_type: str, to_id: EntityId, to_type: str,
relation_type_group=None):
from_id = self.get_id(from_id)
to_id = self.get_id(to_id)
return self.entity_relation_controller.get_relation_using_get(from_id=from_id, from_type=from_type,
relation_type=relation_type, to_id=to_id,
to_type=to_type,
relation_type_group=relation_type_group)
def find_by_from(self, from_id: EntityId, from_type: str, relation_type: str, relation_type_group=None):
from_id = self.get_id(from_id)
return self.entity_relation_controller.find_by_from_using_get(from_id=from_id, from_type=from_type,
relation_type=relation_type,
relation_type_group=relation_type_group)
def assign_entity_view_to_edge(self, edge_id: EdgeId, entity_view_id: EntityViewId):
edge_id = self.get_id(edge_id)
entity_view_id = self.get_id(entity_view_id)
return self.entity_view_controller.assign_entity_view_to_edge_using_post(edge_id=edge_id,
entity_view_id=entity_view_id)
def get_entity_view_types(self, ):
return self.entity_view_controller.get_entity_view_types_using_get()
def delete_entity_view(self, entity_view_id: EntityViewId):
entity_view_id = self.get_id(entity_view_id)
return self.entity_view_controller.delete_entity_view_using_delete(entity_view_id=entity_view_id)
def assign_entity_view_to_customer(self, customer_id: CustomerId, entity_view_id: EntityViewId):
customer_id = self.get_id(customer_id)
entity_view_id = self.get_id(entity_view_id)
return self.entity_view_controller.assign_entity_view_to_customer_using_post(customer_id=customer_id,
entity_view_id=entity_view_id)
def get_entity_view_by_id(self, entity_view_id: EntityViewId):
entity_view_id = self.get_id(entity_view_id)
return self.entity_view_controller.get_entity_view_by_id_using_get(entity_view_id=entity_view_id)
def get_customer_entity_view_infos(self, customer_id: CustomerId, page_size: int, page: int, type=None,
text_search=None, sort_property=None, sort_order=None):
customer_id = self.get_id(customer_id)
return self.entity_view_controller.get_customer_entity_view_infos_using_get(customer_id=customer_id,
page_size=page_size, page=page,
type=type, text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def get_entity_view_info_by_id(self, entity_view_id: EntityViewId):
entity_view_id = self.get_id(entity_view_id)
return self.entity_view_controller.get_entity_view_info_by_id_using_get(entity_view_id=entity_view_id)
def get_tenant_entity_view_infos(self, page_size: int, page: int, type=None, text_search=None, sort_property=None,
sort_order=None):
return self.entity_view_controller.get_tenant_entity_view_infos_using_get(page_size=page_size, page=page,
type=type, text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def get_tenant_entity_view(self, entity_view_name: str):
return self.entity_view_controller.get_tenant_entity_view_using_get(entity_view_name=entity_view_name)
def get_edge_entity_views(self, edge_id: EdgeId, page: str, page_size: str, type=None, text_search=None,
sort_property=None, sort_order=None, start_time=None, end_time=None):
edge_id = self.get_id(edge_id)
return self.entity_view_controller.get_edge_entity_views_using_get(edge_id=edge_id, page=page,
page_size=page_size, type=type,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order, start_time=start_time,
end_time=end_time)
def unassign_entity_view_from_customer(self, entity_view_id: EntityViewId):
entity_view_id = self.get_id(entity_view_id)
return self.entity_view_controller.unassign_entity_view_from_customer_using_delete(
entity_view_id=entity_view_id)
def save_entity_view(self, body=None):
return self.entity_view_controller.save_entity_view_using_post(body=body)
def unassign_entity_view_from_edge(self, edge_id: EdgeId, entity_view_id: EntityViewId):
edge_id = self.get_id(edge_id)
entity_view_id = self.get_id(entity_view_id)
return self.entity_view_controller.unassign_entity_view_from_edge_using_delete(edge_id=edge_id,
entity_view_id=entity_view_id)
def get_tenant_entity_views(self, page_size: int, page: int, type=None, text_search=None, sort_property=None,
sort_order=None):
return self.entity_view_controller.get_tenant_entity_views_using_get(page_size=page_size, page=page, type=type,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def assign_entity_view_to_public_customer(self, entity_view_id: EntityViewId):
entity_view_id = self.get_id(entity_view_id)
return self.entity_view_controller.assign_entity_view_to_public_customer_using_post(
entity_view_id=entity_view_id)
def find_by_query_v4(self, body=None):
return self.entity_view_controller.find_by_query_using_post4(body=body)
def get_customer_entity_views(self, customer_id: CustomerId, page_size: int, page: int, type=None, text_search=None,
sort_property=None, sort_order=None):
customer_id = self.get_id(customer_id)
return self.entity_view_controller.get_customer_entity_views_using_get(customer_id=customer_id,
page_size=page_size, page=page,
type=type, text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
# Admin Controller
def get_admin_settings(self, key: str):
return self.admin_controller.get_admin_settings_using_get(key=key)
def check_updates(self, ):
return self.admin_controller.check_updates_using_get()
def send_test_sms(self, body=None):
return self.admin_controller.send_test_sms_using_post(body=body)
def get_security_settings(self, ):
return self.admin_controller.get_security_settings_using_get()
def send_test_mail(self, body=None):
return self.admin_controller.send_test_mail_using_post(body=body)
def save_admin_settings(self, body=None):
return self.admin_controller.save_admin_settings_using_post(body=body)
def save_security_settings(self, body=None):
return self.admin_controller.save_security_settings_using_post(body=body)
# Sign Up Controller
def get_recaptcha_public_key(self, ):
return self.sign_up_controller.get_recaptcha_public_key_using_get()
def sign_up(self, body=None):
return self.sign_up_controller.sign_up_using_post(body=body)
def accept_privacy_policy(self, ):
return self.sign_up_controller.accept_privacy_policy_using_post()
def resend_email_activation(self, email: str, pkg_name=None):
return self.sign_up_controller.resend_email_activation_using_post(email=email, pkg_name=pkg_name)
def activate_user_by_email_code(self, email_code: str, pkg_name=None):
return self.sign_up_controller.activate_user_by_email_code_using_post(email_code=email_code, pkg_name=pkg_name)
def delete_tenant_account(self, ):
return self.sign_up_controller.delete_tenant_account_using_delete()
def privacy_policy_accepted(self, ):
return self.sign_up_controller.privacy_policy_accepted_using_get()
def activate_email(self, email_code: str, pkg_name=None):
return self.sign_up_controller.activate_email_using_get(email_code=email_code, pkg_name=pkg_name)
def mobile_login(self, pkg_name: str):
return self.sign_up_controller.mobile_login_using_get(pkg_name=pkg_name)
# TB Resource Controller
def get_resource_info_by_id(self, resource_id: EntityId):
resource_id = self.get_id(resource_id)
return self.tb_resource_controller.get_resource_info_by_id_using_get(resource_id=resource_id)
def delete_resource_v1(self, resource_id: EntityId):
resource_id = self.get_id(resource_id)
return self.tb_resource_controller.delete_resource_using_delete1(resource_id=resource_id)
def get_resource_by_id(self, resource_id: EntityId):
resource_id = self.get_id(resource_id)
return self.tb_resource_controller.get_resource_by_id_using_get(resource_id=resource_id)
def save_resource(self, body=None):
return self.tb_resource_controller.save_resource_using_post(body=body)
def get_resources(self, page_size: int, page: int, text_search=None, sort_property=None, sort_order=None):
return self.tb_resource_controller.get_resources_using_get(page_size=page_size, page=page,
text_search=text_search, sort_property=sort_property,
sort_order=sort_order)
def get_lwm2m_list_objects(self, sort_order: str, sort_property: str, object_ids: list):
return self.tb_resource_controller.get_lwm2m_list_objects_using_get(sort_order=sort_order,
sort_property=sort_property,
object_ids=object_ids)
def download_resource(self, resource_id: EntityId):
resource_id = self.get_id(resource_id)
return self.tb_resource_controller.download_resource_using_get(resource_id=resource_id)
def get_lwm2m_list_objects_page(self, page_size: int, page: int, text_search=None, sort_property=None,
sort_order=None):
return self.tb_resource_controller.get_lwm2m_list_objects_page_using_get(page_size=page_size, page=page,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
# O Auth 2 Controller
def get_login_processing_url(self, ):
return self.o_auth2_controller.get_login_processing_url_using_get()
def save_o_auth2_info(self, body=None):
return self.o_auth2_controller.save_o_auth2_info_using_post(body=body)
def get_o_auth2_clients(self, pkg_name=None, platform=None):
return self.o_auth2_controller.get_o_auth2_clients_using_post(pkg_name=pkg_name, platform=platform)
def get_current_o_auth2_info(self, ):
return self.o_auth2_controller.get_current_o_auth2_info_using_get()
# Tenant Profile Controller
def get_default_tenant_profile_info(self, ):
return self.tenant_profile_controller.get_default_tenant_profile_info_using_get()
def save_tenant_profile(self, body=None):
return self.tenant_profile_controller.save_tenant_profile_using_post(body=body)
def get_tenant_profiles(self, page_size: int, page: int, text_search=None, sort_property=None, sort_order=None):
return self.tenant_profile_controller.get_tenant_profiles_using_get(page_size=page_size, page=page,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def delete_tenant_profile(self, tenant_profile_id: TenantProfileId):
tenant_profile_id = self.get_id(tenant_profile_id)
return self.tenant_profile_controller.delete_tenant_profile_using_delete(tenant_profile_id=tenant_profile_id)
def get_tenant_profile_info_by_id(self, tenant_profile_id: TenantProfileId):
tenant_profile_id = self.get_id(tenant_profile_id)
return self.tenant_profile_controller.get_tenant_profile_info_by_id_using_get(
tenant_profile_id=tenant_profile_id)
def get_tenant_profile_by_id(self, tenant_profile_id: TenantProfileId):
tenant_profile_id = self.get_id(tenant_profile_id)
return self.tenant_profile_controller.get_tenant_profile_by_id_using_get(tenant_profile_id=tenant_profile_id)
def set_default_tenant_profile(self, tenant_profile_id: TenantProfileId):
tenant_profile_id = self.get_id(tenant_profile_id)
return self.tenant_profile_controller.set_default_tenant_profile_using_post(tenant_profile_id=tenant_profile_id)
def get_tenant_profile_infos(self, page_size: int, page: int, text_search=None, sort_property=None,
sort_order=None):
return self.tenant_profile_controller.get_tenant_profile_infos_using_get(page_size=page_size, page=page,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
# Widgets Bundle Controller
def get_widgets_bundle_by_id(self, widgets_bundle_id: WidgetsBundleId):
widgets_bundle_id = self.get_id(widgets_bundle_id)
return self.widgets_bundle_controller.get_widgets_bundle_by_id_using_get(widgets_bundle_id=widgets_bundle_id)
def save_widgets_bundle(self, body=None):
return self.widgets_bundle_controller.save_widgets_bundle_using_post(body=body)
def get_widgets_bundles_v1(self, page_size: int, page: int, text_search=None, sort_property=None, sort_order=None):
return self.widgets_bundle_controller.get_widgets_bundles_using_get1(page_size=page_size, page=page,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def delete_widgets_bundle(self, widgets_bundle_id: WidgetsBundleId):
widgets_bundle_id = self.get_id(widgets_bundle_id)
return self.widgets_bundle_controller.delete_widgets_bundle_using_delete(widgets_bundle_id=widgets_bundle_id)
def get_widgets_bundles(self, ):
return self.widgets_bundle_controller.get_widgets_bundles_using_get()
# Device Profile Controller
def get_device_profile_infos(self, page_size: int, page: int, text_search=None, sort_property=None, sort_order=None,
transport_type=None):
return self.device_profile_controller.get_device_profile_infos_using_get(page_size=page_size, page=page,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order,
transport_type=transport_type)
def set_default_device_profile(self, device_profile_id: DeviceProfileId):
device_profile_id = self.get_id(device_profile_id)
return self.device_profile_controller.set_default_device_profile_using_post(device_profile_id=device_profile_id)
def get_attributes_keys(self, device_profile_id=None):
device_profile_id = self.get_id(device_profile_id)
return self.device_profile_controller.get_attributes_keys_using_get(device_profile_id=device_profile_id)
def delete_device_profile(self, device_profile_id: DeviceProfileId):
device_profile_id = self.get_id(device_profile_id)
return self.device_profile_controller.delete_device_profile_using_delete(device_profile_id=device_profile_id)
def save_device_profile(self, body=None):
return self.device_profile_controller.save_device_profile_using_post(body=body)
def get_default_device_profile_info(self, ):
return self.device_profile_controller.get_default_device_profile_info_using_get()
def get_timeseries_keys(self, device_profile_id=None):
device_profile_id = self.get_id(device_profile_id)
return self.device_profile_controller.get_timeseries_keys_using_get(device_profile_id=device_profile_id)
def get_device_profile_info_by_id(self, device_profile_id: DeviceProfileId):
device_profile_id = self.get_id(device_profile_id)
return self.device_profile_controller.get_device_profile_info_by_id_using_get(
device_profile_id=device_profile_id)
def get_device_profiles(self, page_size: int, page: int, text_search=None, sort_property=None, sort_order=None):
return self.device_profile_controller.get_device_profiles_using_get(page_size=page_size, page=page,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def get_device_profile_by_id(self, device_profile_id: DeviceProfileId):
device_profile_id = self.get_id(device_profile_id)
return self.device_profile_controller.get_device_profile_by_id_using_get(device_profile_id=device_profile_id)
# Dashboard Controller
def add_dashboard_customers(self, dashboard_id: DashboardId, body=None):
dashboard_id = self.get_id(dashboard_id)
return self.dashboard_controller.add_dashboard_customers_using_post(dashboard_id=dashboard_id, body=body)
def assign_dashboard_to_edge(self, edge_id: EdgeId, dashboard_id: DashboardId):
edge_id = self.get_id(edge_id)
dashboard_id = self.get_id(dashboard_id)
return self.dashboard_controller.assign_dashboard_to_edge_using_post(edge_id=edge_id, dashboard_id=dashboard_id)
def remove_dashboard_customers(self, dashboard_id: DashboardId, body=None):
dashboard_id = self.get_id(dashboard_id)
return self.dashboard_controller.remove_dashboard_customers_using_post(dashboard_id=dashboard_id, body=body)
def get_server_time(self, ):
return self.dashboard_controller.get_server_time_using_get()
def get_dashboard_by_id(self, dashboard_id: DashboardId):
dashboard_id = self.get_id(dashboard_id)
return self.dashboard_controller.get_dashboard_by_id_using_get(dashboard_id=dashboard_id)
def assign_dashboard_to_public_customer(self, dashboard_id: DashboardId):
dashboard_id = self.get_id(dashboard_id)
return self.dashboard_controller.assign_dashboard_to_public_customer_using_post(dashboard_id=dashboard_id)
def delete_dashboard(self, dashboard_id: DashboardId):
dashboard_id = self.get_id(dashboard_id)
return self.dashboard_controller.delete_dashboard_using_delete(dashboard_id=dashboard_id)
def update_dashboard_customers(self, dashboard_id: DashboardId, body=None):
dashboard_id = self.get_id(dashboard_id)
return self.dashboard_controller.update_dashboard_customers_using_post(dashboard_id=dashboard_id, body=body)
def unassign_dashboard_from_public_customer(self, dashboard_id: DashboardId):
dashboard_id = self.get_id(dashboard_id)
return self.dashboard_controller.unassign_dashboard_from_public_customer_using_delete(dashboard_id=dashboard_id)
def save_dashboard(self, body=None):
return self.dashboard_controller.save_dashboard_using_post(body=body)
def get_home_dashboard_info(self, ):
return self.dashboard_controller.get_home_dashboard_info_using_get()
def get_tenant_home_dashboard_info(self, ):
return self.dashboard_controller.get_tenant_home_dashboard_info_using_get()
def get_tenant_dashboards_v1(self, tenant_id: TenantId, page_size: int, page: int, text_search=None,
sort_property=None, sort_order=None):
tenant_id = self.get_id(tenant_id)
return self.dashboard_controller.get_tenant_dashboards_using_get1(tenant_id=tenant_id, page_size=page_size,
page=page, text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def get_dashboard_info_by_id(self, dashboard_id: DashboardId):
dashboard_id = self.get_id(dashboard_id)
return self.dashboard_controller.get_dashboard_info_by_id_using_get(dashboard_id=dashboard_id)
def unassign_dashboard_from_edge(self, edge_id: EdgeId, dashboard_id: DashboardId):
edge_id = self.get_id(edge_id)
dashboard_id = self.get_id(dashboard_id)
return self.dashboard_controller.unassign_dashboard_from_edge_using_delete(edge_id=edge_id,
dashboard_id=dashboard_id)
def get_home_dashboard(self, ):
return self.dashboard_controller.get_home_dashboard_using_get()
def get_max_datapoints_limit(self, ):
return self.dashboard_controller.get_max_datapoints_limit_using_get()
def get_tenant_dashboards(self, page_size: int, page: int, mobile=None, text_search=None, sort_property=None,
sort_order=None):
return self.dashboard_controller.get_tenant_dashboards_using_get(page_size=page_size, page=page, mobile=mobile,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def get_customer_dashboards(self, customer_id: CustomerId, page_size: int, page: int, mobile=None, text_search=None,
sort_property=None, sort_order=None):
customer_id = self.get_id(customer_id)
return self.dashboard_controller.get_customer_dashboards_using_get(customer_id=customer_id, page_size=page_size,
page=page, mobile=mobile,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def assign_dashboard_to_customer(self, customer_id: CustomerId, dashboard_id: DashboardId):
customer_id = self.get_id(customer_id)
dashboard_id = self.get_id(dashboard_id)
return self.dashboard_controller.assign_dashboard_to_customer_using_post(customer_id=customer_id,
dashboard_id=dashboard_id)
def set_tenant_home_dashboard_info(self, body=None):
return self.dashboard_controller.set_tenant_home_dashboard_info_using_post(body=body)
def get_edge_dashboards(self, edge_id: EdgeId, page_size: int, page: int, text_search=None, sort_property=None,
sort_order=None):
edge_id = self.get_id(edge_id)
return self.dashboard_controller.get_edge_dashboards_using_get(edge_id=edge_id, page_size=page_size, page=page,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def unassign_dashboard_from_customer(self, customer_id: CustomerId, dashboard_id: DashboardId):
customer_id = self.get_id(customer_id)
dashboard_id = self.get_id(dashboard_id)
return self.dashboard_controller.unassign_dashboard_from_customer_using_delete(customer_id=customer_id,
dashboard_id=dashboard_id)
# Entity Query Controller
def count_entities_by_query(self, body=None):
return self.entity_query_controller.count_entities_by_query_using_post(body=body)
def find_entity_timeseries_and_attributes_keys_by_query(self, timeseries: bool, attributes: bool, body=None):
return self.entity_query_controller.find_entity_timeseries_and_attributes_keys_by_query_using_post(
timeseries=timeseries, attributes=attributes, body=body)
def find_alarm_data_by_query(self, body=None):
return self.entity_query_controller.find_alarm_data_by_query_using_post(body=body)
def find_entity_data_by_query(self, body=None):
return self.entity_query_controller.find_entity_data_by_query_using_post(body=body)
# Widget Type Controller
def get_bundle_widget_types_infos(self, is_system: bool, bundle_alias: str):
return self.widget_type_controller.get_bundle_widget_types_infos_using_get(is_system=is_system,
bundle_alias=bundle_alias)
def get_bundle_widget_types_details(self, is_system: bool, bundle_alias: str):
return self.widget_type_controller.get_bundle_widget_types_details_using_get(is_system=is_system,
bundle_alias=bundle_alias)
def delete_widget_type(self, widget_type_id: WidgetTypeId):
widget_type_id = self.get_id(widget_type_id)
return self.widget_type_controller.delete_widget_type_using_delete(widget_type_id=widget_type_id)
def save_widget_type(self, body=None):
return self.widget_type_controller.save_widget_type_using_post(body=body)
def get_bundle_widget_types(self, is_system: bool, bundle_alias: str):
return self.widget_type_controller.get_bundle_widget_types_using_get(is_system=is_system,
bundle_alias=bundle_alias)
def get_widget_type(self, is_system: bool, bundle_alias: str, alias: str):
return self.widget_type_controller.get_widget_type_using_get(is_system=is_system, bundle_alias=bundle_alias,
alias=alias)
def get_widget_type_by_id(self, widget_type_id: WidgetTypeId):
widget_type_id = self.get_id(widget_type_id)
return self.widget_type_controller.get_widget_type_by_id_using_get(widget_type_id=widget_type_id)
# Audit Log Controller
def get_audit_logs_by_customer_id(self, customer_id: CustomerId, page_size: int, page: int, text_search=None,
sort_property=None, sort_order=None, start_time=None, end_time=None,
action_types=None):
customer_id = self.get_id(customer_id)
return self.audit_log_controller.get_audit_logs_by_customer_id_using_get(customer_id=customer_id,
page_size=page_size, page=page,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order,
start_time=start_time,
end_time=end_time,
action_types=action_types)
def get_audit_logs_by_user_id(self, user_id: UserId, page_size: int, page: int, text_search=None,
sort_property=None, sort_order=None, start_time=None, end_time=None,
action_types=None):
user_id = self.get_id(user_id)
return self.audit_log_controller.get_audit_logs_by_user_id_using_get(user_id=user_id, page_size=page_size,
page=page, text_search=text_search,
sort_property=sort_property,
sort_order=sort_order,
start_time=start_time, end_time=end_time,
action_types=action_types)
def get_audit_logs_by_entity_id(self, entity_type: str, entity_id: EntityId, page_size: int, page: int,
text_search=None, sort_property=None, sort_order=None, start_time=None,
end_time=None, action_types=None):
entity_id = self.get_id(entity_id)
return self.audit_log_controller.get_audit_logs_by_entity_id_using_get(entity_type=entity_type,
entity_id=entity_id, page_size=page_size,
page=page, text_search=text_search,
sort_property=sort_property,
sort_order=sort_order,
start_time=start_time, end_time=end_time,
action_types=action_types)
def get_audit_logs(self, page_size: int, page: int, text_search=None, sort_property=None, sort_order=None,
start_time=None, end_time=None, action_types=None):
return self.audit_log_controller.get_audit_logs_using_get(page_size=page_size, page=page,
text_search=text_search, sort_property=sort_property,
sort_order=sort_order, start_time=start_time,
end_time=end_time, action_types=action_types)
# Lwm2m Controller
def get_lwm2m_bootstrap_security_info(self, is_bootstrap_server: bool):
return self.lwm2m_controller.get_lwm2m_bootstrap_security_info_using_get(
is_bootstrap_server=is_bootstrap_server)
# UI Controller
def get_help_base_url(self, ):
return self.ui_settings_controller.get_help_base_url_using_get()
# Component Descriptor Controller
def get_component_descriptors_by_types(self, component_types: str, rule_chain_type=None):
return self.component_descriptor_controller.get_component_descriptors_by_types_using_get(
component_types=component_types, rule_chain_type=rule_chain_type)
def get_component_descriptor_by_clazz(self, component_descriptor_clazz: str):
return self.component_descriptor_controller.get_component_descriptor_by_clazz_using_get(
component_descriptor_clazz=component_descriptor_clazz)
def get_component_descriptors_by_type(self, component_type: str, rule_chain_type=None):
return self.component_descriptor_controller.get_component_descriptors_by_type_using_get(
component_type=component_type, rule_chain_type=rule_chain_type)
# Tenant Controller
def get_tenant_infos(self, page_size: int, page: int, text_search=None, sort_property=None, sort_order=None):
return self.tenant_controller.get_tenant_infos_using_get(page_size=page_size, page=page,
text_search=text_search, sort_property=sort_property,
sort_order=sort_order)
def get_tenant_by_id(self, tenant_id: TenantId):
tenant_id = self.get_id(tenant_id)
return self.tenant_controller.get_tenant_by_id_using_get(tenant_id=tenant_id)
def save_tenant(self, body=None):
return self.tenant_controller.save_tenant_using_post(body=body)
def get_tenants(self, page_size: int, page: int, text_search=None, sort_property=None, sort_order=None):
return self.tenant_controller.get_tenants_using_get(page_size=page_size, page=page, text_search=text_search,
sort_property=sort_property, sort_order=sort_order)
def get_tenant_info_by_id(self, tenant_id: TenantId):
tenant_id = self.get_id(tenant_id)
return self.tenant_controller.get_tenant_info_by_id_using_get(tenant_id=tenant_id)
def delete_tenant(self, tenant_id: TenantId):
tenant_id = self.get_id(tenant_id)
return self.tenant_controller.delete_tenant_using_delete(tenant_id=tenant_id)
# OTA Package Controller
def delete_ota_package(self, ota_package_id: OtaPackageId):
ota_package_id = self.get_id(ota_package_id)
return self.ota_package_controller.delete_ota_package_using_delete(ota_package_id=ota_package_id)
def get_ota_packages_v1(self, device_profile_id: DeviceProfileId, type: str, page_size: int, page: int,
text_search=None, sort_property=None, sort_order=None):
device_profile_id = self.get_id(device_profile_id)
return self.ota_package_controller.get_ota_packages_using_get1(device_profile_id=device_profile_id, type=type,
page_size=page_size, page=page,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def save_ota_package_data(self, checksum_algorithm: str, ota_package_id: OtaPackageId, body=None, checksum=None):
ota_package_id = self.get_id(ota_package_id)
return self.ota_package_controller.save_ota_package_data_using_post(checksum_algorithm=checksum_algorithm,
ota_package_id=ota_package_id, body=body,
checksum=checksum)
def save_ota_package_info(self, body=None):
return self.ota_package_controller.save_ota_package_info_using_post(body=body)
def get_ota_packages(self, page_size: int, page: int, text_search=None, sort_property=None, sort_order=None):
return self.ota_package_controller.get_ota_packages_using_get(page_size=page_size, page=page,
text_search=text_search,
sort_property=sort_property,
sort_order=sort_order)
def get_ota_package_by_id(self, ota_package_id: OtaPackageId):
ota_package_id = self.get_id(ota_package_id)
return self.ota_package_controller.get_ota_package_by_id_using_get(ota_package_id=ota_package_id)
def download_ota_package(self, ota_package_id: OtaPackageId):
ota_package_id = self.get_id(ota_package_id)
return self.ota_package_controller.download_ota_package_using_get(ota_package_id=ota_package_id)
def get_ota_package_info_by_id(self, ota_package_id: OtaPackageId):
ota_package_id = self.get_id(ota_package_id)
return self.ota_package_controller.get_ota_package_info_by_id_using_get(ota_package_id=ota_package_id)
| 62.739099 | 131 | 0.636387 | 10,825 | 87,772 | 4.691917 | 0.029931 | 0.057492 | 0.032605 | 0.03985 | 0.844143 | 0.789545 | 0.736917 | 0.685017 | 0.647017 | 0.599921 | 0 | 0.00124 | 0.301702 | 87,772 | 1,398 | 132 | 62.783977 | 0.82743 | 0.014617 | 0 | 0.380095 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.277725 | false | 0.007583 | 0.008531 | 0.125118 | 0.563981 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
b62690c88dbda876ce43f49ddad84b8bd065b7f1 | 114 | py | Python | core/nn/blocks/__init__.py | rdbch/tutorial_simclr | 7083b5c1bd75a14cf63899249664f17ca7e10331 | [
"MIT"
] | 9 | 2021-02-12T15:19:56.000Z | 2021-03-02T19:45:40.000Z | core/nn/blocks/__init__.py | rdbch/tutorial_simclr | 7083b5c1bd75a14cf63899249664f17ca7e10331 | [
"MIT"
] | null | null | null | core/nn/blocks/__init__.py | rdbch/tutorial_simclr | 7083b5c1bd75a14cf63899249664f17ca7e10331 | [
"MIT"
] | null | null | null | from .linear_block import LinearBlock
from .conv2d_block import Conv2dBlock
from .resblock import ResBlock
| 28.5 | 38 | 0.807018 | 14 | 114 | 6.428571 | 0.571429 | 0.244444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021053 | 0.166667 | 114 | 3 | 39 | 38 | 0.926316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b63bb54f772952ccc8f56c018320721bb28b903c | 22 | py | Python | cringebot/__init__.py | tolfino/cringebot | 753a3a62c73ea8a1c56f11baab1a970dcfd03180 | [
"WTFPL"
] | null | null | null | cringebot/__init__.py | tolfino/cringebot | 753a3a62c73ea8a1c56f11baab1a970dcfd03180 | [
"WTFPL"
] | null | null | null | cringebot/__init__.py | tolfino/cringebot | 753a3a62c73ea8a1c56f11baab1a970dcfd03180 | [
"WTFPL"
] | null | null | null | from .bot import main
| 11 | 21 | 0.772727 | 4 | 22 | 4.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 22 | 1 | 22 | 22 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b6500d283958f349ed83b8e978d10cc44bcdcef0 | 34 | py | Python | src/gym_tictictoe/envs/__init__.py | qorrect/boardgames | e7462e09949f81959b0858c12da106b418b6c488 | [
"MIT"
] | null | null | null | src/gym_tictictoe/envs/__init__.py | qorrect/boardgames | e7462e09949f81959b0858c12da106b418b6c488 | [
"MIT"
] | null | null | null | src/gym_tictictoe/envs/__init__.py | qorrect/boardgames | e7462e09949f81959b0858c12da106b418b6c488 | [
"MIT"
] | null | null | null | from .tictictoe import TicTicToe
| 11.333333 | 32 | 0.823529 | 4 | 34 | 7 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147059 | 34 | 2 | 33 | 17 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b66fe7eeae60fd85bd1336db1aec591880030e0b | 8,414 | py | Python | 7_stg/model/utils/train_utils.py | mackelab/IdentifyMechanisticModels_2020 | b93c90ec6156ae5f8afee6aaac7317373e9caf5e | [
"MIT"
] | 3 | 2020-10-23T02:53:11.000Z | 2021-03-12T11:04:37.000Z | 7_stg/model/utils/train_utils.py | mackelab/IdentifyMechanisticModels_2020 | b93c90ec6156ae5f8afee6aaac7317373e9caf5e | [
"MIT"
] | null | null | null | 7_stg/model/utils/train_utils.py | mackelab/IdentifyMechanisticModels_2020 | b93c90ec6156ae5f8afee6aaac7317373e9caf5e | [
"MIT"
] | 1 | 2021-07-28T08:38:05.000Z | 2021-07-28T08:38:05.000Z | import numpy as np
import sys
sys.path.append("model/setup")
import netio
#from delfi.utils.utils_prinzetal import inv_logistic_fct, logistic_fct
def load_trn_data(filedir, params):
# loading data. Files in the archive are 'params' and 'stats'
data = np.load(filedir) # samples_dir = results/samples/
# 7 parameters: Na+ current, CaT current (T-type Calcium, low-threshold), CaS current, A current (transient potassium current), KCa current, Kd current, H current (hyperpolarization current)
sample_params = data["params"] # there are 7 parameters in the network
sample_stats = data["stats"] # there are 15 summary_stats in 'PrinzStats' (see the params variable above).
# These 15 stats can be seen in summstats.py. They are: cycle_period, burst_length*3, end_to_start*2, start_to_end*2, duty_cycle*3, phase_gap*2, phase*2
prior = netio.create_prior(params, log=True)
sample_params = (sample_params - prior.lower) / (prior.upper - prior.lower)
sample_params = inv_logistic_fct(np.asarray(sample_params))
# normalize data
params_mean = np.mean(sample_params, axis=0)
params_std = np.std(sample_params, axis=0)
sample_params = (sample_params - params_mean) / params_std
# extract number of training samples
sample_params_pilot = sample_params[:params.pilot_samples]
sample_stats_pilot = sample_stats[:params.pilot_samples]
sample_params_train = sample_params[params.pilot_samples:params.pilot_samples + params.n_train]
sample_stats_train = sample_stats[params.pilot_samples:params.pilot_samples + params.n_train]
pilot_data = (sample_params_pilot, sample_stats_pilot)
trn_data = [sample_params_train, sample_stats_train] # taking log of conductances to get the training data
return pilot_data, trn_data, params_mean, params_std
from scipy.special import expit, logit
import delfi.distribution as dd
def load_trn_data_newTF(filedir, params):
# loading data. Files in the archive are 'params' and 'stats'
data = np.load(filedir) # samples_dir = results/samples/
# 7 parameters: Na+ current, CaT current (T-type Calcium, low-threshold), CaS current, A current (transient potassium current), KCa current, Kd current, H current (hyperpolarization current)
sample_params = data["params"] # there are 7 parameters in the network
sample_stats = data["stats"] # there are 15 summary_stats in 'PrinzStats' (see the params variable above).
# These 15 stats can be seen in summstats.py. They are: cycle_period, burst_length*3, end_to_start*2, start_to_end*2, duty_cycle*3, phase_gap*2, phase*2
prior = netio.create_prior(params, log=True)
lower = np.asarray(prior.lower)
upper = np.asarray(prior.upper)
inputscale = lambda x: (x - lower) / (upper - lower)
bijection = lambda x: logit(inputscale(x)) # logit function with scaled input
sample_params = bijection(sample_params)
# normalize data
params_mean = np.mean(sample_params, axis=0)
params_std = np.std(sample_params, axis=0)
sample_params = (sample_params - params_mean) / params_std
# extract number of training samples
sample_params_pilot = sample_params[:params.pilot_samples]
sample_stats_pilot = sample_stats[:params.pilot_samples]
sample_params_train = sample_params[params.pilot_samples:params.pilot_samples + params.n_train]
sample_stats_train = sample_stats[params.pilot_samples:params.pilot_samples + params.n_train]
pilot_data = (sample_params_pilot, sample_stats_pilot)
trn_data = [sample_params_train, sample_stats_train] # taking log of conductances to get the training data
return pilot_data, trn_data, params_mean, params_std
def load_trn_data_normalize(filedir, params):
# loading data. Files in the archive are 'params' and 'stats'
data = np.load(filedir) # samples_dir = results/samples/
# 7 parameters: Na+ current, CaT current (T-type Calcium, low-threshold), CaS current, A current (transient potassium current), KCa current, Kd current, H current (hyperpolarization current)
sample_params = data["params"] # there are 7 parameters in the network
sample_stats = data["stats"] # there are 15 summary_stats in 'PrinzStats' (see the params variable above).
# These 15 stats can be seen in summstats.py. They are: cycle_period, burst_length*3, end_to_start*2, start_to_end*2, duty_cycle*3, phase_gap*2, phase*2
prior = netio.create_prior(params, log=True)
# normalize data
params_mean = prior.mean
params_std = prior.std
sample_params = (sample_params - params_mean) / params_std
# extract number of training samples
sample_params_pilot = sample_params[:params.pilot_samples]
sample_stats_pilot = sample_stats[:params.pilot_samples]
sample_params_train = sample_params[params.pilot_samples:params.pilot_samples + params.n_train]
sample_stats_train = sample_stats[params.pilot_samples:params.pilot_samples + params.n_train]
pilot_data = (sample_params_pilot, sample_stats_pilot)
trn_data = [sample_params_train, sample_stats_train] # taking log of conductances to get the training data
return pilot_data, trn_data, params_mean, params_std
def forward_tf(cond_params, prior=None, params_mean=None, params_std=None, steps='111'):
if steps[0] == '1': cond_params = (cond_params - prior.lower) / (prior.upper - prior.lower)
if steps[1] == '1': cond_params = inv_logistic_fct(np.asarray(cond_params))
if steps[2] == '1': cond_params = (cond_params - params_mean) / params_std
return cond_params
def forward_tf_newTF(cond_params, prior=None):
lower = np.asarray(prior.lower)
upper = np.asarray(prior.upper)
inputscale = lambda x: (x - lower) / (upper - lower)
bijection = lambda x: logit(inputscale(x)) # logit function with scaled input
cond_params = bijection(cond_params)
return cond_params
def load_pair(filedir, **kwargs):
data_xo = np.load(filedir)
xo_params1 = data_xo["params1"]
xo_stats1 = data_xo["summstats1"]
xo_params1 = forward_tf(xo_params1, **kwargs)
xo_params2 = data_xo["params2"]
xo_stats2 = data_xo["summstats2"]
xo_params2 = forward_tf(xo_params2, **kwargs)
return xo_params1, xo_stats1, xo_params2, xo_stats2
def load_pair_newTF(filedir, **kwargs):
data_xo = np.load(filedir)
xo_params1 = data_xo["params1"]
xo_stats1 = data_xo["summstats1"]
xo_params1 = forward_tf_newTF(xo_params1, **kwargs)
xo_params2 = data_xo["params2"]
xo_stats2 = data_xo["summstats2"]
xo_params2 = forward_tf_newTF(xo_params2, **kwargs)
return xo_params1, xo_stats1, xo_params2, xo_stats2
def load_pair_normalize(filedir, prior):
params_mean = prior.mean
params_std = prior.std
data_xo = np.load(filedir)
xo_params1 = data_xo["params1"]
xo_stats1 = data_xo["summstats1"]
xo_params1 = (xo_params1 - params_mean) / params_std
xo_params2 = data_xo["params2"]
xo_stats2 = data_xo["summstats2"]
xo_params2 = (xo_params2 - params_mean) / params_std
return xo_params1, xo_stats1, xo_params2, xo_stats2
def load_single_sample_normalize(filedir, prior, log_synapses=False):
params_mean = prior.mean
params_std = prior.std
data_xo = np.load(filedir)
xo_params1 = data_xo["params1"]
xo_stats1 = data_xo["summstats1"]
if log_synapses:
xo_params1[-17:-10] = np.log(xo_params1[-17:-10])
xo_params1 = (xo_params1 - params_mean) / params_std
return xo_params1, xo_stats1
def load_single_sample(filedir, **kwargs):
data_xo = np.load(filedir)
xo_params1 = data_xo["params1"]
xo_stats1 = data_xo["summstats1"]
xo_params1 = forward_tf(xo_params1, **kwargs)
return xo_params1, xo_stats1
def save_samples(sample_params, sample_stats, sample_seed, index_fast, save_samples, case='fast'):
counter = 0
for inspect_num in range(len(save_samples)):
if save_samples[inspect_num]:
outfile_pair = '../../results/observations/31D/' + case + '_sample_{}.npz'.format(counter)
np.savez_compressed(outfile_pair, index1=int(index_fast[inspect_num]),
params1=sample_params[int(index_fast[inspect_num])],
summstats1=sample_stats[int(index_fast[inspect_num])],
seed1=sample_seed[int(index_fast[inspect_num])])
counter += 1 | 42.281407 | 194 | 0.726408 | 1,199 | 8,414 | 4.822352 | 0.134279 | 0.080941 | 0.056036 | 0.04981 | 0.837772 | 0.812003 | 0.796956 | 0.784504 | 0.770322 | 0.759772 | 0 | 0.018892 | 0.175897 | 8,414 | 199 | 195 | 42.281407 | 0.81497 | 0.24685 | 0 | 0.688 | 0 | 0 | 0.037254 | 0.004914 | 0 | 0 | 0 | 0 | 0 | 1 | 0.088 | false | 0 | 0.04 | 0 | 0.208 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b6707c867b73977b427b69b44822b70b1cbbdaaa | 24,425 | py | Python | tests/test_mongo_controller_audit.py | lucafaggianelli/layabase | 90733c6b9efd56051dfce5c3d89bd4e657ce7b3f | [
"MIT"
] | 3 | 2019-12-02T23:29:44.000Z | 2019-12-31T00:55:01.000Z | tests/test_mongo_controller_audit.py | lucafaggianelli/layabase | 90733c6b9efd56051dfce5c3d89bd4e657ce7b3f | [
"MIT"
] | 29 | 2019-12-02T16:12:45.000Z | 2022-02-17T16:01:55.000Z | tests/test_mongo_controller_audit.py | lucafaggianelli/layabase | 90733c6b9efd56051dfce5c3d89bd4e657ce7b3f | [
"MIT"
] | 3 | 2020-01-02T10:58:47.000Z | 2022-02-17T10:55:18.000Z | import datetime
import pytest
import layabase
import layabase.mongo
from layabase.testing import mock_mongo_audit_datetime
@pytest.fixture
def controller() -> layabase.CRUDController:
class TestCollection:
__collection_name__ = "test"
key = layabase.mongo.Column(str, is_primary_key=True)
mandatory = layabase.mongo.Column(int, is_nullable=False)
optional = layabase.mongo.Column(str)
controller = layabase.CRUDController(TestCollection, audit=True)
layabase.load("mongomock?ssl=True", [controller], replicaSet="globaldb")
return controller
def test_get_all_without_data_returns_empty_list(controller: layabase.CRUDController):
assert controller.get({}) == []
assert controller.get_audit({}) == []
def test_audit_table_name_is_forbidden():
class TestCollection:
__collection_name__ = "audit"
key = layabase.mongo.Column(str)
with pytest.raises(Exception) as exception_info:
layabase.load(
"mongomock?ssl=True",
[layabase.CRUDController(TestCollection)],
replicaSet="globaldb",
)
assert "audit is a reserved collection name." == str(exception_info.value)
def test_audited_table_name_is_forbidden():
class TestCollection:
__collection_name__ = "audit_test"
key = layabase.mongo.Column(str)
with pytest.raises(Exception) as exception_info:
layabase.load(
"mongomock?ssl=True",
[layabase.CRUDController(TestCollection)],
replicaSet="globaldb",
)
assert str(exception_info.value) == "audit_test is a reserved collection name."
def test_post_with_nothing_is_invalid(controller: layabase.CRUDController):
with pytest.raises(layabase.ValidationFailed) as exception_info:
controller.post(None)
assert exception_info.value.errors == {"": ["No data provided."]}
assert not exception_info.value.received_data
assert controller.get_audit({}) == []
def test_post_many_with_nothing_is_invalid(controller: layabase.CRUDController):
with pytest.raises(layabase.ValidationFailed) as exception_info:
controller.post_many(None)
assert exception_info.value.errors == {"": ["No data provided."]}
assert exception_info.value.received_data == []
assert controller.get_audit({}) == []
def test_post_with_empty_dict_is_invalid(controller: layabase.CRUDController):
with pytest.raises(layabase.ValidationFailed) as exception_info:
controller.post({})
assert {
"key": ["Missing data for required field."],
"mandatory": ["Missing data for required field."],
} == exception_info.value.errors
assert exception_info.value.received_data == {}
assert controller.get_audit({}) == []
def test_post_many_with_empty_list_is_invalid(controller: layabase.CRUDController):
with pytest.raises(layabase.ValidationFailed) as exception_info:
controller.post_many([])
assert exception_info.value.errors == {"": ["No data provided."]}
assert exception_info.value.received_data == []
assert controller.get_audit({}) == []
def test_put_with_nothing_is_invalid(controller: layabase.CRUDController):
with pytest.raises(layabase.ValidationFailed) as exception_info:
controller.put(None)
assert exception_info.value.errors == {"": ["No data provided."]}
assert not exception_info.value.received_data
assert controller.get_audit({}) == []
def test_put_with_empty_dict_is_invalid(controller: layabase.CRUDController):
with pytest.raises(layabase.ValidationFailed) as exception_info:
controller.put({})
assert exception_info.value.errors == {"key": ["Missing data for required field."]}
assert exception_info.value.received_data == {}
assert controller.get_audit({}) == []
def test_delete_without_nothing_do_not_fail(controller: layabase.CRUDController):
assert controller.delete({}) == 0
assert controller.get_audit({}) == []
def test_post_without_mandatory_field_is_invalid(controller: layabase.CRUDController):
with pytest.raises(layabase.ValidationFailed) as exception_info:
controller.post({"key": "my_key"})
assert exception_info.value.errors == {
"mandatory": ["Missing data for required field."]
}
assert exception_info.value.received_data == {"key": "my_key"}
assert controller.get_audit({}) == []
def test_post_many_without_mandatory_field_is_invalid(controller: layabase.CRUDController):
with pytest.raises(layabase.ValidationFailed) as exception_info:
controller.post_many([{"key": "my_key"}])
assert exception_info.value.errors == {
0: {"mandatory": ["Missing data for required field."]}
}
assert exception_info.value.received_data == [{"key": "my_key"}]
assert controller.get_audit({}) == []
def test_post_without_key_is_invalid(controller: layabase.CRUDController):
with pytest.raises(layabase.ValidationFailed) as exception_info:
controller.post({"mandatory": 1})
assert exception_info.value.errors == {"key": ["Missing data for required field."]}
assert exception_info.value.received_data == {"mandatory": 1}
assert controller.get_audit({}) == []
def test_post_many_without_key_is_invalid(controller: layabase.CRUDController):
with pytest.raises(layabase.ValidationFailed) as exception_info:
controller.post_many([{"mandatory": 1}])
assert exception_info.value.errors == {
0: {"key": ["Missing data for required field."]}
}
assert exception_info.value.received_data == [{"mandatory": 1}]
assert controller.get_audit({}) == []
def test_post_with_wrong_type_is_invalid(controller: layabase.CRUDController):
with pytest.raises(layabase.ValidationFailed) as exception_info:
controller.post({"key": datetime.date(2007, 12, 5), "mandatory": 1})
assert exception_info.value.errors == {"key": ["Not a valid str."]}
assert exception_info.value.received_data == {
"key": datetime.date(2007, 12, 5),
"mandatory": 1,
}
assert controller.get_audit({}) == []
def test_post_many_with_wrong_type_is_invalid(controller: layabase.CRUDController):
with pytest.raises(layabase.ValidationFailed) as exception_info:
controller.post_many([{"key": datetime.date(2007, 12, 5), "mandatory": 1}])
assert exception_info.value.errors == {0: {"key": ["Not a valid str."]}}
assert exception_info.value.received_data == [
{"key": datetime.date(2007, 12, 5), "mandatory": 1}
]
assert controller.get_audit({}) == []
def test_put_with_wrong_type_is_invalid(controller: layabase.CRUDController, mock_mongo_audit_datetime):
controller.post({"key": "value1", "mandatory": 1})
with pytest.raises(layabase.ValidationFailed) as exception_info:
controller.put({"key": "value1", "mandatory": "invalid_value"})
assert exception_info.value.errors == {"mandatory": ["Not a valid int."]}
assert exception_info.value.received_data == {
"key": "value1",
"mandatory": "invalid_value",
}
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "value1",
"mandatory": 1,
"optional": None,
"revision": 1,
}
]
def test_post_without_optional_is_valid(controller: layabase.CRUDController, mock_mongo_audit_datetime):
assert controller.post({"key": "my_key", "mandatory": 1}) == {
"optional": None,
"mandatory": 1,
"key": "my_key",
}
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key",
"mandatory": 1,
"optional": None,
"revision": 1,
}
]
def test_post_many_without_optional_is_valid(controller: layabase.CRUDController, mock_mongo_audit_datetime):
assert controller.post_many([{"key": "my_key", "mandatory": 1}]) == [
{"optional": None, "mandatory": 1, "key": "my_key"}
]
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key",
"mandatory": 1,
"optional": None,
"revision": 1,
}
]
def test_put_many_is_valid(controller: layabase.CRUDController, mock_mongo_audit_datetime):
controller.post_many(
[{"key": "my_key", "mandatory": 1}, {"key": "my_key2", "mandatory": 2}]
)
controller.put_many(
[{"key": "my_key", "optional": "test"}, {"key": "my_key2", "mandatory": 3}]
)
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key",
"mandatory": 1,
"optional": None,
"revision": 1,
},
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key2",
"mandatory": 2,
"optional": None,
"revision": 2,
},
{
"audit_action": "Update",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key",
"mandatory": 1,
"optional": "test",
"revision": 3,
},
{
"audit_action": "Update",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key2",
"mandatory": 3,
"optional": None,
"revision": 4,
},
]
def test_post_with_optional_is_valid(controller: layabase.CRUDController, mock_mongo_audit_datetime):
assert controller.post(
{"key": "my_key", "mandatory": 1, "optional": "my_value"}
) == {"mandatory": 1, "key": "my_key", "optional": "my_value"}
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key",
"mandatory": 1,
"optional": "my_value",
"revision": 1,
}
]
def test_post_many_with_optional_is_valid(controller: layabase.CRUDController, mock_mongo_audit_datetime):
assert controller.post_many(
[{"key": "my_key", "mandatory": 1, "optional": "my_value"}]
) == [{"mandatory": 1, "key": "my_key", "optional": "my_value"}]
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key",
"mandatory": 1,
"optional": "my_value",
"revision": 1,
}
]
def test_post_with_unknown_field_is_valid(controller: layabase.CRUDController, mock_mongo_audit_datetime):
assert controller.post(
{
"key": "my_key",
"mandatory": 1,
"optional": "my_value",
# This field do not exists in schema
"unknown": "my_value",
}
) == {"optional": "my_value", "mandatory": 1, "key": "my_key"}
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key",
"mandatory": 1,
"optional": "my_value",
"revision": 1,
}
]
def test_post_many_with_unknown_field_is_valid(controller: layabase.CRUDController, mock_mongo_audit_datetime):
assert controller.post_many(
[
{
"key": "my_key",
"mandatory": 1,
"optional": "my_value",
# This field do not exists in schema
"unknown": "my_value",
}
]
) == [{"optional": "my_value", "mandatory": 1, "key": "my_key"}]
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key",
"mandatory": 1,
"optional": "my_value",
"revision": 1,
}
]
def test_get_without_filter_is_retrieving_the_only_item(
controller: layabase.CRUDController, mock_mongo_audit_datetime
):
controller.post({"key": "my_key1", "mandatory": 1, "optional": "my_value1"})
assert controller.get({}) == [
{"mandatory": 1, "optional": "my_value1", "key": "my_key1"}
]
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value1",
"revision": 1,
}
]
def test_get_without_filter_is_retrieving_everything_with_multiple_posts(
controller: layabase.CRUDController, mock_mongo_audit_datetime
):
controller.post({"key": "my_key1", "mandatory": 1, "optional": "my_value1"})
controller.post({"key": "my_key2", "mandatory": 2, "optional": "my_value2"})
assert controller.get({}) == [
{"key": "my_key1", "mandatory": 1, "optional": "my_value1"},
{"key": "my_key2", "mandatory": 2, "optional": "my_value2"},
]
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value1",
"revision": 1,
},
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key2",
"mandatory": 2,
"optional": "my_value2",
"revision": 2,
},
]
def test_get_without_filter_is_retrieving_everything(
controller: layabase.CRUDController, mock_mongo_audit_datetime
):
controller.post_many(
[
{"key": "my_key1", "mandatory": 1, "optional": "my_value1"},
{"key": "my_key2", "mandatory": 2, "optional": "my_value2"},
]
)
assert controller.get({}) == [
{"key": "my_key1", "mandatory": 1, "optional": "my_value1"},
{"key": "my_key2", "mandatory": 2, "optional": "my_value2"},
]
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value1",
"revision": 1,
},
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key2",
"mandatory": 2,
"optional": "my_value2",
"revision": 2,
},
]
def test_get_with_filter_is_retrieving_subset(controller: layabase.CRUDController, mock_mongo_audit_datetime):
controller.post({"key": "my_key1", "mandatory": 1, "optional": "my_value1"})
controller.post({"key": "my_key2", "mandatory": 2, "optional": "my_value2"})
assert controller.get({"optional": "my_value1"}) == [
{"key": "my_key1", "mandatory": 1, "optional": "my_value1"}
]
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value1",
"revision": 1,
},
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key2",
"mandatory": 2,
"optional": "my_value2",
"revision": 2,
},
]
def test_put_is_updating(controller: layabase.CRUDController, mock_mongo_audit_datetime):
controller.post({"key": "my_key1", "mandatory": 1, "optional": "my_value1"})
assert controller.put({"key": "my_key1", "optional": "my_value"}) == (
{"key": "my_key1", "mandatory": 1, "optional": "my_value1"},
{"key": "my_key1", "mandatory": 1, "optional": "my_value"},
)
assert controller.get({"mandatory": 1}) == [
{"key": "my_key1", "mandatory": 1, "optional": "my_value"}
]
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value1",
"revision": 1,
},
{
"audit_action": "Update",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value",
"revision": 2,
},
]
def test_put_is_updating_and_previous_value_cannot_be_used_to_filter(
controller: layabase.CRUDController, mock_mongo_audit_datetime
):
controller.post({"key": "my_key1", "mandatory": 1, "optional": "my_value1"})
controller.put({"key": "my_key1", "optional": "my_value"})
assert controller.get({"optional": "my_value1"}) == []
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value1",
"revision": 1,
},
{
"audit_action": "Update",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value",
"revision": 2,
},
]
def test_delete_with_filter_is_removing_the_proper_row(
controller: layabase.CRUDController, mock_mongo_audit_datetime
):
controller.post({"key": "my_key1", "mandatory": 1, "optional": "my_value1"})
controller.post({"key": "my_key2", "mandatory": 2, "optional": "my_value2"})
assert controller.delete({"key": "my_key1"}) == 1
assert controller.get({}) == [
{"key": "my_key2", "mandatory": 2, "optional": "my_value2"}
]
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value1",
"revision": 1,
},
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key2",
"mandatory": 2,
"optional": "my_value2",
"revision": 2,
},
{
"audit_action": "Delete",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value1",
"revision": 3,
},
]
def test_audit_filter_is_returning_only_selected_data(
controller: layabase.CRUDController, mock_mongo_audit_datetime
):
controller.post({"key": "my_key1", "mandatory": 1, "optional": "my_value1"})
controller.put({"key": "my_key1", "mandatory": 2})
controller.delete({"key": "my_key1"})
assert controller.get_audit({"key": "my_key1"}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value1",
"revision": 1,
},
{
"audit_action": "Update",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 2,
"optional": "my_value1",
"revision": 2,
},
{
"audit_action": "Delete",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 2,
"optional": "my_value1",
"revision": 3,
},
]
def test_audit_filter_on_audit_collection_is_returning_only_selected_data(
controller: layabase.CRUDController, mock_mongo_audit_datetime
):
controller.post({"key": "my_key1", "mandatory": 1, "optional": "my_value1"})
controller.put({"key": "my_key1", "mandatory": 2})
controller.delete({"key": "my_key1"})
assert controller.get_audit({"audit_action": "Update"}) == [
{
"audit_action": "Update",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 2,
"optional": "my_value1",
"revision": 2,
}
]
def test_value_can_be_updated_to_previous_value(controller: layabase.CRUDController, mock_mongo_audit_datetime):
controller.post({"key": "my_key1", "mandatory": 1, "optional": "my_value1"})
controller.put({"key": "my_key1", "mandatory": 2})
controller.put({"key": "my_key1", "mandatory": 1}) # Put back initial value
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value1",
"revision": 1,
},
{
"audit_action": "Update",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 2,
"optional": "my_value1",
"revision": 2,
},
{
"audit_action": "Update",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value1",
"revision": 3,
},
]
def test_update_index(controller: layabase.CRUDController):
# Assert no error is thrown
controller._model.update_indexes()
def test_delete_without_filter_is_removing_everything(
controller: layabase.CRUDController, mock_mongo_audit_datetime
):
controller.post({"key": "my_key1", "mandatory": 1, "optional": "my_value1"})
controller.post({"key": "my_key2", "mandatory": 2, "optional": "my_value2"})
assert 2 == controller.delete({})
assert [] == controller.get({})
assert controller.get_audit({}) == [
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value1",
"revision": 1,
},
{
"audit_action": "Insert",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key2",
"mandatory": 2,
"optional": "my_value2",
"revision": 2,
},
{
"audit_action": "Delete",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key1",
"mandatory": 1,
"optional": "my_value1",
"revision": 3,
},
{
"audit_action": "Delete",
"audit_date_utc": "2018-10-11T15:05:05.663000",
"audit_user": "",
"key": "my_key2",
"mandatory": 2,
"optional": "my_value2",
"revision": 4,
},
]
| 34.065551 | 112 | 0.565691 | 2,546 | 24,425 | 5.14729 | 0.054203 | 0.03472 | 0.065929 | 0.062572 | 0.918504 | 0.900038 | 0.881572 | 0.871881 | 0.852346 | 0.827318 | 0 | 0.056601 | 0.27738 | 24,425 | 716 | 113 | 34.113128 | 0.685892 | 0.004831 | 0 | 0.621622 | 0 | 0 | 0.261542 | 0.038515 | 0 | 0 | 0 | 0 | 0.127186 | 1 | 0.058824 | false | 0 | 0.007949 | 0 | 0.085851 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1e277926b12977e482e1695b9c84f25f7c3e3af7 | 151 | py | Python | Arithmatic/sub/Division.py | MuditMaurya/Calculator | 765cc39300cc98aeb7cec3e9cb916814fad0965d | [
"MIT"
] | 1 | 2019-01-29T19:33:03.000Z | 2019-01-29T19:33:03.000Z | Arithmatic/sub/Division.py | MuditMaurya/Calculator | 765cc39300cc98aeb7cec3e9cb916814fad0965d | [
"MIT"
] | null | null | null | Arithmatic/sub/Division.py | MuditMaurya/Calculator | 765cc39300cc98aeb7cec3e9cb916814fad0965d | [
"MIT"
] | 2 | 2019-01-10T17:40:05.000Z | 2019-02-13T20:56:15.000Z | class Division:
def __init__(self,fnum,snum):
self.fnum=fnum
self.snum=snum
def allDiv(self):
self.sub=self.fnum/self.snum
return self.sub | 18.875 | 30 | 0.721854 | 25 | 151 | 4.2 | 0.4 | 0.228571 | 0.228571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152318 | 151 | 8 | 31 | 18.875 | 0.820313 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
1e6392d1f93c35259e1dae646f2ed9fa51b002f4 | 1,912 | py | Python | modules/healthcheck.py | mmarukaw/oci-bulk-operations | 9dc5434ddbfcb80abaf250036d752e4d22e1c00b | [
"MIT"
] | 1 | 2021-03-22T02:59:19.000Z | 2021-03-22T02:59:19.000Z | modules/healthcheck.py | mmarukaw/oci-bulk-operations | 9dc5434ddbfcb80abaf250036d752e4d22e1c00b | [
"MIT"
] | null | null | null | modules/healthcheck.py | mmarukaw/oci-bulk-operations | 9dc5434ddbfcb80abaf250036d752e4d22e1c00b | [
"MIT"
] | null | null | null | import oci
from modules.common import *
client = oci.healthchecks.HealthChecksClient
def purge_http_monitors(config, signer, compartments, region_name):
target = TargetResources()
target.resource_names = ['healthcheck http monitors']
target.action = 'DELETE'
target.list_methods = [client(config, signer=signer).list_http_monitors]
target.list_args = [None]
target.dispname_keys = ['display_name']
target.parentid_keys = [None]
target.get_method = client(config, signer=signer).get_http_monitor
target.action_method = client(config, signer=signer).delete_http_monitor
target.action_args = {}
def filter_logic(resource):
if (resource.home_region == region_name):
return True
else:
return False
target.filter_logics = [filter_logic]
target_resources = target.list(compartments)
target.commit_action(target_resources)
target.wait_completion(target_resources)
def purge_ping_monitors(config, signer, compartments, region_name):
target = TargetResources()
target.resource_names = ['healthcheck ping monitors']
target.action = 'DELETE'
target.list_methods = [client(config, signer=signer).list_ping_monitors]
target.list_args = [None]
target.dispname_keys = ['display_name']
target.parentid_keys = [None]
target.get_method = client(config, signer=signer).get_ping_monitor
target.action_method = client(config, signer=signer).delete_ping_monitor
target.action_args = {}
def filter_logic(resource):
if (resource.home_region == region_name):
return True
else:
return False
target.filter_logics = [filter_logic]
target_resources = target.list(compartments)
target.commit_action(target_resources)
target.wait_completion(target_resources)
| 36.075472 | 79 | 0.694038 | 212 | 1,912 | 5.995283 | 0.231132 | 0.075531 | 0.084972 | 0.113297 | 0.900079 | 0.900079 | 0.900079 | 0.900079 | 0.900079 | 0.813533 | 0 | 0 | 0.215481 | 1,912 | 52 | 80 | 36.769231 | 0.847333 | 0 | 0 | 0.697674 | 0 | 0 | 0.044979 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.093023 | false | 0 | 0.046512 | 0 | 0.232558 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1ea40f660ddfdd009d88e5ef52c29ee6fae0085f | 50 | py | Python | appserver/neo4japp/services/annotations/data_transfer_objects/__init__.py | SBRG/lifelike | a7b715f38b389a585c10e6d0d067345937455c13 | [
"MIT"
] | 8 | 2022-01-28T08:43:07.000Z | 2022-03-23T11:18:10.000Z | appserver/neo4japp/services/annotations/data_transfer_objects/__init__.py | SBRG/lifelike | a7b715f38b389a585c10e6d0d067345937455c13 | [
"MIT"
] | 23 | 2022-02-14T15:25:00.000Z | 2022-03-28T15:30:45.000Z | appserver/neo4japp/services/annotations/data_transfer_objects/__init__.py | SBRG/lifelike | a7b715f38b389a585c10e6d0d067345937455c13 | [
"MIT"
] | 5 | 2022-01-28T15:45:44.000Z | 2022-03-14T11:36:49.000Z | from .dto import *
from .dto_func_params import *
| 16.666667 | 30 | 0.76 | 8 | 50 | 4.5 | 0.625 | 0.388889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 50 | 2 | 31 | 25 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1ebe3cff7d99f3b3464dd993c3077e7e190a9898 | 27,334 | py | Python | tests/datalake/test_channel.py | abeja-inc/abeja-platform-sdk | 97cfc99b11ffc1fccb3f527435277bc89e18b8c3 | [
"Apache-2.0"
] | 2 | 2020-10-20T18:38:16.000Z | 2020-10-20T20:12:35.000Z | tests/datalake/test_channel.py | abeja-inc/abeja-platform-sdk | 97cfc99b11ffc1fccb3f527435277bc89e18b8c3 | [
"Apache-2.0"
] | 30 | 2020-04-07T01:15:47.000Z | 2020-11-18T03:25:19.000Z | tests/datalake/test_channel.py | abeja-inc/abeja-platform-sdk | 97cfc99b11ffc1fccb3f527435277bc89e18b8c3 | [
"Apache-2.0"
] | null | null | null | import os
from io import BytesIO
import json
import tempfile
from unittest import TestCase
from unittest.mock import Mock, patch
from abeja.datalake.file import DatalakeFile
from abeja.datalake.channel import Channel, Channels
from abeja.datalake.storage_type import StorageType
ORGANIZATION_ID = ' 1234567890123'
CHANNEL_ID = '1230000000000'
CHANNEL_NAME = 'test_channel'
CHANNEL_DISPLAY_NAME = 'test_display_channel_name'
CHANNEL_DESCRIPTION = 'description of a test channel'
CHANNEL_STORAGE_TYPE = 'datalake'
CHANNEL_ARCHIVED = False
class TestChannel(TestCase):
def test_files(self):
mock_api = Mock()
mock_api.list_channel_files.side_effect = [{'next_page_token': 'dummy',
'files': [{'url_expires_on': '2018-06-04T05:04:46+00:00',
'uploaded_at': '2018-06-01T05:22:44+00:00',
'metadata': {'x-abeja-meta-filename': 'DcZzLGkV4AA8FQc.jpg'},
'file_id': '20180601T052244-250482c0-d361-4c5b-a0f9-e796af1a5f0d',
'download_uri': 'http://example/dummy/donwload_url',
'content_type': 'image/jpeg'}]},
{'next_page_token': None,
'files': [{'url_expires_on': '2018-06-04T05:04:46+00:00',
'uploaded_at': '2018-06-01T05:22:44+00:00',
'metadata': {'x-abeja-meta-filename': 'DcZzLGkV4AA8FQc.jpg'},
'file_id': '20180601T052244-250482c0-d361-4c5b-a0f9-e796af1a5f0d',
'download_uri': 'http://example/dummy/donwload_url',
'content_type': 'image/jpeg'}]}]
channel = Channel(mock_api, ORGANIZATION_ID, CHANNEL_ID)
self.assertIsInstance(channel, Channel)
files = list(channel.list_files())
for file in files:
self.assertIsInstance(file, DatalakeFile)
self.assertEqual(mock_api.list_channel_files.call_count, 2)
call_args_1 = mock_api.list_channel_files.call_args_list[0]
self.assertTupleEqual(call_args_1[0], (CHANNEL_ID,))
self.assertDictEqual(call_args_1[1], {})
call_args_2 = mock_api.list_channel_files.call_args_list[1]
self.assertTupleEqual(call_args_2[0], (CHANNEL_ID,))
self.assertDictEqual(call_args_2[1], {'next_page_token': 'dummy'})
self.assertEqual(len(files), 2)
def test_files_below_items_per_page(self):
mock_api = Mock()
mock_api.list_channel_files.side_effect = [
{
'next_page_token': None,
'files': [
{
'url_expires_on': '2018-06-04T05:04:46+00:00',
'uploaded_at': '2018-06-01T05:22:44+00:00',
'metadata': {
'x-abeja-meta-filename': 'DcZzLGkV4AA8FQc.jpg'},
'file_id': '20180601T052244-250482c0-d361-4c5b-a0f9-e796af1a5f0d',
'download_uri': 'http://example/dummy/donwload_url',
'content_type': 'image/jpeg'}]}]
channel = Channel(mock_api, ORGANIZATION_ID, CHANNEL_ID)
self.assertIsInstance(channel, Channel)
files = list(channel.list_files())
for file in files:
self.assertIsInstance(file, DatalakeFile)
self.assertEqual(mock_api.list_channel_files.call_count, 1)
self.assertEqual(
mock_api.list_channel_files.call_args[0][0],
CHANNEL_ID)
self.assertEqual(len(files), 1)
def test_files_with_both_items_per_page_and_next_page_token(self):
mock_api = Mock()
mock_api.list_channel_files.side_effect = [{'next_page_token': 'dummy',
'files': [{'url_expires_on': '2018-06-04T05:04:46+00:00',
'uploaded_at': '2018-06-01T05:22:44+00:00',
'metadata': {'x-abeja-meta-filename': 'DcZzLGkV4AA8FQc.jpg'},
'file_id': '20180601T052244-250482c0-d361-4c5b-a0f9-e796af1a5f0d',
'download_uri': 'http://example/dummy/donwload_url',
'content_type': 'image/jpeg'}]},
{'next_page_token': None,
'files': [{'url_expires_on': '2018-06-04T05:04:46+00:00',
'uploaded_at': '2018-06-01T05:22:44+00:00',
'metadata': {'x-abeja-meta-filename': 'DcZzLGkV4AA8FQc.jpg'},
'file_id': '20180601T052244-250482c0-d361-4c5b-a0f9-e796af1a5f0d',
'download_uri': 'http://example/dummy/donwload_url',
'content_type': 'image/jpeg'}]}]
channel = Channel(mock_api, ORGANIZATION_ID, CHANNEL_ID)
self.assertIsInstance(channel, Channel)
files = list(channel.list_files(limit=1))
for file in files:
self.assertIsInstance(file, DatalakeFile)
self.assertEqual(mock_api.list_channel_files.call_count, 2)
call_args_1 = mock_api.list_channel_files.call_args_list[0]
self.assertTupleEqual(call_args_1[0], (CHANNEL_ID,))
self.assertDictEqual(call_args_1[1], {'items_per_page': 1})
call_args_2 = mock_api.list_channel_files.call_args_list[1]
self.assertTupleEqual(call_args_2[0], (CHANNEL_ID,))
# items_per_page should not be passed as query parameter
self.assertDictEqual(call_args_2[1], {'next_page_token': 'dummy'})
self.assertEqual(len(files), 2)
def test_files_with_empty_items(self):
mock_api = Mock()
mock_api.list_channel_files.side_effect = [
{
'next_page_token': None,
'files': []
}
]
channel = Channel(mock_api, ORGANIZATION_ID, CHANNEL_ID)
self.assertIsInstance(channel, Channel)
files = list(channel.list_files())
self.assertEqual(len(files), 0)
def test_get_file(self):
mock_api = Mock()
dummy_file_id = "20180101T000000-00000000-1111-2222-3333-999999999999"
dummy_content_type = "image/jpeg"
dummy_download_uri = "http://example.com/dummy_upload_url"
dummy_url_expires_on = "2018-01-01T00:00:00+00:00"
dummy_metadata = {'x-abeja-meta-filename': 'test_filename'}
mock_api.get_channel_file_download.return_value = {
"url_expires_on": dummy_url_expires_on,
"download_uri": dummy_download_uri,
"uploaded_at": None,
"metadata": dummy_metadata,
"content_type": dummy_content_type,
"file_id": dummy_file_id
}
channel = Channel(mock_api, ORGANIZATION_ID, CHANNEL_ID)
file = channel.get_file(dummy_file_id)
self.assertIsInstance(file, DatalakeFile)
self.assertEqual(file.organization_id, ORGANIZATION_ID)
self.assertEqual(file.channel_id, CHANNEL_ID)
self.assertEqual(file.file_id, dummy_file_id)
self.assertEqual(file.content_type, dummy_content_type)
self.assertEqual(file.url_expires_on, dummy_url_expires_on)
self.assertEqual(file.metadata['filename'], 'test_filename')
def test_upload(self):
mock_api = Mock()
mock_api.post_channel_file_upload.return_value = {
"uploaded_at": None,
"metadata": {},
"content_type": "image/jpeg",
"lifetime": None,
"file_id": "20180515T180605-f4acc798-9afa-40a1-b500-ebce42a4fa3f"
}
channel = Channel(mock_api, ORGANIZATION_ID, CHANNEL_ID)
content_type = 'application/json'
metadata = {'label': 'dummy label'}
dummy_file_data = json.dumps({'data': 'dummy'}).encode('utf-8')
dummy_file = BytesIO(dummy_file_data)
file = channel.upload(
dummy_file,
content_type=content_type,
metadata=metadata)
self.assertIsInstance(file, DatalakeFile)
expected_metadata = {
'x-abeja-meta-label': 'dummy label'
}
mock_api.post_channel_file_upload.assert_called_once_with(
CHANNEL_ID, dummy_file, content_type,
metadata=expected_metadata, lifetime=None, conflict_target=None)
def test_upload_file(self):
mock_api = Mock()
mock_api.post_channel_file_upload.return_value = {
"url_expires_on": "2018-05-15T19:06:05+00:00",
"uploaded_at": None,
"metadata": {},
"content_type": "image/jpeg",
"lifetime": None,
"file_id": "20180515T180605-f4acc798-9afa-40a1-b500-ebce42a4fa3f"
}
channel = Channel(mock_api, ORGANIZATION_ID, CHANNEL_ID)
content_type = 'application/json'
metadata = {'filename': 'dummy', 'label': 'dummy label'}
dummy_file_data = json.dumps({'data': 'dummy'}).encode('utf-8')
with tempfile.NamedTemporaryFile() as tmp:
tmp.write(dummy_file_data)
tmp.seek(0)
file = channel.upload_file(
tmp.name, metadata=metadata, content_type=content_type)
self.assertIsInstance(file, DatalakeFile)
expected_metadata = {
'x-abeja-meta-filename': 'dummy',
'x-abeja-meta-label': 'dummy label'
}
self.assertEqual(mock_api.post_channel_file_upload.call_count, 1)
call_args = mock_api.post_channel_file_upload.call_args[0]
call_kwargs = mock_api.post_channel_file_upload.call_args[1]
self.assertEqual(call_args[0], CHANNEL_ID)
self.assertDictEqual(call_kwargs, {
'lifetime': None,
'conflict_target': None,
'metadata': expected_metadata
})
def test_upload_file_without_filename(self):
mock_api = Mock()
mock_api.post_channel_file_upload.return_value = {
"uploaded_at": None,
"metadata": {},
"content_type": "image/jpeg",
"file_id": "20180515T180605-f4acc798-9afa-40a1-b500-ebce42a4fa3f"
}
channel = Channel(mock_api, ORGANIZATION_ID, CHANNEL_ID)
dummy_file_data = json.dumps({'data': 'dummy'}).encode('utf-8')
with tempfile.NamedTemporaryFile(suffix='.json') as tmp:
tmp.write(dummy_file_data)
tmp.seek(0)
filename = os.path.basename(tmp.name)
file = channel.upload_file(tmp.name)
self.assertIsInstance(file, DatalakeFile)
content_type = 'application/json'
metadata = {'x-abeja-meta-filename': filename}
self.assertEqual(mock_api.post_channel_file_upload.call_count, 1)
call_args = mock_api.post_channel_file_upload.call_args[0]
call_kwargs = mock_api.post_channel_file_upload.call_args[1]
self.assertEqual(call_args[0], CHANNEL_ID)
self.assertEqual(call_args[2], content_type)
self.assertDictEqual(call_kwargs, {
'lifetime': None,
'conflict_target': None,
'metadata': metadata
})
def test_upload_file_with_lifetime(self):
mock_api = Mock()
dummy_lifetime = "1day"
mock_api.post_channel_file_upload.return_value = {
"uploaded_at": None,
"metadata": {},
"content_type": "image/jpeg",
"lifetime": dummy_lifetime,
"file_id": "20180515T180605-f4acc798-9afa-40a1-b500-ebce42a4fa3f"
}
channel = Channel(mock_api, ORGANIZATION_ID, CHANNEL_ID)
dummy_file_data = json.dumps({'data': 'dummy'}).encode('utf-8')
with tempfile.NamedTemporaryFile(suffix='.json') as tmp:
tmp.write(dummy_file_data)
tmp.seek(0)
filename = os.path.basename(tmp.name)
file = channel.upload_file(tmp.name, lifetime=dummy_lifetime)
self.assertIsInstance(file, DatalakeFile)
content_type = 'application/json'
metadata = {'x-abeja-meta-filename': filename}
self.assertEqual(mock_api.post_channel_file_upload.call_count, 1)
call_args = mock_api.post_channel_file_upload.call_args[0]
call_kwargs = mock_api.post_channel_file_upload.call_args[1]
self.assertEqual(call_args[0], CHANNEL_ID)
self.assertEqual(call_args[2], content_type)
self.assertDictEqual(call_kwargs, {
'lifetime': dummy_lifetime,
'conflict_target': None,
'metadata': metadata
})
def test_upload_file_with_conflict_target(self):
conflict_target = 'filename'
mock_api = Mock()
mock_api.post_channel_file_upload.return_value = {
"uploaded_at": None,
"metadata": {},
"content_type": "image/jpeg",
"lifetime": None,
"file_id": "20180515T180605-f4acc798-9afa-40a1-b500-ebce42a4fa3f"
}
channel = Channel(mock_api, ORGANIZATION_ID, CHANNEL_ID)
dummy_file_data = json.dumps({'data': 'dummy'}).encode('utf-8')
with tempfile.NamedTemporaryFile(suffix='.json') as tmp:
tmp.write(dummy_file_data)
tmp.seek(0)
filename = os.path.basename(tmp.name)
file = channel.upload_file(
tmp.name, conflict_target=conflict_target)
self.assertIsInstance(file, DatalakeFile)
content_type = 'application/json'
metadata = {'x-abeja-meta-filename': filename}
self.assertEqual(mock_api.post_channel_file_upload.call_count, 1)
call_args = mock_api.post_channel_file_upload.call_args[0]
call_kwargs = mock_api.post_channel_file_upload.call_args[1]
self.assertEqual(call_args[0], CHANNEL_ID)
self.assertEqual(call_args[2], content_type)
self.assertDictEqual(call_kwargs, {
'lifetime': None,
'conflict_target': conflict_target,
'metadata': metadata
})
def test_upload_with_dir(self):
mock_api = Mock()
mock_api.get_channel_file_upload.return_value = {
"url_expires_on": "2018-05-15T19:06:05+00:00",
"upload_url": "http://example.com/dummy_upload_url",
"uploaded_at": None,
"metadata": {},
"content_type": "image/jpeg",
"file_id": "20180515T180605-f4acc798-9afa-40a1-b500-ebce42a4fa3f"
}
channel = Channel(mock_api, ORGANIZATION_ID, CHANNEL_ID)
with tempfile.NamedTemporaryFile() as tmp:
tmp.write(json.dumps({'data': 'dummy'}).encode('utf-8'))
tmp.seek(0)
base_dir = '/'.join(tmp.name.split('/')[:-1])
with self.assertRaises(IsADirectoryError):
channel.upload_file(
base_dir,
metadata={
'x-abeja-meta-filename': 'dummy'},
content_type='application/json')
@patch('abeja.datalake.channel.generate_path_iter')
def test_upload_dir(self, mock_generate_path_iter):
mock_api = Mock()
mock_api.post_channel_file_upload.side_effect = [
{
"uploaded_at": None,
"metadata": {},
"content_type": "image/jpeg",
"lifetime": None,
"file_id": "20180515T180605-f4acc798-9afa-40a1-b500-ebce42a4fa3f"
},
{
"uploaded_at": None,
"metadata": {},
"content_type": "image/jpeg",
"lifetime": None,
"file_id": "20180515T180605-f4acc798-9afa-40a1-b500-ebce42a4fa3f"
}
]
with tempfile.NamedTemporaryFile() as tmp1:
tmp1.write(b'dummy1')
tmp1.seek(0)
with tempfile.NamedTemporaryFile() as tmp2:
tmp2.write(b'dummy2')
tmp2.seek(0)
mock_generate_path_iter.return_value = [tmp1.name, tmp2.name]
channel = Channel(mock_api, ORGANIZATION_ID, CHANNEL_ID)
dummy_metadata = {'dummy': 'data'}
content_type = 'image/jpeg'
files = channel.upload_dir(
'dummy_path',
metadata=dummy_metadata,
content_type=content_type)
files = list(files)
file = files[0]
self.assertIsInstance(file, DatalakeFile)
self.assertEqual(mock_api.post_channel_file_upload.call_count, 2)
@patch('abeja.datalake.channel.logger')
@patch('abeja.datalake.channel.generate_path_iter')
def test_upload_dir_with_exception(
self, mock_generate_path_iter, mock_logger):
mock_api = Mock()
dummy_exception = Exception('dummy exception')
mock_api.post_channel_file_upload.side_effect = [
{
"url_expires_on": "2018-05-15T19:06:05+00:00",
"upload_url": "http://example.com/dummy_upload_url",
"uploaded_at": None,
"metadata": {},
"content_type": "image/jpeg",
"file_id": "20180515T180605-f4acc798-9afa-40a1-b500-ebce42a4fa3f"},
dummy_exception]
with tempfile.NamedTemporaryFile() as tmp1:
tmp1.write(b'dummy1')
tmp1.seek(0)
with tempfile.NamedTemporaryFile() as tmp2:
tmp2.write(b'dummy2')
tmp2.seek(0)
mock_generate_path_iter.return_value = [tmp1.name, tmp2.name]
channel = Channel(mock_api, ORGANIZATION_ID, CHANNEL_ID)
dummy_metadata = {'dummy': 'data'}
content_type = 'image/jpeg'
files = channel.upload_dir(
'dummy_path',
metadata=dummy_metadata,
content_type=content_type)
files = list(files)
self.assertEqual(len(files), 1)
file = files[0]
self.assertIsInstance(file, DatalakeFile)
self.assertEqual(mock_api.post_channel_file_upload.call_count, 2)
mock_logger.error.assert_called_once_with(dummy_exception)
@patch('abeja.datalake.channel.generate_path_iter')
def test_upload_dir_without_thread(self, mock_generate_path_iter):
mock_api = Mock()
mock_api.post_channel_file_upload.side_effect = [
{
"url_expires_on": "2018-05-15T19:06:05+00:00",
"upload_url": "http://example.com/dummy_upload_url",
"uploaded_at": None,
"metadata": {},
"content_type": "image/jpeg",
"file_id": "20180515T180605-f4acc798-9afa-40a1-b500-ebce42a4fa3f"
},
{
"url_expires_on": "2018-05-15T19:06:05+00:00",
"upload_url": "http://example.com/dummy_upload_url",
"uploaded_at": None,
"metadata": {},
"content_type": "image/jpeg",
"file_id": "20180515T180605-f4acc798-9afa-40a1-b500-ebce42a4fa3f"
}
]
with tempfile.NamedTemporaryFile() as tmp1:
tmp1.write(b'dummy1')
tmp1.seek(0)
with tempfile.NamedTemporaryFile() as tmp2:
tmp2.write(b'dummy2')
tmp2.seek(0)
mock_generate_path_iter.return_value = [tmp1.name, tmp2.name]
channel = Channel(mock_api, ORGANIZATION_ID, CHANNEL_ID)
dummy_metadata = {'dummy': 'data'}
content_type = 'image/jpeg'
files = channel.upload_dir(
'dummy_path', metadata=dummy_metadata,
content_type=content_type, use_thread=False)
files = list(files)
file = files[0]
self.assertIsInstance(file, DatalakeFile)
self.assertEqual(mock_api.post_channel_file_upload.call_count, 2)
def list_datasources(self):
pass
def add_datasource(self):
pass
def remove_datasource(self):
pass
class TestChannels(TestCase):
def test_create(self):
mock_api = Mock()
mock_api.create_channel.return_value = {
"updated_at": "2017-09-12T10:11:46Z",
"organization_name": "test-organization",
"organization_id": ORGANIZATION_ID,
"created_at": "2017-09-12T10:11:46Z",
"channel": {
"updated_at": "2018-06-03T02:57:19Z",
"storage_type": "datalake",
"name": CHANNEL_NAME,
"display_name": CHANNEL_DISPLAY_NAME,
"description": CHANNEL_DESCRIPTION,
"created_at": "2018-06-03T02:57:19Z",
"channel_id": CHANNEL_ID,
"archived": False
}
}
channels = Channels(api=mock_api, organization_id=ORGANIZATION_ID)
channel = channels.create(
name=CHANNEL_NAME, description=CHANNEL_DESCRIPTION,
storage_type=StorageType.DATALAKE.value)
mock_api.create_channel.assert_called_once_with(
ORGANIZATION_ID, CHANNEL_NAME, CHANNEL_DESCRIPTION,
StorageType.DATALAKE.value)
self.assertIsInstance(channel, Channel)
self.assertEqual(channel.organization_id, ORGANIZATION_ID)
self.assertEqual(channel.channel_id, CHANNEL_ID)
self.assertEqual(channel.name, CHANNEL_NAME)
self.assertEqual(channel.description, CHANNEL_DESCRIPTION)
self.assertEqual(channel.display_name, CHANNEL_DISPLAY_NAME)
self.assertEqual(channel.storage_type, CHANNEL_STORAGE_TYPE)
self.assertEqual(channel.archived, CHANNEL_ARCHIVED)
def test_list(self):
mock_api = Mock()
mock_api.list_channels.return_value = {
"updated_at": "2017-09-12T10:11:46Z",
"organization_name": "abeja-internal",
"organization_id": "1225098818583",
"offset": 0,
"limit": 50,
"has_next": True,
"created_at": "2017-09-12T10:11:46Z",
"channels": [
{
"updated_at": "2018-06-03T02:57:19Z",
"storage_type": "datalake",
"name": CHANNEL_NAME,
"display_name": CHANNEL_DISPLAY_NAME,
"description": CHANNEL_DESCRIPTION,
"created_at": "2018-06-03T02:57:19Z",
"channel_id": CHANNEL_ID,
"archived": False
}
]
}
channel_collection = Channels(
api=mock_api, organization_id=ORGANIZATION_ID)
channels = channel_collection.list()
channel = list(channels)[0]
mock_api.list_channels.assert_called_once_with(
ORGANIZATION_ID, limit=None, offset=None)
self.assertIsInstance(channel, Channel)
self.assertEqual(channel.organization_id, ORGANIZATION_ID)
self.assertEqual(channel.channel_id, CHANNEL_ID)
self.assertEqual(channel.name, CHANNEL_NAME)
self.assertEqual(channel.description, CHANNEL_DESCRIPTION)
self.assertEqual(channel.display_name, CHANNEL_DISPLAY_NAME)
self.assertEqual(channel.storage_type, CHANNEL_STORAGE_TYPE)
self.assertEqual(channel.archived, CHANNEL_ARCHIVED)
def test_get(self):
mock_api = Mock()
mock_api.get_channel.return_value = {
"updated_at": "2017-09-12T10:11:46Z",
"organization_name": "test-organization",
"organization_id": ORGANIZATION_ID,
"created_at": "2017-09-12T10:11:46Z",
"channel": {
"updated_at": "2018-06-03T02:57:19Z",
"storage_type": "datalake",
"name": CHANNEL_NAME,
"display_name": CHANNEL_DISPLAY_NAME,
"description": CHANNEL_DESCRIPTION,
"created_at": "2018-06-03T02:57:19Z",
"channel_id": CHANNEL_ID,
"archived": False
}
}
channels = Channels(api=mock_api, organization_id=ORGANIZATION_ID)
channel = channels.get(channel_id=CHANNEL_ID)
mock_api.get_channel.assert_called_once_with(
ORGANIZATION_ID, CHANNEL_ID)
self.assertIsInstance(channel, Channel)
self.assertEqual(channel.organization_id, ORGANIZATION_ID)
self.assertEqual(channel.channel_id, CHANNEL_ID)
self.assertEqual(channel.name, CHANNEL_NAME)
self.assertEqual(channel.description, CHANNEL_DESCRIPTION)
self.assertEqual(channel.display_name, CHANNEL_DISPLAY_NAME)
self.assertEqual(channel.storage_type, CHANNEL_STORAGE_TYPE)
self.assertEqual(channel.archived, CHANNEL_ARCHIVED)
def test_patch(self):
updated_name = 'updated_name'
updated_description = 'updated description'
mock_api = Mock()
mock_api.patch_channel.return_value = {
"updated_at": "2017-09-12T10:11:46Z",
"organization_name": "test-organization",
"organization_id": ORGANIZATION_ID,
"created_at": "2017-09-12T10:11:46Z",
"channel": {
"updated_at": "2018-06-03T02:57:19Z",
"storage_type": "datalake",
"name": updated_name,
"display_name": CHANNEL_DISPLAY_NAME,
"description": updated_description,
"created_at": "2018-06-03T02:57:19Z",
"channel_id": CHANNEL_ID,
"archived": False
}
}
channels = Channels(api=mock_api, organization_id=ORGANIZATION_ID)
channel = channels.patch(
channel_id=CHANNEL_ID,
name=updated_name,
description=updated_description)
mock_api.patch_channel.assert_called_once_with(
ORGANIZATION_ID, CHANNEL_ID, updated_name, updated_description)
self.assertIsInstance(channel, Channel)
self.assertEqual(channel.organization_id, ORGANIZATION_ID)
self.assertEqual(channel.channel_id, CHANNEL_ID)
self.assertEqual(channel.name, updated_name)
self.assertEqual(channel.description, updated_description)
self.assertEqual(channel.display_name, CHANNEL_DISPLAY_NAME)
self.assertEqual(channel.storage_type, CHANNEL_STORAGE_TYPE)
self.assertEqual(channel.archived, CHANNEL_ARCHIVED)
| 43.874799 | 129 | 0.581071 | 2,854 | 27,334 | 5.274702 | 0.068676 | 0.038129 | 0.040919 | 0.028697 | 0.848545 | 0.816394 | 0.788163 | 0.773017 | 0.757739 | 0.742859 | 0 | 0.064399 | 0.312614 | 27,334 | 622 | 130 | 43.945338 | 0.736814 | 0.001976 | 0 | 0.675676 | 0 | 0 | 0.189163 | 0.062028 | 0 | 0 | 0 | 0 | 0.172973 | 1 | 0.037838 | false | 0.005405 | 0.016216 | 0 | 0.057658 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
94ece3fcef2afb050e552e30a65078b5179f8428 | 187 | py | Python | utils.py | raynecafaro/O-Kerberos | feb9d974334452bd1472c16651469abdd13aaa1e | [
"MIT"
] | null | null | null | utils.py | raynecafaro/O-Kerberos | feb9d974334452bd1472c16651469abdd13aaa1e | [
"MIT"
] | 2 | 2018-05-03T19:36:16.000Z | 2018-05-04T11:07:27.000Z | utils.py | raynecafaro/O-Kerberos | feb9d974334452bd1472c16651469abdd13aaa1e | [
"MIT"
] | null | null | null | from nacl import utils
from nacl import secret
def key_gen():
return utils.random(secret.SecretBox.KEY_SIZE)
def main():
print(key_gen())
if __name__ == '__main__':
main()
| 15.583333 | 50 | 0.695187 | 27 | 187 | 4.407407 | 0.592593 | 0.134454 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.187166 | 187 | 11 | 51 | 17 | 0.782895 | 0 | 0 | 0 | 0 | 0 | 0.042781 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.25 | 0.125 | 0.625 | 0.125 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
a22561df31522c3fe93c76056ffc875b5f4d4313 | 38,210 | py | Python | receiveLine.py | ChawisB/LINEchatbotLogin | b112124bf8691b80f8b006008fef66d3af2da615 | [
"MIT"
] | null | null | null | receiveLine.py | ChawisB/LINEchatbotLogin | b112124bf8691b80f8b006008fef66d3af2da615 | [
"MIT"
] | null | null | null | receiveLine.py | ChawisB/LINEchatbotLogin | b112124bf8691b80f8b006008fef66d3af2da615 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from flask import Flask, request
import json
import requests
from pymongo import MongoClient
from bson.objectid import ObjectId
from readConfig import configBot
import datetime
Authorization = configBot.AccessKey
app = Flask(__name__)
@app.route('/')
def index():
return "Hello World!"
@app.route('/callback', methods=['POST'])
def callback():
main_process()
return '', 200
cluster = MongoClient("mongodb+srv://dbadmin:12345@testcluster-kzj5m.gcp.mongodb.net/test?retryWrites=true&w=majority")
db = cluster['line']
login_collection = db['login']
user_collection = db['users']
main_id = "5e75c27b1c9d44000088df21" # ID for the temp data
def main_process():
LINE_API = configBot.LineApiReply
req_dict = json.loads(request.data)
user = req_dict["events"][0]['replyToken']
print(req_dict)
user_id = req_dict["events"][0]['source']['userId']
dict_type = req_dict["events"][0]['type']
if not dict_type:
dict_type = req_dict["events"]
print(dict_type)
get_Modes = login_collection.find_one({"_id": ObjectId(main_id)})
login = get_Modes["Login_Mode"]
lockout_status = get_Modes["Lockout_status"]
current_time = datetime.datetime.now()
check_current_user = user_collection.find_one({"line_id": user_id})
if check_current_user:
check_Login = check_current_user["Logged_in"]
print(check_current_user)
if dict_type == 'follow':
print('incase')
clear_csv()
cancel_richmenu()
get_Modes = login_collection.find_one({"_id": ObjectId(main_id)})
mode = get_Modes["Mode"]
print(mode)
init_sign_in(user, "", "", "", mode)
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Login_Mode": 1}})
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Mode": 1}})
elif dict_type == "unfollow":
if check_current_user:
if check_Login is True:
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Attempt_count": 0}})
user_collection.update_one({"_id": ObjectId(check_current_user["_id"])}, {"$set": {"Logged_in": False}})
user_collection.update_one({"_id": ObjectId(check_current_user["_id"])}, {"$set": {"line_id": ""}})
cancel_richmenu()
else:
if check_current_user:
check_Login = check_current_user["Logged_in"]
if check_Login is True:
rich_menu()
quick_reply(user)
mes = req_dict["events"][0]['message']['text']
print(mes)
if lockout_status is True:
print("incase lockout")
end_time = get_Modes["Lockout_finish_time"]
print(end_time)
print(current_time)
if current_time > end_time:
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Lockout_status": False}})
else:
return loop_lockout(user, end_time)
if mes:
if mes == 'Login':
print('incase relogin')
clear_csv()
sign_in(user, "", "", "")
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Login_Mode": 2}})
elif mes == 'Logout':
if check_current_user:
if check_Login is True:
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Attempt_count": 0}})
user_collection.update_one({"_id": ObjectId(check_current_user["_id"])}, {"$set": {"Logged_in": False}})
user_collection.update_one({"_id": ObjectId(check_current_user["_id"])}, {"$set": {"line_id": ""}})
logout_success(user)
cancel_richmenu()
else:
logout_failed(user)
if login == 1:
mes = str(mes)
mode = get_Modes["Mode"]
extract = login_collection.find_one({"_id": ObjectId(main_id)})
print(extract)
mes1 = extract["Email"]
mes2 = extract["Org"]
mes3 = extract["Password"]
print(mes1, mes2, mes3)
if mode == 1:
print('incase mode 1')
print(type(mes1))
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Email": mes}})
extract = login_collection.find_one({"_id": ObjectId(main_id)})
mes1 = extract["Email"]
print(mes1)
init_sign_in(user, mes1, mes2, mes3, mode)
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Mode": 2}})
elif mode == 2:
print('incase mode 2')
print(type(mes2))
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Org": mes}})
extract = login_collection.find_one({"_id": ObjectId(main_id)})
mes2 = extract["Org"]
init_sign_in(user, mes1, mes2, mes3, mode)
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Mode": 3}})
elif mode == 3:
print('incase mode 3')
print(type(mes3))
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Password": mes}})
extract = login_collection.find_one({"_id": ObjectId(main_id)})
mes3 = extract["Password"]
sign_in(user, mes1, mes2, mes3)
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Login_Mode": 2}})
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Mode": 0}})
if login == 2:
mes = str(mes)
extract = login_collection.find_one({"_id": ObjectId(main_id)})
print(extract)
mes1 = extract["Email"]
mes2 = extract["Org"]
mes3 = extract["Password"]
n = extract["Attempt_count"]
print(mes1, mes2, mes3)
if mes == 'Email':
print('incase email')
loop_email(user)
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Mode": 1}})
elif mes == 'Org':
print('incase org')
loop_org(user)
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Mode": 2}})
elif mes == 'Password':
print('incase password')
loop_password(user)
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Mode": 3}})
elif mes == 'Confirm':
print('incase confirm')
result = user_collection.find_one({"Email": mes1})
result_id = result["_id"]
print(result_id)
print(type(result_id))
n = n + 1 # Adds 1 to attempt count
if result: # Checks if variable was found
if mes2 == result["Org"]:
if mes3 == result["Password"]:
if result["Logged_in"] is True:
user_already_logged_in(user)
else:
loop_success(user, mes1)
clear_csv()
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Attempt_count": 0}})
user_collection.update_one({"_id": ObjectId(result["_id"])}, {"$set": {"Logged_in": True}})
user_collection.update_one({"_id": ObjectId(result["_id"])}, {"$set": {"line_id": user_id}})
else:
if n >= 3: # Checks if attempt no. 3 or higher
if n > 3: # Checks if over 3 then proceed to add 5 mins for every try above 3
locktime = 5 + ((n - 3) * 5)
end_time = current_time + datetime.timedelta(minutes=locktime)
login_collection.update_one({"_id": ObjectId(main_id)},
{"$set": {"Lockout_finish_time": end_time}})
login_collection.update_one({"_id": ObjectId(main_id)},
{"$set": {"Lockout_status": True}})
else:
end_time = current_time + datetime.timedelta(minutes=5)
login_collection.update_one({"_id": ObjectId(main_id)},
{"$set": {"Lockout_finish_time": end_time}})
login_collection.update_one({"_id": ObjectId(main_id)},
{"$set": {"Lockout_status": True}})
else:
loop_error(user)
sign_in(user, mes1, mes2, mes3)
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Attempt_count": n}})
elif mes == 'Cancel':
print('incase cancel')
loop_cancel(user)
clear_csv() # Exit login mode
if mode == 1:
print('incase mode 1')
print(type(mes1))
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Email": mes}})
extract = login_collection.find_one({"_id": ObjectId(main_id)})
mes1 = extract["Email"]
sign_in(user, mes1, mes2, mes3)
elif mode == 2:
print('incase mode 2')
print(type(mes2))
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Org": mes}})
extract = login_collection.find_one({"_id": ObjectId(main_id)})
mes2 = extract["Org"]
sign_in(user, mes1, mes2, mes3)
elif mode == 3:
print('incase mode 3')
print(type(mes3))
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Password": mes}})
extract = login_collection.find_one({"_id": ObjectId(main_id)})
mes3 = extract["Password"]
sign_in(user, mes1, mes2, mes3)
def clear_csv():
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Login_Mode": 0}})
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Mode": 0}})
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Email": ""}})
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Org": ""}})
login_collection.update_one({"_id": ObjectId(main_id)}, {"$set": {"Password": ""}})
def rich_menu():
print("incase rich menu")
LINE_API = "https://api.line.me/v2/bot/user/all/richmenu/richmenu-20c0e113b3d0ac3bd5fdd0209fa798d9"
headers = {
'Authorization': 'Bearer {' + Authorization + '}'
}
data = json.dumps({})
requests.post(LINE_API, headers=headers, data=data)
def cancel_richmenu():
Line_api = "https://api.line.me/v2/bot/user/all/richmenu"
headers = {
'Authorization': 'Bearer {' + Authorization + '}'
}
requests.delete(Line_api, headers=headers)
def add_friend(user):
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "text",
"text": "Thank you for being my friend"
}
]
}
)
requests.post(LINE_API, headers=headers, data=data)
def init_sign_in(user, mes1, mes2, mes3, mode):
if mode == 0:
ask = "email"
elif mode == 1:
ask = "org"
elif mode == 2:
ask = "password"
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data_send = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "flex",
"altText": "sign in",
"backgroundColor": "#3ebb75",
"contents": {
"type": "bubble",
"size": "giga",
# "backgroundColor": "#3ebb75",
"header": {
"type": "box",
"layout": "vertical",
"backgroundColor": "#46CCFF",
"contents": [
{
"type": "box",
"position": "absolute",
"offsetTop": "60%",
"offsetBottom": "20%",
"offsetStart": "1%",
"offsetEnd": "60%",
"layout": "vertical",
"contents": [
{
"type": "image",
"url": "https://si-sawad.com/image_report/background.png",
"aspectRatio": "3:1",
"size": "md",
}
]
},
{
"type": "text",
"text": "sign in",
"gravity": "center",
"size": "lg"
}
]
},
"body": {
"type": "box",
"layout": "vertical",
"backgroundColor": "#F9FFFF",
"spacing": "md",
"contents": [
{
"type": "box",
"layout": "horizontal",
"contents": [
{
"type": "box",
"layout": "vertical",
"contents": [
{
"type": "text",
"text": "Email :" + mes1,
"wrap": True,
"size": "lg",
"flex": 1,
"gravity": "center"
},
{
"type": "separator"
},
]
}
]
},
{
"type": "box",
"layout": "horizontal",
"contents": [
{
"type": "box",
"layout": "vertical",
"contents": [
{
"type": "text",
"text": "Org :" + mes2,
"wrap": True,
"size": "lg",
"flex": 1,
"gravity": "center"
},
{
"type": "separator"
},
]
}
]
},
{
"type": "box",
"layout": "horizontal",
"contents": [
{
"type": "box",
"layout": "vertical",
"contents": [
{
"type": "text",
"text": "Password :" + mes3,
"wrap": True,
"size": "lg",
"flex": 1,
"gravity": "center"
},
{
"type": "separator"
},
]
}
]
},
],
},
}
},
{
"type": "text",
"text": "กรุณากรอก " + ask
}
]
}
)
requests.post(LINE_API, headers=headers, data=data_send)
def sign_in(user, mes1, mes2, mes3):
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data_send = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "flex",
"altText": "singin",
"backgroundColor": "#3ebb75",
"contents": {
"type": "bubble",
"size": "giga",
# "backgroundColor": "#3ebb75",
"header": {
"type": "box",
"layout": "vertical",
"backgroundColor": "#46CCFF",
"contents": [
{
"type": "box",
"position": "absolute",
"offsetTop": "60%",
"offsetBottom": "20%",
"offsetStart": "1%",
"offsetEnd": "60%",
"layout": "vertical",
"contents": [
{
"type": "image",
"url": "https://si-sawad.com/image_report/background.png",
"aspectRatio": "3:1",
"size": "md",
}
]
},
# {
# "type": "image",
# "url": "https://si-sawad.com/image_report/background.jpg",
# "size": "sm",
# # "aspectMode": "f",
# "aspectRatio": "1:1",
# # "offsetTop": "0px",
# # "offsetBottom": "0px",
# # "offsetStart": "0px",
# # "offsetEnd": "0px",
# "aspectMode": "cover",
# "gravity": "center"
# },
{
"type": "text",
"text": "signin",
"gravity": "center",
"size": "lg"
}
]
},
"body": {
"type": "box",
"layout": "vertical",
"backgroundColor": "#F9FFFF",
"spacing": "md",
"contents": [
{
"type": "box",
"layout": "horizontal",
"contents": [
{
"type": "box",
"layout": "vertical",
"contents": [
{
"type": "text",
"text": "Email :" + mes1,
"wrap": True,
"size": "lg",
"flex": 1,
"gravity": "center"
},
{
"type": "separator"
},
]
}
]
},
{
"type": "box",
"layout": "horizontal",
"contents": [
{
"type": "box",
"layout": "vertical",
"contents": [
{
"type": "text",
"text": "Org :" + mes2,
"wrap": True,
"size": "lg",
"flex": 1,
"gravity": "center"
},
{
"type": "separator"
},
]
}
]
},
{
"type": "box",
"layout": "horizontal",
"contents": [
{
"type": "box",
"layout": "vertical",
"contents": [
{
"type": "text",
"text": "Password :" + mes3,
"wrap": True,
"size": "lg",
"flex": 1,
"gravity": "center"
},
{
"type": "separator"
},
]
}
]
},
{
"type": "image",
"url": "https://si-sawad.com/image_report/Edit.png",
"size": "md",
"position": "absolute",
"offsetTop": "12%",
"offsetBottom": "25%",
"offsetStart": "85%",
"offsetEnd": "5%",
"aspectRatio": "3:1",
"action": {
"type": "message",
"label": "รายละเอียด",
"text": "Email"
}
},
{
"type": "image",
"url": "https://si-sawad.com/image_report/Edit.png",
"size": "md",
"position": "absolute",
"offsetTop": "50%",
"offsetBottom": "25%",
"offsetStart": "85%",
"offsetEnd": "5%",
"aspectRatio": "3:1",
"action": {
"type": "message",
"label": "รายละเอียด",
"text": "Org"
}
},
{
"type": "image",
"url": "https://si-sawad.com/image_report/Edit.png",
"size": "md",
"position": "absolute",
"offsetTop": "90%",
"offsetBottom": "10%",
"offsetStart": "85%",
"offsetEnd": "5%",
"aspectRatio": "3:1",
"action": {
"type": "message",
"label": "รายละเอียด",
"text": "Password"
}
},
# {
# "type": "separator"
# },
],
},
"footer": {
"type": "box",
"layout": "horizontal",
# "layout": "vertical",
"spacing": "md",
"contents": [
{
"type": "button",
"style": "secondary",
# "layout": "horizontal",
"color": "#3ebb75",
"action": {
"type": "message",
"label": "Confirm",
"text": "Confirm"
}
},
{
"type": "button",
"style": "secondary",
# "layout": "horizontal",
"color": "#eb766b",
"action": {
"type": "message",
"label": "Cancel",
"text": "Cancel"
}
}
]
}
}
}
]
}
)
requests.post(LINE_API, headers=headers, data=data_send)
def loop_email(user):
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data_send = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "text",
"text": "กรุณากรอก email"
}
]
}
)
requests.post(LINE_API, headers=headers, data=data_send)
def loop_org(user):
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data_send = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "text",
"text": "กรุณากรอก org"
}
]
}
)
requests.post(LINE_API, headers=headers, data=data_send)
def loop_password(user):
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data_send = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "text",
"text": "กรุณากรอก password"
}
]
}
)
requests.post(LINE_API, headers=headers, data=data_send)
def loop_confirm(user):
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data_send = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "text",
"text": "Please wait"
}
]
}
)
requests.post(LINE_API, headers=headers, data=data_send)
def loop_cancel(user):
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data_send = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "text",
"text": "Login canceled"
}
]
}
)
requests.post(LINE_API, headers=headers, data=data_send)
def loop_error(user):
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data_send = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "text",
"text": "Invalid login"
}
]
}
)
requests.post(LINE_API, headers=headers, data=data_send)
def loop_success(user, mes1):
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data_send = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "text",
"text": "Login successful, Welcome " + mes1
}
]
}
)
requests.post(LINE_API, headers=headers, data=data_send)
def loop_lockout(user, release_time):
time = release_time.strftime("%d-%b-%Y (%H:%M:%S.%f)")
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data_send = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "text",
"text": "You are currently locked out until " + time
}
]
}
)
requests.post(LINE_API, headers=headers, data=data_send)
def logout_failed(user):
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data_send = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "text",
"text": "You are currently not logged in"
}
]
}
)
requests.post(LINE_API, headers=headers, data=data_send)
def logout_success(user):
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data_send = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "text",
"text": "Logout successful"
}
]
}
)
requests.post(LINE_API, headers=headers, data=data_send)
def user_already_logged_in(user):
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data_send = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "text",
"text": "This user is already logged in"
}
]
}
)
requests.post(LINE_API, headers=headers, data=data_send)
def quick_reply(user):
LINE_API = configBot.LineApiReply
headers = {
'Content-Type': 'application/json; charset=UTF-8',
'Authorization': 'Bearer {' + Authorization + '}'
}
data_send = json.dumps(
{
"replyToken": user,
"messages": [
{
"type": "text",
"text": "Hello Quick Reply!",
"quickReply": {
"items": [
{
"type": "action",
"action": {
"type": "message",
"label": "รายงาน",
"text": "รายงาน"
}
},
{
"type": "action",
"action": {
"type": "message",
"label": "รายการอนุมัติ",
"text": "รายการอนุมัติ"
}
},
{
"type": "action",
"action": {
"type": "message",
"label": "สืนค้าคงเหลือ",
"text": "สืนค้าคงเหลือ"
}
},
{
"type": "action",
"action": {
"type": "message",
"label": "จบการทำงาน",
"text": "จบการทำงาน"
}
},
]
}
}
]
}
)
requests.post(LINE_API, headers=headers, data=data_send)
if __name__ == '__main__':
app.run(debug=True, port=4000, host='127.0.0.1')
| 40.997854 | 125 | 0.32693 | 2,432 | 38,210 | 4.967928 | 0.11801 | 0.019037 | 0.049495 | 0.056282 | 0.763946 | 0.750538 | 0.730839 | 0.714286 | 0.704602 | 0.689042 | 0 | 0.015758 | 0.55988 | 38,210 | 931 | 126 | 41.04189 | 0.701433 | 0.017823 | 0 | 0.531361 | 0 | 0.002367 | 0.155599 | 0.003227 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024852 | false | 0.020118 | 0.008284 | 0.001183 | 0.036686 | 0.04142 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bf6b56be609fd154d43041f26ea51d56b756a740 | 3,078 | py | Python | project/models/model_utils.py | Sanger2000/Predicting-Lung-Cancer-Disease-Progression-from-CT-reports | c95f7e666a714c9b3dc9a9a7418a7e37cef86c55 | [
"MIT"
] | null | null | null | project/models/model_utils.py | Sanger2000/Predicting-Lung-Cancer-Disease-Progression-from-CT-reports | c95f7e666a714c9b3dc9a9a7418a7e37cef86c55 | [
"MIT"
] | null | null | null | project/models/model_utils.py | Sanger2000/Predicting-Lung-Cancer-Disease-Progression-from-CT-reports | c95f7e666a714c9b3dc9a9a7418a7e37cef86c55 | [
"MIT"
] | null | null | null | import os
import sys
import torch
import torch.autograd as autograd
import torch.nn.functional as F
import torch.nn as nn
import torch.utils.data as data
import tqdm
import datetime
import pdb
class TextNet(nn.Module):
def __init__(self, args):
super(TextNet, self).__init__()
self.relu = nn.ReLU()
self.dropout = nn.Dropout(args.dropout)
self.output = nn.Linear(args.max_prog + args.max_base, 4)
self.softmax = nn.Softmax()
self.val_acc=0
self.args = args
def forward(self, x):
first_out = self.output(x)
nonlinear = self.relu(first_out)
dropout = self.dropout(nonlinear)
return self.softmax(dropout)
def set_accuracy(self, acc):
self.val_acc=acc
def get_accuracy(self):
return self.val_acc
def get_args(self):
return self.args
class FeatureNet(nn.Module):
def __init__(self, args, concat_func):
super(FeatureNet, self).__init__()
self.shared_layer = nn.Linear(args.max_before+args.max_after+len(args.desired_features), args.mid_dim)
self.relu = nn.ReLU()
self.dropout = nn.Dropout(args.dropout)
self.output = nn.Linear(args.mid_dim*2, 4, bias=False)
self.softmax = nn.Softmax()
self.concat_func = concat_func
self.val_acc=0
self.args = args
def forward(self, x, y):
first_base = self.relu(self.shared_layer(x))
second_base = self.concat_func(first_base, 1)
first_progress = self.relu(self.shared_layer(y))
second_progress = self.concat_func(first_progress, 1)
overall_train = torch.cat((second_base, second_progress), dim=-1)
overall_out = self.output(overall_train)
return self.softmax(overall_out)
def set_accuracy(self, acc):
self.val_acc=acc
def get_accuracy(self):
return self.val_acc
def get_args(self):
return self.args
class CombinedNet(nn.Module):
def __init__(self, args, concat_func):
super(CombinedNet, self).__init__()
self.shared_layer = nn.Linear(args.max_before+args.max_after+len(args.desired_features), args.mid_dim)
self.relu = nn.ReLU()
self.dropout = nn.Dropout(args.dropout)
self.output = nn.Linear(args.mid_dim*2 + args.max_prog + args.max_base, 4, bias=False)
self.softmax = nn.Softmax()
self.concat_func = concat_func
self.val_acc=0
self.args = args
def forward(self, x, y, z):
first_base = self.relu(self.shared_layer(x))
second_base = self.concat_func(first_base, 1)
first_progress = self.relu(self.shared_layer(y))
second_progress = self.concat_func(first_progress, 1)
overall_train = torch.cat((second_base, second_progress, z), dim=-1)
overall_out = self.output(overall_train)
return self.softmax(overall_out)
def set_accuracy(self, acc):
self.val_acc=acc
def get_accuracy(self):
return self.val_acc
def get_args(self):
return self.args
| 27.981818 | 110 | 0.652047 | 434 | 3,078 | 4.398618 | 0.154378 | 0.052383 | 0.047145 | 0.037716 | 0.827135 | 0.816658 | 0.80461 | 0.783133 | 0.783133 | 0.743321 | 0 | 0.005991 | 0.240741 | 3,078 | 109 | 111 | 28.238532 | 0.810869 | 0 | 0 | 0.6375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1875 | false | 0 | 0.125 | 0.075 | 0.4625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bf717ae858ac1291cb6b62cdd4f1eaec4583ecb0 | 145 | py | Python | roles/openshift_health_checker/test/conftest.py | emailtovinod/openshift-ansible | 03ae7f76debd812264b6831a6f3314825b251abe | [
"Apache-2.0"
] | null | null | null | roles/openshift_health_checker/test/conftest.py | emailtovinod/openshift-ansible | 03ae7f76debd812264b6831a6f3314825b251abe | [
"Apache-2.0"
] | null | null | null | roles/openshift_health_checker/test/conftest.py | emailtovinod/openshift-ansible | 03ae7f76debd812264b6831a6f3314825b251abe | [
"Apache-2.0"
] | 1 | 2018-07-11T08:43:55.000Z | 2018-07-11T08:43:55.000Z | import os
import sys
# extend sys.path so that tests can import openshift_checks
sys.path.insert(1, os.path.dirname(os.path.dirname(__file__)))
| 24.166667 | 62 | 0.786207 | 25 | 145 | 4.36 | 0.6 | 0.12844 | 0.238532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007752 | 0.110345 | 145 | 5 | 63 | 29 | 0.837209 | 0.393103 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bf8ab73914a413612e83a36229a249a249fe6e49 | 123 | py | Python | blog/functional_test/conftest.py | UrangUrang/flask-tdd-with-testing-goat | 4bbe1d0592bc023ce42f998c6836cebef133d533 | [
"MIT"
] | 26 | 2017-05-02T09:23:56.000Z | 2021-04-30T02:29:32.000Z | blog/functional_test/conftest.py | UrangUrang/flask_tdd_with_testing_goat | 4bbe1d0592bc023ce42f998c6836cebef133d533 | [
"MIT"
] | 1 | 2017-05-25T08:11:28.000Z | 2017-06-01T16:28:03.000Z | blog/functional_test/conftest.py | UrangUrang/flask_tdd_with_testing_goat | 4bbe1d0592bc023ce42f998c6836cebef133d533 | [
"MIT"
] | 5 | 2017-05-29T06:05:32.000Z | 2020-02-29T05:35:19.000Z | import pytest
from selenium import webdriver
@pytest.fixture(scope="module")
def browser():
return webdriver.Chrome() | 17.571429 | 31 | 0.764228 | 15 | 123 | 6.266667 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130081 | 123 | 7 | 32 | 17.571429 | 0.878505 | 0 | 0 | 0 | 0 | 0 | 0.048387 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
bfb187acd3d8c7e5c42f5939f53b782bae3b4df6 | 73 | py | Python | models/__init__.py | canyilmaz90/pytorch-YOLOv4 | 0217bccd2790a9c86e405a3d6d68071ff4db3775 | [
"Apache-2.0"
] | null | null | null | models/__init__.py | canyilmaz90/pytorch-YOLOv4 | 0217bccd2790a9c86e405a3d6d68071ff4db3775 | [
"Apache-2.0"
] | null | null | null | models/__init__.py | canyilmaz90/pytorch-YOLOv4 | 0217bccd2790a9c86e405a3d6d68071ff4db3775 | [
"Apache-2.0"
] | null | null | null | from .resnet import *
from .mobilenetv3 import *
from . import detection
| 18.25 | 26 | 0.767123 | 9 | 73 | 6.222222 | 0.555556 | 0.357143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016393 | 0.164384 | 73 | 3 | 27 | 24.333333 | 0.901639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
44ac050307bf530e72b63d1dd3c9cdc6b062639f | 38 | py | Python | pype9/simulate/neuron/cells/__init__.py | tclose/Pype9 | 23f96c0885fd9df12d9d11ff800f816520e4b17a | [
"MIT"
] | null | null | null | pype9/simulate/neuron/cells/__init__.py | tclose/Pype9 | 23f96c0885fd9df12d9d11ff800f816520e4b17a | [
"MIT"
] | null | null | null | pype9/simulate/neuron/cells/__init__.py | tclose/Pype9 | 23f96c0885fd9df12d9d11ff800f816520e4b17a | [
"MIT"
] | 1 | 2021-04-08T12:46:21.000Z | 2021-04-08T12:46:21.000Z | from .base import CellMetaClass, Cell
| 19 | 37 | 0.815789 | 5 | 38 | 6.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131579 | 38 | 1 | 38 | 38 | 0.939394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
44b5646b3e8edbc85a4d81e6ff4ae1a9dd3408f1 | 2,297 | py | Python | src/fea/inputs/__config.py | ananthsridharan/vtol_sizing | 3f754e1bd3cebdb5b5c68c8a2d84c47be1df2f02 | [
"MIT"
] | 10 | 2020-03-24T10:20:52.000Z | 2021-11-22T18:49:25.000Z | src/fea/inputs/__config.py | ananthsridharan/vtol_sizing | 3f754e1bd3cebdb5b5c68c8a2d84c47be1df2f02 | [
"MIT"
] | 4 | 2020-12-08T10:26:41.000Z | 2021-10-04T18:19:59.000Z | src/fea/inputs/__config.py | ananthsridharan/vtol_sizing | 3f754e1bd3cebdb5b5c68c8a2d84c47be1df2f02 | [
"MIT"
] | 5 | 2018-11-27T21:21:19.000Z | 2021-04-20T15:44:18.000Z | # nnodes, nelem
31 30
# node positions
0.000000e+00 0.000000e+00 0.000000e+00
3.000000e-01 0.000000e+00 0.000000e+00
6.000000e-01 0.000000e+00 0.000000e+00
9.000000e-01 0.000000e+00 0.000000e+00
1.200000e+00 0.000000e+00 0.000000e+00
1.500000e+00 0.000000e+00 0.000000e+00
-3.000000e-01 0.000000e+00 0.000000e+00
-6.000000e-01 0.000000e+00 0.000000e+00
-9.000000e-01 0.000000e+00 0.000000e+00
-1.200000e+00 0.000000e+00 0.000000e+00
-1.500000e+00 0.000000e+00 0.000000e+00
1.800000e+00 2.500000e-01 0.000000e+00
2.100000e+00 5.000000e-01 0.000000e+00
2.400000e+00 7.500000e-01 0.000000e+00
2.700000e+00 1.000000e+00 0.000000e+00
3.000000e+00 1.250000e+00 0.000000e+00
1.800000e+00 -2.500000e-01 0.000000e+00
2.100000e+00 -5.000000e-01 0.000000e+00
2.400000e+00 -7.500000e-01 0.000000e+00
2.700000e+00 -1.000000e+00 0.000000e+00
3.000000e+00 -1.250000e+00 0.000000e+00
-1.800000e+00 2.500000e-01 0.000000e+00
-2.100000e+00 5.000000e-01 0.000000e+00
-2.400000e+00 7.500000e-01 0.000000e+00
-2.700000e+00 1.000000e+00 0.000000e+00
-3.000000e+00 1.250000e+00 0.000000e+00
-1.800000e+00 -2.500000e-01 0.000000e+00
-2.100000e+00 -5.000000e-01 0.000000e+00
-2.400000e+00 -7.500000e-01 0.000000e+00
-2.700000e+00 -1.000000e+00 0.000000e+00
-3.000000e+00 -1.250000e+00 0.000000e+00
# element connectivity
1 2
2 3
3 4
4 5
5 6
1 7
7 8
8 9
9 10
10 11
6 12
12 13
13 14
14 15
15 16
6 17
17 18
18 19
19 20
20 21
11 22
22 23
23 24
24 25
25 26
11 27
27 28
28 29
29 30
30 31
# force/moment boundary condition (nforce \\ nodeid dof id || 1-Fx 2-Fy 3-Fz 4-Mx 5-My 6-Mz)
8
16 3 891.818
21 3 891.818
26 3 891.818
31 3 891.818
16 6 104.957
21 6 104.957
26 6 -104.957
31 6 -104.957
# displacement boundary condition (ndisp \\ nodeid, dof_id, displacement || 1-u 2-v 3-w 4-thx 5-thy 6-thz)
6
1 1 0.0
1 2 0.0
1 3 0.0
1 4 0.0
1 5 0.0
1 6 0.0
# element properties (Aluminum)
68.9e9 # 1 - elastic modulus
0.33d0 # 2 - poissons ratio
2700.d0 # 3 - density of material
4.1591e-3 #5.8905d-3 # 4 - cross-sectional area
2.2942e-6 #4.6019e-6 # 5 - Iy
2.2942e-6 #4.6019e-6 # 6 - Iz
4.5885e-6 #9.2039e-6 # 7 - J (polar moment)
0.d0 # 8 - ky (shear factor)
0.d0 # 9 - kz (shear factor)
4.201e-2 #0.05d0 # 10 - max y dimension
4.201e-2 #0.05d0 # 11 - max z dimension
276.e6 # 12 - Yield stress
| 23.680412 | 106 | 0.696996 | 535 | 2,297 | 2.990654 | 0.214953 | 0.286875 | 0.26875 | 0.18 | 0.6125 | 0.59875 | 0.57625 | 0.57625 | 0.57625 | 0.57625 | 0 | 0.598237 | 0.160209 | 2,297 | 96 | 107 | 23.927083 | 0.231208 | 0.249891 | 0 | 0.066667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
44c455737c946cbbe3135d8d606ddce2b7a636e3 | 140 | py | Python | py_adventure/__init__.py | ABostrom/py_adventure | 4e2765be1d05aba45c3a45280aeea0b709814d63 | [
"MIT"
] | 2 | 2020-10-12T13:33:31.000Z | 2020-10-14T12:00:40.000Z | py_adventure/__init__.py | ABostrom/py_adventure | 4e2765be1d05aba45c3a45280aeea0b709814d63 | [
"MIT"
] | null | null | null | py_adventure/__init__.py | ABostrom/py_adventure | 4e2765be1d05aba45c3a45280aeea0b709814d63 | [
"MIT"
] | 1 | 2021-04-30T08:10:34.000Z | 2021-04-30T08:10:34.000Z | from .entities import *
from .world import *
from .events import *
from .containers import *
from .io import *
from .game_manager import *
| 17.5 | 27 | 0.735714 | 19 | 140 | 5.368421 | 0.473684 | 0.490196 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178571 | 140 | 7 | 28 | 20 | 0.886957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
44c5946616bba66ab40e5b35365fa3d25e998f8a | 117 | py | Python | tools/dev/llvm_path.py | sineang01/hashlibcxx | 3fc50e6bd69eab0b3a33d388ff22dee5e8cfea3d | [
"BSD-3-Clause"
] | null | null | null | tools/dev/llvm_path.py | sineang01/hashlibcxx | 3fc50e6bd69eab0b3a33d388ff22dee5e8cfea3d | [
"BSD-3-Clause"
] | null | null | null | tools/dev/llvm_path.py | sineang01/hashlibcxx | 3fc50e6bd69eab0b3a33d388ff22dee5e8cfea3d | [
"BSD-3-Clause"
] | null | null | null | import sys
import os
head, tail = os.path.split(sys.argv[1])
head, tail = os.path.split(head)
sys.stdout.write(head)
| 19.5 | 39 | 0.726496 | 22 | 117 | 3.863636 | 0.5 | 0.188235 | 0.235294 | 0.329412 | 0.447059 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009615 | 0.111111 | 117 | 5 | 40 | 23.4 | 0.807692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
44d7090757c0e316c1a53541ca6b72ec5f3307fe | 29,463 | py | Python | networking_nec/tests/unit/nwa/l2/drivers/test_mech_necnwa.py | nec-openstack/networking-nec-nwa | 0c5a4a9fb74b6dc78b773d78755c758ed67ed777 | [
"Apache-2.0"
] | null | null | null | networking_nec/tests/unit/nwa/l2/drivers/test_mech_necnwa.py | nec-openstack/networking-nec-nwa | 0c5a4a9fb74b6dc78b773d78755c758ed67ed777 | [
"Apache-2.0"
] | null | null | null | networking_nec/tests/unit/nwa/l2/drivers/test_mech_necnwa.py | nec-openstack/networking-nec-nwa | 0c5a4a9fb74b6dc78b773d78755c758ed67ed777 | [
"Apache-2.0"
] | null | null | null | # Copyright 2015-2016 NEC Corporation. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from mock import MagicMock
from mock import patch
from neutron.common import constants
from neutron import context
from neutron.extensions import providernet as prov_net
from neutron.tests.unit import testlib_api
from oslo_config import cfg
from oslo_serialization import jsonutils
from networking_nec.nwa.common import constants as nwa_const
from networking_nec.nwa.common import exceptions as nwa_exc
from networking_nec.nwa.l2.drivers import mech_necnwa as mech
class TestMechNwa(testlib_api.SqlTestCase):
def setUp(self):
super(TestMechNwa, self).setUp()
class network_context(object):
network = MagicMock()
current = MagicMock()
_plugin = MagicMock()
_plugin_context = MagicMock()
_binding = MagicMock()
_plugin_context.session = context.get_admin_context().session
def set_binding(self, segment_id, vif_type, vif_details,
status=None):
self._binding.segment = segment_id
self._binding.vif_type = vif_type
self._binding.vif_details = vif_details
self._new_port_status = status
self.context = network_context()
self.context.network.current = {
'tenant_id': 'tenant201',
'name': 'tenant 201',
'id': '61',
'network_type': 'vlan',
'physical_network': 'Common/App/Pod3',
'segments': []
}
self.context._port = {
'binding:vif_details': {},
'binding:vif_type': 'ovs',
'binding:vnic_type': 'normal',
'id': 'uuid-port-100',
'device_owner': constants.DEVICE_OWNER_ROUTER_INTF,
'device_id': 'uuid-device_id_100',
'fixed_ips': [
{'ip_address': '192.168.120.1',
'subnet_id': '65e6bc06-09b5-4a16-b093-cbc177818b9e'}
],
'mac_address': '12:34:56:78:9a:bc'
}
self.context._binding.segment = ''
self.context._binding.vif_type = ''
self.host_agents = [
{
"binary": "neutron-openvswitch-agent",
"description": None,
"admin_state_up": True,
"alive": True,
"topic": "N/A",
"host": "harry",
"agent_type": "Open vSwitch agent",
"id": "a01dc42f-0d15-43ff-8f80-22e15cfe715d",
"configurations": {
"tunnel_types": [],
"tunneling_ip": "",
"bridge_mappings": {
"Common/App/Pod3": "br-eth1",
"Common/KVM/Pod1-1": "br-eth2"
},
"l2_population": False,
"devices": 0
},
"alive": True
},
{
"binary": "neutron-openvswitch-agent",
"description": None,
"admin_state_up": True,
"alive": True,
"topic": "N/A",
"host": "harry",
"agent_type": "Open vSwitch agent",
"id": "a01dc42f-0d15-43ff-8f80-22e15cfe715d",
"configurations": {
"tunnel_types": [],
"tunneling_ip": "",
"bridge_mappings": {
"Common/App/Pod4": "br-eth1"
},
"l2_population": False,
"devices": 0
},
"alive": False
}
]
self.context.host_agents = MagicMock(return_value=self.host_agents)
self.network_segments = [
{
"provider:network_type": "vlan",
"provider:physical_network": "Common/KVM/Pod1-1",
"id": "uuid-1-1",
"provider:segmentation_id": 100
},
{
"network_type": "vlan",
"physical_network": "Common/KVM/Pod1-2",
"id": "uuid-1-2",
"provider:segmentation_id": 101
},
{
"provider:network_type": "vlan",
"provider:physical_network": "Common/App/Pod3",
"id": "61",
"provider:segmentation_id": 102
}
]
self.context.network.network_segments = self.network_segments
self.context.original = MagicMock()
resource_group = [
{
"physical_network": "Common/KVM/Pod1-1",
"device_owner": "compute:AZ1",
"ResourceGroupName": "Common/KVM/Pod1-1"
},
{
"physical_network": "Common/App/Pod3",
"device_owner": "ironic:isolation",
"ResourceGroupName": "Common/App/Pod3"
},
{
"physical_network": "Common/App/Pod3",
"device_owner": "network:dhcp",
"ResourceGroupName": "Common/App/Pod3"
},
{
"physical_network": "Common/App/Pod3",
"device_owner": constants.DEVICE_OWNER_ROUTER_GW,
"ResourceGroupName": "Common/App/Pod3"
},
{
"physical_network": "Common/App/Pod3",
"device_owner": constants.DEVICE_OWNER_ROUTER_INTF,
"ResourceGroupName": "Common/App/Pod3"
}
]
fn_resource_group = self.get_temp_file_path('resource_group.json')
with open(fn_resource_group, 'w') as f:
f.write(jsonutils.dumps(resource_group))
cfg.CONF.set_override('resource_group_file', fn_resource_group,
group='NWA')
class TestNECNWAMechanismDriver(TestMechNwa):
def setUp(self):
super(TestNECNWAMechanismDriver, self).setUp()
self.driver = mech.NECNWAMechanismDriver()
self.driver.initialize()
self.rcode = MagicMock()
self.rcode.value_json = {
'CreateTenant': True,
'NWA_tenant_id': 'RegionOnetenant201'
}
def _get_nwa_tenant_binding(self, value_json):
rcode = MagicMock()
rcode.value_json = value_json
return rcode
def test_create_port_precommit_compute(self):
self.context._port['device_owner'] = 'compute:DC01_KVM01_ZONE01'
self.driver.create_port_precommit(self.context)
def test_create_port_precommit_group_not_found(self):
self.driver.resource_groups = [
{
"physical_network": "Common/App/Pod3",
"device_owner": constants.DEVICE_OWNER_ROUTER_GW,
"ResourceGroupName": "Common/App/Pod3"
}
]
self.assertRaises(nwa_exc.ResourceGroupNameNotFound,
self.driver.create_port_precommit, self.context)
@patch('networking_nec.nwa.l2.db_api.add_nwa_tenant_binding')
@patch('networking_nec.nwa.l2.db_api.get_nwa_tenant_binding')
def test_create_port_precommit_owner_router_intf(self, gntb, antb):
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_INTF
gntb.return_value = self.rcode
self.driver.create_port_precommit(self.context)
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_INTF
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100': 'device_id',
'DEV_uuid-device_id_100_62': constants.DEVICE_OWNER_ROUTER_INTF,
'DEV_uuid-device_id_100_62_ip_address': '192.168.120.1',
'DEV_uuid-device_id_100_62_mac_address': '12:34:56:78:9a:bc'}
)
self.driver.create_port_precommit(self.context)
@patch('networking_nec.nwa.l2.db_api.add_nwa_tenant_binding')
@patch('networking_nec.nwa.l2.db_api.get_nwa_tenant_binding')
def test_create_port_precommit_owner_router_gw(self, gntb, antb):
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_GW
gntb.return_value = self.rcode
self.driver.create_port_precommit(self.context)
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_GW
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100': 'device_id',
'DEV_uuid-device_id_100_62': constants.DEVICE_OWNER_ROUTER_GW,
'DEV_uuid-device_id_100_62_ip_address': '192.168.120.1',
'DEV_uuid-device_id_100_62_mac_address': '12:34:56:78:9a:bc'}
)
self.driver.create_port_precommit(self.context)
def test_update_port_precommit(self):
for device_owner in (constants.DEVICE_OWNER_ROUTER_INTF,
constants.DEVICE_OWNER_ROUTER_GW):
self.context._port['device_owner'] = device_owner
self.driver.update_port_precommit(self.context)
@patch('networking_nec.nwa.l2.utils.portcontext_to_nwa_info')
@patch('networking_nec.nwa.l2.db_api.get_nwa_tenant_binding')
def test_update_port_precommit_current_none(self, gntb, ptni):
self.context.current = self.context._port
self.context.current['device_id'] = None
self.context.current['device_owner'] = None
self.context.original[
'device_owner'] = constants.DEVICE_OWNER_ROUTER_INTF
self.context.original['device_id'] = 'uuid-device_id_000'
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100_61': constants.DEVICE_OWNER_ROUTER_INTF,
'DEV_uuid-device_id_100_61_TYPE': nwa_const.NWA_DEVICE_TFW,
'DEV_uuid-device_id_100_61_ip_address': '192.168.120.1',
'DEV_uuid-device_id_100_61_mac_address': '12:34:56:78:9a:bc'}
)
ptni.return_value = {
'tenant_id': 'Tenant1',
'nwa_tenant_id': 'RegionOnetenant201',
'resource_group_name': 'Common/App/Pod3',
'physical_network': 'Common/App/Pod3'}
self.driver.update_port_precommit(self.context)
@patch('networking_nec.nwa.l2.db_api.set_nwa_tenant_binding')
@patch('networking_nec.nwa.l2.db_api.get_nwa_tenant_binding')
def test_delete_port_precommit_owner_router_interface(self, gntb, sntb):
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_INTF
gntb.return_value = self.rcode
self.driver.delete_port_precommit(self.context)
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_INTF
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100_61': constants.DEVICE_OWNER_ROUTER_INTF,
'DEV_uuid-device_id_100_61_TYPE': nwa_const.NWA_DEVICE_TFW,
'DEV_uuid-device_id_100_61_ip_address': '192.168.120.1',
'DEV_uuid-device_id_100_61_mac_address': '12:34:56:78:9a:bc'}
)
self.driver.delete_port_precommit(self.context)
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_INTF
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100_61': constants.DEVICE_OWNER_ROUTER_INTF,
'DEV_uuid-device_id_100_61_TYPE': nwa_const.NWA_DEVICE_TFW,
'DEV_uuid-device_id_100_61_ip_address': '192.168.120.1',
'DEV_uuid-device_id_100_61_mac_address': '12:34:56:78:9a:bc'}
)
self.driver.delete_port_precommit(self.context)
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_INTF
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100_61': constants.DEVICE_OWNER_ROUTER_INTF,
'DEV_uuid-device_id_100_61_TYPE': nwa_const.NWA_DEVICE_TFW,
'DEV_uuid-device_id_100_61_ip_address': '192.168.120.1',
'DEV_uuid-device_id_100_61_mac_address': '12:34:56:78:9a:bc',
'DEV_uuid-device_id_100_62_ip_address': '192.168.120.2'}
)
self.driver.delete_port_precommit(self.context)
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_INTF
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100_61': 'net001',
'DEV_uuid-device_id_100_61_TYPE': nwa_const.NWA_DEVICE_TFW,
'DEV_uuid-device_id_100_61_ip_address': '192.168.120.1',
'DEV_uuid-device_id_100_61_mac_address': '12:34:56:78:9a:bc',
'DEV_uuid-device_id_100_62_ip_address': '192.168.120.2'}
)
self.driver.delete_port_precommit(self.context)
@patch('networking_nec.nwa.l2.db_api.set_nwa_tenant_binding')
@patch('networking_nec.nwa.l2.db_api.get_nwa_tenant_binding')
def test_delete_port_precommit_owner_network_floatingip(self, gntb, sntb):
self.context._port['device_owner'] = 'network:floatingip'
gntb.return_value = self.rcode
router_intf = constants.DEVICE_OWNER_ROUTER_INTF
self.context._port['device_owner'] = router_intf
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100': 'device_id',
'DEV_uuid-device_id_100_device_owner': router_intf,
'DEV_uuid-device_id_100_61': 'net001',
'DEV_uuid-device_id_100_61_TYPE': nwa_const.NWA_DEVICE_TFW,
'DEV_uuid-device_id_100_61_ip_address': '192.168.120.1',
'DEV_uuid-device_id_100_61_mac_address': '12:34:56:78:9a:bc',
'DEV_uuid-device_id_101_61': '12'}
)
self.driver.delete_port_precommit(self.context)
self.context._port['device_owner'] = router_intf
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100': 'device_id',
'DEV_uuid-device_id_100_device_owner': router_intf,
'DEV_uuid-device_id_100_61': 'net001',
'DEV_uuid-device_id_100_61_TYPE': nwa_const.NWA_DEVICE_TFW,
'DEV_uuid-device_id_100_61_ip_address': '192.168.120.1',
'DEV_uuid-device_id_100_61_mac_address': '12:34:56:78:9a:bc'}
)
self.driver.delete_port_precommit(self.context)
self.context._port['device_owner'] = router_intf
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100': 'device_id',
'DEV_uuid-device_id_100_device_owner': router_intf,
'DEV_uuid-device_id_100_61': 'net001',
'DEV_uuid-device_id_100_61_ip_address': '192.168.120.1',
'DEV_uuid-device_id_100_61_mac_address': '12:34:56:78:9a:bc'}
)
self.driver.delete_port_precommit(self.context)
@patch('networking_nec.nwa.l2.db_api.set_nwa_tenant_binding')
@patch('networking_nec.nwa.l2.db_api.get_nwa_tenant_binding')
def test_delete_port_precommit_owner_router_gateway(self, gntb, sntb):
router_gw = constants.DEVICE_OWNER_ROUTER_GW
self.context._port['device_owner'] = router_gw
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': 1,
'CreateTenantNW': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100': 'device_id',
'DEV_uuid-device_id_100_device_owner': router_gw,
'DEV_uuid-device_id_100_61_TYPE': nwa_const.NWA_DEVICE_TFW,
'DEV_uuid-device_id_100_61': 'public001',
'DEV_uuid-device_id_100_61_ip_address': '172.16.1.23',
'DEV_uuid-device_id_100_63_mac_address': '12:34:56:78:9a:bc'}
)
self.driver.delete_port_precommit(self.context)
@patch('networking_nec.nwa.l2.db_api.set_nwa_tenant_binding')
@patch('networking_nec.nwa.l2.db_api.get_nwa_tenant_binding')
def test_delete_port_precommit_owner_floatingip(self, gntb, sntb):
floatingip = constants.DEVICE_OWNER_FLOATINGIP
self.context._port['device_owner'] = floatingip
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': 1,
'CreateTenantNW': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100': 'device_id',
'DEV_uuid-device_id_100_device_owner': floatingip,
'DEV_uuid-device_id_100_61_TYPE': nwa_const.NWA_DEVICE_TFW,
'DEV_uuid-device_id_100_61': 'public001',
'DEV_uuid-device_id_100_61_ip_address': '172.16.1.23',
'DEV_uuid-device_id_100_63_mac_address': '12:34:56:78:9a:bc'}
)
self.driver.delete_port_precommit(self.context)
@patch('networking_nec.nwa.l2.db_api.set_nwa_tenant_binding')
@patch('networking_nec.nwa.l2.db_api.get_nwa_tenant_binding')
def test_delete_port_precommit_owner_none(self, gntb, sntb):
self.context._port['device_owner'] = ''
self.context._port['device_id'] = ''
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': 1,
'CreateTenantNW': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100': 'device_id',
'DEV_uuid-device_id_100_device_owner': '',
'DEV_uuid-device_id_100_61_TYPE': nwa_const.NWA_DEVICE_TFW,
'DEV_uuid-device_id_100_61': 'public001',
'DEV_uuid-device_id_100_61_ip_address': '172.16.1.23',
'DEV_uuid-device_id_100_63_mac_address': '12:34:56:78:9a:bc'}
)
self.driver.delete_port_precommit(self.context)
@patch('networking_nec.nwa.l2.utils.portcontext_to_nwa_info')
@patch('networking_nec.nwa.l2.db_api.get_nwa_tenant_binding')
def test_delete_port_precommit_owner_dhcp(self, gntb, ptni):
self.context._port['device_owner'] = constants.DEVICE_OWNER_DHCP
self.context._port[
'device_id'] = constants.DEVICE_ID_RESERVED_DHCP_PORT
self.context._port['binding:host_id'] = 'myhost'
# _revert_dhcp_agent_device
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': 1,
'CreateTenantNW': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100': 'device_id',
'DEV_uuid-device_id_100_device_owner': constants.DEVICE_OWNER_DHCP,
'DEV_uuid-device_id_100_61_TYPE': nwa_const.NWA_DEVICE_TFW,
'DEV_uuid-device_id_100_61': 'public001',
'DEV_uuid-device_id_100_61_ip_address': '172.16.1.23',
'DEV_uuid-device_id_100_63_mac_address': '12:34:56:78:9a:bc'}
)
ptni.return_value = {
'tenant_id': 'Tenant1',
'nwa_tenant_id': 'RegionOnetenant201',
'device': {
'owner': constants.DEVICE_OWNER_DHCP,
'id': 'device_id'},
'resource_group_name': 'Common/App/Pod3',
'physical_network': 'Common/App/Pod3'}
self.driver.delete_port_precommit(self.context)
@patch('networking_nec.nwa.l2.db_api.set_nwa_tenant_binding')
@patch('networking_nec.nwa.l2.db_api.get_nwa_tenant_binding')
def test_delete_port_precommit_owner_compute_az(self, gntb, sntb):
# 1 net 1 port(compute:AZ1)
self.context.current = self.context._port
self.context.current['device_owner'] = 'compute:AZ1'
self.context._port['device_owner'] = 'compute:AZ1'
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100': 'device_id',
'DEV_uuid-device_id_100_device_owner': 'compute:AZ1',
'DEV_uuid-device_id_100_61_TYPE': nwa_const.NWA_DEVICE_GDV,
'DEV_uuid-device_id_100_61': 'net001',
'DEV_uuid-device_id_100_61_ip_address': '192.168.120.1',
'DEV_uuid-device_id_100_61_mac_address': '12:34:56:78:9a:bc'}
)
self.driver.delete_port_precommit(self.context)
# 1 net 2 port(compute:AZ1)
self.context._port['device_owner'] = 'compute:AZ1'
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100': 'device_id',
'DEV_uuid-device_id_100_device_owner': 'compute:AZ1',
'DEV_uuid-device_id_100_61_TYPE': nwa_const.NWA_DEVICE_GDV,
'DEV_uuid-device_id_100_61': 'net001',
'DEV_uuid-device_id_100_61_ip_address': '192.168.120.1',
'DEV_uuid-device_id_100_61_mac_address': '12:34:56:78:9a:bc',
'DEV_uuid-device_id_200': 'device_id',
'DEV_uuid-device_id_200_device_owner': 'compute:AZ1',
'DEV_uuid-device_id_200_61_TYPE': nwa_const.NWA_DEVICE_GDV,
'DEV_uuid-device_id_200_61': 'net001',
'DEV_uuid-device_id_200_61_ip_address': '192.168.120.2',
'DEV_uuid-device_id_200_61_mac_address': 'fe:34:56:78:9a:bc'}
)
self.driver.delete_port_precommit(self.context)
@patch('networking_nec.nwa.l2.db_api.get_nwa_tenant_binding')
def test_try_to_bind_segment_for_agent(self, gntb):
# in segment
self.context._port['device_owner'] = 'network:dhcp'
self.context._port['fixed_ips'] = []
self.context.current = self.context._port
rb = self.driver.try_to_bind_segment_for_agent(
self.context, self.network_segments[1], self.host_agents[0])
self.assertEqual(rb, True)
# in physical_network
self.context.network.current[
'provider:physical_network'] = 'Common/App/Pod3'
self.context.network.current['provider:segmentation_id'] = 199
self.context.current = self.context._port
rb = self.driver.try_to_bind_segment_for_agent(
self.context, self.network_segments[1], self.host_agents[0])
self.assertEqual(rb, True)
# not in segment
rb = self.driver.try_to_bind_segment_for_agent(
self.context, self.network_segments[1], self.host_agents[1])
self.assertEqual(rb, False)
# device_owner is router_gw
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_GW
rb = self.driver.try_to_bind_segment_for_agent(
self.context, self.network_segments[1], self.host_agents[0])
self.assertEqual(rb, False)
@patch('networking_nec.nwa.l2.db_api.get_nwa_tenant_binding')
def test_try_to_bind_segment_for_agent_in_segments(self, gntb):
# in segment
self.context._port['device_owner'] = 'network:dhcp'
self.context.network.current['segments'] = self.network_segments
self.context.current = self.context._port
rb = self.driver.try_to_bind_segment_for_agent(
self.context, self.network_segments[2], self.host_agents[0])
self.assertEqual(rb, True)
def test__find_nwa_physical_network(self):
pod3_eth1 = self.host_agents[0]
physnet = self.driver._find_nwa_physical_network(self.context,
pod3_eth1)
self.assertEqual(physnet, 'Common/App/Pod3')
def test__find_nwa_physical_network_no_match(self):
physnet = self.driver._find_nwa_physical_network(self.context,
self.host_agents[1])
self.assertIsNone(physnet)
def test__find_nwa_physical_network_agent_none(self):
physnet = self.driver._find_nwa_physical_network(self.context)
self.assertEqual(physnet, 'Common/App/Pod3')
@patch('neutron.plugins.ml2.db.get_dynamic_segment')
def test__bind_segment_to_vif_type_dummy_segment_none(self, gds):
gds.return_value = None
physnet = 'Common/App/Pod3'
rd = self.driver._bind_segment_to_vif_type(self.context, physnet)
self.assertIsNone(rd)
@patch('neutron.plugins.ml2.db.get_dynamic_segment')
@patch('neutron.plugins.ml2.db.add_network_segment')
def test__bind_segment_to_vif_type_dummy_segment_exists(self, ans, gds):
gds.return_value = self.network_segments[1]
physnet = 'Common/KVM/Pod1-2'
self.driver._bind_segment_to_vif_type(self.context, physnet)
self.assertEqual(0, ans.call_count)
@patch('networking_nec.nwa.l2.db_api.get_nwa_tenant_binding')
def _test__bind_port_nwa(self, gntb):
# if prov_net.PHYSICAL_NETWORK in self.context.network.current:
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_INTF
self.context.network.current[
'provider:physical_network'] = 'Common/App/Pod3'
self.context.network.current['provider:segmentation_id'] = 199
gntb.return_value = None
self.driver._bind_port_nwa(self.context)
# else:
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_INTF
self.context.network.current['segments'] = self.network_segments
gntb.return_value = None
self.driver._bind_port_nwa(self.context)
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_INTF
gntb.return_value = None
self.driver._bind_port_nwa(self.context)
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_GW
gntb.return_value = None
self.driver._bind_port_nwa(self.context)
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_INTF
gntb.return_value = self._get_nwa_tenant_binding({
'CreateTenant': True,
'NWA_tenant_id': 'RegionOnetenant201',
'DEV_uuid-device_id_100': 'device_id',
'DEV_uuid-device_id_100_61': constants.DEVICE_OWNER_ROUTER_INTF,
'DEV_uuid-device_id_100_61_ip_address': '192.168.120.1',
'DEV_uuid-device_id_100_61_mac_address': '12:34:56:78:9a:bc'}
)
self.driver._bind_port_nwa(self.context)
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_INTF
gntb.return_value = self.rcode
self.driver._bind_port_nwa(self.context)
self.context._port['device_owner'] = 'network:floatingip'
gntb.return_value = self.rcode
self.driver._bind_port_nwa(self.context)
self.context._port['device_owner'] = 'ironic:isolation'
gntb.return_value = self.rcode
self.driver._bind_port_nwa(self.context)
self.context._port['device_owner'] = 'ironic:isolation'
gntb.return_value = self.rcode
self.driver._bind_port_nwa(self.context)
self.context._port['device_owner'] = 'compute:BM_001'
gntb.return_value = self.rcode
self.driver._bind_port_nwa(self.context)
self.context._port['device_owner'] = 'compute:BM_001'
gntb.return_value = self.rcode
self.driver._bind_port_nwa(self.context)
self.context._port['device_owner'] = 'compute:az_001'
gntb.return_value = self.rcode
self.driver._bind_port_nwa(self.context)
self.context._port['device_owner'] = 'compute:az_001'
gntb.return_value = self.rcode
self.driver._bind_port_nwa(self.context)
self.context._port['device_owner'] = 'compute:AZ1'
gntb.return_value = self.rcode
self.context.network.current.pop(prov_net.PHYSICAL_NETWORK)
self.driver._bind_port_nwa(self.context)
# Exception
self.context._port['device_owner'] = constants.DEVICE_OWNER_ROUTER_INTF
gntb.side_effect = Exception
self.driver._bind_port_nwa(self.context)
@patch('neutron.plugins.ml2.db.get_dynamic_segment')
@patch('neutron.plugins.ml2.db.delete_network_segment')
def test__l2_delete_segment(self, dns, gds):
gds.return_value = None
self.driver._l2_delete_segment(self.context, MagicMock())
self.assertEqual(0, dns.call_count)
dns.mock_reset()
gds.return_value = {'id': 'ID-100'}
self.driver._l2_delete_segment(self.context, MagicMock())
self.assertEqual(1, dns.call_count)
| 44.981679 | 79 | 0.638394 | 3,658 | 29,463 | 4.744396 | 0.082832 | 0.07796 | 0.066379 | 0.081245 | 0.808355 | 0.786402 | 0.744627 | 0.73391 | 0.711668 | 0.700951 | 0 | 0.051345 | 0.250382 | 29,463 | 654 | 80 | 45.050459 | 0.734447 | 0.028748 | 0 | 0.598579 | 0 | 0 | 0.308524 | 0.166066 | 0 | 0 | 0 | 0 | 0.023091 | 1 | 0.046181 | false | 0 | 0.019538 | 0 | 0.072824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
44ee5155f3f406beb6e3ab351754127e8f4dbbe6 | 93 | py | Python | tests/conftest.py | pberba/pulsedive-py | e7b4863bd875843c97f769dac06899652ec9d910 | [
"MIT"
] | 6 | 2018-09-21T22:29:24.000Z | 2020-11-10T02:42:43.000Z | tests/conftest.py | pberba/pulsedive-py | e7b4863bd875843c97f769dac06899652ec9d910 | [
"MIT"
] | 1 | 2019-05-24T01:27:09.000Z | 2019-05-24T01:27:37.000Z | tests/conftest.py | pberba/pulsedive-py | e7b4863bd875843c97f769dac06899652ec9d910 | [
"MIT"
] | 3 | 2019-05-21T11:51:46.000Z | 2020-03-11T13:41:01.000Z | import pytest
import pulsedive
@pytest.fixture
def pud():
return pulsedive.Pulsedive()
| 11.625 | 32 | 0.752688 | 11 | 93 | 6.363636 | 0.636364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16129 | 93 | 7 | 33 | 13.285714 | 0.897436 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
781a03a18e2324684269b898b2ca70056cd7d27a | 41 | py | Python | sigtestv/utils/__init__.py | castorini/sigtestv | 1eea53191196ec492b73abc1e3895ffb02daab06 | [
"MIT"
] | 11 | 2020-04-29T06:53:08.000Z | 2021-06-29T07:35:52.000Z | meanmax/utils/__init__.py | castorini/meanmax | 0ea124105eda04a00677c077b591a94c2e2b2936 | [
"MIT"
] | 2 | 2021-07-29T07:52:04.000Z | 2021-07-29T07:56:41.000Z | meanmax/utils/__init__.py | castorini/meanmax | 0ea124105eda04a00677c077b591a94c2e2b2936 | [
"MIT"
] | null | null | null | from .object import *
from .list import * | 20.5 | 21 | 0.731707 | 6 | 41 | 5 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170732 | 41 | 2 | 22 | 20.5 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7867fb82a3141a815c10aaa038ef67f48befe330 | 13,141 | py | Python | VectorMath.py | hsspratt/Nott-Hawkeye1 | 178f4f0fef62e8699f6057d9d50adfd61a851047 | [
"MIT"
] | null | null | null | VectorMath.py | hsspratt/Nott-Hawkeye1 | 178f4f0fef62e8699f6057d9d50adfd61a851047 | [
"MIT"
] | 1 | 2021-11-11T22:15:36.000Z | 2021-11-11T22:15:36.000Z | VectorMath.py | hsspratt/Nott-Hawkeye1 | 178f4f0fef62e8699f6057d9d50adfd61a851047 | [
"MIT"
] | null | null | null | # %% Imports
%matplotlib widget
from numpy.lib.function_base import angle
from skimage import color
from sympy import Matrix, init_printing
import sympy as sym
from sympy.core.numbers import I
import sympy.printing as printing
from sympy import Integral, Matrix, pi, pprint
import numpy as np
import matplotlib.pyplot as plt
from mpl_toolkits import mplot3d
import VectorFunctions as vf
import importlib
importlib.reload(vf)
import functions as f
# %%
angles = f.import_bz2('A_angles_test')
video_camera_1 = f.import_bz2('video_camera_1')
video_camera_2 = f.import_bz2('video_camera_2')
video = f.video_read('IMG_3005.mp4')
# %% All calculations relating to the vectors
# initialise vectors for test lines
r1 = np.array([2,6,-9])
r2 = np.array([-1,-2,3])
v1 = np.array([3,4,-4])
v2 = np.array([2,-6,1])
print(vf.FindShortestDistance(r1,r2,v1,v2))
print(vf.LocShortestDistance(r1,r2,v1,v2))
c1, c2, dist = vf.LocShortestDistance(r1,r2,v1,v2)
t = np.linspace(0,2000,2000)
camera_1_theta = np.radians(45)
camera_1_phi = np.radians(10)
camera_1_r0 = np.array([0,0,0])
camera_1_r0, camera_1_vector1 = vf.Polar2Vector(camera_1_r0, camera_1_theta, axis="xz", camera="1")
camera_1_r0, camera_1_vector2 = vf.Polar2Vector(camera_1_r0, camera_1_phi, axis="yz", camera="1")
camera_1_r1 = t*camera_1_vector1.T
camera_1_r2 = t*camera_1_vector2.T
x1 = -100
z1 = 100
camera_2_theta = np.radians(-30)
camera_2_phi = np.radians(10)
camera_2_r0 = np.array([x1, 0, z1]) # np.array([-mpmath.tan(camera_2_theta)*x1,0,z1])
camera_2_r0, camera_2_vector1 = vf.Polar2Vector(camera_2_r0, camera_2_theta, axis="xz", camera="2")
camera_2_r0, camera_2_vector2 = vf.Polar2Vector(camera_2_r0, camera_2_phi, axis="yz", camera="2")
camera_2_r1 = t*camera_2_vector1.T
camera_2_r2 = t*camera_2_vector2.T
# remember y and z axis are switched
plt.figure('3D plot 2 Cameras Both Angles')
ax = plt.axes(projection='3d')
ax.plot3D(xs=camera_1_r1[0]+camera_1_r0[0],ys=camera_1_r1[2]+camera_1_r0[2],zs=np.zeros(camera_1_r1.shape[-1]))
ax.plot3D(xs=camera_2_r1[0]+camera_2_r0[0],ys=camera_2_r1[2]+camera_2_r0[2],zs=np.zeros(camera_1_r1.shape[-1]))
ax.plot3D(xs=np.zeros(camera_1_r2.shape[-1]),ys=camera_1_r2[2]+camera_1_r0[2],zs=camera_1_r2[1]+camera_1_r0[1])
ax.plot3D(xs=np.zeros(camera_2_r2.shape[-1]),ys=camera_2_r2[2]+camera_2_r0[2],zs=camera_2_r2[1]+camera_1_r0[1])
ax.set_xlabel('x')
ax.set_ylabel('z')
ax.set_zlabel('y')
ax.set_xlim([-100,100])
ax.set_ylim([0,200])
ax.set_zlim([-100,100])
ax.elev = 90
plt.figure('3D plot Camera 1 Both Angles')
ax = plt.axes(projection='3d')
ax.set_xlabel('x')
ax.set_ylabel('z')
ax.set_zlabel('y')
ax.plot3D(xs=camera_1_r1[0]+camera_1_r0[0],ys=camera_1_r1[2]+camera_1_r0[2],zs=np.zeros(camera_1_r1.shape[-1]))
ax.plot3D(xs=np.zeros(camera_1_r2.shape[-1]),ys=camera_1_r2[2]+camera_1_r0[2],zs=camera_1_r2[1]+camera_1_r0[1])
camera_1_cart = vf.sph2cart(1, camera_1_theta, camera_1_phi)
camera_2_cart = vf.sph2cart(1, camera_2_theta, camera_2_phi)
camera_1_vector = camera_1_cart - camera_1_r0
camera_2_vector = camera_2_cart - camera_2_r0
camera_1_t_vector = t*np.array([camera_1_vector]).T
camera_2_t_vector = t*np.array([camera_2_vector]).T
plt.figure('3D plot Camera 1 & 2 Cart')
ax = plt.axes(projection='3d')
ax.set_xlabel('x')
ax.set_ylabel('z')
ax.set_zlabel('y')
# ax.set_xlim([-100,100])
# ax.set_ylim([-100,100])
# ax.set_zlim([0,200])
# ax.elev = 90
ax.plot3D(xs=camera_1_t_vector[0]+camera_1_r0[0],ys=camera_1_t_vector[2]+camera_1_r0[2],zs=camera_1_t_vector[1]+camera_1_r0[1])
ax.plot3D(xs=camera_2_t_vector[0]+camera_2_r0[0],ys=camera_2_t_vector[2]+camera_2_r0[2],zs=camera_2_t_vector[1]+camera_2_r0[1])
d = vf.FindShortestDistance(camera_1_r0, camera_2_r0, camera_1_vector, camera_2_vector)
position_short_dist = vf.LocShortestDistance(camera_1_r0, camera_2_r0, camera_1_vector, camera_2_vector)
plt.show()
# %%
angles_A = np.array(f.import_bz2('A_angles_test'))
angles_B = np.array(f.import_bz2('A_angles_test'))
angles_AB = np.vstack([angles_A,angles_B])
camera1_r0 = np.array([0,0,0])
camera2_r0 = np.array([0,0,0])
cameras_r0 = [camera1_r0, camera2_r0]
for frames in range(np.shape(angles_AB)[-1]):
print(" ", end=f"\r frame: {frames+1} ", flush=True)
if np.any(np.isnan(angles_AB[:,frames])) == True:
continue
position_3D = np.zeros([3,np.shape(angles_AB)[-1]])
position_3D[:,frames] = vf.Find3DPosition(cameras_r0, angles_AB[:,frames],args="")
# %%
import matplotlib.pyplot as plt
import numpy as np
import VectorFunctions as vf
import importlib as imp
import functions as f
imp.reload(vf)
angles_BA = np.array(f.import_bz2('test_photos_angles'))
anglesB_phi = angles_BA[0]
anglesB_theta = angles_BA[1]
anglesA_phi = angles_BA[2]
anglesA_theta = angles_BA[3]
delta = np.array([np.radians(18), np.radians(38), 0])
angles_AB = np.vstack([anglesA_phi,anglesA_theta,anglesB_phi,anglesB_theta])
#original_angles = np.full([4, np.shape(angles_AB)[-1]], np.pi/2)
#angles_AB = original_angles-angles_AB
camera1_r0 = np.array([0,0,0])
camera2_r0 = np.array([-29,1,42])
cameras_r0 = [camera1_r0, camera2_r0]
for frames in range(np.shape(angles_AB)[-1]):
print(" ", end=f"\r frame: {frames+1} ", flush=True)
if np.any(np.isnan(angles_AB[:,frames])) == True:
continue
position_3D = np.zeros([3,np.shape(angles_AB)[-1]])
c1 = np.zeros([3,np.shape(angles_AB)[-1]])
c2 = np.zeros([3,np.shape(angles_AB)[-1]])
camera1_vector, camera2_vector, c1[:,frames], c2[:,frames], position_3D[:,frames] = vf.Find3DPosition(cameras_r0, angles_AB[:,frames],delta, args="")
tlim = np.max(cameras_r0)*2
t = np.linspace(0,tlim,5000)
camera1_line = t*np.array([camera1_vector]).T[0]
camera2_line = t*np.array([camera2_vector]).T[0]
plt.figure('3D lines')
ax = plt.axes(projection='3d')
ax.set_xlabel('x')
ax.set_ylabel('z')
ax.set_zlabel('y')
ax.plot3D(camera1_line[0], camera1_line[2], camera1_line[1])
ax.plot3D(camera2_line[0], camera2_line[2], camera2_line[1])
print("Finished finding 3D position")
plt.figure('3D position of ball')
ax = plt.axes(projection='3d')
ax.set_xlabel('x')
ax.set_ylabel('z')
ax.set_zlabel('y')
ax.scatter(xs=position_3D[0,:],ys=position_3D[2,:],zs=position_3D[1,:],marker='o')
ax.scatter(xs=c1[0,:],ys=c1[2,:],zs=c1[1,:],marker='o')
ax.scatter(xs=c2[0,:],ys=c2[2,:],zs=c2[1,:],marker='o')
ax.set_xlim([-30,30])
ax.set_ylim([-0,50])
ax.set_zlim([-30,30])
ax.elev = 90
# %% TEst Final run
import matplotlib.pyplot as plt
import numpy as np
import VectorFunctions as vf
import importlib as imp
import functions as f
imp.reload(vf)
angles_BA = np.array(f.import_bz2('test_photos_angles'))
camera1_r0 = np.array([0,0,0])
camera2_r0 = np.array([-29,1,42])
I = 10
anglesB_phi = angles_BA[0] / I
anglesB_theta = angles_BA[1] / 1
anglesA_phi = angles_BA[2]/I
anglesA_theta = angles_BA[3]/1
anglesB_phi = 0
anglesB_theta = 0
anglesA_phi = 0
anglesA_theta = 0
angles_AB = np.vstack([anglesA_phi,anglesA_theta,anglesB_phi,anglesB_theta])
frames = 1
cameras_r0 = [camera1_r0, camera2_r0]
cameras_angles = angles_AB[:,frames]
delta = np.array([np.radians(18), np.radians(38), 0])
camera1_r0 = cameras_r0[0]
camera2_r0 = cameras_r0[1]
dt_A, dt_B, dp_B = delta
camera1_phi = (np.pi/2) - cameras_angles[0]
camera1_theta = (np.pi/2 + dt_A) + cameras_angles[1]
camera2_phi = cameras_angles[2] - dp_B # -np.pi
camera2_theta = (np.pi/2 + dt_B) - cameras_angles[3]
camera1_cart = np.array(vf.new_sph2cart(1, camera1_theta, camera1_phi, camera1_r0))
camera2_cart = np.array(vf.new_sph2cart(1, camera2_theta, camera2_phi, camera2_r0))
#camera1_vector = vf.normalised(camera1_cart - camera1_r0)
#camera2_vector = vf.normalised(camera2_cart - camera2_r0)
camera1_vector, _ = vf.EqOfLine(camera1_r0, camera1_cart)
camera2_vector, _ = vf.EqOfLine(camera2_r0, camera2_cart)
camera1_vector = camera1_vector[0]
camera2_vector = camera2_vector[0]
tlim = np.max(cameras_r0)*2
t = np.linspace(0,tlim,50000)
camera1_line = t*np.array([camera1_vector]).T + np.array([camera1_r0]).T
camera2_line = t*np.array([camera2_vector]).T + np.array([camera2_r0]).T
n = np.cross(camera1_vector,camera2_vector)
n1 = np.cross(camera1_vector, n)
n2 = np.cross(camera2_vector, n)
c1 = camera1_r0 + camera1_vector*(vf.division(np.dot((camera2_r0-camera1_r0),n2),(np.dot(camera1_vector,n2))))
c2 = camera2_r0 + camera2_vector*(vf.division(np.dot((camera1_r0-camera2_r0),n1),(np.dot(camera2_vector,n1))))
dist = np.linalg.norm(c1-c2,axis=0)
cart_position = (c1 + c2) / 2.0
plt.figure('3D position of ball')
ax = plt.axes(projection='3d')
ax.set_xlabel('x')
ax.set_ylabel('z')
ax.set_zlabel('y')
# for i in range(len(position_3D[-1])):
# ax.scatter(xs=position_3D[0,i],ys=position_3D[2,i],zs=position_3D[1,i],marker='o')
# ax.scatter(camera1_line[0], camera1_line[2], camera1_line[1],marker='o')
# ax.scatter(camera2_line[0], camera2_line[2], camera2_line[1],marker='o')
ax.plot3D(camera1_line[0,[0,-1]], camera1_line[2,[0,-1]], camera1_line[1,[0,-1]])
ax.plot3D(camera2_line[0,[0,-1]], camera2_line[2,[0,-1]], camera2_line[1,[0,-1]])
ax.plot3D(camera1_r0[0], camera1_r0[2], camera1_r0[1],marker='x', color='k')
ax.plot3D(camera2_r0[0], camera2_r0[2], camera2_r0[1],marker='x', color='k')
ax.plot3D([-13, 13], [56, 56], [-27, -27], color='r')
ax.plot3D([13, 13], [56, 18], [-27, -27], color='r')
ax.plot3D([13, -13], [18, 18], [-27, -27], color='r')
ax.plot3D([-13, -13], [18, 56], [-27, -27], color='r')
ax.set_xlim([-30,30])
ax.set_ylim([0,60])
ax.set_zlim([-30,30])
ax.elev = 100
ax.azim = 300# 150
# %% Second Final run
import matplotlib.pyplot as plt
import numpy as np
import VectorFunctions as vf
import importlib as imp
import functions as f
imp.reload(vf)
angles_BA = np.array(f.import_bz2('test_photos_angles'))
camera1_r0 = np.array([0,0,0])
camera2_r0 = np.array([-29,1,42])
I = 10
anglesB_phi = angles_BA[0] / I
anglesB_theta = angles_BA[1] / 1
anglesA_phi = angles_BA[2]/I
anglesA_theta = angles_BA[3]/1
angles_AB = np.vstack([anglesA_phi,anglesA_theta,anglesB_phi,anglesB_theta])
frames = 0
cameras_r0 = [camera1_r0, camera2_r0]
cameras_angles = angles_AB[:,frames]
delta = np.array([np.radians(18), np.radians(38), 0])
camera1_r0 = cameras_r0[0]
camera2_r0 = cameras_r0[1]
dt_A, dt_B, dp_B = delta
camera1_phi = (np.pi/2) - cameras_angles[0]
camera1_theta = (np.pi/2 + dt_A) + cameras_angles[1]
camera2_phi = -np.pi + cameras_angles[2] - dp_B
camera2_theta = (np.pi/2 + dt_B) - cameras_angles[3]
x0, y0, z0 = np.array([0,0,0]) # cameras_r0
r = 1
x1 = r * np.cos(anglesA_theta[frames]) * np.sin(anglesA_phi[frames]) + x0
z1 = r * np.sin(anglesA_theta[frames]) * np.sin(anglesA_phi[frames]) + z0
y1 = r * np.cos(anglesA_theta[frames]) + y0
x0, y0, z0 = np.array([-29,1,42])
z2 = r * np.cos(anglesB_theta[frames]) * np.sin(anglesB_phi[frames]) + x0
x2 = r * np.sin(anglesB_theta[frames]) * np.sin(anglesB_phi[frames]) + z0
y2 = r * np.cos(anglesB_theta[frames]) + y0
camera1_cart = np.hstack((x1,y1,z1))
camera2_cart = np.hstack((x2,y2,z2))
#camera1_cart = np.array(vf.new_sph2cart(1, camera1_theta, camera1_phi, camera1_r0))
#camera2_cart = np.array(vf.new_sph2cart(1, camera2_theta, camera2_phi, camera2_r0))
#camera1_vector = vf.normalised(camera1_cart - camera1_r0)
#camera2_vector = vf.normalised(camera2_cart - camera2_r0)
camera1_vector, _ = vf.EqOfLine(camera1_r0, camera1_cart)
camera2_vector, _ = vf.EqOfLine(camera2_r0, camera2_cart)
#camera1_vector = camera1_vector[0]
#camera2_vector = camera2_vector[0]
tlim = np.max(cameras_r0)*2
t = np.linspace(0,tlim,50000)
camera1_line = t*np.array([camera1_vector]).T + np.array([camera1_r0]).T
camera2_line = t*np.array([camera2_vector]).T + np.array([camera2_r0]).T
n = np.cross(camera1_vector,camera2_vector)
n1 = np.cross(camera1_vector, n)
n2 = np.cross(camera2_vector, n)
c1 = camera1_r0 + camera1_vector*(vf.division(np.dot((camera2_r0-camera1_r0),n2),(np.dot(camera1_vector,n2))))
c2 = camera2_r0 + camera2_vector*(vf.division(np.dot((camera1_r0-camera2_r0),n1),(np.dot(camera2_vector,n1))))
dist = np.linalg.norm(c1-c2,axis=0)
cart_position = (c1 + c2) / 2.0
plt.figure('3D position of ball')
ax = plt.axes(projection='3d')
ax.set_xlabel('x')
ax.set_ylabel('z')
ax.set_zlabel('y')
# for i in range(len(position_3D[-1])):
# ax.scatter(xs=position_3D[0,i],ys=position_3D[2,i],zs=position_3D[1,i],marker='o')
# ax.scatter(camera1_line[0], camera1_line[2], camera1_line[1],marker='o')
# ax.scatter(camera2_line[0], camera2_line[2], camera2_line[1],marker='o')
ax.plot3D(camera1_line[0,[0,-1]], camera1_line[2,[0,-1]], camera1_line[1,[0,-1]])
ax.plot3D(camera2_line[0,[0,-1]], camera2_line[2,[0,-1]], camera2_line[1,[0,-1]])
ax.plot3D(camera1_r0[0], camera1_r0[2], camera1_r0[1],marker='x', color='k')
ax.plot3D(camera2_r0[0], camera2_r0[2], camera2_r0[1],marker='x', color='k')
ax.plot3D([-13, 13], [56, 56], [-27, -27], color='r')
ax.plot3D([13, 13], [56, 18], [-27, -27], color='r')
ax.plot3D([13, -13], [18, 18], [-27, -27], color='r')
ax.plot3D([-13, -13], [18, 56], [-27, -27], color='r')
ax.set_xlim([-30,30])
ax.set_ylim([0,60])
ax.set_zlim([-30,30])
ax.elev = 90
ax.azim = 300# 150
# %%
| 29.464126 | 153 | 0.71646 | 2,428 | 13,141 | 3.639621 | 0.086903 | 0.04832 | 0.020369 | 0.010863 | 0.816114 | 0.767568 | 0.724001 | 0.704877 | 0.649994 | 0.631549 | 0 | 0.091101 | 0.100373 | 13,141 | 445 | 154 | 29.530337 | 0.656403 | 0.109124 | 0 | 0.613971 | 0 | 0 | 0.035821 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.136029 | null | null | 0.029412 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
78c2a6f3e9c3ca8a7fc091b6705c024e49e4ef67 | 49 | py | Python | bots/map_graph/__init__.py | xyla-io/raspador | 4e77234239d44a83faf5c1d3a6d022a9e3861f25 | [
"MIT"
] | null | null | null | bots/map_graph/__init__.py | xyla-io/raspador | 4e77234239d44a83faf5c1d3a6d022a9e3861f25 | [
"MIT"
] | null | null | null | bots/map_graph/__init__.py | xyla-io/raspador | 4e77234239d44a83faf5c1d3a6d022a9e3861f25 | [
"MIT"
] | null | null | null | from .map_graph_scraper import MapGraphBot as Bot | 49 | 49 | 0.877551 | 8 | 49 | 5.125 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102041 | 49 | 1 | 49 | 49 | 0.931818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1520e4ba9aa9d76102f1fb69fe2f2849bf142c54 | 115 | py | Python | titan/api_pkg/graphqlapi/resources.py | mnieber/gen | 65f8aa4fb671c4f90d5cbcb1a0e10290647a31d9 | [
"MIT"
] | null | null | null | titan/api_pkg/graphqlapi/resources.py | mnieber/gen | 65f8aa4fb671c4f90d5cbcb1a0e10290647a31d9 | [
"MIT"
] | null | null | null | titan/api_pkg/graphqlapi/resources.py | mnieber/gen | 65f8aa4fb671c4f90d5cbcb1a0e10290647a31d9 | [
"MIT"
] | null | null | null | from dataclasses import dataclass
from moonleap import Resource
@dataclass
class GraphqlApi(Resource):
pass
| 12.777778 | 33 | 0.8 | 13 | 115 | 7.076923 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.165217 | 115 | 8 | 34 | 14.375 | 0.958333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
1547ecec60d4cb68a3cac8fbe6f7765a16b724dd | 113 | py | Python | data_structures/hash_table/conftest.py | jeremyCtown/data-structures-and-algorithms | d4ba8741f858fb5298f8ce560240373fb7742e20 | [
"MIT"
] | null | null | null | data_structures/hash_table/conftest.py | jeremyCtown/data-structures-and-algorithms | d4ba8741f858fb5298f8ce560240373fb7742e20 | [
"MIT"
] | null | null | null | data_structures/hash_table/conftest.py | jeremyCtown/data-structures-and-algorithms | d4ba8741f858fb5298f8ce560240373fb7742e20 | [
"MIT"
] | null | null | null | from hash_table import HashTable as hasher
import pytest
@pytest.fixture
def test_hasher():
return hasher()
| 16.142857 | 42 | 0.778761 | 16 | 113 | 5.375 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.159292 | 113 | 6 | 43 | 18.833333 | 0.905263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
154a8022b63ae450abef4e85692a9b19a7a60fb8 | 15 | py | Python | T00-00/slice0.3.py | pedromigueladao/SSof-Project1920 | 91e768312f1a6560db45fd4372010cca30bae65e | [
"MIT"
] | 2 | 2019-11-20T19:26:07.000Z | 2019-11-22T00:42:23.000Z | T00-00/slice0.3.py | pedromigueladao/SSof-Project1920 | 91e768312f1a6560db45fd4372010cca30bae65e | [
"MIT"
] | 2 | 2019-11-28T05:21:24.000Z | 2019-11-28T05:21:58.000Z | T00-00/slice0.3.py | pedromigueladao/SSof-Project1920 | 91e768312f1a6560db45fd4372010cca30bae65e | [
"MIT"
] | 25 | 2019-11-27T01:40:56.000Z | 2019-12-04T23:38:59.000Z | a=0
q=a
z3(q)
| 3 | 5 | 0.466667 | 6 | 15 | 1.166667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 0.266667 | 15 | 4 | 6 | 3.75 | 0.454545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1565e71dddf45cd99b69005365f95a6e3e3822bf | 17,236 | py | Python | privx_api/authorizer.py | hokenssh/privx-sdk-for-python | 24627d25c0343f350c9b2396677344b771f8aec6 | [
"Apache-2.0"
] | 4 | 2020-06-15T17:14:18.000Z | 2021-12-20T12:12:56.000Z | privx_api/authorizer.py | hokenssh/privx-sdk-for-python | 24627d25c0343f350c9b2396677344b771f8aec6 | [
"Apache-2.0"
] | 5 | 2019-11-25T07:04:07.000Z | 2021-05-19T08:09:53.000Z | privx_api/authorizer.py | hokenssh/privx-sdk-for-python | 24627d25c0343f350c9b2396677344b771f8aec6 | [
"Apache-2.0"
] | 23 | 2019-11-22T08:17:58.000Z | 2022-02-21T15:50:36.000Z | from http import HTTPStatus
from typing import Optional
from privx_api.base import BasePrivXAPI
from privx_api.enums import UrlEnum
from privx_api.response import PrivXAPIResponse, PrivXStreamResponse
from privx_api.utils import get_value
class AuthorizerAPI(BasePrivXAPI):
def get_authorizer_service_status(self) -> PrivXAPIResponse:
"""
Get microservice status.
Returns:
PrivXAPIResponse
"""
response_status, data = self._http_get(UrlEnum.AUTHORIZER.STATUS)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def get_authorizer_cert(
self, access_group_id: Optional[str] = None
) -> PrivXAPIResponse:
"""
Gets authorizer's root certificate.
Returns:
PrivXAPIResponse
"""
response_status, data = self._http_get(
UrlEnum.AUTHORIZER.AUTHORIZER_CERT,
query_params={"access_group_id": access_group_id}
if access_group_id
else None,
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def download_authorizer_cert(self, cert_id: str) -> PrivXStreamResponse:
"""
Gets authorizer's root certificate.
Returns:
PrivXStreamResponse
"""
response = self._http_stream(
UrlEnum.AUTHORIZER.AUTHORIZER_CERT_ID, path_params={"id": cert_id}
)
return PrivXStreamResponse(response, HTTPStatus.OK)
def download_cert_revocation_list(self, cert_id: str) -> PrivXStreamResponse:
"""
Gets authorizer CA's certificate revocation list.
Returns:
PrivXStreamResponse
"""
response = self._http_stream(
UrlEnum.AUTHORIZER.CERT_REVOCATION_LIST, path_params={"id": cert_id}
)
return PrivXStreamResponse(response, HTTPStatus.OK)
def get_target_host_credentials(
self,
target_params: dict,
) -> PrivXAPIResponse:
"""
Get target host credentials for the user.
Returns:
PrivXStreamResponse
"""
response_status, data = self._http_post(
UrlEnum.AUTHORIZER.TARGET_HOST,
body=target_params,
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def get_principals(self) -> PrivXAPIResponse:
"""
Get defined principals from the authorizer.
Returns:
PrivXAPIResponse
"""
response_status, data = self._http_get(UrlEnum.AUTHORIZER.PRINCIPALS)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def get_principal(
self,
group_id: str,
key_id: Optional[str] = None,
filter_param: Optional[str] = None,
) -> PrivXAPIResponse:
"""
Gets the principal key by its group ID.
Returns:
PrivXAPIResponse
"""
search_params = self._get_search_params(
key_id=key_id,
filter=filter_param,
)
response_status, data = self._http_get(
UrlEnum.AUTHORIZER.GROUP_PRINCIPAL_KEY,
path_params={"group_id": group_id},
query_params=search_params,
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def delete_principal_key(
self, group_id: str, key_id: Optional[str] = None
) -> PrivXAPIResponse:
"""
Deletes the principal key by its group ID.
Returns:
PrivXAPIResponse
"""
search_params = self._get_search_params(key_id=key_id)
response_status, data = self._http_delete(
UrlEnum.AUTHORIZER.GROUP_PRINCIPAL_KEY,
path_params={"group_id": group_id},
query_params=search_params,
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def create_principal_key(
self,
group_id: str,
) -> PrivXAPIResponse:
"""
Create a principal key pair.
Returns:
PrivXAPIResponse
"""
response_status, data = self._http_post(
UrlEnum.AUTHORIZER.CREATE_GROUP_PRINCIPAL_KEY,
path_params={"group_id": group_id},
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def import_principal_key(
self, group_id: str, principal_key_params: dict
) -> PrivXAPIResponse:
"""
Import a principal key pair.
Returns:
PrivXAPIResponse
"""
response_status, data = self._http_post(
UrlEnum.AUTHORIZER.IMPORT_GROUP_PRINCIPAL_KEY,
path_params={"group_id": group_id},
body=principal_key_params,
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def sign_with_principal_key(
self,
group_id: str,
key_id: str,
sign_params: dict,
) -> PrivXAPIResponse:
"""
Get a signature.
Returns:
PrivXAPIResponse
"""
search_params = self._get_search_params(key_id=key_id)
response_status, data = self._http_post(
UrlEnum.AUTHORIZER.SIGN_GROUP_PRINCIPAL_KEY,
path_params={"group_id": group_id},
query_params=search_params,
body=sign_params,
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def get_component_certs(
self,
ca_type: str,
access_group_id: Optional[str] = None,
) -> PrivXAPIResponse:
"""
Gets authorizer's CA certificates.
ca_type should be `extender` or `icap`.
Returns:
PrivXAPIResponse
"""
search_params = self._get_search_params(access_group_id=access_group_id)
response_status, data = self._http_get(
UrlEnum.AUTHORIZER.COMPONENT_CERTS,
path_params={"ca_type": ca_type},
query_params=search_params,
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def download_component_cert(
self, ca_type: str, cert_id: str
) -> PrivXStreamResponse:
"""
Gets authorizer's CA certificate.
ca_type should be `extender` or `icap`.
Returns:
PrivXStreamResponse
"""
response = self._http_stream(
UrlEnum.AUTHORIZER.COMPONENT_CERT,
path_params={"id": cert_id, "ca_type": ca_type},
)
return PrivXStreamResponse(response, HTTPStatus.OK)
def download_component_cert_crl(
self,
ca_type: str,
cert_id: str,
) -> PrivXStreamResponse:
"""
Gets authorizer CA's certificate revocation list.
ca_type should be `extender` or `icap`.
Returns:
PrivXStreamResponse
"""
response = self._http_stream(
UrlEnum.AUTHORIZER.COMPONENT_CERT_REVOCATION_LIST,
path_params={"id": cert_id, "ca_type": ca_type},
)
return PrivXStreamResponse(response, HTTPStatus.OK)
def create_extender_config_download_handle(
self,
trusted_client_id: str,
) -> PrivXAPIResponse:
"""
Gets a extender-config.toml pre-configured for this PrivX installation.
Returns:
PrivXAPIResponse
"""
response_status, data = self._http_post(
UrlEnum.AUTHORIZER.EXTENDER_CONFIG_SESSION_ID,
path_params={"trusted_client_id": trusted_client_id},
)
return PrivXAPIResponse(response_status, HTTPStatus.CREATED, data)
def download_extender_config(
self,
trusted_client_id: str,
session_id: str,
) -> PrivXStreamResponse:
"""
Gets a extender-config.toml pre-configured for this PrivX installation.
Returns:
PrivXStreamResponse
"""
response = self._http_stream(
UrlEnum.AUTHORIZER.EXTENDER_CONFIG,
path_params={
"trusted_client_id": trusted_client_id,
"session_id": session_id,
},
)
return PrivXStreamResponse(response, HTTPStatus.OK)
def create_deployment_script_download_handle(
self, trusted_client_id: str
) -> PrivXAPIResponse:
"""
Gets a deployment script pre-configured for this PrivX installation.
Returns:
PrivXAPIResponse
"""
response_status, data = self._http_post(
UrlEnum.AUTHORIZER.DEPLOYMENT_SCRIPT_SESSION_ID,
path_params={"trusted_client_id": trusted_client_id},
)
return PrivXAPIResponse(response_status, HTTPStatus.CREATED, data)
def download_deployment_script(
self,
trusted_client_id: str,
session_id: str,
) -> PrivXStreamResponse:
"""
Gets a deployment script pre-configured for this PrivX installation.
Returns:
PrivXStreamResponse
"""
response = self._http_stream(
UrlEnum.AUTHORIZER.DOWNLOAD_DEPLOYMENT_SCRIPT,
path_params={
"trusted_client_id": trusted_client_id,
"session_id": session_id,
},
)
return PrivXStreamResponse(response, HTTPStatus.OK)
def download_principal_command_script(self) -> PrivXStreamResponse:
"""
Gets the principals_command.sh script.
Returns:
PrivXStreamResponse
"""
response = self._http_stream(
UrlEnum.AUTHORIZER.DOWNLOAD_COMMAND_SCRIPT,
)
return PrivXStreamResponse(response, HTTPStatus.OK)
def create_carrier_config_download_handle(
self, trusted_client_id: str
) -> PrivXAPIResponse:
"""
Gets a carrier-config.toml pre-configured for this PrivX installation.
Returns:
PrivXStreamResponse
"""
response_status, data = self._http_post(
UrlEnum.AUTHORIZER.CARRIER_CONFIG_SESSION_ID,
path_params={"trusted_client_id": trusted_client_id},
)
return PrivXAPIResponse(response_status, HTTPStatus.CREATED, data)
def download_carrier_config(
self,
trusted_client_id: str,
session_id: str,
) -> PrivXStreamResponse:
"""
Gets a carrier-config.toml pre-configured for this PrivX installation.
Returns:
PrivXStreamResponse
"""
response = self._http_stream(
UrlEnum.AUTHORIZER.DOWNLOAD_CARRIER_CONFIG,
path_params={
"trusted_client_id": trusted_client_id,
"session_id": session_id,
},
)
return PrivXStreamResponse(response, HTTPStatus.OK)
def create_web_proxy_config_download_handle(
self, trusted_client_id: str
) -> PrivXAPIResponse:
"""
Gets a web-proxy-config.toml pre-configured for this PrivX installation.
Returns:
PrivXStreamResponse
"""
response_status, data = self._http_post(
UrlEnum.AUTHORIZER.WEB_PROXY_CONFIG_SESSION_ID,
path_params={"trusted_client_id": trusted_client_id},
)
return PrivXAPIResponse(response_status, HTTPStatus.CREATED, data)
def download_web_proxy_config(
self,
trusted_client_id: str,
session_id: str,
) -> PrivXStreamResponse:
"""
Gets a web-proxy-config.toml pre-configured for this PrivX installation.
Returns:
PrivXStreamResponse
"""
response = self._http_stream(
UrlEnum.AUTHORIZER.DOWNLOAD_CARRIER_CONFIG,
path_params={
"trusted_client_id": trusted_client_id,
"session_id": session_id,
},
)
return PrivXStreamResponse(response, HTTPStatus.OK)
def get_cert_auth_templates(
self, service: Optional[str] = None
) -> PrivXAPIResponse:
"""
Returns the certificate authentication templates for the service.
Returns:
PrivXStreamResponse
"""
search_params = self._get_search_params(service=service)
response_status, data = self._http_get(
UrlEnum.AUTHORIZER.CERT_AUTH_TEMPLATES,
query_params=search_params,
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def get_ssl_trust_anchor(self) -> PrivXAPIResponse:
"""
Returns the SSL trust anchor (PrivX TLS CA certificate).
Returns:
PrivXStreamResponse
"""
response_status, data = self._http_get(
UrlEnum.AUTHORIZER.SSL_TRUST_ANCHOR,
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def get_extender_trust_anchor(self) -> PrivXAPIResponse:
"""
Returns the extender trust anchor (PrivX Extender CA certificate).
Returns:
PrivXStreamResponse
"""
response_status, data = self._http_get(
UrlEnum.AUTHORIZER.EXTENDER_TRUST_ANCHOR,
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def get_access_groups(
self,
offset: Optional[int] = None,
limit: Optional[int] = None,
sort_key: Optional[str] = None,
sort_dir: Optional[str] = None,
) -> PrivXAPIResponse:
"""
Get access groups.
Returns:
PrivXStreamResponse
"""
search_params = self._get_search_params(
offset=offset,
limit=limit,
sortkey=sort_key,
sortdir=sort_dir,
)
response_status, data = self._http_get(
UrlEnum.AUTHORIZER.ACCESS_GROUPS,
query_params=search_params,
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def create_access_group(self, access_group_params: dict) -> PrivXAPIResponse:
"""
Create access group.
Returns:
PrivXStreamResponse
"""
response_status, data = self._http_post(
UrlEnum.AUTHORIZER.ACCESS_GROUPS,
body=access_group_params,
)
return PrivXAPIResponse(response_status, HTTPStatus.CREATED, data)
def search_access_groups(
self,
offset: Optional[int] = None,
limit: Optional[int] = None,
sort_key: Optional[str] = None,
sort_dir: Optional[str] = None,
access_group_params: Optional[dict] = None,
) -> PrivXAPIResponse:
"""
Get access groups.
Returns:
PrivXStreamResponse
"""
search_params = self._get_search_params(
offset=offset,
limit=limit,
sortkey=sort_key,
sortdir=sort_dir,
)
response_status, data = self._http_post(
UrlEnum.AUTHORIZER.SEARCH_ACCESS_GROUPS,
query_params=search_params,
body=get_value(access_group_params, dict()),
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def get_access_group(self, access_group_id: str) -> PrivXAPIResponse:
"""
Get access group.
Returns:
PrivXStreamResponse
"""
response_status, data = self._http_get(
UrlEnum.AUTHORIZER.ACCESS_GROUP,
path_params={"id": access_group_id},
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def update_access_group(
self, access_group_id: str, access_group_params: dict
) -> PrivXAPIResponse:
"""
Update access group.
Returns:
PrivXStreamResponse
"""
response_status, data = self._http_put(
UrlEnum.AUTHORIZER.ACCESS_GROUP,
path_params={"id": access_group_id},
body=access_group_params,
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def delete_access_group(self, access_group_id: str) -> PrivXAPIResponse:
"""
Delete access group.
Returns:
PrivXStreamResponse
"""
response_status, data = self._http_delete(
UrlEnum.AUTHORIZER.ACCESS_GROUP, path_params={"id": access_group_id}
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
def search_certificates(
self,
cert_params: dict,
offset: Optional[int] = None,
limit: Optional[int] = None,
sort_key: Optional[str] = None,
sort_dir: Optional[str] = None,
) -> PrivXAPIResponse:
"""
Search certificates.
Returns:
PrivXStreamResponse
"""
search_params = self._get_search_params(
offset=offset,
limit=limit,
sortkey=sort_key,
sortdir=sort_dir,
)
response_status, data = self._http_post(
UrlEnum.AUTHORIZER.SEARCH_CERTS,
query_params=search_params,
body=cert_params,
)
return PrivXAPIResponse(response_status, HTTPStatus.OK, data)
| 30.778571 | 81 | 0.609944 | 1,653 | 17,236 | 6.058076 | 0.0732 | 0.067106 | 0.09287 | 0.052726 | 0.844817 | 0.827342 | 0.80687 | 0.786 | 0.718095 | 0.646195 | 0 | 0 | 0.311789 | 17,236 | 559 | 82 | 30.833631 | 0.844208 | 0.154154 | 0 | 0.588235 | 0 | 0 | 0.02005 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.102167 | false | 0 | 0.024768 | 0 | 0.232198 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
158954e78781556df0a260d6253c8263cc35205c | 36 | py | Python | gitlangstats/csv/__init__.py | zekroTJA/gitlangstats | cf3b1f625e2c345a107548703f1b79a5f44b2cd8 | [
"MIT"
] | null | null | null | gitlangstats/csv/__init__.py | zekroTJA/gitlangstats | cf3b1f625e2c345a107548703f1b79a5f44b2cd8 | [
"MIT"
] | null | null | null | gitlangstats/csv/__init__.py | zekroTJA/gitlangstats | cf3b1f625e2c345a107548703f1b79a5f44b2cd8 | [
"MIT"
] | null | null | null | # flake8: noqa
from .csv import *
| 12 | 19 | 0.638889 | 5 | 36 | 4.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0.25 | 36 | 2 | 20 | 18 | 0.814815 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
01a8879ecb8b9fb6fe538add347b5287042f4183 | 4,232 | py | Python | 2020/days/day11.py | terezaif/adventofcode | 67601f79a3b01d71434ef0236387ffd5ab7dca0f | [
"MIT"
] | 4 | 2020-12-06T13:11:59.000Z | 2021-12-15T11:34:34.000Z | 2020/days/day11.py | terezaif/adventofcode | 67601f79a3b01d71434ef0236387ffd5ab7dca0f | [
"MIT"
] | null | null | null | 2020/days/day11.py | terezaif/adventofcode | 67601f79a3b01d71434ef0236387ffd5ab7dca0f | [
"MIT"
] | 1 | 2021-12-02T16:32:50.000Z | 2021-12-02T16:32:50.000Z | import copy
def get_seat_count(input: list) -> int:
changes, input = shuffle_seats(input)
while changes > 0:
changes, input = shuffle_seats(input)
return get_seats(input)
def shuffle_seats(input):
new_state = copy.deepcopy(input)
changes = 0
for r in range(0, len(input)):
for c in range(0, len(input[r])):
if input[r][c] == "L" and get_neighbors_occupied(r, c, input) == 0:
new_state[r][c] = "#"
changes += 1
elif input[r][c] == "#" and get_neighbors_occupied(r, c, input) >= 5:
new_state[r][c] = "L"
changes += 1
return changes, new_state
def get_neighbors_occupied(r, c, input):
occupied = sum(
[input[i][j] == "#" for i in range(r - 1, r + 2) for j in range(c - 1, c + 2)]
)
return occupied
def print_array(input):
for line in input:
print("".join(line))
def get_seat_count_2(input: list) -> int:
# print(get_seats(input))
# print_array(input)
changes, input = shuffle_seats_diagonal(input)
# print_array(input)
# print(get_seats(input))
while changes > 0:
changes, input = shuffle_seats_diagonal(input)
# print_array(input)
# print(get_seats(input))
return get_seats(input)
def shuffle_seats_diagonal(input):
new_state = copy.deepcopy(input)
changes = 0
for r in range(0, len(input)):
for c in range(0, len(input[r])):
if input[r][c] == "L" and get_neighbors_occupied_diagonal(r, c, input) == 0:
new_state[r][c] = "#"
changes += 1
elif (
input[r][c] == "#" and get_neighbors_occupied_diagonal(r, c, input) >= 5
):
new_state[r][c] = "L"
changes += 1
return changes, new_state
def get_neighbors_occupied_diagonal(r, c, input):
mr = len(input)
mc = len(input[0])
dirs = []
dirs.append(
next(
(
input[r - incr][c + incr] == "#"
for incr in range(1, min(r + 1, mc - c))
if input[r - incr][c + incr] in ["L", "#"]
),
False,
)
) # top left
dirs.append(
next(
(
input[r + incr][c + incr] == "#"
for incr in range(1, min(mr - r, mc - c))
if input[r + incr][c + incr] in ["L", "#"]
),
False,
)
) # bot left
dirs.append(
next(
(
input[r + incr][c - incr] == "#"
for incr in range(1, min(mr - r, c + 1))
if input[r + incr][c - incr] in ["L", "#"]
),
False,
)
) # bot right
dirs.append(
next(
(
input[r - incr][c - incr] == "#"
for incr in range(1, min(r + 1, c + 1))
if input[r - incr][c - incr] in ["L", "#"]
),
False,
)
) # top right
dirs.append(
next(
(
input[r][c + incr] == "#"
for incr in range(1, mc - c)
if input[r][c + incr] in ["L", "#"]
),
False,
)
)
dirs.append(
next(
(
input[r][c - incr] == "#"
for incr in range(1, c + 1)
if input[r][c - incr] in ["L", "#"]
),
False,
)
)
dirs.append(
next(
(
input[r + incr][c] == "#"
for incr in range(1, mr - r)
if input[r + incr][c] in ["L", "#"]
),
False,
)
)
dirs.append(
next(
(
input[r - incr][c] == "#"
for incr in range(1, r + 1)
if input[r - incr][c] in ["L", "#"]
),
False,
)
)
occupied = sum(dirs)
return occupied
def get_seats(matrix):
seats = sum(
[
matrix[i][j] == "#"
for i in range(0, len(matrix))
for j in range(0, len(matrix))
]
)
return seats
| 25.493976 | 88 | 0.427221 | 504 | 4,232 | 3.494048 | 0.107143 | 0.074957 | 0.068143 | 0.074957 | 0.827371 | 0.801249 | 0.773424 | 0.764906 | 0.736513 | 0.660988 | 0 | 0.015715 | 0.428639 | 4,232 | 165 | 89 | 25.648485 | 0.712572 | 0.039225 | 0 | 0.428571 | 0 | 0 | 0.008385 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.057143 | false | 0 | 0.007143 | 0 | 0.114286 | 0.014286 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bf1adea57e13d5b11fede7e98bc6929519b9db80 | 32 | py | Python | airbus_cobot_gui/src/airbus_cobot_gui/timestamp/__init__.py | ipa320/airbus_coop | 974564807ba5d24096e237a9991311608a390da1 | [
"Apache-2.0"
] | 4 | 2017-10-15T23:32:24.000Z | 2019-12-26T12:31:53.000Z | airbus_cobot_gui/src/airbus_cobot_gui/timestamp/__init__.py | ipa320/airbus_coop | 974564807ba5d24096e237a9991311608a390da1 | [
"Apache-2.0"
] | 6 | 2017-09-05T13:52:00.000Z | 2017-12-01T14:18:27.000Z | airbus_cobot_gui/src/airbus_cobot_gui/timestamp/__init__.py | ipa320/airbus_coop | 974564807ba5d24096e237a9991311608a390da1 | [
"Apache-2.0"
] | 4 | 2017-09-04T08:14:36.000Z | 2017-09-18T07:22:21.000Z | from timestamp import Timestamp
| 16 | 31 | 0.875 | 4 | 32 | 7 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bf25226cefddd855d024708d2935e20d74d8dd04 | 1,309 | py | Python | blog/migrations/0026_auto_20200218_2148.py | sonikarichamodur/website | 5f4be6325675c17870ccf6f78757047151dbf3dd | [
"Apache-2.0"
] | null | null | null | blog/migrations/0026_auto_20200218_2148.py | sonikarichamodur/website | 5f4be6325675c17870ccf6f78757047151dbf3dd | [
"Apache-2.0"
] | 36 | 2019-02-01T09:13:24.000Z | 2021-06-09T19:00:40.000Z | blog/migrations/0026_auto_20200218_2148.py | sonikarichamodur/website | 5f4be6325675c17870ccf6f78757047151dbf3dd | [
"Apache-2.0"
] | 1 | 2021-12-11T02:45:28.000Z | 2021-12-11T02:45:28.000Z | # Generated by Django 2.2.10 on 2020-02-19 02:48
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('blog', '0025_auto_20191123_1545'),
]
operations = [
migrations.AlterField(
model_name='historicalmember',
name='team',
field=models.CharField(choices=[('None', 'None'), ('Software', 'Software'), ('CAD/Manufacturing', 'CAD/Manufacturing'), ('Outreach', 'Outreach'), ('Strategy', 'Strategy'), ('Safety', 'Safety'), ('Helping Hands', 'Helping Hands'), ('Sponsorship', 'Sponsorship'), ('Graphics/Spirit', 'Graphics/Spirit'), ('Comun/Website', 'Comun/Website'), ('Visuals', 'Visuals')], default='None', max_length=255, verbose_name='team name'),
),
migrations.AlterField(
model_name='member',
name='team',
field=models.CharField(choices=[('None', 'None'), ('Software', 'Software'), ('CAD/Manufacturing', 'CAD/Manufacturing'), ('Outreach', 'Outreach'), ('Strategy', 'Strategy'), ('Safety', 'Safety'), ('Helping Hands', 'Helping Hands'), ('Sponsorship', 'Sponsorship'), ('Graphics/Spirit', 'Graphics/Spirit'), ('Comun/Website', 'Comun/Website'), ('Visuals', 'Visuals')], default='None', max_length=255, verbose_name='team name'),
),
]
| 54.541667 | 433 | 0.622613 | 129 | 1,309 | 6.248062 | 0.410853 | 0.039702 | 0.062035 | 0.07196 | 0.717122 | 0.717122 | 0.717122 | 0.717122 | 0.717122 | 0.717122 | 0 | 0.034959 | 0.169595 | 1,309 | 23 | 434 | 56.913043 | 0.706532 | 0.035141 | 0 | 0.470588 | 1 | 0 | 0.41475 | 0.018239 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
17184a09e308a2c88126d8119067d29a08106dea | 2,765 | py | Python | tests/test_losses/test_gan_loss.py | liuhd073/mmgeneration | 2e09a6b63c5f0ddee850d429c5b739ae1e0cc76d | [
"Apache-2.0"
] | null | null | null | tests/test_losses/test_gan_loss.py | liuhd073/mmgeneration | 2e09a6b63c5f0ddee850d429c5b739ae1e0cc76d | [
"Apache-2.0"
] | null | null | null | tests/test_losses/test_gan_loss.py | liuhd073/mmgeneration | 2e09a6b63c5f0ddee850d429c5b739ae1e0cc76d | [
"Apache-2.0"
] | null | null | null | # Copyright (c) OpenMMLab. All rights reserved.
import numpy.testing as npt
import pytest
import torch
from mmgen.models.losses.gan_loss import GANLoss
def test_gan_losses():
"""Test gan losses."""
with pytest.raises(NotImplementedError):
GANLoss(
'xixihaha',
loss_weight=1.0,
real_label_val=1.0,
fake_label_val=0.0)
input_1 = torch.ones(1, 1)
input_2 = torch.ones(1, 3, 6, 6) * 2
# vanilla
gan_loss = GANLoss(
'vanilla', loss_weight=2.0, real_label_val=1.0, fake_label_val=0.0)
loss = gan_loss(input_1, True, is_disc=False)
npt.assert_almost_equal(loss.item(), 0.6265233)
loss = gan_loss(input_1, False, is_disc=False)
npt.assert_almost_equal(loss.item(), 2.6265232)
loss = gan_loss(input_1, True, is_disc=True)
npt.assert_almost_equal(loss.item(), 0.3132616)
loss = gan_loss(input_1, False, is_disc=True)
npt.assert_almost_equal(loss.item(), 1.3132616)
# lsgan
gan_loss = GANLoss(
'lsgan', loss_weight=2.0, real_label_val=1.0, fake_label_val=0.0)
loss = gan_loss(input_2, True, is_disc=False)
npt.assert_almost_equal(loss.item(), 2.0)
loss = gan_loss(input_2, False, is_disc=False)
npt.assert_almost_equal(loss.item(), 8.0)
loss = gan_loss(input_2, True, is_disc=True)
npt.assert_almost_equal(loss.item(), 1.0)
loss = gan_loss(input_2, False, is_disc=True)
npt.assert_almost_equal(loss.item(), 4.0)
# wgan
gan_loss = GANLoss(
'wgan', loss_weight=2.0, real_label_val=1.0, fake_label_val=0.0)
loss = gan_loss(input_2, True, is_disc=False)
npt.assert_almost_equal(loss.item(), -4.0)
loss = gan_loss(input_2, False, is_disc=False)
npt.assert_almost_equal(loss.item(), 4)
loss = gan_loss(input_2, True, is_disc=True)
npt.assert_almost_equal(loss.item(), -2.0)
loss = gan_loss(input_2, False, is_disc=True)
npt.assert_almost_equal(loss.item(), 2.0)
# wgan
gan_loss = GANLoss(
'wgan-logistic-ns',
loss_weight=2.0,
real_label_val=1.0,
fake_label_val=0.0)
loss = gan_loss(input_2, True, is_disc=False)
assert loss.item() > 0
loss = gan_loss(input_2, False, is_disc=False)
assert loss.item() > 0
# hinge
gan_loss = GANLoss(
'hinge', loss_weight=2.0, real_label_val=1.0, fake_label_val=0.0)
loss = gan_loss(input_2, True, is_disc=False)
npt.assert_almost_equal(loss.item(), -4.0)
loss = gan_loss(input_2, False, is_disc=False)
npt.assert_almost_equal(loss.item(), -4.0)
loss = gan_loss(input_2, True, is_disc=True)
npt.assert_almost_equal(loss.item(), 0.0)
loss = gan_loss(input_2, False, is_disc=True)
npt.assert_almost_equal(loss.item(), 3.0)
| 35 | 75 | 0.667269 | 459 | 2,765 | 3.745098 | 0.124183 | 0.097731 | 0.115183 | 0.167539 | 0.786504 | 0.786504 | 0.760908 | 0.743456 | 0.712042 | 0.712042 | 0 | 0.054103 | 0.19783 | 2,765 | 78 | 76 | 35.448718 | 0.72092 | 0.033635 | 0 | 0.483871 | 0 | 0 | 0.016911 | 0 | 0 | 0 | 0 | 0 | 0.290323 | 1 | 0.016129 | false | 0 | 0.064516 | 0 | 0.080645 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bd6cdfcb2d2baa7c9890807db8a64ffe94a651e9 | 129 | py | Python | src/Hyperactive/hyperactive/sub_packages/iota/iota.py | skn123/LDWPSO-CNN | 7f05eb1defee2e968e5b3bed53f2b444b2b48fdb | [
"MIT"
] | 6 | 2020-01-24T16:15:34.000Z | 2022-03-21T13:53:32.000Z | src/Hyperactive/hyperactive/sub_packages/iota/iota.py | skn123/LDWPSO-CNN | 7f05eb1defee2e968e5b3bed53f2b444b2b48fdb | [
"MIT"
] | 1 | 2020-06-15T04:19:32.000Z | 2020-06-15T04:19:32.000Z | src/Hyperactive/hyperactive/sub_packages/iota/iota.py | skn123/LDWPSO-CNN | 7f05eb1defee2e968e5b3bed53f2b444b2b48fdb | [
"MIT"
] | 3 | 2021-03-29T17:11:27.000Z | 2021-05-17T13:33:10.000Z | # Author: Simon Blanke
# Email: simon.blanke@yahoo.com
# License: MIT License
class Iota:
def __init__(self):
pass
| 14.333333 | 31 | 0.666667 | 17 | 129 | 4.823529 | 0.823529 | 0.268293 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.232558 | 129 | 8 | 32 | 16.125 | 0.828283 | 0.550388 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
bdd5e3f29772766dab26fbb7264f33fe35870786 | 30 | py | Python | cppwg/__init__.py | josephsnyder/cppwg | 265117455ed57eb250643a28ea6029c2bccf3ab3 | [
"MIT"
] | 21 | 2017-10-03T14:29:36.000Z | 2021-12-07T08:54:43.000Z | cppwg/__init__.py | josephsnyder/cppwg | 265117455ed57eb250643a28ea6029c2bccf3ab3 | [
"MIT"
] | 2 | 2017-12-29T19:17:44.000Z | 2020-03-27T14:59:27.000Z | cppwg/__init__.py | josephsnyder/cppwg | 265117455ed57eb250643a28ea6029c2bccf3ab3 | [
"MIT"
] | 6 | 2019-03-21T11:55:52.000Z | 2021-07-13T20:49:50.000Z | from cppwg.generators import * | 30 | 30 | 0.833333 | 4 | 30 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 30 | 1 | 30 | 30 | 0.925926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bdfcdb1c9d91bfe15d00aa70ba5f70d769f8a4b2 | 144 | py | Python | savvy/__init__.py | mschrader15/savvy | ae9e06195ac76c0d0a729c8a5cdd24ec36d5297c | [
"BSD-2-Clause"
] | 38 | 2016-03-19T20:14:43.000Z | 2022-01-05T22:53:22.000Z | savvy/__init__.py | mschrader15/savvy | ae9e06195ac76c0d0a729c8a5cdd24ec36d5297c | [
"BSD-2-Clause"
] | 8 | 2016-02-23T23:17:29.000Z | 2016-03-11T21:40:33.000Z | savvy/__init__.py | houghb/HDSAviz | 82e7e897ac3bef6291dbd2b7b93a78de4db64c69 | [
"BSD-2-Clause"
] | 10 | 2016-07-30T04:05:46.000Z | 2021-09-20T02:44:24.000Z | from . import data_processing
from . import interactive_plots
from . import plotting
from . import network_tools
from . import sensitivity_tools | 28.8 | 31 | 0.833333 | 19 | 144 | 6.105263 | 0.526316 | 0.431034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131944 | 144 | 5 | 32 | 28.8 | 0.928 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
da1f27b0f2ba1797b6dd69ecca33d6506a8c725a | 4,215 | py | Python | store/tests/test_context_processor.py | kevgathuku/compshop | 385fcb608d492921d41072d2d634fc46164ee0f8 | [
"BSD-3-Clause"
] | 1 | 2015-07-04T08:24:59.000Z | 2015-07-04T08:24:59.000Z | store/tests/test_context_processor.py | kevgathuku/compshop | 385fcb608d492921d41072d2d634fc46164ee0f8 | [
"BSD-3-Clause"
] | null | null | null | store/tests/test_context_processor.py | kevgathuku/compshop | 385fcb608d492921d41072d2d634fc46164ee0f8 | [
"BSD-3-Clause"
] | null | null | null | """
Tests for Custom context processors.
"""
import os
from django.conf import settings
from django.core.urlresolvers import reverse
from django.test import TestCase, override_settings
from .factories import ProductFactory
class ProductCategoriesContextProcessorTests(TestCase):
"""
Tests for the ``store.context_processors.product_categories`` processor.
"""
@override_settings(
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [os.path.join(settings.BASE_DIR, 'templates')],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
'store.context_processors.product_categories',
],
},
},
]
)
def test_custom_context_exists_if_context_processor_included(self):
# Get the homepage
response = self.client.get(reverse('home'))
self.assertIn('product_categories', response.context)
@override_settings(
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [os.path.join(settings.BASE_DIR, 'templates')],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
)
def test_custom_context_does_not_exist_if_not_included_in_settings(self):
# Get the homepage
response = self.client.get(reverse('home'))
self.assertNotIn('product_categories', response.context)
class FeaturedProductsContextProcessorTests(TestCase):
"""
Tests for the ``store.context_processors.featured_products`` processor.
"""
@override_settings(
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [os.path.join(settings.BASE_DIR, 'templates')],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
'store.context_processors.featured_products',
],
},
},
]
)
def test_custom_context_exists_if_context_processor_included(self):
# Get the homepage
response = self.client.get(reverse('home'))
self.assertIn('featured', response.context)
@override_settings(
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [os.path.join(settings.BASE_DIR, 'templates')],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
)
def test_custom_context_does_not_exist_if_not_included_in_settings(self):
# Get the homepage
response = self.client.get(reverse('home'))
self.assertNotIn('featured', response.context)
| 37.633929 | 78 | 0.559193 | 347 | 4,215 | 6.570605 | 0.181556 | 0.186404 | 0.073684 | 0.108772 | 0.847368 | 0.818421 | 0.818421 | 0.782456 | 0.782456 | 0.782456 | 0 | 0 | 0.337604 | 4,215 | 111 | 79 | 37.972973 | 0.816619 | 0.059312 | 0 | 0.629213 | 0 | 0 | 0.320336 | 0.248981 | 0 | 0 | 0 | 0 | 0.044944 | 1 | 0.044944 | false | 0 | 0.05618 | 0 | 0.123596 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
da2601ac6cf4ea8f41a6a6d212ed8eb6020f73cd | 160,883 | py | Python | BH3.py | MayankDAS/BH_GUI | d680a82ebc20b8b757392bc5014ca3d6d6533f22 | [
"CC0-1.0"
] | null | null | null | BH3.py | MayankDAS/BH_GUI | d680a82ebc20b8b757392bc5014ca3d6d6533f22 | [
"CC0-1.0"
] | null | null | null | BH3.py | MayankDAS/BH_GUI | d680a82ebc20b8b757392bc5014ca3d6d6533f22 | [
"CC0-1.0"
] | null | null | null | from tkinter import *
import tkinter as tk
from tkinter import ttk
import mysql.connector
from tkinter import messagebox
root = tk.Tk()
root.title("BRAND HUT")
Grid.rowconfigure(root, 0, weight=1)
Grid.rowconfigure(root, 1, weight=1)
Grid.rowconfigure(root, 2, weight=1)
Grid.columnconfigure(root, 0, weight=1)
Grid.columnconfigure(root, 2, weight=1)
Grid.columnconfigure(root, 1, weight=1)
root.minsize(300,200)
root.iconbitmap('icov1.ico')
def connectmdb():
try:
mydb = mysql.connector.connect(host="localhost", user="root", password="root123", database="purchasinglog")
connectionLabel=Label(root, text="Successfully Connected").grid(row=2, column=2)
except mysql.connector.Error as e:
connectionLabel=Label(root, text="Failed to Connect").grid(row=2, column=2)
#-----------------------------------------------PURCHASING FRAME--------------------------------------------------------------------------------------------------------------
def newpurchase():
newpur = Toplevel()
newpur.title("Purchase Entry")
newpur.minsize(500,300)
newpur.iconbitmap('icov1.ico')
def savepurchase():
mydb = mysql.connector.connect(host="localhost", user="root", password="root123", database="purchasinglog")
mycursor = mydb.cursor()
date=E1.get()
party=E2.get()
bill=E3.get()
qtym=E4.get()
ratem=E5.get()
qtys=E44.get()
rates=E55.get()
if (ratem == ""):
ratem=0
if (qtym == ""):
qtym=0
if (rates == ""):
rates=0
if (qtys == ""):
qtys=0
amntm= float(qtym)*float(ratem)
amnts= float(qtys)*float(rates)
amntTT = amntm + amnts
ratess=str(rates)
qtyss=str(qtys)
ratems=str(ratem)
qtyms=str(qtym)
amntms=str(amntm)
amntss=str(amnts)
amntTTs=str(amntTT)
if (amntm==0):
items = " Sooji "
elif (amnts==0):
items = " Maida "
else:
items =" Maida Sooji "
try:
mycursor.execute("INSERT INTO purchasingentry (Date, Bill_Number, Unit, QuantityM, AmountM, Party_Name, RateM, Item, QuantityS, RateS, AmountS) VALUES ('"+ date +"','"+ bill +"','QTL','"+ qtyms +"','"+amntms+"','"+ party +"','"+ ratems +"', '"+items+"','"+ qtyss +"','"+ ratess +"','"+ amntss +"')")
mycursor.execute("INSERT INTO partpaymentpurchase (AmountT, Party_Name, Bill_Number, BIll_Date, Balance, Balance_d) VALUES('"+amntTTs+"', '"+party+"','"+bill+"', '"+ date +"', '"+amntTTs+"', '"+amntTTs+"')")
messagebox.showinfo("showinfo", "Successfully Submitted Entry for "+items)
except mysql.connector.Error as err:
messagebox.showerror("showerror", err)
mydb.commit()
#bal=amntt - amntpd
#Lbal = Label(newpur, text=bal).grid(row=8, column=1)
Lamnt = Label(newpur, text=amntTTs).grid(row=9, column=1)
def querrybill():
L10 = Label(newpur, text=" ").grid(row=4, column=1)
L11 = Label(newpur, text=" ").grid(row=5, column=1)
L11 = Label(newpur, text=" ").grid(row=6, column=1)
L11 = Label(newpur, text=" ").grid(row=7, column=1)
L11 = Label(newpur, text=" ").grid(row=8, column=1)
mydb = mysql.connector.connect(host="localhost", user="root", password="root123", database="purchasinglog")
mycursor = mydb.cursor()
billqq=E10.get()
mycursor.execute("SELECT AmountT, Party_Name, Amount_Paid, Balance, BIll_Date, Balance_d FROM partpaymentpurchase WHERE Bill_Number = '"+ billqq +"'")
myresult = mycursor.fetchall()
for x in myresult:
amntqr = str(x[0])
partyqr= str(x[1])
pdqr= str(x[2])
balqr= str(x[3])
dateqr = str(x[4])
bald = str(x[5])
L11 = Label(frquery, text="Party: ").grid(row=3, column=0)
L10 = Label(frquery, text=partyqr).grid(row=3, column=1)
L11 = Label(frquery, text="Amount: ").grid(row=4, column=0)
L10 = Label(frquery, text=amntqr).grid(row=4, column=1)
L11 = Label(frquery, text="Paid: ").grid(row=5, column=0)
L10 = Label(frquery, text=pdqr).grid(row=5, column=1)
L11 = Label(frquery, text="Last Balance: ").grid(row=6, column=0)
L10 = Label(frquery, text=bald).grid(row=6, column=1)
L11 = Label(frquery, text="Bill Date: ").grid(row=7, column=0)
L10 = Label(frquery, text=dateqr).grid(row=7, column=1)
def updatepayment():
L10 = Label(frquery, text=" ").grid(row=5, column=1)
L10 = Label(frquery, text=" ").grid(row=6, column=1)
L13 = Label(frquery, text=" ").grid(row=11, column=1)
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="purchasinglog")
mycursor = mainlog.cursor()
billqq = E10.get()
partdt = E12.get()
partpay = E13.get()
partdts = str(partdt)
partpays = str(partpay)
mycursor.execute("SELECT AmountT, Party_Name, Balance, Amount_Paid, BIll_Date, Bill_Number FROM partpaymentpurchase WHERE Bill_Number = '"+ billqq +"'")
myresult = mycursor.fetchall()
for x in myresult:
amntqr = str(x[0])
partyqr= str(x[1])
balqr= str(x[2])
amntpd = str(x[3])
dateqr = str(x[4])
bill = str(x[5])
if (balqr=="None"):
balqr="0"
if (amntpd=="None"):
amntpd="0"
#upamntt = float(amntqr) - float(partpay)
#baln= float(amntqr) - float(partpay)
baln=float(balqr)-float(partpays)
upamntpd = float(amntpd) + float(partpay)
upamntpds= str(upamntpd)
upamntT=float(amntqr) - float(upamntpds)
upamntTs=str(upamntT)
try:
mycursor.execute("UPDATE partpaymentpurchase SET Balance='"+ upamntTs +"', Amount_Paid= '"+upamntpds+"' WHERE Bill_Number='"+ billqq +"' ")
mycursor.execute("INSERT INTO partpaymentpurchase (AmountT, Party_Name, Bill_Number, Part_date, Part_payment, Balance, Amount_Paid, BIll_Date, Balance_d) VALUES('"+amntqr+"', '"+partyqr+"','"+bill+"', '"+partdts+"', '"+partpays+"', '"+ upamntTs +"','"+upamntpds+"','"+dateqr+"', '"+str(baln)+"' )")
messagebox.showinfo("showinfo", "Update Successfull")
except mysql.connector.Error as err:
messagebox.showerror("showerror", err)
L10 = Label(frquery, text=upamntpds).grid(row=5, column=1)
L13 = Label(frquery, text=upamntTs).grid(row=6, column=1)
L10 = Label(frquery, text=upamntTs).grid(row=11, column=1)
mainlog.commit()
def savepack():
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="purchasinglog")
mycursor = mainlog.cursor()
date=E1.get()
party=E2.get()
bill=E3.get()
qtypouch=E17.get()
ratepouch=E18.get()
qtybag=E19.get()
ratebag=E20.get()
qtycart=E21.get()
ratecart=E22.get()
if (qtypouch==""):
qtypouch="0"
if (ratepouch==""):
ratepouch="0"
if (qtybag==""):
qtybag="0"
if (ratebag==""):
ratebag="0"
if (qtycart==""):
qtycart="0"
if (ratecart==""):
ratecart="0"
amntpouch = float(qtypouch) * float(ratepouch)
amntbag = float(qtybag) * float(ratebag)
amntcart = float(qtycart) * float(ratecart)
amntpouchs=str(amntpouch)
amntbags=str(amntbag)
amntcarts=str(amntcart)
thpack = float(qtypouch) * 0.1497
thpacks =str(thpack)
L20=Label(frpack, text=amntpouchs).grid(row=6, column=1)
L21=Label(frpack, text=amntbags).grid(row=7, column=1)
L22=Label(frpack, text=amntcarts).grid(row=8, column=1)
try:
mycursor.execute("INSERT INTO packingpurchase (Date, Party_Name, Bill_Number, QtyP, RateP, AmountP, QtyB, RateB, AmountB, QtyC, RateC, AmountC, packingth) VALUES('"+ date +"', '"+ party +"', '"+ bill +"', '"+ qtypouch +"', '"+ ratepouch +"', '"+ amntpouchs +"', '"+ qtybag +"', '"+ ratebag +"', '"+ amntbags +"', '"+ qtycart +"', '"+ ratecart +"', '"+ amntcarts +"', '"+ thpacks +"') ")
messagebox.showinfo("showinfo", "Packing Entry Saved Successfully")
except mysql.connector.Error as err:
messagebox.showerror("showerror", err)
mainlog.commit()
#____________________________NEW ENTRY____________________________________________________
frnew = LabelFrame(newpur, text="New Bill Entry", padx=10, pady=10)
frnew.grid(row=0, column=0, padx=10, pady=10)
frquery = LabelFrame(newpur, text="Query and Update", padx=10, pady=10)
frquery.grid(row=0, column=1, padx=10, pady=10)
frpack = LabelFrame(newpur, text="Packing Materials", padx=10, pady=10)
frpack.grid(row=0, column=2, padx=10, pady=10)
L1 = Label(frnew, text="Date")
L1.grid(row = 1, column = 0)
E1 = Entry(frnew)
E1.grid(row=1, column=1)
L2 = Label(frnew, text="Vendor")
L2.grid(row=2, column=0)
E2 = Entry(frnew)
E2.grid(row=2, column=1)
L3 = Label(frnew, text="Bill No.")
L3.grid(row=3, column=0)
E3 = Entry(frnew)
E3.grid(row=3, column=1)
L4 = Label(frnew, text="Maida Qty.")
L4.grid(row=5, column=0)
E4 = Entry(frnew)
E4.grid(row=6, column=0)
L5 = Label(frnew, text="Maida Rate")
L5.grid(row=7, column=0)
E5 = Entry(frnew)
E5.grid(row=8, column=0)
L4 = Label(frnew, text="Sooji Qty.")
L4.grid(row=5, column=1)
E44 = Entry(frnew)
E44.grid(row=6, column=1)
L5 = Label(frnew, text="Sooji Rate")
L5.grid(row=7, column=1)
E55 = Entry(frnew)
E55.grid(row=8, column=1)
L6 = Label(frnew, text="Amount")
L6.grid(row=9, column=0)
L8 = Label(frnew, text="Balance")
L8.grid(row=10, column=0)
SavePur = Button(frnew, text="Save Entry", command=savepurchase).grid(row=11, column=1)
#___________________________________PACKING___________________________________________________________________________ ____________________
LLp=Label(frpack, text=" ").grid(row=0, column=0)
L17 = Label(frpack, text="Packing Pouch (Panni):").grid(row=0, column=0)
L17 = Label(frpack, text="Qty (Kgs)").grid(row=0, column=1)
L17 = Label(frpack, text="Rate").grid(row=0, column=2)
E17 =Entry(frpack)
E17.grid(row=1, column=1)
E18 =Entry(frpack)
E18.grid(row=1, column=2)
L18 = Label(frpack, text="Packing Bags (Bori):").grid(row=2, column=0)
L17 = Label(frpack, text="Qty (No.)").grid(row=2, column=1)
L17 = Label(frpack, text="Rate").grid(row=2, column=2)
E19 =Entry(frpack)
E19.grid(row=3, column=1)
E20 =Entry(frpack)
E20.grid(row=3, column=2)
L19 = Label(frpack, text="Empty Carton (Gatte):").grid(row=4, column=0)
L17 = Label(frpack, text="Qty (No.)").grid(row=4, column=1)
L17 = Label(frpack, text="Rate").grid(row=4, column=2)
E21 =Entry(frpack)
E21.grid(row=5, column=1)
E22 =Entry(frpack)
E22.grid(row=5, column=2)
L20=Label(frpack, text="Amount Pouch:").grid(row=6, column=0)
L21=Label(frpack, text="Amount Bags:").grid(row=7, column=0)
L22=Label(frpack, text="Amount Carton:").grid(row=8, column=0)
Bsavepack = Button(frpack, text="Save Packings", command=savepack).grid(row=9, column=1)
# ___________________________ QUERY and UPDATE ___________________________________________________________________
Lab = Label(frquery, text=" ").grid(row=0, column=2)
Lsb = Label(frquery, text="Search Bill:").grid(row=1, column=0)
E10 = Entry(frquery)
E10.grid(row=1, column=1)
Bquerry = Button(frquery, text="Search",command=querrybill).grid(row=2, column=1)
L12 = Label(frquery, text=" ").grid(row=8, column=0)
L12 = Label(frquery, text="Enter Part Payment Date:").grid(row=9, column=0)
E12 = Entry(frquery)
E12.grid(row=9, column=1)
L13 = Label(frquery, text="Part Payment:").grid(row=10, column=0)
E13 = Entry(frquery)
E13.grid(row=10, column=1)
L13 = Label(frquery, text="Updated Balance:").grid(row=11, column=0)
Bupdate = Button(frquery, text="Update Payment", command=updatepayment).grid(row=12, column=0)
#-----------------------------------------------PURCHASING FRAME ENDS--------------------------------------------------------------------------------------------------------------
#-----------------------------------------------PRODUCTION FRAME STARTS--------------------------------------------------------------------------------------------------------------
def newproduction():
newprod = Toplevel()
newprod.title("Production Entry")
newprod.minsize(700,400)
newprod.iconbitmap('icov1.ico')
def savecons():
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="productionlog")
mycursor = mainlog.cursor()
date=E1.get()
qtyM=E2.get()
qtyS=E3.get()
if (qtyM==""):
qtyM="0"
if (qtyS==""):
qtyS="0"
if (qtyM=="" and qtyS==""):
messagebox.showerror("showerror", "No entries!!")
upfactorM = E11.get()
upfactorN = E10.get()
factorN =StringVar(newprod, value='upfactorN')
factorM =StringVar(newprod, value='upfactorM')
thNoo= float(upfactorN) * float(qtyM)
thNoos =str(thNoo)
thMac= float(upfactorM) * float(qtyS)
thMacs =str(thMac)
try:
mycursor.execute("INSERT INTO consumption (Date, QtyM, QtyS, ThNoodleProd, ThMacProd) VALUES('"+ date +"', '"+ qtyM +"', '"+ qtyS +"', '"+ thNoos +"', '"+ thMacs +"') ")
messagebox.showinfo("showinfo", "Consumption Entry Saved Successfully")
except mysql.connector.Error as err:
messagebox.showerror("showerror", err)
mainlog.commit()
def saveprods():
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="productionlog")
mycursor = mainlog.cursor()
date=E1.get()
qtyNoodle=E7.get()
qtyMac=E8.get()
qtyPasta=E9.get()
if (qtyNoodle==""):
qtyNoodle="0"
if (qtyMac==""):
qtyMac="0"
if (qtyPasta==""):
qtyPasta="0"
qtypk = int(qtyNoodle) * 25
qtypks=str(qtypk)
qtycts=qtyNoodle
qtybgs=qtyMac
try:
mycursor.execute("INSERT INTO production (Date, qtyNoodle, qtyMac, qtyPasta, QtyP, QtyB, QtyC) VALUES('"+ date +"', '"+ qtyNoodle +"', '"+ qtyMac +"', '"+ qtyPasta +"', '"+ qtypks +"','"+ qtybgs +"','"+ qtycts +"') ")
messagebox.showinfo("showinfo", "Production Entry Saved Successfully")
except mysql.connector.Error as err:
messagebox.showerror("showerror", err)
mainlog.commit()
#______________________________________ CONSUPTION ________________________________________________________________________
L1 = Label(newprod, text="Today's Date").grid(row = 0, column = 0, padx=10, pady=10, sticky='W')
E1 = Entry(newprod)
E1.grid(row=0, column=0, padx=10, pady=10)
frcons = LabelFrame(newprod, text="Consumption", padx=20, pady=20, height=400, width=400)
frcons.grid(row=1, column=0, padx=20, pady=20)
frprod = LabelFrame(newprod, text="Production", padx=20, pady=20,height=400, width=400)
frprod.grid(row=1, column=1, padx=20, pady=20)
L1 = Label(frcons, text="Maida Qty (Bori) :")
L1.grid(row = 0, column = 0, padx=5, pady=5)
E2 = Entry(frcons)
E2.grid(row=0, column=1, padx=5, pady=5)
L1 = Label(frcons, text="Sooji Qty (Bori) :").grid(row = 1, column = 0, padx=5, pady=5)
E3 = Entry(frcons)
E3.grid(row=1, column=1, padx=5, pady=5)
L2= Label(frcons, text=" ").grid(row=0, column=2)
L2= Label(frcons, text=" ").grid(row=2, column=0)
factorN =StringVar(newprod, value='2.6')
factorM =StringVar(newprod, value='2.4')
L1 = Label(frcons, text="Noodle Factor :")
L1.grid(row = 3, column = 0, padx=5, pady=5)
E10 = Entry(frcons, textvariable=factorN)
E10.grid(row=3, column=1, padx=5, pady=5)
L1 = Label(frcons, text="Macroni Factor :")
L1.grid(row = 4, column = 0, padx=5, pady=5)
E11 = Entry(frcons, textvariable=factorM)
E11.grid(row=4, column=1, padx=5, pady=5)
Bsavecons = Button(frcons, text="Save Consumption", command=savecons).grid(row=6, column = 1, padx=5, pady=5)
#__________________________________________ PRODUCTION ________________________________________________________________________
L1 = Label(frprod, text="Noodle Qty :")
L1.grid(row = 0, column = 0, padx=5, pady=5)
E7 = Entry(frprod)
E7.grid(row=0, column=1, padx=5, pady=5)
L1 = Label(frprod, text="Macroni Qty :")
L1.grid(row = 1, column = 0, padx=5, pady=5)
E8 = Entry(frprod)
E8.grid(row=1, column=1, padx=5, pady=5)
L1 = Label(frprod, text="Pasta Qty :")
L1.grid(row = 2, column = 0, padx=5, pady=5)
E9 = Entry(frprod)
E9.grid(row=2, column=1, padx=5, pady=5)
Bsaveprod = Button(frprod, text="Save Production", command=saveprods).grid(row=6, column = 1, padx=5, pady=5)
#-----------------------------------------------PRODUCTION FRAME ENDS--------------------------------------------------------------------------------------------------------------
#-----------------------------------------------SALES FRAME STARTS--------------------------------------------------------------------------------------------------------------
def newsales():
newsale = Toplevel()
newsale.title("Sales Entry")
newsale.minsize(300,400)
newsale.iconbitmap('icov1.ico')
def savesales():
mydb = mysql.connector.connect(host="localhost", user="root", password="root123", database="newsales")
mycursor = mydb.cursor()
date=E1.get()
party=E2.get()
bill=E3.get()
qtyn=E4.get()
raten=E5.get()
qtym=E44.get()
ratem=E55.get()
qtyp=E6.get()
ratep=E7.get()
if (ratem == ""):
ratem=0
if (qtym == ""):
qtym=0
if (raten == ""):
raten=0
if (qtyn == ""):
qtyn=0
if (ratep == ""):
ratep=0
if (qtyp == ""):
qtyp=0
amntm= float(qtym)*float(ratem)
amntn= float(qtyn)*float(raten)
amntp= float(qtyp)*float(ratep)
amntTT = amntm + amntn + amntp
ratens=str(raten)
qtyns=str(qtyn)
ratems=str(ratem)
qtyms=str(qtym)
rateps=str(ratep)
qtyps=str(qtyp)
amntms=str(amntm)
amntns=str(amntn)
amntps=str(amntp)
amntTTs=str(amntTT)
items=""
NoodleSS = " Noodle "
MacSS = " Macroni "
PastaSS= " Pasta "
if (amntns!="0.0"):
items = items +" "+ NoodleSS
if (amntms!="0.0"):
items = items +" "+ MacSS
if (amntps!="0.0"):
items = items +" "+ PastaSS
try:
mycursor.execute("INSERT INTO salesentry (Date, Bill_Number, Party_Name, Item, RateM, QtyM, AmountM, QtyN, RateN, AmountN, QtyP, RateP, AmountP) VALUES ('"+ date +"','"+ bill +"','"+ party +"', '"+ items +"', '"+ ratems +"', '"+ qtyms +"','"+amntms+"','"+ qtyns +"','"+ ratens +"','"+ amntns +"','"+ qtyps +"','"+ rateps +"','"+ amntps +"') ")
mycursor.execute("INSERT INTO paymentsales (AmountT, Party_Name, Bill_Number, Bill_Date, Balance_d, Balance) VALUES('"+amntTTs+"', '"+party+"','"+bill+"', '"+ date +"', '"+ amntTTs +"', '"+ amntTTs +"')")
messagebox.showinfo("showinfo", "Successfully Submitted Sales Entry")
except mysql.connector.Error as err:
messagebox.showerror("showerror", err)
mydb.commit()
L6 = Label(frsale, text=amntTTs)
L6.grid(row=9, column=1)
def querrysales():
L10 = Label(frpay, text=" ").grid(row=4, column=1)
L11 = Label(frpay, text=" ").grid(row=5, column=1)
L11 = Label(frpay, text=" ").grid(row=6, column=1)
L11 = Label(frpay, text=" ").grid(row=7, column=1)
L11 = Label(frpay, text=" ").grid(row=8, column=1)
mydb = mysql.connector.connect(host="localhost", user="root", password="root123", database="newsales")
mycursor = mydb.cursor()
billqq=E10.get()
mycursor.execute("SELECT AmountT, Party_Name, Amount_Paid, Balance, Bill_Date, Balance_d FROM paymentsales WHERE Bill_Number = '"+ billqq +"'")
myresult = mycursor.fetchall()
for x in myresult:
amntqr = str(x[0])
partyqr= str(x[1])
pdqr= str(x[2])
balqr= str(x[3])
dateqr = str(x[4])
bald = str(x[5])
L11 = Label(frpay, text="Party: ").grid(row=4, column=0)
L10 = Label(frpay, text=partyqr).grid(row=4, column=1)
L11 = Label(frpay, text="Amount: ").grid(row=5, column=0)
L10 = Label(frpay, text=amntqr).grid(row=5, column=1)
L11 = Label(frpay, text="Paid: ").grid(row=6, column=0)
L10 = Label(frpay, text=pdqr).grid(row=6, column=1)
L11 = Label(frpay, text="Last Balance: ").grid(row=7, column=0)
L10 = Label(frpay, text=bald).grid(row=7, column=1)
L11 = Label(frpay, text="Bill Date: ").grid(row=8, column=0)
L10 = Label(frpay, text=dateqr).grid(row=8, column=1)
def uppaysale():
L10 = Label(frpay, text=" ").grid(row=7, column=1)
L10 = Label(frpay, text=" ").grid(row=6, column=1)
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="newsales")
mycursor = mainlog.cursor()
billqq = E10.get()
partdt = E12.get()
partpay = E13.get()
partdts = str(partdt)
partpays = str(partpay)
mycursor.execute("SELECT AmountT, Party_Name, Balance, Amount_Paid, Bill_Date, Bill_Number, Balance_d FROM paymentsales WHERE Bill_Number = '"+ billqq +"'")
myresult = mycursor.fetchall()
for x in myresult:
amntqr = str(x[0])
partyqr= str(x[1])
balqr= str(x[2])
amntpd = str(x[3])
dateqr = str(x[4])
bill = str(x[5])
bald = str(x[5])
if (balqr=="None"):
balqr="0"
if (amntpd=="None"):
amntpd="0"
#upamntt = float(amntqr) - float(partpay)
#baln= float(amntqr) - float(partpay)
baln=float(balqr)-float(partpays)
#upbal=float(baln) - float(partpays)
#upbals=str(upbal)
upamntpd = float(amntpd) + float(partpay)
upamntpds= str(upamntpd)
upamntT=float(amntqr) - float(upamntpds)
upamntTs=str(upamntT)
try:
mycursor.execute("UPDATE paymentsales SET Balance='"+ upamntTs +"', Amount_Paid= '"+upamntpds+"' WHERE Bill_Number='"+ billqq +"' ")
mycursor.execute("INSERT INTO paymentsales (AmountT, Party_Name, Bill_Number, Part_date, Part_Pay, Balance, Amount_Paid, Bill_Date, Balance_d) VALUES('"+amntqr+"', '"+partyqr+"','"+bill+"', '"+partdts+"', '"+partpays+"', '"+ upamntTs +"','"+upamntpds+"','"+dateqr+"', '"+str(baln)+"' )")
messagebox.showinfo("showinfo", "Update Successfull")
except mysql.connector.Error as err:
messagebox.showerror("showerror", err)
L10 = Label(frpay, text=upamntpds).grid(row=6, column=1)
L10 = Label(frpay, text=upamntTs).grid(row=12, column=1)
L10 = Label(frpay, text=upamntTs).grid(row=7, column=1)
mainlog.commit()
frsale = LabelFrame(newsale, text="New Bill Entry", padx=20, pady=20, height=400, width=400)
frsale.grid(row=1, column=0, padx=20, pady=20)
frpay = LabelFrame(newsale, text="Part Payment Update", padx=20, pady=20,height=400, width=400)
frpay.grid(row=1, column=1, padx=20, pady=20)
#__________________________________________ New Bill Entry ________________________________________________________________________
L1 = Label(frsale, text="Date")
L1.grid(row = 1, column = 0, padx=5, pady=5)
E1 = Entry(frsale)
E1.grid(row=1, column=1, padx=5, pady=5)
L2 = Label(frsale, text="Party")
L2.grid(row=2, column=0, padx=5, pady=5)
E2 = Entry(frsale)
E2.grid(row=2, column=1, padx=5, pady=5)
L3 = Label(frsale, text="Bill No.")
L3.grid(row=3, column=0, padx=5, pady=5)
E3 = Entry(frsale)
E3.grid(row=3, column=1, padx=5, pady=5)
L4 = Label(frsale, text="Noodle Qty.")
L4.grid(row=5, column=0, padx=5, pady=5)
E4 = Entry(frsale)
E4.grid(row=6, column=0, padx=5, pady=5)
L5 = Label(frsale, text="Noodle Rate")
L5.grid(row=7, column=0, padx=5, pady=5)
E5 = Entry(frsale)
E5.grid(row=8, column=0, padx=5, pady=5)
L4 = Label(frsale, text="Macroni Qty.")
L4.grid(row=5, column=1, padx=5, pady=5)
E44 = Entry(frsale)
E44.grid(row=6, column=1, padx=5, pady=5)
L5 = Label(frsale, text="Macroni Rate")
L5.grid(row=7, column=1, padx=5, pady=5)
E55 = Entry(frsale)
E55.grid(row=8, column=1, padx=5, pady=5)
L4 = Label(frsale, text="Pasta Qty.")
L4.grid(row=5, column=2, padx=5, pady=5)
E6 = Entry(frsale)
E6.grid(row=6, column=2, padx=5, pady=5)
L5 = Label(frsale, text="Pasta Rate")
L5.grid(row=7, column=2, padx=5, pady=5)
E7 = Entry(frsale)
E7.grid(row=8, column=2, padx=5, pady=5)
L6 = Label(frsale, text="Amount")
L6.grid(row=9, column=0)
Bsavesale = Button(frsale, text="Save New Bill", command=savesales).grid(row=11, column = 1, padx=5, pady=5)
#__________________________________________ Payment Update ________________________________________________________________________
Lsb = Label(frpay, text="Search Bill:").grid(row=1, column=0)
E10 = Entry(frpay)
E10.grid(row=1, column=1)
Bquerry = Button(frpay, text="Search",command=querrysales).grid(row=2, column=1)
L12 = Label(frpay, text=" ").grid(row=3, column=0)
L12 = Label(frpay, text="Enter Part Payment Date:").grid(row=10, column=0)
E12 = Entry(frpay)
E12.grid(row=10, column=1)
L13 = Label(frpay, text="Part Payment:").grid(row=11, column=0)
E13 = Entry(frpay)
E13.grid(row=11, column=1)
L13 = Label(frpay, text="Updated Balance:").grid(row=12, column=0)
Bupdate = Button(frpay, text="Update Payment", command=uppaysale).grid(row=13, column=1)
#-----------------------------------------------SALES FRAME ENDS--------------------------------------------------------------------------------------------------------------
#-----------------------------------------------EXPENDITURE FRAME STARTS--------------------------------------------------------------------------------------------------------------
def expend():
newexp = Toplevel()
newexp.title("Expenditure Entry")
newexp.minsize(400,300)
newexp.iconbitmap('icov1.ico')
def saveregx():
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="expenditure")
mycursor = mainlog.cursor()
date=E1.get()
salary=E2.get()
diesel=E3.get()
electricity=E4.get()
if (salary == ""):
salary="0"
if (diesel == ""):
diesel="0"
if (electricity == ""):
electricity="0"
texp = float(salary) + float(diesel) + float(electricity)
texps = str(texp)
try:
mycursor.execute("INSERT INTO expreg (Date, Salary, Diesel, Electricity, expT ) VALUES('"+date+"', '"+salary+"','"+diesel+"', '"+electricity+"', '"+texps+"') ")
messagebox.showinfo("showinfo", "Expenditures Added")
except mysql.connector.Error as err:
messagebox.showerror("showerror", err)
mainlog.commit()
LT=Label(frregx, text="Total expenditure :").grid(row=6, column=0)
LT=Label(frregx, text=texps).grid(row=6, column=1)
def saveiregx():
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="expenditure")
mycursor = mainlog.cursor()
date=E5.get()
name=E6.get()
amnount=E7.get()
try:
mycursor.execute("INSERT INTO expireg (Date, Name, Amount) VALUES('"+date+"', '"+name+"','"+amnount+"') ")
messagebox.showinfo("showinfo", "Expenditures Added")
except mysql.connector.Error as err:
messagebox.showerror("showerror", err)
mainlog.commit()
def clrirx():
E5 = Entry(friregx)
E7 = Entry(friregx)
E6 = Entry(friregx)
E5.grid(row=0, column=1, padx=5, pady=5)
E6.grid(row=1, column=1, padx=5, pady=5)
E7.grid(row=2, column=1, padx=5, pady=5)
#__________________________________________ REGULAR EXP________________________________________________________________________
frregx = LabelFrame(newexp, text="Regular Expenditures", padx=10, pady=10, height=400, width=400)
frregx.grid(row=0, column=0, padx=10, pady=10)
friregx = LabelFrame(newexp, text="Irregular Expenditures", padx=10, pady=10, height=400, width=400)
friregx.grid(row=0, column=1, padx=10, pady=10)
L1 = Label(frregx, text="Date")
L1.grid(row = 0, column = 0, padx=5, pady=5)
E1 = Entry(frregx)
E1.grid(row=0, column=1, padx=5, pady=5)
L1 = Label(frregx, text="Rs.")
L1.grid(row = 1, column = 1, padx=5, pady=5, sticky="W")
L1 = Label(frregx, text="Salary :")
L1.grid(row = 2, column = 0, padx=5, pady=5)
E2 = Entry(frregx)
E2.grid(row=2, column=1, padx=5, pady=5)
L1 = Label(frregx, text="Diesel :")
L1.grid(row = 3, column = 0, padx=5, pady=5)
E3 = Entry(frregx)
E3.grid(row=3, column=1, padx=5, pady=5)
L1 = Label(frregx, text="Electricity :")
L1.grid(row = 4, column = 0, padx=5, pady=5)
E4 = Entry(frregx)
E4.grid(row=4, column=1, padx=5, pady=5)
B1 = Button(frregx, text="Save", command=saveregx).grid(row=7, column=1)
#__________________________________________ IRREGULAR EXP________________________________________________________________________
L1 = Label(friregx, text="Date")
L1.grid(row = 0, column = 0, padx=5, pady=5)
E5 = Entry(friregx)
E5.grid(row=0, column=1, padx=5, pady=5)
L1 = Label(friregx, text="Name of Expenditure :")
L1.grid(row = 1, column = 0, padx=5, pady=5)
E6 = Entry(friregx)
E6.grid(row=1, column=1, padx=5, pady=5)
L1 = Label(friregx, text="Amount")
L1.grid(row = 2, column = 0, padx=5, pady=5)
E7 = Entry(friregx)
E7.grid(row=2, column=1, padx=5, pady=5)
B2 = Button(friregx, text="Save", command=saveiregx).grid(row=3, column=1)
B3 = Button(friregx, text="Clear", command=clrirx).grid(row=3, column=0)
#-----------------------------------------------EXPENDITURE FRAME ENDS--------------------------------------------------------------------------------------------------------------
#-----------------------------------------------RESULTS FRAME STARTS--------------------------------------------------------------------------------------------------------------
def results():
result = Toplevel()
result.title("Results And Queries")
result.minsize(400,300)
result.iconbitmap('icov1.ico')
#__________________________________________________________VEIW LOGS______________________________________________________________________________________________
def viewlistbox():
def onFrameConfigure(canvas):
'''Reset the scroll region to encompass the inner frame'''
canvas.configure(scrollregion=canvas.bbox("all"))
Lb1 = Toplevel()
Lb1.title("LOGS")
#Lb1.minsize(300,300)
Grid.rowconfigure(Lb1, 0, weight=1)
Grid.columnconfigure(Lb1, 0, weight=1)
Grid.columnconfigure(Lb1, 1, weight=1)
dfrom=E1.get()
dto=E2.get()
d1 = drop1.get()
d2 = drop2.get()
canvas = Canvas(Lb1)
canvas.grid(row=0, column=0, sticky=N+E+W+S)
frlogs = Frame(canvas)
#frlogs.configure(scrollregion=frlogs.bbox("all"))
if (d1=="Purchase"):
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="purchasinglog")
mycursor = mainlog.cursor()
if (d2=="Maida"):
mycursor.execute("SELECT Date, Bill_Number, Party_Name, QuantityM, RateM, AmountM FROM purchasingentry WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Item LIKE '%Maida%' ORDER BY Date DESC")
myresult = mycursor.fetchall()
rc=len(myresult)
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Qty").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="Rate").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="Amount").grid(row=0, column=5, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=6, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
qtym = [lis[3] for lis in myresult]
ratem = [lis[4] for lis in myresult]
amntm = [lis[5] for lis in myresult]
#print(res[2])
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(qtym[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(ratem[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(amntm[y])).grid(row=y+1, column=5)
Label(frlogs, text=str(y+1)).grid(row=y+1, column=6)
if (str(amntm[y])=="None"):
amntm[y] = 0
sumamnt = sumamnt + float(str(amntm[y]))
Lbb=Label(frlog, text="Total Purchasing :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=3, column=1)
Lbb=Label(frlogs, text="Total Purchasing :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+1, column=1)
if (d2=="Sooji"):
mycursor.execute("SELECT Date, Bill_Number, Party_Name, QuantityS, RateS, AmountS FROM purchasingentry WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Item LIKE '%Sooji%' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Qty").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="Rate").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="Amount").grid(row=0, column=5, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=6, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
qtym = [lis[3] for lis in myresult]
ratem = [lis[4] for lis in myresult]
amntm = [lis[5] for lis in myresult]
#print(res[2])
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(qtym[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(ratem[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(amntm[y])).grid(row=y+1, column=5)
Label(frlogs, text=str(y+1)).grid(row=y+1, column=6)
if (str(amntm[y])=="None"):
amntm[y] = 0
sumamnt = sumamnt + float(str(amntm[y]))
Lbb=Label(frlog, text="Total Purchasing :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=3, column=1)
Lbb=Label(frlogs, text="Total Purchasing :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+1, column=1)
if (d2=="Maida and Sooji"):
mycursor.execute("SELECT Date, Bill_Number, Party_Name, QuantityS, RateS, AmountS, QuantityM, RateM, AmountM FROM purchasingentry WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="QtyM").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="RateM").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="AmountS").grid(row=0, column=5, sticky=N+W+E)
Label(frlogs, text="QtyS").grid(row=0, column=6, sticky=N+W+E)
Label(frlogs, text="RateS").grid(row=0, column=7, sticky=N+W+E)
Label(frlogs, text="AmountS").grid(row=0, column=8, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=9, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
qtys = [lis[3] for lis in myresult]
rates = [lis[4] for lis in myresult]
amnts = [lis[5] for lis in myresult]
qtym = [lis[6] for lis in myresult]
ratem = [lis[7] for lis in myresult]
amntm = [lis[8] for lis in myresult]
#print(res[2])
sumamntm=0
sumamnts=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(qtym[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(ratem[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(amntm[y])).grid(row=y+1, column=5)
Lb=Label(frlogs, text=str(qtys[y])).grid(row=y+1, column=6)
Lb=Label(frlogs, text=str(rates[y])).grid(row=y+1, column=7)
Lb=Label(frlogs, text=str(amnts[y])).grid(row=y+1, column=8)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=9)
if (str(amntm[y])=="None"):
amntm[y] = 0
if (str(amnts[y])=="None"):
amnts[y] = 0
sumamntm = sumamntm + float(str(amntm[y]))
sumamnts = sumamnts + float(str(amnts[y]))
sumamntt =sumamnts + sumamntm
Lbb=Label(frlog, text="Total Purchasing :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamntt)).grid(row=3, column=1)
Lbb=Label(frlogs, text="Total Purchasing :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamntt)).grid(row=rc+1, column=1)
if (d2=="Pouches"):
mycursor.execute("SELECT Date, Bill_Number, Party_Name, QtyP, RateP, AmountP FROM packingpurchase WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Qty").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="Rate").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="Amount").grid(row=0, column=5, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=6, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
qtyp = [lis[3] for lis in myresult]
ratep = [lis[4] for lis in myresult]
amntp = [lis[5] for lis in myresult]
#print(res[2])
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(qtyp[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(ratep[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(amntp[y])).grid(row=y+1, column=5)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=6)
if (str(amntp[y])=="None"):
amntp[y] = 0
sumamnt = sumamnt + float(str(amntp[y]))
Lbb=Label(frlog, text="Total Purchasing :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=3, column=1)
Lbb=Label(frlogs, text="Total Purchasing :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+1, column=1)
if (d2=="Bags"):
mycursor.execute("SELECT Date, Bill_Number, Party_Name, QtyB, RateB, AmountB FROM packingpurchase WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Qty").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="Rate").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="Amount").grid(row=0, column=5, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=6, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
qtyp = [lis[3] for lis in myresult]
ratep = [lis[4] for lis in myresult]
amntp = [lis[5] for lis in myresult]
#print(res[2])
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(qtyp[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(ratep[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(amntp[y])).grid(row=y+1, column=5)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=6)
if (str(amntp[y])=="None"):
amntp[y] = 0
sumamnt = sumamnt + float(str(amntp[y]))
Lbb=Label(frlog, text="Total Purchasing :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=3, column=1)
Lbb=Label(frlogs, text="Total Purchasing :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+1, column=1)
if (d2=="Cartons"):
mycursor.execute("SELECT Date, Bill_Number, Party_Name, QtyC, RateC, AmountC FROM packingpurchase WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Qty").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="Rate").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="Amount").grid(row=0, column=5, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=6, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
qtyp = [lis[3] for lis in myresult]
ratep = [lis[4] for lis in myresult]
amntp = [lis[5] for lis in myresult]
#print(res[2])
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(qtyp[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(ratep[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(amntp[y])).grid(row=y+1, column=5)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=6)
if (str(amntp[y])=="None"):
amntp[y] = 0
sumamnt = sumamnt + float(str(amntp[y]))
Lbb=Label(frlog, text="Total Purchasing :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=3, column=1)
Lbb=Label(frlogs, text="Total Purchasing :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+1, column=1)
if (d2=="All Packings"):
mycursor.execute("SELECT Date, Bill_Number, Party_Name, QtyP, RateP, AmountP, QtyB, RateB, AmountB, QtyC, RateC, AmountC FROM packingpurchase WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="QtyP").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="RateP").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="AmountP").grid(row=0, column=5, sticky=N+W+E)
Label(frlogs, text="QtyB").grid(row=0, column=6, sticky=N+W+E)
Label(frlogs, text="RateB").grid(row=0, column=7, sticky=N+W+E)
Label(frlogs, text="AmountB").grid(row=0, column=8, sticky=N+W+E)
Label(frlogs, text="QtyC").grid(row=0, column=9, sticky=N+W+E)
Label(frlogs, text="RateC").grid(row=0, column=10, sticky=N+W+E)
Label(frlogs, text="AmountC").grid(row=0, column=11, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=12, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
qtyp = [lis[3] for lis in myresult]
ratep = [lis[4] for lis in myresult]
amntp = [lis[5] for lis in myresult]
qtyb = [lis[6] for lis in myresult]
rateb = [lis[7] for lis in myresult]
amntb = [lis[8] for lis in myresult]
qtyc = [lis[9] for lis in myresult]
ratec = [lis[10] for lis in myresult]
amntc = [lis[11] for lis in myresult]
#print(res[2])
sumamntp=0
sumamntb=0
sumamntc=0
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(qtyp[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(ratep[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(amntp[y])).grid(row=y+1, column=5)
Lb=Label(frlogs, text=str(qtyb[y])).grid(row=y+1, column=6)
Lb=Label(frlogs, text=str(rateb[y])).grid(row=y+1, column=7)
Lb=Label(frlogs, text=str(amntb[y])).grid(row=y+1, column=8)
Lb=Label(frlogs, text=str(qtyc[y])).grid(row=y+1, column=9)
Lb=Label(frlogs, text=str(ratec[y])).grid(row=y+1, column=10)
Lb=Label(frlogs, text=str(amntc[y])).grid(row=y+1, column=11)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=12)
if (str(amntp[y])=="None"):
amntp[y] = 0
if (str(amntb[y])=="None"):
amntb[y] = 0
if (str(amntc[y])=="None"):
amntc[y] = 0
sumamntp = sumamntp +float(str(amntp[y]))
sumamntb = sumamntb +float(str(amntb[y]))
sumamntc = sumamntc +float(str(amntc[y]))
sumamnt = sumamntp + sumamntc + sumamntb
Lbb=Label(frlog, text="Total Purchasing :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=3, column=1)
Lbb=Label(frlogs, text="Total Purchasing :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+1, column=1)
if (d1=="Consumption"):
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="productionlog")
mycursor = mainlog.cursor()
if (d2=="Maida"):
mycursor.execute("SELECT Date, QtyM, ThNoodleProd FROM consumption WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
rc=len(myresult)
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Quantity Consumed").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Theoretical Production Noodle").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=3, sticky=N+W+E)
date = [lis[0] for lis in myresult]
qtym = [lis[1] for lis in myresult]
thqtym = [lis[2] for lis in myresult]
sumcon=0
sumconth=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(qtym[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(thqtym[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=3)
if (str(qtym[y])=="None"):
qtym[y] = 0
if (str(thqtym[y])=="None"):
thqtym[y] = 0
sumcon = sumcon + float(str(qtym[y]))
sumconth = sumconth + float(str(thqtym[y]))
Lbb=Label(frlog, text="Total Consumption :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumcon)).grid(row=3, column=1)
Lbb=Label(frlog, text="Theoretical Noodles Produced :").grid(row=4, column=0)
Lbb=Label(frlog, text=str(sumconth)).grid(row=4, column=1)
Lbb=Label(frlogs, text="Total Consumption :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumcon)).grid(row=rc+1, column=1)
Lbb=Label(frlogs, text="Theoretical Noodles Produced :").grid(row=rc+2, column=0)
Lbb=Label(frlogs, text=str(sumconth)).grid(row=rc+2, column=1)
# Lbb=Label(frlog, text="Wastage :").grid(row=5, column=0)
# Lbb=Label(frlog, text=str(sumconth-sumcon)).grid(row=5, column=1)
if (d2=="Sooji"):
mycursor.execute("SELECT Date, QtyS, ThMacProd FROM consumption WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
rc=len(myresult)
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Quantity Consumed").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Theoretical Production Macroni").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=3, sticky=N+W+E)
date = [lis[0] for lis in myresult]
qtym = [lis[1] for lis in myresult]
thqtym = [lis[2] for lis in myresult]
sumcon=0
sumconth=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(qtym[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(thqtym[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=3)
if (str(qtym[y])=="None"):
qtym[y] = 0
if (str(thqtym[y])=="None"):
thqtym[y] = 0
sumcon = sumcon + float(str(qtym[y]))
sumconth = sumconth + float(str(thqtym[y]))
Lbb=Label(frlog, text="Total Consumption :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumcon)).grid(row=3, column=1)
Lbb=Label(frlog, text="Theoretical Production Macroni :").grid(row=4, column=0)
Lbb=Label(frlog, text=str(sumconth)).grid(row=4, column=1)
Lbb=Label(frlogs, text="Total Consumption :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumcon)).grid(row=rc+1, column=1)
Lbb=Label(frlogs, text="Theoretical Production Macroni :").grid(row=rc+2, column=0)
Lbb=Label(frlogs, text=str(sumconth)).grid(row=rc+2, column=1)
#Lbb=Label(frlog, text="Wastage :").grid(row=5, column=0)
#Lbb=Label(frlog, text=str(sumconth-sumcon)).grid(row=5, column=1)
if (d2=="Maida and Sooji"):
mycursor.execute("SELECT Date, QtyM, ThNoodleProd, QtyS, ThMacProd FROM consumption WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="QtyMaida").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Theoretical Production Noodle").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="QtySooji").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="Theoretical Production Mac").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=5, sticky=N+W+E)
date = [lis[0] for lis in myresult]
qtym = [lis[1] for lis in myresult]
thn = [lis[2] for lis in myresult]
qtys = [lis[3] for lis in myresult]
thm = [lis[4] for lis in myresult]
sumamntm=0
sumamnts=0
sumconthm=0
sumconthn=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(qtym[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(thn[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(qtys[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(thm[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=5)
if (str(qtym[y])=="None"):
qtym[y] = 0
if (str(qtys[y])=="None"):
qtys[y] = 0
if (str(thn[y])=="None"):
thn[y] = 0
if (str(thm[y])=="None"):
thm[y] = 0
sumamntm = sumamntm + float(str(qtym[y]))
sumamnts = sumamnts + float(str(qtys[y]))
sumconthm = sumconthm + float(str(thm[y]))
sumconthn = sumconthn + float(str(thn[y]))
sumamntt =sumamnts + sumamntm
sumamntth = sumconthm + sumconthn
Lbb=Label(frlog, text="Total Consumption :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamntt)).grid(row=3, column=1)
Lbb=Label(frlog, text="Total Theoretical Production :").grid(row=4, column=0)
Lbb=Label(frlog, text=str(sumamntth)).grid(row=4, column=1)
Lbb=Label(frlogs, text="Total Consumption :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamntt)).grid(row=rc+1, column=1)
Lbb=Label(frlogs, text="Total Theoretical Production :").grid(row=rc+2, column=0)
Lbb=Label(frlogs, text=str(sumamntth)).grid(row=rc+2, column=1)
# Lbb=Label(frlog, text="Total Wastage :").grid(row=5, column=0)
# Lbb=Label(frlog, text=str(sumamntth-sumamntt)).grid(row=5, column=1)
if (d2=="Pouches"):
mycursor.execute("SELECT Date, QtyP FROM production WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
rc=len(myresult)
date = [lis1[0] for lis1 in myresult]
qtyp = [lis1[1] for lis1 in myresult]
mainlog2 = mysql.connector.connect(host="localhost", user="root", password="root123", database="purchasinglog")
mycursor2 = mainlog2.cursor()
mycursor2.execute("SELECT packingth FROM packingpurchase WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult2 = mycursor2.fetchall()
thpp = [lis[0] for lis in myresult2]
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Qty Produced").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Qty Purchased (Th)").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=3, sticky=N+W+E)
sumamnt=0
sumconthp=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(qtyp[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(thpp[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=3)
if (str(qtyp[y])=="None"):
qtyp[y] = 0
sumamnt = sumamnt + float(str(qtyp[y]))
if (str(thpp[y])=="None"):
thpp[y] = 0
sumconthp = sumconthp + float(str(thpp[y]))
Lbb=Label(frlog, text="Total Produced Packings :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=3, column=1)
Lbb=Label(frlog, text="Total Purchased Packings :").grid(row=4, column=0)
Lbb=Label(frlog, text=str(sumconthp)).grid(row=4, column=1)
Lbb=Label(frlog, text="Packing in Stock :").grid(row=5, column=0)
Lbb=Label(frlog, text=str(sumconthp-sumamnt)).grid(row=5, column=1)
Lbb=Label(frlogs, text="Total Produced Packings :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+1, column=1)
Lbb=Label(frlogs, text="Total Purchased Packings :").grid(row=rc+2, column=0)
Lbb=Label(frlogs, text=str(sumconthp)).grid(row=rc+2, column=1)
Lbb=Label(frlogs, text="Packing in Stock :").grid(row=rc+3, column=0)
Lbb=Label(frlogs, text=str(sumconthp-sumamnt)).grid(row=rc+3, column=1)
if (d2=="Bags"):
mycursor.execute("SELECT Date, QtyB FROM production WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
rc=len(myresult)
date = [lis[0] for lis in myresult]
qtyp= [lis[1] for lis in myresult]
mainlog2 = mysql.connector.connect(host="localhost", user="root", password="root123", database="purchasinglog")
mycursor2 = mainlog2.cursor()
mycursor2.execute("SELECT QtyB FROM packingpurchase WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult2 = mycursor2.fetchall()
thpp = [lis[0] for lis in myresult2]
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Qty Produced").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Qty Purchased").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=3, sticky=N+W+E)
sumamnt=0
sumconthp=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(qtyp[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(thpp[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=3)
if (str(qtyp[y])=="None"):
qtyp[y] = 0
sumamnt = sumamnt + float(str(qtyp[y]))
if (str(thpp[y])=="None"):
thpp[y] = 0
sumconthp = sumconthp + float(str(thpp[y]))
Lbb=Label(frlog, text="Total Produced Bags :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=3, column=1)
Lbb=Label(frlog, text="Total Purchased Bags :").grid(row=4, column=0)
Lbb=Label(frlog, text=str(sumconthp)).grid(row=4, column=1)
Lbb=Label(frlog, text="Bags in Stock :").grid(row=5, column=0)
Lbb=Label(frlog, text=str(sumconthp-sumamnt)).grid(row=5, column=1)
Lbb=Label(frlogs, text="Total Produced Bags :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+1, column=1)
Lbb=Label(frlogs, text="Total Purchased Bags :").grid(row=rc+2, column=0)
Lbb=Label(frlogs, text=str(sumconthp)).grid(row=rc+2, column=1)
Lbb=Label(frlogs, text="Bags in Stock :").grid(row=rc+3, column=0)
Lbb=Label(frlogs, text=str(sumconthp-sumamnt)).grid(row=rc+3, column=1)
if (d2=="Cartons"):
mycursor.execute("SELECT Date, QtyC FROM production WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
rc=len(myresult)
date = [lis[0] for lis in myresult]
qtyp= [lis[1] for lis in myresult]
mainlog2 = mysql.connector.connect(host="localhost", user="root", password="root123", database="purchasinglog")
mycursor2 = mainlog2.cursor()
mycursor2.execute("SELECT QtyB FROM packingpurchase WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult2 = mycursor2.fetchall()
thpp = [lis[0] for lis in myresult2]
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Cartons Produced").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Cartons Purchased").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=3, sticky=N+W+E)
sumamnt=0
sumconthp=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(qtyp[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(thpp[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=3)
if (str(qtyp[y])=="None"):
qtyp[y] = 0
sumamnt = sumamnt + float(str(qtyp[y]))
if (str(thpp[y])=="None"):
thpp[y] = 0
sumconthp = sumconthp + float(str(thpp[y]))
Lbb=Label(frlog, text="Total Produced Cartons:").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=3, column=1)
Lbb=Label(frlog, text="Total Purchased Cartons:").grid(row=4, column=0)
Lbb=Label(frlog, text=str(sumconthp)).grid(row=4, column=1)
Lbb=Label(frlog, text="CArtons in Stock:").grid(row=5, column=0)
Lbb=Label(frlog, text=str(sumconthp-sumamnt)).grid(row=5, column=1)
Lbb=Label(frlogs, text="Total Produced Cartons:").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+1, column=1)
Lbb=Label(frlogs, text="Total Purchased Cartons:").grid(row=rc+2, column=0)
Lbb=Label(frlogs, text=str(sumconthp)).grid(row=rc+2, column=1)
Lbb=Label(frlogs, text="CArtons in Stock:").grid(row=rc+3, column=0)
Lbb=Label(frlogs, text=str(sumconthp-sumamnt)).grid(row=rc+3, column=1)
if (d2=="All Packings"):
messagebox.showerror("showerror", "Invalid Request")
if (d1=="Production"):
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="productionlog")
mycursor = mainlog.cursor()
if (d2=="Noodle"):
mycursor.execute("SELECT Date, qtyNoodle FROM production WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
rc=len(myresult)
date = [lis[0] for lis in myresult]
qtyp= [lis[1] for lis in myresult]
mycursor.execute("SELECT ThNoodleProd FROM consumption WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult2 = mycursor.fetchall()
thpp = [lis[0] for lis in myresult2]
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Qty Produced").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="TH Production").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=3, sticky=N+W+E)
sumamnt=0
sumconthp=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(qtyp[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(thpp[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=3)
if (str(qtyp[y])=="None"):
qtyp[y] = 0
sumamnt = sumamnt + float(str(qtyp[y]))
if (str(thpp[y])=="None"):
thpp[y] = 0
sumconthp = sumconthp + float(str(thpp[y]))
Lbb=Label(frlog, text="Total Production of Noodle :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=3, column=1)
Lbb=Label(frlog, text="Total Theoretical Production :").grid(row=4, column=0)
Lbb=Label(frlog, text=str(sumconthp)).grid(row=4, column=1)
Lbb=Label(frlog, text="Wastage of Maida :").grid(row=5, column=0)
Lbb=Label(frlog, text=str(sumconthp-sumamnt)).grid(row=5, column=1)
Lbb=Label(frlogs, text="Total Production of Noodle :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+1, column=1)
Lbb=Label(frlogs, text="Total Theoretical Production :").grid(row=rc+2, column=0)
Lbb=Label(frlogs, text=str(sumconthp)).grid(row=rc+2, column=1)
Lbb=Label(frlogs, text="Wastage of Maida :").grid(row=rc+3, column=0)
Lbb=Label(frlogs, text=str(sumconthp-sumamnt)).grid(row=rc+3, column=1)
if (d2=="Macroni"):
mycursor.execute("SELECT Date, qtyMac FROM production WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
rc=len(myresult)
date = [lis[0] for lis in myresult]
qtyp= [lis[1] for lis in myresult]
mycursor.execute("SELECT ThMacProd FROM consumption WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult2 = mycursor.fetchall()
thpp = [lis[0] for lis in myresult2]
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Qty Produced").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="TH Production").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=3, sticky=N+W+E)
sumamnt=0
sumconthp=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(qtyp[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(thpp[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=3)
if (str(qtyp[y])=="None"):
qtyp[y] = 0
sumamnt = sumamnt + float(str(qtyp[y]))
if (str(thpp[y])=="None"):
thpp[y] = 0
sumconthp = sumconthp + float(str(thpp[y]))
Lbb=Label(frlog, text="Total Production of Macroni :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=3, column=1)
Lbb=Label(frlog, text="Total Theoretical Production :").grid(row=4, column=0)
Lbb=Label(frlog, text=str(sumconthp)).grid(row=4, column=1)
Lbb=Label(frlog, text="Wastage of Sooji :").grid(row=5, column=0)
Lbb=Label(frlog, text=str(sumconthp-sumamnt)).grid(row=5, column=1)
Lbb=Label(frlogs, text="Total Production of Macroni :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+1, column=1)
Lbb=Label(frlogs, text="Total Theoretical Production :").grid(row=rc+2, column=0)
Lbb=Label(frlogs, text=str(sumconthp)).grid(row=rc+2, column=1)
Lbb=Label(frlogs, text="Wastage of Sooji :").grid(row=rc+3, column=0)
Lbb=Label(frlogs, text=str(sumconthp-sumamnt)).grid(row=rc+3, column=1)
if (d2=="Pasta"):
mycursor.execute("SELECT Date, qtyPasta FROM production WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
rc=len(myresult)
date = [lis[0] for lis in myresult]
qtyp= [lis[1] for lis in myresult]
mycursor.execute("SELECT ThMacProd FROM consumption WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult2 = mycursor.fetchall()
thpp = [lis[0] for lis in myresult2]
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Qty Produced").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="TH MAc Production").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=3, sticky=N+W+E)
sumamnt=0
sumconthp=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(qtyp[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(thpp[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=3)
if (str(qtyp[y])=="None"):
qtyp[y] = 0
sumamnt = sumamnt + float(str(qtyp[y]))
if (str(thpp[y])=="None"):
thpp[y] = 0
sumconthp = sumconthp + float(str(thpp[y]))
Lbb=Label(frlog, text="Total Production of Macroni :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=3, column=1)
Lbb=Label(frlog, text="Total Theoretical Production Mac :").grid(row=4, column=0)
Lbb=Label(frlog, text=str(sumconthp)).grid(row=4, column=1)
Lbb=Label(frlog, text="Wastage of Sooji acc Mac :").grid(row=5, column=0)
Lbb=Label(frlog, text=str(sumconthp-sumamnt)).grid(row=5, column=1)
Lbb=Label(frlogs, text="Total Production of Macroni :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+1, column=1)
Lbb=Label(frlogs, text="Total Theoretical Production Mac :").grid(row=rc+2, column=0)
Lbb=Label(frlogs, text=str(sumconthp)).grid(row=rc+2, column=1)
Lbb=Label(frlogs, text="Wastage of Sooji acc Mac :").grid(row=rc+3, column=0)
Lbb=Label(frlogs, text=str(sumconthp-sumamnt)).grid(row=rc+3, column=1)
if (d2=="All"):
messagebox.showerror("showerror", "Invalid Request")
if (d1=="Sale"):
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="newsales")
mycursor = mainlog.cursor()
if (d2=="Noodle"):
mycursor.execute("SELECT Date, Bill_Number, Party_Name, QtyN, RateN, AmountN FROM salesentry WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Item LIKE '%Noodle%' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
rc=len(myresult)
date = [lis[0] for lis in myresult]
bill= [lis[1] for lis in myresult]
party= [lis[2] for lis in myresult]
qtyn= [lis[3] for lis in myresult]
raten= [lis[4] for lis in myresult]
amntn= [lis[5] for lis in myresult]
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill No").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Qty").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="Rate").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="Amount").grid(row=0, column=5, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=6, sticky=N+W+E)
sumamnt=0
sumqty=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(qtyn[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(raten[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(amntn[y])).grid(row=y+1, column=5)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=6)
if (str(qtyn[y])=="None"):
qtyn[y] = 0
if (str(amntn[y])=="None"):
amntn[y] = 0
sumamnt = sumamnt + float(str(amntn[y]))
sumqty=sumqty + float(str(qtyn[y]))
Lbb=Label(frlog, text="Total Sale of Noodle :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumqty)).grid(row=3, column=1)
Lbb=Label(frlog, text="Total Amount :").grid(row=4, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=4, column=1)
Lbb=Label(frlogs, text="Total Sale of Noodle :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumqty)).grid(row=rc+1, column=1)
Lbb=Label(frlogs, text="Total Amount :").grid(row=rc+2, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+2, column=1)
if (d2=="Macroni"):
mycursor.execute("SELECT Date, Bill_Number, Party_Name, QtyM, RateM, AmountM FROM salesentry WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Item LIKE '%Macroni%' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
rc=len(myresult)
date = [lis[0] for lis in myresult]
bill= [lis[1] for lis in myresult]
party= [lis[2] for lis in myresult]
qtyn= [lis[3] for lis in myresult]
raten= [lis[4] for lis in myresult]
amntn= [lis[5] for lis in myresult]
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill No").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Qty").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="Rate").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="Amount").grid(row=0, column=5, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=6, sticky=N+W+E)
sumamnt=0
sumqty=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(qtyn[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(raten[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(amntn[y])).grid(row=y+1, column=5)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=6)
if (str(qtyn[y])=="None"):
qtyn[y] = 0
if (str(amntn[y])=="None"):
amntn[y] = 0
sumamnt = sumamnt + float(str(amntn[y]))
sumqty=sumqty + float(str(qtyn[y]))
Lbb=Label(frlog, text="Total Sale of Macroni :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumqty)).grid(row=3, column=1)
Lbb=Label(frlog, text="Total Amount :").grid(row=4, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=4, column=1)
Lbb=Label(frlogs, text="Total Sale of Macroni :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumqty)).grid(row=rc+1, column=1)
Lbb=Label(frlogs, text="Total Amount :").grid(row=rc+2, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+2, column=1)
if (d2=="Pasta"):
mycursor.execute("SELECT Date, Bill_Number, Party_Name, QtyP, RateP, AmountP FROM salesentry WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Item LIKE '%Pasta%' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
rc=len(myresult)
date = [lis[0] for lis in myresult]
bill= [lis[1] for lis in myresult]
party= [lis[2] for lis in myresult]
qtyn= [lis[3] for lis in myresult]
raten= [lis[4] for lis in myresult]
amntn= [lis[5] for lis in myresult]
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill No").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Qty").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="Rate").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="Amount").grid(row=0, column=5, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=6, sticky=N+W+E)
sumamnt=0
sumqty=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(qtyn[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(raten[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(amntn[y])).grid(row=y+1, column=5)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=6)
if (str(qtyn[y])=="None"):
qtyn[y] = 0
if (str(amntn[y])=="None"):
amntn[y] = 0
sumamnt = sumamnt + float(str(amntn[y]))
sumqty=sumqty + float(str(qtyn[y]))
Lbb=Label(frlog, text="Total Sale of Pasta :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumqty)).grid(row=3, column=1)
Lbb=Label(frlog, text="Total Amount :").grid(row=4, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=4, column=1)
Lbb=Label(frlogs, text="Total Sale of Pasta :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumqty)).grid(row=rc+1, column=1)
Lbb=Label(frlogs, text="Total Amount :").grid(row=rc+2, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+2, column=1)
if (d2=="All"):
mycursor.execute("SELECT Date, Bill_Number, Party_Name, QtyN, RateN, AmountN, QtyM, RateM, AmountM, QtyP, RateP, AmountP FROM salesentry WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
rc=len(myresult)
date = [lis[0] for lis in myresult]
bill= [lis[1] for lis in myresult]
party= [lis[2] for lis in myresult]
qtyn= [lis[3] for lis in myresult]
raten= [lis[4] for lis in myresult]
amntn= [lis[5] for lis in myresult]
qtym= [lis[6] for lis in myresult]
ratem= [lis[7] for lis in myresult]
amntm= [lis[8] for lis in myresult]
qtyp= [lis[9] for lis in myresult]
ratep= [lis[10] for lis in myresult]
amntp= [lis[11] for lis in myresult]
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill No").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="QtyN").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="RateN").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="AmountN").grid(row=0, column=5, sticky=N+W+E)
Label(frlogs, text="QtyM").grid(row=0, column=6, sticky=N+W+E)
Label(frlogs, text="RateM").grid(row=0, column=7, sticky=N+W+E)
Label(frlogs, text="AmountM").grid(row=0, column=8, sticky=N+W+E)
Label(frlogs, text="QtyP").grid(row=0, column=9, sticky=N+W+E)
Label(frlogs, text="RateP").grid(row=0, column=10, sticky=N+W+E)
Label(frlogs, text="AmountP").grid(row=0, column=11, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=12, sticky=N+W+E)
sumamntm=0
sumamntn=0
sumamntp=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(qtyn[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(raten[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(amntn[y])).grid(row=y+1, column=5)
Lb=Label(frlogs, text=str(qtym[y])).grid(row=y+1, column=6)
Lb=Label(frlogs, text=str(ratem[y])).grid(row=y+1, column=7)
Lb=Label(frlogs, text=str(amntm[y])).grid(row=y+1, column=8)
Lb=Label(frlogs, text=str(qtyp[y])).grid(row=y+1, column=9)
Lb=Label(frlogs, text=str(ratep[y])).grid(row=y+1, column=10)
Lb=Label(frlogs, text=str(amntp[y])).grid(row=y+1, column=11)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=12)
if (str(amntm[y])=="None"):
amntm[y] = 0
if (str(amntn[y])=="None"):
amntn[y] = 0
if (str(amntp[y])=="None"):
amntp[y] = 0
sumamntn = sumamntn + float(str(amntn[y]))
sumamntm = sumamntm + float(str(amntm[y]))
sumamntp = sumamntp + float(str(amntp[y]))
sumtt=sumamntp+sumamntm+sumamntn
Lbb=Label(frlog, text="Total Sale Amount :").grid(row=4, column=0)
Lbb=Label(frlog, text=str(sumtt)).grid(row=4, column=1)
Lbb=Label(frlogs, text="Total Sale Amount :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumtt)).grid(row=rc+1, column=1)
if (d1=="Expenditure"):
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="expenditure")
mycursor = mainlog.cursor()
if (d2=="Regular"):
mycursor.execute("SELECT Date, Salary, Diesel, Electricity, expT FROM expreg WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
rc=len(myresult)
date = [lis[0] for lis in myresult]
bill= [lis[1] for lis in myresult]
party= [lis[2] for lis in myresult]
qtyn= [lis[3] for lis in myresult]
amntn= [lis[4] for lis in myresult]
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Salary").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Diesel").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Electricity").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="Total").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=5, sticky=N+W+E)
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(qtyn[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(amntn[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=5)
if (str(amntn[y])=="None"):
amntn[y] = 0
sumamnt = sumamnt + float(str(amntn[y]))
Lbb=Label(frlog, text="Total Expenditure :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=3, column=1)
Lbb=Label(frlogs, text="Total Expenditure :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+1, column=1)
if (d2=="Irregular"):
mycursor.execute("SELECT Date, Name, Amount FROM expireg WHERE Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' ORDER BY Date DESC ")
myresult = mycursor.fetchall()
Lbb=Label(frlog, text=" ").grid(row=3, column=1)
Lbb=Label(frlog, text=" ").grid(row=4, column=1)
Lbb=Label(frlog, text=" ").grid(row=5, column=1)
Lbb=Label(frlog, text=" ").grid(row=3, column=0)
Lbb=Label(frlog, text=" ").grid(row=4, column=0)
Lbb=Label(frlog, text=" ").grid(row=5, column=0)
rc=len(myresult)
date = [lis[0] for lis in myresult]
name= [lis[1] for lis in myresult]
amnt= [lis[2] for lis in myresult]
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Name").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Amount").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=3, sticky=N+W+E)
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(name[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(amnt[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(y+1)).grid(row=y+1, column=3)
if (str(amnt[y])=="None"):
amnt[y] = 0
sumamnt = sumamnt + float(str(amnt[y]))
Lbb=Label(frlog, text="Total Expenditure :").grid(row=3, column=0)
Lbb=Label(frlog, text=str(sumamnt)).grid(row=3, column=1)
Lbb=Label(frlogs, text="Total Expenditure :").grid(row=rc+1, column=0)
Lbb=Label(frlogs, text=str(sumamnt)).grid(row=rc+1, column=1)
scrollbar = Scrollbar(Lb1, orient="vertical", command = canvas.yview )
scrollbar.grid( row=0, column = 1, sticky=N+S)
canvas.configure(yscrollcommand=scrollbar.set)
#canvas.create_window((4,4), window=frlogs, anchor="nw")
canvas.create_window((100,50), window=frlogs, anchor=tk.NW)
frlogs.update_idletasks() # Needed to make bbox info available.
bbox = canvas.bbox(tk.ALL)
#frlogs.bind("<Configure>", lambda event, canvas=canvas: onFrameConfigure(canvas))
canvas.configure(scrollregion=bbox, width=700, height=300)
def binddrop1(event):
list1item=drop1.get()
if (list1item=="Purchase" or list1item=="Consumption"):
drop2['values'] = ('Maida', 'Sooji', 'Maida and Sooji', 'Pouches', 'Bags', 'Cartons', 'All Packings')
drop2.current()
if (list1item=="Production" or list1item=="Sale"):
drop2['values'] = ('Noodle', 'Macroni', 'Pasta', 'All')
drop2.current()
if (list1item=="Expenditure"):
drop2['values'] = ('Regular', 'Irregular')
drop2.current()
#__________________________________________________________VEIW LOGS ENDS__________________________________________________________________________________________
#__________________________________________________________VEIW BILLS_________________________________________________________________________________________
def clearedbill():
def onFrameConfigure(canvas):
'''Reset the scroll region to encompass the inner frame'''
canvas.configure(scrollregion=canvas.bbox("all"))
Lb1 = Toplevel()
Lb1.title("LOGS")
#Lb1.minsize(300,300)
Grid.rowconfigure(Lb1, 0, weight=1)
Grid.columnconfigure(Lb1, 0, weight=1)
Grid.columnconfigure(Lb1, 1, weight=1)
canvas = Canvas(Lb1)
canvas.grid(row=0, column=0, sticky=N+E+W+S)
frlogs = Frame(canvas)
dfrom=E11.get()
dto=E22.get()
qtype=drop3.get()
if (qtype=="Purchasing"):
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="purchasinglog")
mycursor = mainlog.cursor()
mycursor.execute("SELECT DISTINCT BIll_Date, Bill_Number, Party_Name, AmountT FROM partpaymentpurchase WHERE BIll_Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Balance = '0.00' ORDER BY BIll_Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Amount").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=4, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
amntm = [lis[3] for lis in myresult]
#print(res[2])
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(amntm[y])).grid(row=y+1, column=3)
LL=Label(frlogs, text=str(y+1)).grid(row=y+1, column=4)
if (str(amntm[y])=="None"):
amntm[y] = 0
sumamnt = sumamnt + float(str(amntm[y]))
LL=Label(frlogs, text="----------------------------------------------------------------------------------------------------------------------------------------------------------").grid(row=rc+1, column =0, columnspan=7)
LL=Label(frlogs, text="Total Cleared Amount :").grid(row=rc+2, column =0, columnspan=2)
LL=Label(frlogs, text=str(sumamnt)).grid(row=rc+2, column =3)
if (qtype=="Sales"):
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="newsales")
mycursor = mainlog.cursor()
mycursor.execute("SELECT DISTINCT Bill_Date, Bill_Number, Party_Name, AmountT FROM paymentsales WHERE Bill_Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Balance = '0.00' ORDER BY Bill_Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Amount").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=4, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
amntm = [lis[3] for lis in myresult]
#print(res[2])
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(amntm[y])).grid(row=y+1, column=3)
LL=Label(frlogs, text=str(y+1)).grid(row=y+1, column=4)
if (str(amntm[y])=="None"):
amntm[y] = 0
sumamnt = sumamnt + float(str(amntm[y]))
LL=Label(frlogs, text="----------------------------------------------------------------------------------------------------------------------------------------------------------").grid(row=rc+1, column =0, columnspan=7)
LL=Label(frlogs, text="Total Cleared Amount :").grid(row=rc+2, column =0, columnspan=2)
LL=Label(frlogs, text=str(sumamnt)).grid(row=rc+2, column =3)
scrollbar = Scrollbar(Lb1, orient="vertical", command = canvas.yview )
scrollbar.grid( row=0, column = 1, sticky=N+S)
canvas.configure(yscrollcommand=scrollbar.set)
#canvas.create_window((4,4), window=frlogs, anchor="nw")
canvas.create_window((100,50), window=frlogs, anchor=tk.NW)
frlogs.update_idletasks() # Needed to make bbox info available.
bbox = canvas.bbox(tk.ALL)
#frlogs.bind("<Configure>", lambda event, canvas=canvas: onFrameConfigure(canvas))
canvas.configure(scrollregion=bbox, width=700, height=300)
def pendingbill():
def onFrameConfigure(canvas):
'''Reset the scroll region to encompass the inner frame'''
canvas.configure(scrollregion=canvas.bbox("all"))
Lb1 = Toplevel()
Lb1.title("LOGS")
#Lb1.minsize(300,300)
Grid.rowconfigure(Lb1, 0, weight=1)
Grid.columnconfigure(Lb1, 0, weight=1)
Grid.columnconfigure(Lb1, 1, weight=1)
canvas = Canvas(Lb1)
canvas.grid(row=0, column=0, sticky=N+E+W+S)
frlogs = Frame(canvas)
dfrom=E11.get()
dto=E22.get()
qtype=drop3.get()
if (qtype=="Purchasing"):
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="purchasinglog")
mycursor = mainlog.cursor()
mycursor.execute("SELECT DISTINCT BIll_Date, Bill_Number, Party_Name, AmountT, Balance, Amount_Paid FROM partpaymentpurchase WHERE BIll_Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Balance != '0.00' ORDER BY BIll_Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Amount").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="Amount_Paid").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="Last Balance").grid(row=0, column=5, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=6, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
amntm = [lis[3] for lis in myresult]
bal=[lis[4] for lis in myresult]
amntpdd= [lis[5] for lis in myresult]
#print(res[2])
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(bal[y])).grid(row=y+1, column=5)
Lb=Label(frlogs, text=str(amntm[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(amntpdd[y])).grid(row=y+1, column=4)
LL=Label(frlogs, text=str(y+1)).grid(row=y+1, column=6)
if (str(amntpdd[y])=="None"):
amntpdd[y] = 0
if (str(amntm[y])=="None"):
amntm[y] = 0
sumamnt=sumamnt+float(str(amntm[y]))
sumpd=sumpd+float(str(amntpdd[y]))
LL=Label(frlogs, text="----------------------------------------------------------------------------------------------------------------------------------------------------------").grid(row=rc+1, column =0, columnspan=7)
LL=Label(frlogs, text="Total:").grid(row=rc+2, column =0)
LL=Label(frlogs, text=str(sumamnt)).grid(row=rc+2, column =3)
#LL=Label(frlogs, text="Total Paid Amount :").grid(row=rc+1, column =2)
LL=Label(frlogs, text=str(sumpd)).grid(row=rc+2, column =4)
#LL=Label(frlogs, text="Total pending Amount :").grid(row=rc+1, column =4)
LL=Label(frlogs, text=str(sumamnt-sumpd)).grid(row=rc+2, column =5)
if (qtype=="Sales"):
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="newsales")
mycursor = mainlog.cursor()
mycursor.execute("SELECT DISTINCT Bill_Date, Bill_Number, Party_Name, AmountT, Balance, Amount_Paid FROM paymentsales WHERE Bill_Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Balance != '0.00' ORDER BY Bill_Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
Label(frlogs, text="Date").grid(row=0, column=0, sticky=N+W+E)
Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Label(frlogs, text="Amount").grid(row=0, column=3, sticky=N+W+E)
Label(frlogs, text="Amount_Paid").grid(row=0, column=4, sticky=N+W+E)
Label(frlogs, text="Last Balance").grid(row=0, column=5, sticky=N+W+E)
Label(frlogs, text="Sr. No.").grid(row=0, column=6, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
amntm = [lis[3] for lis in myresult]
bal= [lis[4] for lis in myresult]
amntpdd= [lis[5] for lis in myresult]
sumpd=0
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(bal[y])).grid(row=y+1, column=5)
Lb=Label(frlogs, text=str(amntm[y])).grid(row=y+1, column=3)
Lb=Label(frlogs, text=str(amntpdd[y])).grid(row=y+1, column=4)
LL=Label(frlogs, text=str(y+1)).grid(row=y+1, column=6)
if (str(amntpdd[y])=="None"):
amntpdd[y] = 0
if (str(amntm[y])=="None"):
amntm[y] = 0
sumamnt=sumamnt+float(str(amntm[y]))
sumpd=sumpd+float(str(amntpdd[y]))
LL=Label(frlogs, text="------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------").grid(row=rc+1, column =0, columnspan=10)
LL=Label(frlogs, text="Total:").grid(row=rc+2, column =0)
LL=Label(frlogs, text=str(sumamnt)).grid(row=rc+2, column =3)
#LL=Label(frlogs, text="Total Paid Amount :").grid(row=rc+1, column =2)
LL=Label(frlogs, text=str(sumpd)).grid(row=rc+2, column =4)
#LL=Label(frlogs, text="Total pending Amount :").grid(row=rc+1, column =4)
LL=Label(frlogs, text=str(sumamnt-sumpd)).grid(row=rc+2, column =5)
scrollbar = Scrollbar(Lb1, orient="vertical", command = canvas.yview )
scrollbar.grid( row=0, column = 1, sticky=N+S)
canvas.configure(yscrollcommand=scrollbar.set)
#canvas.create_window((4,4), window=frlogs, anchor="nw")
canvas.create_window((100,50), window=frlogs, anchor=tk.NW)
frlogs.update_idletasks() # Needed to make bbox info available.
bbox = canvas.bbox(tk.ALL)
#frlogs.bind("<Configure>", lambda event, canvas=canvas: onFrameConfigure(canvas))
canvas.configure(scrollregion=bbox, width=700, height=300)
def partybills():
def onFrameConfigure(canvas):
'''Reset the scroll region to encompass the inner frame'''
canvas.configure(scrollregion=canvas.bbox("all"))
Lb1 = Toplevel()
Lb1.title("LOGS")
#Lb1.minsize(300,300)
Grid.rowconfigure(Lb1, 0, weight=1)
Grid.columnconfigure(Lb1, 0, weight=1)
Grid.columnconfigure(Lb1, 1, weight=1)
canvas = Canvas(Lb1)
canvas.grid(row=0, column=0, sticky=N+E+W+S)
frlogs = Frame(canvas)
dfrom=E11.get()
dto=E22.get()
qtype=drop3.get()
partyname=E3.get()
billquery=E4.get()
if (qtype=="Purchasing"):
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="purchasinglog")
mycursor = mainlog.cursor()
mycursor.execute("SELECT BIll_Date, Bill_Number, Party_Name, AmountT, Balance_d, Part_date, Part_payment, Amount_Paid, Balance FROM partpaymentpurchase WHERE BIll_Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Party_Name = '"+partyname+"' ORDER BY BIll_Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
Lb=Label(frlogs, text="Bill_Date").grid(row=0, column=0, sticky=N+W+E)
Lb=Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
Lb=Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Lb=Label(frlogs, text="Amount").grid(row=0, column=3, sticky=N+W+E)
#Lb=Label(frlogs, text="Last Balance").grid(row=0, column=7, sticky=N+W+E)
Lb=Label(frlogs, text="Balance").grid(row=0, column=6, sticky=N+W+E)
Lb=Label(frlogs, text="Part_Date").grid(row=0, column=4, sticky=N+W+E)
Lb=Label(frlogs, text="Part_Payment").grid(row=0, column=5, sticky=N+W+E)
#Lb=Label(frlogs, text="Amount_Paid").grid(row=0, column=4, sticky=N+W+E)
Lb=Label(frlogs, text="Sr. No.").grid(row=0, column=7, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
amntm = [lis[3] for lis in myresult]
bal = [lis[4] for lis in myresult]
ptdat = [lis[5] for lis in myresult]
ptpay = [lis[6] for lis in myresult]
amntpd = [lis[7] for lis in myresult]
bald=[lis[8] for lis in myresult]
#print(res[2])
cursor2 = mainlog.cursor()
cursor2.execute("SELECT Balance FROM partpaymentpurchase WHERE BIll_Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Party_Name = '"+partyname+"' GROUP BY Bill_Number ORDER BY BIll_Date DESC ")
result2 = cursor2.fetchall()
rc2=len(result2)
bal2=[lis2[0] for lis2 in result2]
sumbal2 = 0
for i in range(rc2):
sumbal2 = sumbal2 + float(bal2[i])
LL=Label(frlogs, text="------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------").grid(row=rc+1, column =0, columnspan=10)
LL=Label(frlogs, text="Total Pending Balance: ").grid(row=rc+2, column=0, columnspan=2)
LL=Label(frlogs, text=str(sumbal2)).grid(row=rc+2, column=6)
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(amntm[y])).grid(row=y+1, column=3)
#Lb=Label(frlogs, text=str(bal[y])).grid(row=y+1, column=7)
Lb=Label(frlogs, text=str(bald[y])).grid(row=y+1, column=6)
Lb=Label(frlogs, text=str(ptdat[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(ptpay[y])).grid(row=y+1, column=5)
#Lb=Label(frlogs, text=str(amntpd[y])).grid(row=y+1, column=4)
LL=Label(frlogs, text=str(y+1)).grid(row=y+1, column=7)
if (qtype=="Sales"):
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="newsales")
mycursor = mainlog.cursor()
mycursor.execute("SELECT Bill_Date, Bill_Number, Party_Name, AmountT, Balance, Part_Date, Part_Pay, Amount_Paid, Balance_d FROM paymentsales WHERE Bill_Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Party_Name = '"+partyname+"' ORDER BY BIll_Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
Lb=Label(frlogs, text="Bill_Date").grid(row=0, column=0, sticky=N+W+E)
Lb=Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
Lb=Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
Lb=Label(frlogs, text="Amount").grid(row=0, column=3, sticky=N+W+E)
#Lb=Label(frlogs, text="Last Balance").grid(row=0, column=7, sticky=N+W+E)
Lb=Label(frlogs, text="Balance").grid(row=0, column=6, sticky=N+W+E)
Lb=Label(frlogs, text="Part_Date").grid(row=0, column=4, sticky=N+W+E)
Lb=Label(frlogs, text="Part_Payment").grid(row=0, column=5, sticky=N+W+E)
#Lb=Label(frlogs, text="Amount_Paid").grid(row=0, column=4, sticky=N+W+E)
Lb=Label(frlogs, text="Sr. No.").grid(row=0, column=7, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
amntm = [lis[3] for lis in myresult]
bal = [lis[4] for lis in myresult]
ptdat = [lis[5] for lis in myresult]
ptpay = [lis[6] for lis in myresult]
amntpd = [lis[7] for lis in myresult]
bald=[lis[8] for lis in myresult]
cursor2 = mainlog.cursor()
cursor2.execute("SELECT Balance FROM paymentsales WHERE BIll_Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Party_Name = '"+partyname+"' GROUP BY Bill_Number ORDER BY Bill_Date DESC ")
result2 = cursor2.fetchall()
rc2=len(result2)
bal2=[lis2[0] for lis2 in result2]
sumbal2 = 0
for i in range(rc2):
sumbal2 = sumbal2 + float(bal2[i])
LL=Label(frlogs, text="------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------").grid(row=rc+1, column =0, columnspan=10)
LL=Label(frlogs, text="Total Pending Balance: ").grid(row=rc+2, column=0, columnspan=2)
LL=Label(frlogs, text=str(sumbal2)).grid(row=rc+2, column=6)
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(amntm[y])).grid(row=y+1, column=3)
#Lb=Label(frlogs, text=str(bal[y])).grid(row=y+1, column=7)
Lb=Label(frlogs, text=str(bald[y])).grid(row=y+1, column=6)
Lb=Label(frlogs, text=str(ptdat[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(ptpay[y])).grid(row=y+1, column=5)
#Lb=Label(frlogs, text=str(amntpd[y])).grid(row=y+1, column=4)
LL=Label(frlogs, text=str(y+1)).grid(row=y+1, column=7)
scrollbar = Scrollbar(Lb1, orient="vertical", command = canvas.yview )
scrollbar.grid( row=0, column = 1, sticky=N+S)
canvas.configure(yscrollcommand=scrollbar.set)
#canvas.create_window((4,4), window=frlogs, anchor="nw")
canvas.create_window((100,50), window=frlogs, anchor=tk.NW)
frlogs.update_idletasks() # Needed to make bbox info available.
bbox = canvas.bbox(tk.ALL)
#frlogs.bind("<Configure>", lambda event, canvas=canvas: onFrameConfigure(canvas))
canvas.configure(scrollregion=bbox, width=700, height=300)
def billpart():
def onFrameConfigure(canvas):
'''Reset the scroll region to encompass the inner frame'''
canvas.configure(scrollregion=canvas.bbox("all"))
Lb1 = Toplevel()
Lb1.title("LOGS")
#Lb1.minsize(300,300)
Grid.rowconfigure(Lb1, 0, weight=1)
Grid.columnconfigure(Lb1, 0, weight=1)
Grid.columnconfigure(Lb1, 1, weight=1)
canvas = Canvas(Lb1)
canvas.grid(row=0, column=0, sticky=N+E+W+S)
frlogs = Frame(canvas)
dfrom=E11.get()
dto=E22.get()
qtype=drop3.get()
partyname=E3.get()
billquery=E4.get()
if (qtype=="Purchasing"):
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="purchasinglog")
mycursor = mainlog.cursor()
mycursor.execute("SELECT BIll_Date, Bill_Number, Party_Name, AmountT, Balance, Part_date, Part_payment, Amount_Paid, Balance_d FROM partpaymentpurchase WHERE BIll_Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Bill_Number = '"+billquery+"' ORDER BY BIll_Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
L5=Label(frlogs, text="Bill_Date").grid(row=0, column=0, sticky=N+W+E)
LL=Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
LL=Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
LL=Label(frlogs, text="Amount").grid(row=0, column=3, sticky=N+W+E)
#Lb=Label(frlogs, text="Last Balance").grid(row=0, column=7, sticky=N+W+E)
Lb=Label(frlogs, text="Balance").grid(row=0, column=6, sticky=N+W+E)
LL=Label(frlogs, text="Part_Date").grid(row=0, column=4, sticky=N+W+E)
LL=Label(frlogs, text="Part_Payment").grid(row=0, column=5, sticky=N+W+E)
#LL=Label(frlogs, text="Amount_Paid").grid(row=0, column=4, sticky=N+W+E)
LL=Label(frlogs, text="Sr. No.").grid(row=0, column=7, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
amntm = [lis[3] for lis in myresult]
bal = [lis[4] for lis in myresult]
ptdat = [lis[5] for lis in myresult]
ptpay = [lis[6] for lis in myresult]
amntpd = [lis[7] for lis in myresult]
bald=[lis[8] for lis in myresult]
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(amntm[y])).grid(row=y+1, column=3)
#Lb=Label(frlogs, text=str(bal[y])).grid(row=y+1, column=7)
Lb=Label(frlogs, text=str(bald[y])).grid(row=y+1, column=6)
Lb=Label(frlogs, text=str(ptdat[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(ptpay[y])).grid(row=y+1, column=5)
#Lb=Label(frlogs, text=str(amntpd[y])).grid(row=y+1, column=4)
LL=Label(frlogs, text=str(y+1)).grid(row=y+1, column=7)
if (qtype=="Sales"):
mainlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="newsales")
mycursor = mainlog.cursor()
mycursor.execute("SELECT Bill_Date, Bill_Number, Party_Name, AmountT, Balance, Part_Date, Part_Pay, Amount_Paid, Balance_d FROM paymentsales WHERE Bill_Date BETWEEN '"+ dfrom +"' AND '"+ dto +"' AND Bill_Number = '"+billquery+"' ORDER BY BIll_Date DESC ")
myresult = mycursor.fetchall()
rc=len(myresult)
LL=Label(frlogs, text="Bill_Date").grid(row=0, column=0, sticky=N+W+E)
LL=Label(frlogs, text="Bill").grid(row=0, column=1, sticky=N+W+E)
LL=Label(frlogs, text="Party").grid(row=0, column=2, sticky=N+W+E)
LL=Label(frlogs, text="Amount").grid(row=0, column=3, sticky=N+W+E)
#Lb=Label(frlogs, text="Last Balance").grid(row=0, column=7, sticky=N+W+E)
Lb=Label(frlogs, text="Balance").grid(row=0, column=6, sticky=N+W+E)
LL=Label(frlogs, text="Part_Date").grid(row=0, column=4, sticky=N+W+E)
LL=Label(frlogs, text="Part_Payment").grid(row=0, column=5, sticky=N+W+E)
#LL=Label(frlogs, text="Amount_Paid").grid(row=0, column=4, sticky=N+W+E)
LL=Label(frlogs, text="Sr. No.").grid(row=0, column=7, sticky=N+W+E)
#Lb1.insert(1, "Date Bill_Number Party Qty Rate Amount")
date = [lis[0] for lis in myresult]
bill = [lis[1] for lis in myresult]
party = [lis[2] for lis in myresult]
amntm = [lis[3] for lis in myresult]
bal = [lis[4] for lis in myresult]
ptdat = [lis[5] for lis in myresult]
ptpay = [lis[6] for lis in myresult]
amntpd = [lis[7] for lis in myresult]
bald=[lis[8] for lis in myresult]
sumamnt=0
for y in range(rc):
Lb=Label(frlogs, text=str(date[y])).grid(row=y+1, column=0)
Lb=Label(frlogs, text=str(bill[y])).grid(row=y+1, column=1)
Lb=Label(frlogs, text=str(party[y])).grid(row=y+1, column=2)
Lb=Label(frlogs, text=str(amntm[y])).grid(row=y+1, column=3)
#Lb=Label(frlogs, text=str(bal[y])).grid(row=y+1, column=7)
Lb=Label(frlogs, text=str(bald[y])).grid(row=y+1, column=6)
Lb=Label(frlogs, text=str(ptdat[y])).grid(row=y+1, column=4)
Lb=Label(frlogs, text=str(ptpay[y])).grid(row=y+1, column=5)
#Lb=Label(frlogs, text=str(amntpd[y])).grid(row=y+1, column=4)
LL=Label(frlogs, text=str(y+1)).grid(row=y+1, column=7)
scrollbar = Scrollbar(Lb1, orient="vertical", command = canvas.yview )
scrollbar.grid( row=0, column = 1, sticky=N+S)
canvas.configure(yscrollcommand=scrollbar.set)
#canvas.create_window((4,4), window=frlogs, anchor="nw")
canvas.create_window((100,50), window=frlogs, anchor=tk.NW)
frlogs.update_idletasks() # Needed to make bbox info available.
bbox = canvas.bbox(tk.ALL)
#frlogs.bind("<Configure>", lambda event, canvas=canvas: onFrameConfigure(canvas))
canvas.configure(scrollregion=bbox, width=700, height=300)
#__________________________________________________________VEIW LOGS FRAME__________________________________________________________________________________________
frlog = LabelFrame(result, text="View Logs", padx=10, pady=10, height=400, width=400, font=12)
frlog.grid(row=0, column=0, padx=10, pady=10)
frbill = LabelFrame(result, text="View Bills", padx=10, pady=10, height=400, width=400, font=12)
frbill.grid(row=0, column=1, padx=10, pady=10)
L1=Label(frlog, text="From ").grid(row=0, column=0, padx=3, pady=3)
E1 = Entry(frlog)
E1.grid(row=0, column=1, padx=3, pady= 3)
L1=Label(frlog, text=" To ").grid(row=0, column=2, padx=3, pady=3)
E2 = Entry(frlog)
E2.grid(row=0, column=3, padx=3, pady=3)
n = StringVar()
m = StringVar()
drop1 = ttk.Combobox(frlog, width = 30, textvariable = n)
drop2 = ttk.Combobox(frlog, width = 30, textvariable = m)
# Adding combobox drop down list
drop1['values'] = ('Purchase', 'Consumption', 'Production', 'Sale', 'Expenditure')
drop1.grid(row=1, column=0, padx=5, pady=5, columnspan=2)
drop1.current()
drop1.bind("<<ComboboxSelected>>", binddrop1)
drop2.grid(row=1, column=2, padx=5, pady=5, columnspan=2)
Bvlog = Button(frlog, text="View Logs", command=viewlistbox).grid(row=2, column=0, padx=3, pady=3)
#__________________________________________________________VEIW BILLS FRAME__________________________________________________________________________________________
L1=Label(frbill, text="From ").grid(row=0, column=0, padx=3, pady=3)
E11 = Entry(frbill)
E11.grid(row=0, column=1, padx=3, pady= 3)
L1=Label(frbill, text=" To ").grid(row=0, column=2, padx=3, pady=3)
E22 = Entry(frbill)
E22.grid(row=0, column=3, padx=3, pady=3)
L1=Label(frbill, text="Search For :").grid(row=1, column=0, padx=3, pady=3)
p =StringVar()
drop3 = ttk.Combobox(frbill, width = 30, textvariable = p)
drop3['values'] = ('Purchasing', 'Sales')
drop3.grid(row=1, column=1, padx=5, pady=5, columnspan=2)
drop3.current()
Bcleared=Button(frbill, text="View Cleared Bills", command=clearedbill).grid(row=2, column=0, padx=3, pady=3)
Bcleared=Button(frbill, text="View Pending Bills", command=pendingbill).grid(row=2, column=1, padx=3, pady=3)
L1=Label(frbill, text="Search by Party:").grid(row=3, column=0, padx=3, pady=3)
E3 = Entry(frbill)
E3.grid(row=3, column=1, padx=3, pady= 3)
L1=Label(frbill, text="Search by Bill").grid(row=4, column=0, padx=3, pady=3)
E4 = Entry(frbill)
E4.grid(row=4, column=1, padx=3, pady=3)
BBB = Button(frbill, text="View Party Bills", command=partybills).grid(row=3, column=2, padx=3, pady=3)
BB3B = Button(frbill, text="View Part Payments by Bill", command=billpart).grid(row=4, column=2, padx=3, pady=3)
#Lb1 = Listbox(frlog, width=100)
#-----------------------------------------------RESULTS FRAME ENDS--------------------------------------------------------------------------------------------------------------
#-----------------------------------------------ANALYSIS FRAME STARTS--------------------------------------------------------------------------------------------------------------
def analysis():
newan = Toplevel()
newan.title("Expenditure Entry")
newan.minsize(400,300)
newan.iconbitmap('icov1.ico')
def graphs():
dfrom=E1.get()
dto=E2.get()
totalsalep = 0
totalsalea = 0
totalpurma = 0
totalpurmp = 0
totalexp = 0
totalpurB = 0
totalpurC = 0
totalpurP = 0
salelog = mysql.connector.connect(host="localhost", user="root", password="root123", database="newsales")
cursorsale = salelog.cursor()
cursorsale.execute("SELECT DISTINCT AmountT, Amount_Paid FROM paymentsales WHERE Bill_Date BETWEEN '"+dfrom+"' AND '"+dto+"' ")
resultsale = cursorsale.fetchall()
amnts = [lis[0] for lis in resultsale]
amntpd = [lis[1] for lis in resultsale]
rcsale=len(resultsale)
for i in range(rcsale):
if(str(amnts[i]) =="None"):
amnts[i]=0
if(str(amntpd[i]) =="None"):
amntpd[i]=0
totalsalep = totalsalep + float(str(amnts[i]))
totalsalea = totalsalea + float(str(amntpd[i]))
purlog = mysql.connector.connect(host="localhost", user="root", password="root123", database="purchasinglog")
cursorpur = purlog.cursor()
cursorpur.execute("SELECT DISTINCT AmountT, Amount_Paid FROM partpaymentpurchase WHERE BIll_Date BETWEEN '"+dfrom+"' AND '"+dto+"' ")
resultpur = cursorpur.fetchall()
amntpur = [lis[0] for lis in resultpur]
amntpdpur = [lis[1] for lis in resultpur]
rcpur=len(resultpur)
for i in range(rcpur):
if(str(amntpur[i]) =="None"):
amntpur[i]=0
if(str(amntpdpur[i]) =="None"):
amntpdpur[i]=0
totalpurmp = totalpurmp + float(str(amntpur[i]))
totalpurma = totalpurma + float(str(amntpdpur[i]))
cursorpur.execute("SELECT DISTINCT AmountP, AmountB, AmountC FROM packingpurchase WHERE Date BETWEEN '"+dfrom+"' AND '"+dto+"' ")
resultpurp = cursorpur.fetchall()
amntP = [lis[0] for lis in resultpurp]
amntB = [lis[1] for lis in resultpurp]
amntC = [lis[2] for lis in resultpurp]
rcpurp=len(resultpurp)
for i in range(rcpurp):
if(str(amntB[i]) =="None"):
amntB[i]=0
if(str(amntC[i]) =="None"):
amntC[i]=0
if(str(amntP[i]) =="None"):
amntP[i]=0
totalpurB = totalpurB + float(str(amntB[i]))
totalpurC = totalpurC + float(str(amntC[i]))
totalpurP = totalpurP + float(str(amntP[i]))
totalPPpur = totalpurP + totalpurC + totalpurB
explog = mysql.connector.connect(host="localhost", user="root", password="root123", database="expenditure")
cursorx = explog.cursor()
cursorx.execute("SELECT DISTINCT expT FROM expreg WHERE Date BETWEEN '"+dfrom+"' AND '"+dto+"' ")
resultx = cursorx.fetchall()
amntx = [lis[0] for lis in resultx]
rcx=len(resultx)
for i in range(rcx):
if(str(amntx[i]) =="None"):
amntx[i]=0
totalexp = totalexp + float(str(amntx[i]))
profitp = totalsalep - totalpurmp - totalPPpur - totalexp
profitpp = profitp/(totalpurmp + totalPPpur + totalexp)
profita = totalsalea - totalpurma - totalPPpur - totalexp
profitap = profita/(totalpurma + totalPPpur + totalexp)
LL = Label(newan, text="Projected Profit: ").grid(row=2, column=0, padx=3, pady=3)
LL = Label(newan, text=profitp).grid(row=2, column=1, padx=3, pady=3)
LL = Label(newan, text="Projected Profit %: ").grid(row=2, column=2, padx=3, pady=3)
LL = Label(newan, text=profitpp).grid(row=2, column=3, padx=3, pady=3)
LL = Label(newan, text="Actual Profit: ").grid(row=3, column=0, padx=3, pady=3)
LL = Label(newan, text=profitp).grid(row=3, column=1, padx=3, pady=3)
LL = Label(newan, text="Actual Profit %: ").grid(row=3, column=2, padx=3, pady=3)
LL = Label(newan, text=profitpp).grid(row=3, column=3, padx=3, pady=3)
#__________________________________________________________VEIW Analysis FRAME__________________________________________________________________________________________
L1=Label(newan, text="From ").grid(row=0, column=0, padx=3, pady=3)
E1 = Entry(newan)
E1.grid(row=0, column=1, padx=3, pady= 3)
L1=Label(newan, text=" To ").grid(row=0, column=2, padx=3, pady=3)
E2 = Entry(newan)
E2.grid(row=0, column=3, padx=3, pady=3)
BBB = Button(newan, text="View Graphs", command=graphs).grid(row=1, column=0, padx=3, pady=3)
#xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx R O O T xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
frame1 = LabelFrame(root, text="Material Purchasing", padx=20, pady=20, bg="light pink", relief=RIDGE, height=60, width=60, font=10)
frame1.grid(row=0, column=0, padx=10, pady=10, sticky=N+E+S+W)
bpur = Button(frame1, text="Open Purchasing Entry", command=newpurchase)
bpur.grid(row=0, column=0)
frame2 = LabelFrame(root, text="Consumption and Production", padx=20, pady=20, bg="light blue", relief=RIDGE, height=60, width=60, font=10)
frame2.grid(row=0, column=1, padx=10, pady=10, sticky=N+E+S+W)
bprod = Button(frame2, text="Open Production Entry", command=newproduction)
bprod.grid(row=0, column=0)
frame3 = LabelFrame(root, text="Sales", padx=20, pady=20, bg="light green", relief=RIDGE, height=60, width=60, font=10)
frame3.grid(row=0, column=2, padx=10, pady=10, sticky=N+E+S+W)
bsale = Button(frame3, text="New Entry and Payment", command=newsales)
bsale.grid(row=0, column=0)
frame4 = LabelFrame(root, text="Expenditure Entry", padx=20, pady=20, bg="light green", relief=RIDGE, height=60, width=60, font=10)
frame4.grid(row=1, column=0, padx=10, pady=10, sticky=N+E+S+W)
bexpn = Button(frame4, text=" New Expenditures ", command=expend)
bexpn.grid(row=0, column=0)
frame5 = LabelFrame(root, text="Results", padx=20, pady=20, bg="light blue", relief=RIDGE, height=60, width=60, font=10)
frame5.grid(row=1, column=1, padx=10, pady=10, sticky=N+E+S+W)
brslt = Button(frame5, text=" View Results and Logs ", command=results)
brslt.grid(row=0, column=0)
frame6 = LabelFrame(root, text="Analysis", padx=20, pady=20, bg="light pink", relief=RIDGE, height=60, width=60, font=10)
frame6.grid(row=1, column=2, padx=10, pady=10, sticky=N+E+S+W)
ConB = Button(root, text="Connect", command=connectmdb)
ConB.grid(row=2, column=0)
Lab333 = Label(root, text="Test Connection").grid(row=2, column =1, sticky=N+E+S+W)
banls = Button(frame6, text="View Graphs and Analysis", command=analysis)
banls.grid(row=0, column=0, sticky=N+E+S+W)
root.mainloop() | 58.31207 | 399 | 0.478714 | 18,773 | 160,883 | 3.99217 | 0.031748 | 0.090693 | 0.102875 | 0.048382 | 0.855774 | 0.833731 | 0.807165 | 0.794556 | 0.780959 | 0.756355 | 0 | 0.036912 | 0.363133 | 160,883 | 2,759 | 400 | 58.31207 | 0.694537 | 0.058732 | 0 | 0.636364 | 0 | 0.005455 | 0.201808 | 0.00723 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014091 | false | 0.014091 | 0.002273 | 0 | 0.016364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e52c6b0465ba4c3eff48d11c5fdf12bdaccfdf62 | 81 | py | Python | crowdsourcing/annotation_types/classification/__init__.py | sbranson/online_crowdsourcing | d1f7c814bb60aae9cf5e76e0b299713246f98ce3 | [
"MIT"
] | 4 | 2019-08-14T21:14:18.000Z | 2021-11-04T09:32:37.000Z | crowdsourcing/annotation_types/classification/__init__.py | sbranson/online_crowdsourcing | d1f7c814bb60aae9cf5e76e0b299713246f98ce3 | [
"MIT"
] | null | null | null | crowdsourcing/annotation_types/classification/__init__.py | sbranson/online_crowdsourcing | d1f7c814bb60aae9cf5e76e0b299713246f98ce3 | [
"MIT"
] | 1 | 2019-11-09T08:20:27.000Z | 2019-11-09T08:20:27.000Z | from binary import *
from binary_cv_predictor import *
#from multiclass import *
| 20.25 | 33 | 0.802469 | 11 | 81 | 5.727273 | 0.545455 | 0.31746 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148148 | 81 | 3 | 34 | 27 | 0.913043 | 0.296296 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e54a9ac1ef43fe6260995043068b1e4b0385c365 | 50 | py | Python | tests/test_service.py | christophevg/py-service | 6c06caac42537580af4e5cdc880e03c5714f65d3 | [
"MIT"
] | 1 | 2018-12-23T16:56:23.000Z | 2018-12-23T16:56:23.000Z | tests/test_service.py | christophevg/py-service | 6c06caac42537580af4e5cdc880e03c5714f65d3 | [
"MIT"
] | 1 | 2021-06-01T22:03:16.000Z | 2021-06-01T22:03:16.000Z | tests/test_service.py | christophevg/py-servicefactory | 6c06caac42537580af4e5cdc880e03c5714f65d3 | [
"MIT"
] | null | null | null | # TODO :-(
def test_placeholder():
assert True
| 10 | 23 | 0.66 | 6 | 50 | 5.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 50 | 4 | 24 | 12.5 | 0.8 | 0.16 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0.5 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e5563d1a7ee77d6f8f0d18d7261133d2763f6103 | 631 | py | Python | temboo/core/Library/CorpWatch/Lists/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 7 | 2016-03-07T02:07:21.000Z | 2022-01-21T02:22:41.000Z | temboo/core/Library/CorpWatch/Lists/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | null | null | null | temboo/core/Library/CorpWatch/Lists/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 8 | 2016-06-14T06:01:11.000Z | 2020-04-22T09:21:44.000Z | from temboo.Library.CorpWatch.Lists.ListCountryCodes import ListCountryCodes, ListCountryCodesInputSet, ListCountryCodesResultSet, ListCountryCodesChoreographyExecution
from temboo.Library.CorpWatch.Lists.ListIndustryCodes import ListIndustryCodes, ListIndustryCodesInputSet, ListIndustryCodesResultSet, ListIndustryCodesChoreographyExecution
from temboo.Library.CorpWatch.Lists.ListLocations import ListLocations, ListLocationsInputSet, ListLocationsResultSet, ListLocationsChoreographyExecution
from temboo.Library.CorpWatch.Lists.ListNames import ListNames, ListNamesInputSet, ListNamesResultSet, ListNamesChoreographyExecution
| 126.2 | 173 | 0.911252 | 44 | 631 | 13.068182 | 0.5 | 0.069565 | 0.118261 | 0.18087 | 0.215652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044374 | 631 | 4 | 174 | 157.75 | 0.953566 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
e5da178835d91e3f6f79c31b036f3a9a34508969 | 53 | py | Python | pewn/__init__.py | 5elenay/pewn | d5bd7c42eb623a64884623542e71bfcf2532878d | [
"MIT"
] | 1 | 2021-07-19T14:25:23.000Z | 2021-07-19T14:25:23.000Z | pewn/__init__.py | 5elenay/pewn | d5bd7c42eb623a64884623542e71bfcf2532878d | [
"MIT"
] | null | null | null | pewn/__init__.py | 5elenay/pewn | d5bd7c42eb623a64884623542e71bfcf2532878d | [
"MIT"
] | null | null | null | from pewn.classes import *
from pewn.http import *
| 17.666667 | 27 | 0.735849 | 8 | 53 | 4.875 | 0.625 | 0.410256 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188679 | 53 | 2 | 28 | 26.5 | 0.906977 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e5dda140565ec21e117c4262b8ac1f61f77bdda1 | 189 | py | Python | sqlserver_ado/models/__init__.py | BangC/django-mssql | 998c7a0c94f2906bc79f6cf8b74a5a53420f3714 | [
"MIT"
] | null | null | null | sqlserver_ado/models/__init__.py | BangC/django-mssql | 998c7a0c94f2906bc79f6cf8b74a5a53420f3714 | [
"MIT"
] | null | null | null | sqlserver_ado/models/__init__.py | BangC/django-mssql | 998c7a0c94f2906bc79f6cf8b74a5a53420f3714 | [
"MIT"
] | null | null | null | from __future__ import unicode_literals
from sqlserver_ado.models.manager import RawStoredProcedureManager # NOQA
from sqlserver_ado.models.query import RawStoredProcedureQuerySet # NOQA
| 47.25 | 74 | 0.873016 | 21 | 189 | 7.52381 | 0.619048 | 0.164557 | 0.202532 | 0.278481 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 189 | 3 | 75 | 63 | 0.923977 | 0.047619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
e5e3b445e8dacfeb04e932c9cce56743f8a55f27 | 87 | py | Python | problems/current_time.py | rndmized/python_fundamentals | 222ed85eb582c4a2ce5459e20c53ac22d5b58711 | [
"MIT"
] | null | null | null | problems/current_time.py | rndmized/python_fundamentals | 222ed85eb582c4a2ce5459e20c53ac22d5b58711 | [
"MIT"
] | null | null | null | problems/current_time.py | rndmized/python_fundamentals | 222ed85eb582c4a2ce5459e20c53ac22d5b58711 | [
"MIT"
] | null | null | null | import datetime as dt
#Print Date Time
print(" Current Date/Time:", dt.datetime.now()) | 21.75 | 47 | 0.735632 | 14 | 87 | 4.571429 | 0.642857 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126437 | 87 | 4 | 47 | 21.75 | 0.842105 | 0.172414 | 0 | 0 | 0 | 0 | 0.263889 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
f92c391754d04b88a4948ac94a62a59f457ddcfc | 91 | py | Python | saleor/rest/serializers/checkout/__init__.py | Chaoslecion123/Diver | 8c5c493701422eada49cbf95b0b0add08f1ea561 | [
"BSD-3-Clause"
] | null | null | null | saleor/rest/serializers/checkout/__init__.py | Chaoslecion123/Diver | 8c5c493701422eada49cbf95b0b0add08f1ea561 | [
"BSD-3-Clause"
] | null | null | null | saleor/rest/serializers/checkout/__init__.py | Chaoslecion123/Diver | 8c5c493701422eada49cbf95b0b0add08f1ea561 | [
"BSD-3-Clause"
] | null | null | null | from .checkout import CheckoutSerializer
from .checkout_line import CheckoutLineSerializer
| 30.333333 | 49 | 0.89011 | 9 | 91 | 8.888889 | 0.666667 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087912 | 91 | 2 | 50 | 45.5 | 0.963855 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0062e7da3f434ecc70f11f5cf11538830162d05d | 8,523 | py | Python | tests/components/wake_on_lan/test_switch.py | pcaston/core | e74d946cef7a9d4e232ae9e0ba150d18018cfe33 | [
"Apache-2.0"
] | 1 | 2021-07-08T20:09:55.000Z | 2021-07-08T20:09:55.000Z | tests/components/wake_on_lan/test_switch.py | pcaston/core | e74d946cef7a9d4e232ae9e0ba150d18018cfe33 | [
"Apache-2.0"
] | 47 | 2021-02-21T23:43:07.000Z | 2022-03-31T06:07:10.000Z | tests/components/wake_on_lan/test_switch.py | OpenPeerPower/core | f673dfac9f2d0c48fa30af37b0a99df9dd6640ee | [
"Apache-2.0"
] | null | null | null | """The tests for the wake on lan switch platform."""
import platform
import subprocess
from unittest.mock import patch
import pytest
import openpeerpower.components.switch as switch
from openpeerpower.const import (
ATTR_ENTITY_ID,
SERVICE_TURN_OFF,
SERVICE_TURN_ON,
STATE_OFF,
STATE_ON,
)
from openpeerpower.setup import async_setup_component
from tests.common import async_mock_service
@pytest.fixture(autouse=True)
def mock_send_magic_packet():
"""Mock magic packet."""
with patch("wakeonlan.send_magic_packet") as mock_send:
yield mock_send
async def test_valid_hostname(opp):
"""Test with valid hostname."""
assert await async_setup_component(
opp,
switch.DOMAIN,
{
"switch": {
"platform": "wake_on_lan",
"mac": "00-01-02-03-04-05",
"host": "validhostname",
}
},
)
await opp.async_block_till_done()
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_OFF
with patch.object(subprocess, "call", return_value=0):
await opp.services.async_call(
switch.DOMAIN,
SERVICE_TURN_ON,
{ATTR_ENTITY_ID: "switch.wake_on_lan"},
blocking=True,
)
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_ON
await opp.services.async_call(
switch.DOMAIN,
SERVICE_TURN_OFF,
{ATTR_ENTITY_ID: "switch.wake_on_lan"},
blocking=True,
)
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_ON
async def test_valid_hostname_windows(opp):
"""Test with valid hostname on windows."""
assert await async_setup_component(
opp,
switch.DOMAIN,
{
"switch": {
"platform": "wake_on_lan",
"mac": "00-01-02-03-04-05",
"host": "validhostname",
}
},
)
await opp.async_block_till_done()
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_OFF
with patch.object(subprocess, "call", return_value=0), patch.object(
platform, "system", return_value="Windows"
):
await opp.services.async_call(
switch.DOMAIN,
SERVICE_TURN_ON,
{ATTR_ENTITY_ID: "switch.wake_on_lan"},
blocking=True,
)
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_ON
async def test_broadcast_config_ip_and_port(opp, mock_send_magic_packet):
"""Test with broadcast address and broadcast port config."""
mac = "00-01-02-03-04-05"
broadcast_address = "255.255.255.255"
port = 999
assert await async_setup_component(
opp,
switch.DOMAIN,
{
"switch": {
"platform": "wake_on_lan",
"mac": mac,
"broadcast_address": broadcast_address,
"broadcast_port": port,
}
},
)
await opp.async_block_till_done()
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_OFF
with patch.object(subprocess, "call", return_value=0):
await opp.services.async_call(
switch.DOMAIN,
SERVICE_TURN_ON,
{ATTR_ENTITY_ID: "switch.wake_on_lan"},
blocking=True,
)
mock_send_magic_packet.assert_called_with(
mac, ip_address=broadcast_address, port=port
)
async def test_broadcast_config_ip(opp, mock_send_magic_packet):
"""Test with only broadcast address."""
mac = "00-01-02-03-04-05"
broadcast_address = "255.255.255.255"
assert await async_setup_component(
opp,
switch.DOMAIN,
{
"switch": {
"platform": "wake_on_lan",
"mac": mac,
"broadcast_address": broadcast_address,
}
},
)
await opp.async_block_till_done()
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_OFF
with patch.object(subprocess, "call", return_value=0):
await opp.services.async_call(
switch.DOMAIN,
SERVICE_TURN_ON,
{ATTR_ENTITY_ID: "switch.wake_on_lan"},
blocking=True,
)
mock_send_magic_packet.assert_called_with(mac, ip_address=broadcast_address)
async def test_broadcast_config_port(opp, mock_send_magic_packet):
"""Test with only broadcast port config."""
mac = "00-01-02-03-04-05"
port = 999
assert await async_setup_component(
opp,
switch.DOMAIN,
{"switch": {"platform": "wake_on_lan", "mac": mac, "broadcast_port": port}},
)
await opp.async_block_till_done()
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_OFF
with patch.object(subprocess, "call", return_value=0):
await opp.services.async_call(
switch.DOMAIN,
SERVICE_TURN_ON,
{ATTR_ENTITY_ID: "switch.wake_on_lan"},
blocking=True,
)
mock_send_magic_packet.assert_called_with(mac, port=port)
async def test_off_script(opp):
"""Test with turn off script."""
assert await async_setup_component(
opp,
switch.DOMAIN,
{
"switch": {
"platform": "wake_on_lan",
"mac": "00-01-02-03-04-05",
"host": "validhostname",
"turn_off": {"service": "shell_command.turn_off_target"},
}
},
)
await opp.async_block_till_done()
calls = async_mock_service(opp, "shell_command", "turn_off_target")
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_OFF
with patch.object(subprocess, "call", return_value=0):
await opp.services.async_call(
switch.DOMAIN,
SERVICE_TURN_ON,
{ATTR_ENTITY_ID: "switch.wake_on_lan"},
blocking=True,
)
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_ON
assert len(calls) == 0
with patch.object(subprocess, "call", return_value=2):
await opp.services.async_call(
switch.DOMAIN,
SERVICE_TURN_OFF,
{ATTR_ENTITY_ID: "switch.wake_on_lan"},
blocking=True,
)
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_OFF
assert len(calls) == 1
async def test_invalid_hostname_windows(opp):
"""Test with invalid hostname on windows."""
assert await async_setup_component(
opp,
switch.DOMAIN,
{
"switch": {
"platform": "wake_on_lan",
"mac": "00-01-02-03-04-05",
"host": "invalidhostname",
}
},
)
await opp.async_block_till_done()
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_OFF
with patch.object(subprocess, "call", return_value=2):
await opp.services.async_call(
switch.DOMAIN,
SERVICE_TURN_ON,
{ATTR_ENTITY_ID: "switch.wake_on_lan"},
blocking=True,
)
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_OFF
async def test_no_hostname_state(opp):
"""Test that the state updates if we do not pass in a hostname."""
assert await async_setup_component(
opp,
switch.DOMAIN,
{
"switch": {
"platform": "wake_on_lan",
"mac": "00-01-02-03-04-05",
}
},
)
await opp.async_block_till_done()
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_OFF
with patch.object(subprocess, "call", return_value=0):
await opp.services.async_call(
switch.DOMAIN,
SERVICE_TURN_ON,
{ATTR_ENTITY_ID: "switch.wake_on_lan"},
blocking=True,
)
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_ON
await opp.services.async_call(
switch.DOMAIN,
SERVICE_TURN_OFF,
{ATTR_ENTITY_ID: "switch.wake_on_lan"},
blocking=True,
)
state = opp.states.get("switch.wake_on_lan")
assert state.state == STATE_OFF
| 26.717868 | 84 | 0.585944 | 1,012 | 8,523 | 4.659091 | 0.098814 | 0.045811 | 0.068717 | 0.085896 | 0.848356 | 0.804242 | 0.788971 | 0.788971 | 0.780912 | 0.764581 | 0 | 0.023111 | 0.30447 | 8,523 | 318 | 85 | 26.801887 | 0.772267 | 0.007626 | 0 | 0.672199 | 0 | 0 | 0.142892 | 0.006922 | 0 | 0 | 0 | 0 | 0.120332 | 1 | 0.004149 | false | 0 | 0.033195 | 0 | 0.037344 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
00a99a592c21af86443bb87a33e2b72d5baff349 | 76 | py | Python | upman/views/__init__.py | marcsello/upman | 9e6481f73986ba4a162962c623b8ee1c1d4dab9d | [
"MIT"
] | null | null | null | upman/views/__init__.py | marcsello/upman | 9e6481f73986ba4a162962c623b8ee1c1d4dab9d | [
"MIT"
] | null | null | null | upman/views/__init__.py | marcsello/upman | 9e6481f73986ba4a162962c623b8ee1c1d4dab9d | [
"MIT"
] | null | null | null | from .report_view import ReportView
from .reporter_view import ReporterView
| 25.333333 | 39 | 0.868421 | 10 | 76 | 6.4 | 0.7 | 0.3125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 76 | 2 | 40 | 38 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9737af319169b87de27f875b9027c20f58e810ba | 24 | py | Python | subshifter_files/constants/__init__.py | stecik/subtitle_shifter | 035ecaac27ce66378ac7c7ef219fbbc8204a1fee | [
"MIT"
] | 37 | 2017-08-23T23:26:01.000Z | 2022-01-29T14:37:42.000Z | subshifter_files/constants/__init__.py | stecik/subtitle_shifter | 035ecaac27ce66378ac7c7ef219fbbc8204a1fee | [
"MIT"
] | 11 | 2020-05-26T20:45:42.000Z | 2020-07-24T21:20:08.000Z | subshifter_files/constants/__init__.py | stecik/subtitle_shifter | 035ecaac27ce66378ac7c7ef219fbbc8204a1fee | [
"MIT"
] | 16 | 2017-11-21T18:46:15.000Z | 2022-03-31T07:33:17.000Z | from .constants import * | 24 | 24 | 0.791667 | 3 | 24 | 6.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 24 | 1 | 24 | 24 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
97781402ab2c46ed7351bc015de304230d8534c7 | 150 | py | Python | pynode/helpers/signatures.py | JasonGhostDev/Eternity | c1dec90b95c329ad5a6ada872bb1c2868defa126 | [
"MIT"
] | 6 | 2021-03-28T17:46:54.000Z | 2021-11-26T02:55:34.000Z | pynode/helpers/signatures.py | JasonGhostDev/Eternity | c1dec90b95c329ad5a6ada872bb1c2868defa126 | [
"MIT"
] | 1 | 2021-03-28T17:58:55.000Z | 2021-03-28T17:58:55.000Z | pynode/helpers/signatures.py | JasonGhostDev/Eternity | c1dec90b95c329ad5a6ada872bb1c2868defa126 | [
"MIT"
] | 7 | 2021-03-28T17:31:54.000Z | 2021-07-21T02:52:37.000Z |
# Copyright (c) 2021 Eternity Devs
import ecdsa
def verify_signature(pub_key, signature, message):
return pub_key.verify(signature, message)
| 18.75 | 50 | 0.76 | 20 | 150 | 5.55 | 0.7 | 0.27027 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031746 | 0.16 | 150 | 7 | 51 | 21.428571 | 0.849206 | 0.213333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 6 |
97c0ff571bd7c4ab17879f7a1505659d28e82ea9 | 170 | py | Python | src/admin_extra_urls/api.py | saxix/django-admin-extra-urls | 295112cbc194c20de0c176c535d0cddf3b3cb680 | [
"BSD-1-Clause"
] | 24 | 2015-05-01T03:27:53.000Z | 2022-02-20T07:45:02.000Z | src/admin_extra_urls/api.py | saxix/django-admin-extra-urls | 295112cbc194c20de0c176c535d0cddf3b3cb680 | [
"BSD-1-Clause"
] | 19 | 2015-03-28T09:54:56.000Z | 2022-01-07T14:56:09.000Z | src/admin_extra_urls/api.py | saxix/django-admin-extra-urls | 295112cbc194c20de0c176c535d0cddf3b3cb680 | [
"BSD-1-Clause"
] | 15 | 2015-10-20T10:15:18.000Z | 2022-02-01T16:25:53.000Z | from .button import UrlButton # noqa: F401
from .decorators import button, href, link, url # noqa: F401
from .mixins import ExtraUrlMixin, confirm_action # noqa: F401
| 42.5 | 63 | 0.758824 | 23 | 170 | 5.565217 | 0.608696 | 0.1875 | 0.1875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06338 | 0.164706 | 170 | 3 | 64 | 56.666667 | 0.838028 | 0.188235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c1227f15eacf4d039b0cf4511d0c5fde8c09c5f6 | 3,018 | py | Python | optimization/examples/sa.py | gianmarcodonetti/optimization | 68fbf330b7849a111032a726adaaa97a3e842dba | [
"MIT"
] | null | null | null | optimization/examples/sa.py | gianmarcodonetti/optimization | 68fbf330b7849a111032a726adaaa97a3e842dba | [
"MIT"
] | null | null | null | optimization/examples/sa.py | gianmarcodonetti/optimization | 68fbf330b7849a111032a726adaaa97a3e842dba | [
"MIT"
] | null | null | null | import numpy as np
import matplotlib.pyplot as plt
from functools import partial
from optimization.examples.objfunctions import sin_amplitude_obj, sin_amplitude_basic_obj, neighbour_sin_amplitude
from optimization.heuristic import simulatedannealing
def basic():
x = np.linspace(-100, 100, 1000)
y = sin_amplitude_basic_obj(x)
_ = plt.figure(figsize=(14, 10))
plt.plot(x, y)
plt.xlabel('X')
plt.ylabel('Y')
plt.title("Objective Function")
plt.show()
h = np.random.randint(10 ** 4, 10 ** 7)
obj_func = lambda s: - sin_amplitude_basic_obj(s)
nb_func = partial(neighbour_sin_amplitude, k=10, e=10)
minimization = True
h_final, cache = simulatedannealing.simulated_annealing(
h, obj_func, nb_func, t_initial=80, t_final=1e-100, ctr_max=150, alpha=0.98, minimization=minimization,
verbose=True
)
print("Final solution:")
print("f({}) = {}".format(h_final, -obj_func(h_final)))
print("Len cache:", len(cache))
_ = plt.figure(figsize=(14, 10))
plt.plot([sin_amplitude_basic_obj(c) for c in cache[::20]])
plt.xlabel('iteration')
plt.ylabel('value of objective function')
plt.title("Simulated Annealing searching the optimum")
plt.show()
_ = plt.figure(figsize=(14, 10))
plt.plot(x, y)
plt.scatter(h_final, sin_amplitude_basic_obj(h_final), c='r', label='end')
plt.scatter(h, sin_amplitude_basic_obj(h), c='g', label='start')
plt.xlabel('X')
plt.ylabel('Y')
plt.title("Optimal value found")
plt.legend()
plt.show()
return
def advance():
x = np.linspace(-100, 100, 10000)
y = sin_amplitude_obj(x)
_ = plt.figure(figsize=(14, 10))
plt.plot(x, y)
plt.xlabel('X')
plt.ylabel('Y')
plt.title("Objective Function")
plt.show()
h = np.random.randint(10 ** 4, 10 ** 7)
obj_func = lambda s: - sin_amplitude_obj(s)
nb_func = partial(neighbour_sin_amplitude, k=10, e=10)
minimization = True
h_final, cache = simulatedannealing.simulated_annealing(
h, obj_func, nb_func, t_initial=80, t_final=1e-100, ctr_max=150, alpha=0.98, minimization=minimization,
verbose=True
)
print("Final solution:")
print("f({}) = {}".format(h_final, -obj_func(h_final)))
print("Len cache:", len(cache))
_ = plt.figure(figsize=(14, 10))
plt.plot([sin_amplitude_obj(c) for c in cache[::20]])
plt.xlabel('iteration')
plt.ylabel('value of objective function')
plt.title("Simulated Annealing searching the optimum")
plt.show()
_ = plt.figure(figsize=(14, 10))
plt.plot(x, y)
plt.scatter(h_final, sin_amplitude_obj(h_final), c='r', label='end')
plt.scatter(h, sin_amplitude_obj(h), c='g', label='start')
plt.xlabel('X')
plt.ylabel('Y')
plt.title("Optimal value found")
plt.legend()
plt.show()
return
if __name__ == '__main__':
print("Starting the Simulated Annealing examples...")
basic()
advance()
input("Please, press Enter to end...")
| 29.588235 | 114 | 0.653082 | 435 | 3,018 | 4.351724 | 0.232184 | 0.095087 | 0.047544 | 0.063391 | 0.802958 | 0.779715 | 0.779715 | 0.779715 | 0.779715 | 0.779715 | 0 | 0.038224 | 0.193837 | 3,018 | 101 | 115 | 29.881188 | 0.739827 | 0 | 0 | 0.682927 | 0 | 0 | 0.134858 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02439 | false | 0 | 0.060976 | 0 | 0.109756 | 0.085366 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c15d4879b89a9a368523f891eb2cd392af02ecee | 23 | py | Python | gif2mat.py | William-An/2016-HiMCM-Source-Code | 2c701fe98420017e0667485f7905133ac58b15b1 | [
"MIT"
] | 2 | 2020-10-06T07:33:15.000Z | 2020-10-12T17:26:49.000Z | scripts/icons_selection.py | Goader/embroidery | 7dbc1a9fca99af74069eba473626f40022cb936c | [
"MIT"
] | null | null | null | scripts/icons_selection.py | Goader/embroidery | 7dbc1a9fca99af74069eba473626f40022cb936c | [
"MIT"
] | 1 | 2021-08-06T10:05:30.000Z | 2021-08-06T10:05:30.000Z | from PIL import Image
| 7.666667 | 21 | 0.782609 | 4 | 23 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.217391 | 23 | 2 | 22 | 11.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c18b6789e4ab49af4474f98a806c6b634275ed1a | 3,279 | py | Python | django_test/wxtest/models.py | yanhuaijun/test01 | a61314ee528f376b4a42fe0ecb072c036fc5f3b2 | [
"Unlicense"
] | null | null | null | django_test/wxtest/models.py | yanhuaijun/test01 | a61314ee528f376b4a42fe0ecb072c036fc5f3b2 | [
"Unlicense"
] | null | null | null | django_test/wxtest/models.py | yanhuaijun/test01 | a61314ee528f376b4a42fe0ecb072c036fc5f3b2 | [
"Unlicense"
] | null | null | null | from django.db import models
# Create your models here.
from django.db import models
# Create your models here.
# Register your models here.
class UserType(models.Model):
name = models.CharField(max_length=32)
class wxuser(models.Model): #微信用户表
unionid = models.CharField(max_length=32)
openid = models.CharField(max_length=32)
stats = models.CharField(max_length=32)
userid = models.CharField(max_length=32)
create_time=models.CharField(max_length=32)
upeate_time =models.CharField(max_length=32)
class UserInfo(models.Model): #用户表
username = models.CharField(max_length=32)
pwd = models.CharField(max_length=32)
email = models.CharField(max_length=32)
user_type = models.ForeignKey('UserType', on_delete=models.CASCADE, )
create_by=models.CharField(max_length=32)
create_date =models.CharField(max_length=32)
upeate_by =models.CharField(max_length=32)
upeate_date =models.CharField(max_length=32)
class User(models.Model): #用户表
username = models.CharField(max_length=32)
pwd = models.CharField(max_length=32)
email = models.CharField(max_length=32)
Mobile_phone=models.CharField(max_length=32)
create_by=models.CharField(max_length=32)
create_date =models.CharField(max_length=32)
upeate_by =models.CharField(max_length=32)
upeate_date =models.CharField(max_length=32)
class mood (models.Model):#心情文章表
title=models.CharField(max_length=500)
content=models.CharField(max_length=50000)
create_by = models.CharField(max_length=32)
create_date = models.CharField(max_length=32)
upeate_by = models.CharField(max_length=32)
upeate_date = models.CharField(max_length=32)
class Love (models.Model):#爱情文章表
title=models.CharField(max_length=500)
content=models.CharField(max_length=50000)
create_by = models.CharField(max_length=32)
create_date = models.CharField(max_length=32)
upeate_by = models.CharField(max_length=32)
upeate_date = models.CharField(max_length=32)
class Chicken_soup (models.Model):#鸡汤文章表
title=models.CharField(max_length=500)
content=models.CharField(max_length=50000)
create_by = models.CharField(max_length=32)
create_date = models.CharField(max_length=32)
upeate_by = models.CharField(max_length=32)
upeate_date = models.CharField(max_length=32)
class Sentimental(models.Model): # 伤感文章表
title = models.CharField(max_length=500)
content = models.CharField(max_length=50000)
create_by = models.CharField(max_length=32)
create_date = models.CharField(max_length=32)
upeate_by = models.CharField(max_length=32)
upeate_date = models.CharField(max_length=32)
class Struggle(models.Model): # 奋斗文章表
title = models.CharField(max_length=500)#标题
content = models.CharField(max_length=50000)#内容
create_by = models.CharField(max_length=32)#创建者
create_date = models.CharField(max_length=32)#创建时间
upeate_by = models.CharField(max_length=32)#修改者
upeate_date = models.CharField(max_length=32)#修改时间
class music(models.Model): # 音乐表
title = models.CharField(max_length=500) # 标题
author = models.CharField(max_length=500) # 作者
play_url=models.CharField(max_length=50000) # 播放地址
picture_url=models.CharField(max_length=50000) # 图片地址 | 37.689655 | 73 | 0.752668 | 460 | 3,279 | 5.165217 | 0.152174 | 0.353535 | 0.424242 | 0.565657 | 0.849747 | 0.805556 | 0.696128 | 0.608586 | 0.608586 | 0.571549 | 0 | 0.049558 | 0.138457 | 3,279 | 87 | 74 | 37.689655 | 0.791504 | 0.046356 | 0 | 0.681159 | 0 | 0 | 0.002576 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.028986 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
c1a0ec83811bc920a27f05c49d861eb4459e970a | 260 | py | Python | core/views.py | ThalesLeal/appdjangoecommercee | 816f31117996c23d87d224f3eebe1195ee215400 | [
"CC0-1.0"
] | null | null | null | core/views.py | ThalesLeal/appdjangoecommercee | 816f31117996c23d87d224f3eebe1195ee215400 | [
"CC0-1.0"
] | null | null | null | core/views.py | ThalesLeal/appdjangoecommercee | 816f31117996c23d87d224f3eebe1195ee215400 | [
"CC0-1.0"
] | null | null | null | # coding=utf-8
from django.shortcuts import render
from django.http import HttpResponse
from catalog.models import Category
def index(request):
return render(request, 'index.html')
def contact(request):
return render(request, 'contact.html')
| 17.333333 | 42 | 0.746154 | 34 | 260 | 5.705882 | 0.558824 | 0.103093 | 0.195876 | 0.268041 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004587 | 0.161538 | 260 | 14 | 43 | 18.571429 | 0.885321 | 0.046154 | 0 | 0 | 0 | 0 | 0.089796 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.428571 | 0.285714 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
de11e416e5dfa9011e58bb04b5e5c2b813e0a7f8 | 27 | py | Python | lime/optimize/__init__.py | choderalab/gin | 9082431d8b664699a898c1e2fa490a18737d6e2d | [
"MIT"
] | 24 | 2019-07-20T22:37:09.000Z | 2021-07-07T07:13:56.000Z | lime/optimize/__init__.py | choderalab/gin | 9082431d8b664699a898c1e2fa490a18737d6e2d | [
"MIT"
] | 3 | 2021-05-10T05:29:59.000Z | 2022-02-10T00:15:05.000Z | lime/optimize/__init__.py | kuano-ai/gimlet | 9082431d8b664699a898c1e2fa490a18737d6e2d | [
"MIT"
] | 8 | 2019-08-09T17:30:20.000Z | 2021-12-01T13:27:46.000Z | import lime.optimize.dummy
| 13.5 | 26 | 0.851852 | 4 | 27 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 27 | 1 | 27 | 27 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
de14f9a63c445f3b5fae02f97772c29f26fd8af4 | 34 | py | Python | newpy.py | sunilks/firstBinder | 0a91931dfdbf5fb89bc733b24b837af3fbb36b6c | [
"Apache-2.0"
] | null | null | null | newpy.py | sunilks/firstBinder | 0a91931dfdbf5fb89bc733b24b837af3fbb36b6c | [
"Apache-2.0"
] | null | null | null | newpy.py | sunilks/firstBinder | 0a91931dfdbf5fb89bc733b24b837af3fbb36b6c | [
"Apache-2.0"
] | null | null | null | print("Hello Sunil from Binder!")
| 17 | 33 | 0.735294 | 5 | 34 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 34 | 1 | 34 | 34 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
a9b03a3f2e25f99416f76c1c1a8e6960eab305c1 | 32 | py | Python | assemblyline/al_ui/helper/__init__.py | dendisuhubdy/grokmachine | 120a21a25c2730ed356739231ec8b99fc0575c8b | [
"BSD-3-Clause"
] | 46 | 2017-05-15T11:15:08.000Z | 2018-07-02T03:32:52.000Z | assemblyline/al_ui/helper/__init__.py | dendisuhubdy/grokmachine | 120a21a25c2730ed356739231ec8b99fc0575c8b | [
"BSD-3-Clause"
] | null | null | null | assemblyline/al_ui/helper/__init__.py | dendisuhubdy/grokmachine | 120a21a25c2730ed356739231ec8b99fc0575c8b | [
"BSD-3-Clause"
] | 24 | 2017-05-17T03:26:17.000Z | 2018-07-09T07:00:50.000Z | from al_ui.helper.core import *
| 16 | 31 | 0.78125 | 6 | 32 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e72ea1d33552e9db55ae0bcc113fecaa325f5958 | 2,343 | py | Python | data_to_mongodb.py | Aldridgexia/sleepinfo | 7586561588633eda46f8bc15f5a00543fd7d7174 | [
"MIT"
] | null | null | null | data_to_mongodb.py | Aldridgexia/sleepinfo | 7586561588633eda46f8bc15f5a00543fd7d7174 | [
"MIT"
] | null | null | null | data_to_mongodb.py | Aldridgexia/sleepinfo | 7586561588633eda46f8bc15f5a00543fd7d7174 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from datetime import *
import pymongo
client = pymongo.MongoClient()
db = client.sleepinfo
post = db.dayinfo
today = datetime.today()
dayinfo = {}
is_today = str(raw_input("Do you want to enter today's data: "))
if is_today in ['y', 'Y', 'yes', 'Yes']:
dayinfo['date'] = today
#读入起床时间
getup_hour = int(raw_input("Please enter get-up-hour: "))
getup_minute = int(raw_input("Please enter get-up-minute: "))
getup_time_actual = datetime(today.year, today.month, today.day, getup_hour, getup_minute, 0)
dayinfo['getup_time'] = getup_time_actual
print getup_time_actual
#读入入睡时间
bed_hour = int(raw_input("Please enter bed-hour: "))
bed_minute = int(raw_input("Please enter bed-minute: "))
if bed_hour < 12:
bedtime_actual = datetime(today.year, today.month, today.day, bed_hour, bed_minute, 0)
else:
bedtime_actual = datetime(today.year, today.month, today.day-1, bed_hour, bed_minute, 0)
dayinfo['bedtime'] = bedtime_actual
print bedtime_actual
#读入mt time
mt_time_today = int(raw_input('Please enter mt time: '))
dayinfo['mt_time'] = mt_time_today
print mt_time_today
else:
data_date = str(raw_input("Please enter your data's date: "))
data_date = datetime.strptime(data_date, format('%Y-%m-%d'))
dayinfo['date'] = data_date
#读入起床时间
getup_hour = int(raw_input("Please enter get-up-hour: "))
getup_minute = int(raw_input("Please enter get-up-minute: "))
getup_time_actual = datetime(data_date.year, data_date.month, data_date.day, getup_hour, getup_minute, 0)
dayinfo['getup_time'] = getup_time_actual
print getup_time_actual
#读入入睡时间
bed_hour = int(raw_input("Please enter bed-hour: "))
bed_minute = int(raw_input("Please enter bed-minute: "))
if bed_hour < 12:
bedtime_actual = datetime(data_date.year, data_date.month, data_date.day, bed_hour, bed_minute, 0)
else:
bedtime_actual = datetime(data_date.year, data_date.month, data_date.day-1, bed_hour, bed_minute, 0)
dayinfo['bedtime'] = bedtime_actual
print bedtime_actual
#读入mt time
mt_time_today = int(raw_input('Please enter mt time: '))
dayinfo['mt_time'] = mt_time_today
print mt_time_today
#每日数据插入数据库
print dayinfo
post.insert_one(dayinfo)
print('Data Insert successfully!')
| 35.5 | 109 | 0.696116 | 347 | 2,343 | 4.45245 | 0.178674 | 0.067314 | 0.099676 | 0.135275 | 0.757929 | 0.757929 | 0.757929 | 0.757929 | 0.742395 | 0.711327 | 0 | 0.006757 | 0.178831 | 2,343 | 65 | 110 | 36.046154 | 0.796258 | 0.03073 | 0 | 0.55102 | 0 | 0 | 0.181698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.040816 | null | null | 0.163265 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e7550a03c4bea3896b0687d58e8b395e83de1aad | 135 | py | Python | pardal/authserver/run.py | anapaulagomes/pardal-python | d67ba1e0e67677320f37cab1288881ad7845612b | [
"MIT"
] | 2 | 2019-01-30T22:35:44.000Z | 2019-01-30T22:41:06.000Z | pardal/authserver/run.py | anapaulagomes/pardal-python | d67ba1e0e67677320f37cab1288881ad7845612b | [
"MIT"
] | null | null | null | pardal/authserver/run.py | anapaulagomes/pardal-python | d67ba1e0e67677320f37cab1288881ad7845612b | [
"MIT"
] | null | null | null | from pardal.authserver.app import app
from pardal.authserver import endpoints # noqa: F401
if __name__ == '__main__':
app.run()
| 19.285714 | 53 | 0.733333 | 18 | 135 | 5.055556 | 0.666667 | 0.21978 | 0.43956 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026786 | 0.17037 | 135 | 6 | 54 | 22.5 | 0.785714 | 0.074074 | 0 | 0 | 0 | 0 | 0.065041 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
e792c6c645436c7a05772cecae4604d091c1da07 | 24 | py | Python | src/sound_lib/effects/__init__.py | Oire/TheQube | fcfd8a68b15948e0740642d635db24adef8cc314 | [
"MIT"
] | 21 | 2015-08-02T21:26:14.000Z | 2019-12-27T09:57:44.000Z | src/sound_lib/effects/__init__.py | Oire/TheQube | fcfd8a68b15948e0740642d635db24adef8cc314 | [
"MIT"
] | 34 | 2015-01-12T00:38:14.000Z | 2020-08-31T11:19:37.000Z | src/sound_lib/effects/__init__.py | Oire/TheQube | fcfd8a68b15948e0740642d635db24adef8cc314 | [
"MIT"
] | 15 | 2015-03-24T15:42:30.000Z | 2020-09-24T20:26:42.000Z | from tempo import Tempo
| 12 | 23 | 0.833333 | 4 | 24 | 5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 24 | 1 | 24 | 24 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e7bfc296c5fead67e1e6480b5af208da378c999c | 38 | py | Python | python/file-tests/testTraceback3.py | CodeSteak/write-your-python-program | 13a10f1e8f4fe23c66a8762854c8bb12f0fb9ff4 | [
"BSD-3-Clause"
] | 1 | 2021-09-30T10:17:57.000Z | 2021-09-30T10:17:57.000Z | python/file-tests/testTraceback3.py | CodeSteak/write-your-python-program | 13a10f1e8f4fe23c66a8762854c8bb12f0fb9ff4 | [
"BSD-3-Clause"
] | 47 | 2020-11-16T14:02:52.000Z | 2022-03-18T12:44:38.000Z | python/file-tests/testTraceback3.py | CodeSteak/write-your-python-program | 13a10f1e8f4fe23c66a8762854c8bb12f0fb9ff4 | [
"BSD-3-Clause"
] | 4 | 2020-10-28T13:54:44.000Z | 2022-01-20T17:36:24.000Z | from wypp import *
print([1,2,3][10])
| 12.666667 | 18 | 0.631579 | 8 | 38 | 3 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151515 | 0.131579 | 38 | 2 | 19 | 19 | 0.575758 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
e7c957015197e6abe0e71e878f4eb17a778ad18d | 38 | py | Python | jinahub/indexers/storage/LMDBStorage/__init__.py | albertocarpentieri/executors | 3b025b6106fca9dba3c2569b0e60da050273fa6e | [
"Apache-2.0"
] | 29 | 2021-07-26T07:16:38.000Z | 2022-03-27T15:10:34.000Z | jinahub/indexers/storage/LMDBStorage/__init__.py | albertocarpentieri/executors | 3b025b6106fca9dba3c2569b0e60da050273fa6e | [
"Apache-2.0"
] | 176 | 2021-07-23T08:30:21.000Z | 2022-03-14T12:29:06.000Z | jinahub/indexers/storage/LMDBStorage/__init__.py | albertocarpentieri/executors | 3b025b6106fca9dba3c2569b0e60da050273fa6e | [
"Apache-2.0"
] | 16 | 2021-07-26T20:55:40.000Z | 2022-03-18T15:32:17.000Z | from .lmdb_storage import LMDBStorage
| 19 | 37 | 0.868421 | 5 | 38 | 6.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 38 | 1 | 38 | 38 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e7da276e3dade22d7288072fde5a46b27de25651 | 109 | py | Python | python/atizer/__init__.py | tjgiese/atizer | b8cdb8f4bac7cedfb566d766acee5fe0cc7a7bd3 | [
"MIT"
] | null | null | null | python/atizer/__init__.py | tjgiese/atizer | b8cdb8f4bac7cedfb566d766acee5fe0cc7a7bd3 | [
"MIT"
] | null | null | null | python/atizer/__init__.py | tjgiese/atizer | b8cdb8f4bac7cedfb566d766acee5fe0cc7a7bd3 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
#__all__ = [ "base" ]
from .base import *
from .m4 import *
from .licenses import *
| 12.111111 | 23 | 0.642202 | 15 | 109 | 4.4 | 0.666667 | 0.30303 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011494 | 0.201835 | 109 | 8 | 24 | 13.625 | 0.747126 | 0.366972 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8242ded30f2d252bec9693139738875a4c8f99c4 | 121 | py | Python | python/13_regex_and_parsing/15_validatingcreditcardnumbers.py | jaimiles23/hacker_rank | 0580eac82e5d0989afabb5c2e66faf09713f891b | [
"Apache-2.0"
] | null | null | null | python/13_regex_and_parsing/15_validatingcreditcardnumbers.py | jaimiles23/hacker_rank | 0580eac82e5d0989afabb5c2e66faf09713f891b | [
"Apache-2.0"
] | null | null | null | python/13_regex_and_parsing/15_validatingcreditcardnumbers.py | jaimiles23/hacker_rank | 0580eac82e5d0989afabb5c2e66faf09713f891b | [
"Apache-2.0"
] | 3 | 2021-09-22T11:06:58.000Z | 2022-01-25T09:29:24.000Z | Solution to [Validating Credit Card Numbers](https://www.hackerrank.com/challenges/validating-credit-card-number/problem) | 121 | 121 | 0.834711 | 16 | 121 | 6.3125 | 0.8125 | 0.316832 | 0.39604 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041322 | 121 | 1 | 121 | 121 | 0.87069 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
68b722972d50199855d95fa8c8cc5307c9726c12 | 34 | py | Python | reproject/healpix/__init__.py | williyamshoe/reproject | 6d9b69a8b7965f4223749ca761703c33d051a9ac | [
"BSD-3-Clause"
] | 39 | 2019-05-11T19:23:33.000Z | 2022-02-28T12:25:37.000Z | reproject/healpix/__init__.py | williyamshoe/reproject | 6d9b69a8b7965f4223749ca761703c33d051a9ac | [
"BSD-3-Clause"
] | 121 | 2018-12-06T16:36:05.000Z | 2022-03-31T23:52:40.000Z | reproject/healpix/__init__.py | williyamshoe/reproject | 6d9b69a8b7965f4223749ca761703c33d051a9ac | [
"BSD-3-Clause"
] | 17 | 2018-12-05T04:14:48.000Z | 2021-12-09T22:29:54.000Z | from .high_level import * # noqa
| 17 | 33 | 0.705882 | 5 | 34 | 4.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.205882 | 34 | 1 | 34 | 34 | 0.851852 | 0.117647 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6b60a3a3cf7bb03244f9dfc289589613b6beb418 | 45 | py | Python | fnote/blueprints/note/__init__.py | carlegbert/wholenote | 710b6d9d90b35e6f35ac66e9312ae91fbb2ddfd3 | [
"MIT"
] | 1 | 2017-04-27T08:58:24.000Z | 2017-04-27T08:58:24.000Z | fnote/blueprints/note/__init__.py | carlegbert/wholenote | 710b6d9d90b35e6f35ac66e9312ae91fbb2ddfd3 | [
"MIT"
] | null | null | null | fnote/blueprints/note/__init__.py | carlegbert/wholenote | 710b6d9d90b35e6f35ac66e9312ae91fbb2ddfd3 | [
"MIT"
] | null | null | null | from fnote.blueprints.note.views import note
| 22.5 | 44 | 0.844444 | 7 | 45 | 5.428571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 45 | 1 | 45 | 45 | 0.926829 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
6b6b7b4ca6a18f84eda289ea0b26d2108a596982 | 165 | py | Python | MetioTube/main_app/templatetags/split_get_first_filter.py | Sheko1/MetioTube | c1c36d00ea46fc37cc7f3c0c9c0cae6e89b2113c | [
"MIT"
] | null | null | null | MetioTube/main_app/templatetags/split_get_first_filter.py | Sheko1/MetioTube | c1c36d00ea46fc37cc7f3c0c9c0cae6e89b2113c | [
"MIT"
] | null | null | null | MetioTube/main_app/templatetags/split_get_first_filter.py | Sheko1/MetioTube | c1c36d00ea46fc37cc7f3c0c9c0cae6e89b2113c | [
"MIT"
] | null | null | null | from django.template import Library
register = Library()
@register.filter(name='split_get_first')
def split_get_first(value, sep):
return value.split(sep)[0]
| 18.333333 | 40 | 0.757576 | 24 | 165 | 5.041667 | 0.666667 | 0.247934 | 0.214876 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006897 | 0.121212 | 165 | 8 | 41 | 20.625 | 0.827586 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0.2 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
6be355a4204210661a0b7b8b59048638ae41fbe1 | 24 | py | Python | webserver/__init__.py | mdatsev/webserver | 325339596971b0f3f6a273d7dc6b39353f5661ef | [
"Unlicense"
] | null | null | null | webserver/__init__.py | mdatsev/webserver | 325339596971b0f3f6a273d7dc6b39353f5661ef | [
"Unlicense"
] | null | null | null | webserver/__init__.py | mdatsev/webserver | 325339596971b0f3f6a273d7dc6b39353f5661ef | [
"Unlicense"
] | null | null | null | from .server import main | 24 | 24 | 0.833333 | 4 | 24 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 24 | 1 | 24 | 24 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6bf2b939cfe40bb456e2fe6b1dafa78ad1cda34c | 8,590 | py | Python | effmass/dos.py | brlec/effmass | cd0add6a40a16ad8f0250f8e5e7a2e07c284a497 | [
"MIT"
] | 34 | 2018-10-05T14:52:42.000Z | 2022-03-29T14:23:11.000Z | effmass/dos.py | brlec/effmass | cd0add6a40a16ad8f0250f8e5e7a2e07c284a497 | [
"MIT"
] | 43 | 2018-07-06T16:31:27.000Z | 2022-03-29T11:00:31.000Z | effmass/dos.py | brlec/effmass | cd0add6a40a16ad8f0250f8e5e7a2e07c284a497 | [
"MIT"
] | 31 | 2018-06-26T12:10:00.000Z | 2022-03-29T14:23:12.000Z | #! /usr/bin/env python3
"""
A module for analysing DOSCAR data.
"""
import numpy as np
def _check_integrated_dos_loaded(Data):
"""Helper function to check if :attr:`~effmass.inputs.Data.integrated_dos`
is loaded.
Args:
Data (Data): Instance of the :class:`Data` class.
Returns:
None.
"""
assert Data.integrated_dos != [], "Data.integrated_dos is empty. Please set attribute, perhaps using Data.parse_DOSCAR, and try again."
return
def _check_dos_loaded(Data):
"""Helper function to check if :attr:`~effmass.inputs.Data.dos` is loaded.
Args:
Data (Data): Instance of the :class:`Data` class.
Returns:
None.
"""
assert Data.integrated_dos != [], "Data.dos is empty. Please set attribute, perhaps using Data.parse_DOSCAR, and try again."
return
def find_dos_VBM_index(Data):
"""Finds the lowest index of the
:attr:`~effmass.inputs.Data.integrated_dos` array where the energy exceeds
:attr:`~effmass.inputs.Data.VBM`.
Args:
Data (Data): Instance of the Data class.
Returns:
int: the lowest index of the :attr:`~effmass.inputs.Data.integrated_dos` array where the energy exceeds :attr:`~effmass.inputs.Data.VBM`.
"""
_check_integrated_dos_loaded(Data)
for i in range(len(Data.integrated_dos)):
if Data.VBM < Data.integrated_dos[i][0]:
if Data.dos[i][1] == 0.0:
return i
def find_dos_CBM_index(Data):
"""Finds the highest index of the
:attr:`~effmass.inputs.Data.integrated_dos` array where the energy is less
than :attr:`~effmass.inputs.Data.CBM`.
Args:
Data (Data): Instance of the Data class.
Returns:
int: the highest index of the :attr:`~effmass.inputs.Data.integrated_dos` array where the energy is less than :attr:`~effmass.inputs.Data.CBM`.
"""
_check_integrated_dos_loaded(Data)
for i in range(len(Data.integrated_dos))[::-1]:
if Data.CBM > Data.integrated_dos[i][0]:
if Data.dos[i][1] == 0.0:
return i
def electron_fill_level(Data, volume, concentration, CBM_index):
r"""
Finds the energy to which a given electron concentration will fill the density of states in :attr:`~effmass.inputs.Data.integrated_dos`.
Uses linear interpolation to estimate the energy between two points given in the DOSCAR.
Args:
Data (Data): Instance of the :class:`Data` class.
volume (float): volume of the unit cell in angstrom :math:`^3`.
concentration (float): electron concentration in cm :math:`^{-3}`.
CBM_index (int): highest index of the :attr:`~effmass.inputs.Data.integrated_dos` array where the energy is less than :attr:`~effmass.inputs.Data.CBM`.
Returns:
float: the energy (eV, referenced from the CBM) to which the electrons will fill. For the case where the concentration specified would fill all states specified by :attr:`~effmass.inputs.Data.integrated_dos`, None is returned.
Notes:
The precision of the result will depend upon the energy resolution in the DOSCAR.
"""
_check_integrated_dos_loaded(Data)
CBM_index = CBM_index
states_per_unit_cell = volume * 1E-30 * concentration * 1E6
assert (
states_per_unit_cell < np.absolute(Data.integrated_dos[-1][1] -
Data.integrated_dos[CBM_index][1])
), "the concentration specified would fill all available energy states"
upper_index = len(Data.integrated_dos) - 1
lower_index = CBM_index
# this function is made a little more complicated because the dos can be a step function
# therefore, we cannot interpolate between consecutive indices, but need to find the range of the step.
for i in range(CBM_index + 1, len(Data.integrated_dos)):
if states_per_unit_cell < np.absolute(
Data.integrated_dos[i][1] - Data.integrated_dos[CBM_index][1]):
upper_index = i # marks the maximum energy for this concentration
break
for i in range(1, upper_index - CBM_index):
if Data.integrated_dos[upper_index -
1][1] - Data.integrated_dos[upper_index - 1 -
i][1] != 0:
lower_index = upper_index - i # marks the minimum energy for this concentration
break
# linear interpolation
proportion = (states_per_unit_cell - (Data.integrated_dos[lower_index][1] -
Data.integrated_dos[CBM_index][1])
) / np.absolute(Data.integrated_dos[upper_index][1] -
(Data.integrated_dos[lower_index][1]))
energy = ((Data.integrated_dos[lower_index][0] -
Data.integrated_dos[CBM_index][0]) +
(Data.integrated_dos[upper_index][0] -
Data.integrated_dos[lower_index][0]) * proportion)
# where lower index has been calculated to be below the CBM, as the concentration is smaller than the first step so set lower energy equal to CBM (0eV)
if Data.integrated_dos[lower_index][0] - Data.integrated_dos[CBM_index][0] < 0:
energy = (Data.integrated_dos[upper_index][0] -
Data.integrated_dos[CBM_index][0]) * proportion
return energy
def hole_fill_level(Data, volume, concentration, VBM_index):
r"""
Finds the energy to which a given hole concentration will fill the density of states in :attr:`~effmass.inputs.Data.integrated_dos`.
Uses linear interpolation to estimate the energy between two points given in the DOSCAR.
Args:
Data (Data): Instance of the :class:`Data` class.
volume (float): volume of the unit cell in angstrom :math:`^3`.
concentration: hole concentration in cm :math:`^{-3}`.
VBM_index (int): lowest index of the :attr:`~effmass.inputs.Data.integrated_dos` array where the energy is more than than :attr:`~effmass.inputs.Data.VBM`.
Returns:
float: the energy (eV, referenced from the VBM) to which the holes will fill. For the case where the concentration specified would fill all states specified by :attr:`~effmass.inputs.Data.integrated_dos`, None is returned.
Notes:
The precision of the result will depend upon the energy resolution in the DOSCAR.
"""
_check_integrated_dos_loaded(Data)
VBM_index = VBM_index
states_per_unit_cell = volume * 1E-30 * concentration * 1E6
assert (
states_per_unit_cell < np.absolute(Data.integrated_dos[0][1] -
Data.integrated_dos[VBM_index][1])
), "the concentration specified would fill all available energy states"
upper_index = 0
lower_index = VBM_index
# this function is made a little more complicated because the dos can be a step function
# therefore, we cannot interpolate between consecutive indices, but need to find the range of the step.
for i in range(0, VBM_index)[::-1]:
if states_per_unit_cell < np.absolute(
Data.integrated_dos[i][1] -
Data.integrated_dos[VBM_index - 1][1]):
upper_index = i # marks the maximum hole energy for this concentration
break
for i in range(1, VBM_index):
if Data.integrated_dos[upper_index +
1][1] - Data.integrated_dos[upper_index + 1 +
i][1] != 0:
lower_index = upper_index + i # marks the minimum hole energy for this concentration
break
# linear interpolation
proportion = (states_per_unit_cell -
np.absolute(Data.integrated_dos[lower_index][1] -
Data.integrated_dos[VBM_index][1])
) / np.absolute(Data.integrated_dos[upper_index][1] -
(Data.integrated_dos[lower_index][1]))
energy = ((Data.integrated_dos[lower_index][0] -
Data.integrated_dos[VBM_index][0]) -
(Data.integrated_dos[lower_index][0] -
Data.integrated_dos[upper_index][0]) * proportion)
# where lower index has been calculated to be above the CBM, as the concentration is smaller than the first step and so set lower energy equal to VBM (0eV)
if Data.integrated_dos[lower_index][0] - Data.integrated_dos[VBM_index][0] > 0:
energy = (Data.integrated_dos[upper_index][0] -
Data.integrated_dos[VBM_index][0]) * proportion
return energy
| 42.95 | 234 | 0.644936 | 1,163 | 8,590 | 4.615649 | 0.128117 | 0.147727 | 0.177347 | 0.070417 | 0.931818 | 0.887109 | 0.875186 | 0.853018 | 0.832899 | 0.785209 | 0 | 0.012606 | 0.261234 | 8,590 | 199 | 235 | 43.165829 | 0.833281 | 0.444936 | 0 | 0.449438 | 0 | 0 | 0.070795 | 0 | 0 | 0 | 0 | 0 | 0.044944 | 1 | 0.067416 | false | 0 | 0.011236 | 0 | 0.146067 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2e0eb3e018651e30a0b6a06f422aa283bac1235c | 2,308 | py | Python | fb-spambot.py | mah-hacker/Facebook-Spambot | 7898387e6d7028977d52b05bf0499a8592f3232f | [
"MIT"
] | null | null | null | fb-spambot.py | mah-hacker/Facebook-Spambot | 7898387e6d7028977d52b05bf0499a8592f3232f | [
"MIT"
] | null | null | null | fb-spambot.py | mah-hacker/Facebook-Spambot | 7898387e6d7028977d52b05bf0499a8592f3232f | [
"MIT"
] | null | null | null | from selenium import webdriver
import time
from getpass import getpass
print('_____ _ ____ _____ ____ ___ ___ _ __ ____ ____ _ ___ ___ ____ ___ _____')
print('| ___/ \ / ___| ____| __ ) / _ \ / _ \| |/ / / ___|| _ \ / \ | \/ | | __ ) / _ \_ _|')
print("| |_ / _ \| | | _| | _ \| | | | | | | ' / \___ \| |_) / _ \ | |\/| | | _ \| | | || |")
print('| _/ ___ \ |___| |___| |_) | |_| | |_| | . \ ___) | __/ ___ \| | | | | |_) | |_| || |')
print('|_|/_/ \_\____|_____|____/ \___/ \___/|_|\_\ |____/|_| /_/ \_\_| |_| |____/ \___/ |_|')
print("\n")
username = input("Enter in your username: ")
password = getpass("Enter your password: ")
friend = input("Enter your friend name to message: ")
message = input("Enter your message to your friend: ")
mah = webdriver.Chrome("PASTE YOUR WEBDRIVER PATH HERE") #Example ("E:\WebDrivers\chromedriver.exe")
mah.implicitly_wait(15)
mah.get('https://www.facebook.com/')
username_textbox = mah.find_element_by_name("email")
username_textbox.send_keys(username)
password_textbox = mah.find_element_by_name("pass")
password_textbox.send_keys(password)
time.sleep(2)
login = mah.find_element_by_name("login")
login.click()
time.sleep(5)
msg = mah.find_element_by_xpath("/html/body/div[1]/div/div[1]/div[1]/div[2]/div[4]/div[1]/div[2]/span/div/div[1]")
msg.click()
time.sleep(3)
search = mah.find_element_by_xpath('/html/body/div[1]/div/div[1]/div[1]/div[2]/div[4]/div[2]/div/div[1]/div[1]/div[1]/div/div/div/div/div/div/div[1]/div/div[1]/div[1]/div[2]/div[1]/div/div/div/label/input')
search.send_keys(friend)
time.sleep(5)
select = mah.find_element_by_xpath("/html/body/div[1]/div/div[1]/div[1]/div[2]/div[4]/div[2]/div/div[1]/div[1]/div[2]/div/div/div[1]/div[1]/div/div/div[1]/ul/div[1]/li[1]")
select.click()
i=0
while i<= 1:
enter = mah.find_element_by_xpath("/html/body/div[1]/div/div[1]/div[1]/div[5]/div[1]/div[1]/div[1]/div/div/div/div/div/div/div[2]/div/div[2]/form/div/div[3]/div[2]/div[1]/div/div/div/div/div/div/div/div/div")
enter.send_keys(message)
send = mah.find_element_by_xpath("/html/body/div[1]/div/div[1]/div[1]/div[5]/div[1]/div[1]/div[1]/div/div/div/div/div/div/div[2]/div/div[2]/form/div/div[3]/span[2]/div")
send.click()
| 44.384615 | 211 | 0.611785 | 345 | 2,308 | 3.556522 | 0.194203 | 0.220049 | 0.193969 | 0.176039 | 0.496333 | 0.455583 | 0.409128 | 0.383863 | 0.375713 | 0.355338 | 0 | 0.034219 | 0.151646 | 2,308 | 51 | 212 | 45.254902 | 0.592441 | 0.018198 | 0 | 0.052632 | 0 | 0.236842 | 0.602529 | 0.309395 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.105263 | 0.078947 | 0 | 0.078947 | 0.157895 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
2e6cbf3ee9e5e6dfa9c02c2c8cbd4c9dc6832bd3 | 24,915 | py | Python | tests/test_predator_prey_communication.py | Leonardo767/Abmarl | 9fada5447b09174c6a70b6032b4a8d08b66c4589 | [
"Apache-2.0"
] | null | null | null | tests/test_predator_prey_communication.py | Leonardo767/Abmarl | 9fada5447b09174c6a70b6032b4a8d08b66c4589 | [
"Apache-2.0"
] | null | null | null | tests/test_predator_prey_communication.py | Leonardo767/Abmarl | 9fada5447b09174c6a70b6032b4a8d08b66c4589 | [
"Apache-2.0"
] | null | null | null | import numpy as np
from abmarl.sim.predator_prey import PredatorPreySimulation, Predator, Prey
from abmarl.sim.wrappers import CommunicationHandshakeWrapper
def test_communication():
np.random.seed(24)
agents = [
Predator(id='predator0', view=1, attack=1),
Predator(id='predator1', view=8, attack=0),
Prey(id='prey1', view=4),
Prey(id='prey2', view=5)
]
sim = PredatorPreySimulation.build(
{'agents': agents, 'observation_mode': PredatorPreySimulation.ObservationMode.DISTANCE}
)
sim = CommunicationHandshakeWrapper(sim)
sim.reset()
sim.sim.agents['prey1'].position = np.array([1, 1])
sim.sim.agents['prey2'].position = np.array([1, 4])
sim.sim.agents['predator0'].position = np.array([2, 3])
sim.sim.agents['predator1'].position = np.array([0, 7])
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['predator1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey2'], np.array([-1, 1, 1])
)
assert sim.get_obs('predator0')['message_buffer'] == \
{'predator1': False, 'prey1': False, 'prey2': False}
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['predator0'], np.array([2, -4, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey1'], np.array([1, -6, 1])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey2'], np.array([1, -3, 1])
)
assert sim.get_obs('predator1')['message_buffer'] == \
{'predator0': False, 'prey1': False, 'prey2': False}
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator0'], np.array([1, 2, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['prey2'], np.array([0, 3, 1])
)
assert sim.get_obs('prey1')['message_buffer'] == \
{'predator0': False, 'predator1': False, 'prey2': False}
np.testing.assert_array_equal(
sim.get_obs('prey2')['obs']['predator0'], np.array([1, -1, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey2')['obs']['predator1'], np.array([-1, 3, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey2')['obs']['prey1'], np.array([0, -3, 1])
)
assert sim.get_obs('prey2')['message_buffer'] == \
{'predator0': False, 'predator1': False, 'prey1': False}
action1 = {
'predator0': {
'action': {'move': np.zeros(2), 'attack': 1},
'send': {'predator1': False, 'prey1': False, 'prey2': False},
'receive': {'predator1': False, 'prey1': False, 'prey2': True},
},
'predator1': {
'action': {'move': np.zeros(2), 'attack': 0},
'send': {'predator0': True, 'prey1': False, 'prey2': False},
'receive': {'predator0': True, 'prey1': True, 'prey2': True},
},
'prey1': {
'action': np.array([-1, 0]),
'send': {'predator0': False, 'predator1': False, 'prey2': False},
'receive': {'predator0': False, 'predator1': False, 'prey2': False},
},
'prey2': {
'action': np.array([0, 1]),
'send': {'predator0': False, 'predator1': False, 'prey1': True},
'receive': {'predator0': True, 'predator1': True, 'prey1': True},
}
}
sim.step(action1)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['predator1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator0')['message_buffer'] == \
{'predator1': True, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator0') == 100
assert not sim.get_done('predator0')
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['predator0'], np.array([2, -4, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey1'], np.array([0, -6, 1])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator1')['message_buffer'] == \
{'predator0': False, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator1') == 0
assert not sim.get_done('predator1')
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator0'], np.array([2, 2, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('prey1')['message_buffer'] == \
{'predator0': False, 'predator1': False, 'prey2': True}
assert sim.get_reward('prey1') == -1
assert not sim.get_done('prey1')
np.testing.assert_array_equal(
sim.get_obs('prey2')['obs']['predator0'], np.array([1, -1, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey2')['obs']['predator1'], np.array([-1, 3, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey2')['obs']['prey1'], np.array([-1, -3, 1])
)
assert sim.get_obs('prey2')['message_buffer'] == \
{'predator0': False, 'predator1': False, 'prey1': False}
assert sim.get_reward('prey2') == -100
assert sim.get_done('prey2')
assert not sim.get_all_done()
action2 = {
'predator0': {
'action': {'move': np.array([-1, 0]), 'attack': 0},
'send': {'predator1': False, 'prey1': False, 'prey2': False},
'receive': {'predator1': False, 'prey1': False, 'prey2': False}
},
'predator1': {
'action': {'move': np.array([1, -1]), 'attack': 0},
'send': {'predator0': True, 'prey1': False, 'prey2': False},
'receive': {'predator0': True, 'prey1': True, 'prey2': False}
},
'prey1': {
'action': np.array([1, -1]),
'send': {'predator0': False, 'predator1': False, 'prey2': True},
'receive': {'predator0': False, 'predator1': False, 'prey2': True}
},
}
sim.step(action2)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['predator1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator0')['message_buffer'] == \
{'predator1': True, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator0') == -1
assert not sim.get_done('predator0')
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['predator0'], np.array([0, -3, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey1'], np.array([0, -6, 1])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator1')['message_buffer'] == \
{'predator0': False, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator1') == -1
assert not sim.get_done('predator1')
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator0'], np.array([0, 3, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('prey1')['message_buffer'] == \
{'predator0': False, 'predator1': False, 'prey2': False}
assert sim.get_reward('prey1') == -1
assert not sim.get_done('prey1')
action3 = {
'predator0': {
'action': {'move': np.array([0, -1]), 'attack': 0},
'send': {'predator1': False, 'prey1': False, 'prey2': False},
'receive': {'predator1': False, 'prey1': False, 'prey2': False},
},
'predator1': {
'action': {'move': np.array([1, -1]), 'attack': 0},
'send': {'predator0': True, 'prey1': False, 'prey2': False},
'receieve': {'predator0': True, 'prey1': True, 'prey2': False}
},
'prey1': {
'action': np.array([1, 0]),
'send': {'predator0': False, 'predator1': False, 'prey2': True},
'receive': {'predator0': False, 'predator1': False, 'prey2': True}
}
}
sim.step(action3)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['predator1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator0')['message_buffer'] == \
{'predator1': True, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator0') == -1
assert not sim.get_done('predator0')
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['predator0'], np.array([-1, -3, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey1'], np.array([0, -5, 1])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator1')['message_buffer'] == \
{'predator0': False, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator1') == -1
assert not sim.get_done('predator1')
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator0'], np.array([-1, 2, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('prey1')['message_buffer'] == \
{'predator0': False, 'predator1': False, 'prey2': False}
assert sim.get_reward('prey1') == -1
assert not sim.get_done('prey1')
action4 = {
'predator0': {
'action': {'move': np.array([0, 0]), 'attack': 1},
'send': {'predator1': False, 'prey1': False, 'prey2': False},
'receive': {'predator1': False, 'prey1': True, 'prey2': False},
},
'predator1': {
'action': {'move': np.array([1, -1]), 'attack': 0},
'send': {'predator0': True, 'prey1': False, 'prey2': False},
'receive': {'predator0': True, 'prey1': True, 'prey2': False}
},
'prey1': {
'action': np.array([1, 0]),
'send': {'predator0': False, 'predator1': False, 'prey2': False},
'receive': {'predator0': True, 'predator1': True, 'prey2': True}
},
}
sim.step(action4)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['predator1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator0')['message_buffer'] == \
{'predator1': True, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator0') == -10
assert not sim.get_done('predator0')
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['predator0'], np.array([-2, -2, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey1'], np.array([0, -4, 1])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator1')['message_buffer'] == \
{'predator0': False, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator1') == -1
assert not sim.get_done('predator1')
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator0'], np.array([-2, 2, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator1'], np.array([0, 4, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('prey1')['message_buffer'] == \
{'predator0': False, 'predator1': False, 'prey2': False}
assert sim.get_reward('prey1') == -1
assert not sim.get_done('prey1')
action5 = {
'predator0': {
'action': {'move': np.zeros(2), 'attack': 0},
'send': {'predator1': True, 'prey1': False, 'prey2': False},
'receive': {'predator1': True, 'prey1': False, 'prey2': False},
},
'predator1': {
'action': {'move': np.array([1, 0]), 'attack': 0},
'send': {'predator0': True, 'prey1': False, 'prey2': False},
'receive': {'predator0': True, 'prey1': True, 'prey2': False},
},
'prey1': {
'action': np.array([0, -1]),
'send': {'predator0': False, 'predator1': False, 'prey2': False},
'receive': {'predator0': False, 'predator1': False, 'prey2': False},
},
}
sim.step(action5)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['predator1'], np.array([3, 2, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey1'], np.array([2, -2, 1])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator0')['message_buffer'] == \
{'predator1': True, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator0') == 0
assert not sim.get_done('predator0')
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['predator0'], np.array([-3, -2, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey1'], np.array([-1, -4, 1])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator1')['message_buffer'] == \
{'predator0': True, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator1') == -1
assert not sim.get_done('predator1')
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator0'], np.array([-2, 2, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator1'], np.array([1, 4, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('prey1')['message_buffer'] == \
{'predator0': False, 'predator1': False, 'prey2': False}
assert sim.get_reward('prey1') == -10
assert not sim.get_done('prey1')
action6 = {
'predator0': {
'action': {'move': np.array([1, 0]), 'attack': 0},
'send': {'predator1': True, 'prey1': False, 'prey2': False},
'receive': {'predator1': True, 'prey1': False, 'prey2': False}
},
'predator1': {
'action': {'move': np.array([1, 0]), 'attack': 0},
'send': {'predator0': True, 'prey1': False, 'prey2': False},
'receive': {'predator0': True, 'prey1': True, 'prey2': False}
},
'prey1': {
'action': np.array([1, 1]),
'send': {'predator0': False, 'predator1': False, 'prey2': False},
'receive': {'predator0': False, 'predator1': False, 'prey2': False},
}
}
sim.step(action6)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['predator1'], np.array([3, 2, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey1'], np.array([2, -1, 1])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator0')['message_buffer'] == \
{'predator1': True, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator0') == -1
assert not sim.get_done('predator0')
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['predator0'], np.array([-3, -2, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey1'], np.array([-1, -3, 1])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator1')['message_buffer'] == \
{'predator0': True, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator1') == -1
assert not sim.get_done('predator1')
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator0'], np.array([-2, 1, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator1'], np.array([1, 3, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('prey1')['message_buffer'] == \
{'predator0': False, 'predator1': False, 'prey2': False}
assert sim.get_reward('prey1') == -1
assert not sim.get_done('prey1')
action7 = {
'predator0': {
'action': {'move': np.array([1, 0]), 'attack': 0},
'send': {'predator1': True, 'prey1': False, 'prey2': False},
'receive': {'predator1': True, 'prey1': False, 'prey2': False}
},
'predator1': {
'action': {'move': np.array([1, 1]), 'attack': 0},
'send': {'predator0': True, 'prey1': False, 'prey2': False},
'receive': {'predator0': True, 'prey1': True, 'prey2': False}
},
'prey1': {
'action': np.array([1, 1]),
'send': {'predator0': False, 'predator1': False, 'prey2': False},
'receive': {'predator0': False, 'predator1': False, 'prey2': False},
}
}
sim.step(action7)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['predator1'], np.array([3, 3, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey1'], np.array([2, 0, 1])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator0')['message_buffer'] == \
{'predator1': True, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator0') == -1
assert not sim.get_done('predator0')
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['predator0'], np.array([-3, -3, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey1'], np.array([-1, -3, 1])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator1')['message_buffer'] == \
{'predator0': True, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator1') == -1
assert not sim.get_done('predator1')
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator0'], np.array([-2, 0, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator1'], np.array([1, 3, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('prey1')['message_buffer'] == \
{'predator0': False, 'predator1': False, 'prey2': False}
assert sim.get_reward('prey1') == -1
assert not sim.get_done('prey1')
action8 = {
'predator0': {
'action': {'move': np.array([1, 0]), 'attack': 0},
'send': {'predator1': True, 'prey1': False, 'prey2': False},
'receive': {'predator1': True, 'prey1': False, 'prey2': False}
},
'predator1': {
'action': {'move': np.array([-1, -1]), 'attack': 0},
'send': {'predator0': True, 'prey1': False, 'prey2': False},
'receive': {'predator0': True, 'prey1': True, 'prey2': False}
},
'prey1': {
'action': np.array([0, 1]),
'send': {'predator0': False, 'predator1': False, 'prey2': False},
'receive': {'predator0': False, 'predator1': False, 'prey2': False},
}
}
sim.step(action8)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['predator1'], np.array([1, 2, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey1'], np.array([1, 1, 1])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator0')['message_buffer'] == \
{'predator1': True, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator0') == -1
assert not sim.get_done('predator0')
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['predator0'], np.array([-1, -2, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey1'], np.array([0, -1, 1])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator1')['message_buffer'] == \
{'predator0': True, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator1') == -1
assert not sim.get_done('predator1')
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator0'], np.array([-1, -1, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator1'], np.array([0, 1, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('prey1')['message_buffer'] == \
{'predator0': False, 'predator1': False, 'prey2': False}
assert sim.get_reward('prey1') == -1
assert not sim.get_done('prey1')
action9 = {
'predator0': {
'action': {'move': np.array([0, 0]), 'attack': 1},
'send': {'predator1': True, 'prey1': False, 'prey2': False},
'receive': {'predator1': True, 'prey1': False, 'prey2': False}
},
'predator1': {
'action': {'move': np.array([0, 0]), 'attack': 0},
'send': {'predator0': True, 'prey1': False, 'prey2': False},
'receive': {'predator0': True, 'prey1': True, 'prey2': False}
},
'prey1': {
'action': np.array([-1, 1]),
'send': {'predator0': False, 'predator1': False, 'prey2': False},
'receive': {'predator0': False, 'predator1': False, 'prey2': False},
}
}
sim.step(action9)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['predator1'], np.array([1, 2, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('predator0')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator0')['message_buffer'] == \
{'predator1': True, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator0') == 100
assert not sim.get_done('predator0')
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['predator0'], np.array([-1, -2, 2])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey1'], np.array([0, 0, 0])
)
np.testing.assert_array_equal(
sim.get_obs('predator1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('predator1')['message_buffer'] == \
{'predator0': True, 'prey1': False, 'prey2': False}
assert sim.get_reward('predator1') == 0
assert not sim.get_done('predator1')
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator0'], np.array([-1, -1, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['predator1'], np.array([0, 1, 2])
)
np.testing.assert_array_equal(
sim.get_obs('prey1')['obs']['prey2'], np.array([0, 0, 0])
)
assert sim.get_obs('prey1')['message_buffer'] == \
{'predator0': False, 'predator1': False, 'prey2': False}
assert sim.get_reward('prey1') == -100
assert sim.get_done('prey1')
assert sim.get_all_done()
| 38.09633 | 95 | 0.558298 | 3,046 | 24,915 | 4.430401 | 0.024294 | 0.082697 | 0.085365 | 0.142275 | 0.939533 | 0.931308 | 0.926565 | 0.923824 | 0.917747 | 0.917229 | 0 | 0.052522 | 0.227413 | 24,915 | 653 | 96 | 38.154671 | 0.648553 | 0 | 0 | 0.61139 | 0 | 0 | 0.221674 | 0 | 0 | 0 | 0 | 0 | 0.311558 | 1 | 0.001675 | false | 0 | 0.005025 | 0 | 0.0067 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5cece27b4a7518b2fd5e91596b0de3d74bfea400 | 25 | py | Python | s3_uri_to_url/__init__.py | its-dron/s3-uri-to-url | 0f470a593d53ae36148bf926714ff76cc0941031 | [
"Apache-2.0"
] | null | null | null | s3_uri_to_url/__init__.py | its-dron/s3-uri-to-url | 0f470a593d53ae36148bf926714ff76cc0941031 | [
"Apache-2.0"
] | null | null | null | s3_uri_to_url/__init__.py | its-dron/s3-uri-to-url | 0f470a593d53ae36148bf926714ff76cc0941031 | [
"Apache-2.0"
] | null | null | null | from .main import uri2url | 25 | 25 | 0.84 | 4 | 25 | 5.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0.12 | 25 | 1 | 25 | 25 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d8a93878434b71763ed4111c2d386011681de196 | 46 | py | Python | test/run/t150.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 2,671 | 2015-01-03T08:23:25.000Z | 2022-03-31T06:15:48.000Z | test/run/t150.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 972 | 2015-01-05T08:11:00.000Z | 2022-03-29T13:47:15.000Z | test/run/t150.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 845 | 2015-01-03T19:53:36.000Z | 2022-03-29T18:34:22.000Z | print str(range(-4))[:5]
print len(range(-4))
| 15.333333 | 24 | 0.630435 | 9 | 46 | 3.222222 | 0.666667 | 0.413793 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 0.086957 | 46 | 2 | 25 | 23 | 0.619048 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
d8d77e3e9a2f5b4a27f36dcc52685d129529ea90 | 179 | py | Python | tests/__init__.py | pylipp/robber.py | e84bb0c3dd982a383f3d9da2bf334f94836a7507 | [
"MIT"
] | 23 | 2015-10-27T20:13:46.000Z | 2021-12-22T03:18:29.000Z | tests/__init__.py | pylipp/robber.py | e84bb0c3dd982a383f3d9da2bf334f94836a7507 | [
"MIT"
] | 70 | 2016-09-29T07:30:06.000Z | 2020-08-26T06:42:50.000Z | tests/__init__.py | pylipp/robber.py | e84bb0c3dd982a383f3d9da2bf334f94836a7507 | [
"MIT"
] | 10 | 2015-05-13T17:35:21.000Z | 2021-05-06T12:40:01.000Z | __all__ = [
'fixtures', 'util', 'chain_matcher'
]
from tests.fixtures import * # noqa 403
from tests.util import * # noqa 403
from tests.chain_matcher import * # noqa 403
| 22.375 | 45 | 0.687151 | 24 | 179 | 4.875 | 0.416667 | 0.230769 | 0.333333 | 0.290598 | 0.376068 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062937 | 0.201117 | 179 | 7 | 46 | 25.571429 | 0.755245 | 0.145251 | 0 | 0 | 0 | 0 | 0.167785 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
2b39e5489ad59067c76d2d55e4e52db859e6fb0f | 412 | py | Python | pydfs_lineup_optimizer/sites/__init__.py | bmjjr/pydfs-lineup-optimizer | 71176ac440832965291780f8e5a2c3f13bd234cd | [
"MIT"
] | 326 | 2015-12-23T16:30:18.000Z | 2022-03-19T17:48:07.000Z | pydfs_lineup_optimizer/sites/__init__.py | bmjjr/pydfs-lineup-optimizer | 71176ac440832965291780f8e5a2c3f13bd234cd | [
"MIT"
] | 335 | 2016-02-06T05:36:58.000Z | 2022-03-26T02:19:02.000Z | pydfs_lineup_optimizer/sites/__init__.py | bmjjr/pydfs-lineup-optimizer | 71176ac440832965291780f8e5a2c3f13bd234cd | [
"MIT"
] | 178 | 2016-02-06T05:30:20.000Z | 2022-03-27T19:05:42.000Z | from pydfs_lineup_optimizer.sites.sites_registry import SitesRegistry
from pydfs_lineup_optimizer.sites.draftkings import * # type: ignore
from pydfs_lineup_optimizer.sites.fanball import * # type: ignore
from pydfs_lineup_optimizer.sites.fanduel import * # type: ignore
from pydfs_lineup_optimizer.sites.fantasy_draft import * # type: ignore
from pydfs_lineup_optimizer.sites.yahoo import * # type: ignore
| 58.857143 | 72 | 0.82767 | 55 | 412 | 5.945455 | 0.290909 | 0.165138 | 0.275229 | 0.440367 | 0.727829 | 0.550459 | 0.550459 | 0.550459 | 0 | 0 | 0 | 0 | 0.106796 | 412 | 6 | 73 | 68.666667 | 0.888587 | 0.15534 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
990e738f294996fe00e02ec98f23c678b09b6b15 | 21,269 | py | Python | scripts/test.py | how2/object_centric_VAD | 3acdcac78f9d40dbafa54fcaa87927a8695522ec | [
"MIT"
] | null | null | null | scripts/test.py | how2/object_centric_VAD | 3acdcac78f9d40dbafa54fcaa87927a8695522ec | [
"MIT"
] | null | null | null | scripts/test.py | how2/object_centric_VAD | 3acdcac78f9d40dbafa54fcaa87927a8695522ec | [
"MIT"
] | null | null | null | # from sklearn import svm
from sklearn.externals import joblib
import sys
from scripts import inference
from models.CAE import CAE_encoder
from utils import util
import os
import argparse
import numpy as np
import pickle
import time
from utils import evaluate
from utils.paths import PATHS
sys.path.append("../")
prefix = PATHS.get_dataset_dir_path()
def arg_parse():
parser = argparse.ArgumentParser()
parser.add_argument("-g", "--gpu", type=str, default="0", help="Use which gpu?")
parser.add_argument("-d", "--dataset", type=str, help="Train on which dataset")
parser.add_argument(
"-b", "--bn", type=bool, default=False, help="whether to use BN layer"
)
parser.add_argument(
"--model_path", type=str, help="Path to saved tensorflow CAE model"
)
parser.add_argument(
"--graph_path", type=str, help="Path to saved detection frozen graph model"
)
parser.add_argument("--svm_model", type=str, help="Path to saved svm model")
parser.add_argument("--dataset_folder", type=str, help="Dataset Fodlder Path")
parser.add_argument(
"-c",
"--class_add",
type=bool,
default=False,
help="Whether to add class one-hot embedding to the featrue",
)
parser.add_argument(
"-n",
"--norm",
type=int,
default=0,
help="Whether to use Normalization to the Feature and the normalization level",
)
parser.add_argument(
"--test_CAE", type=bool, default=False, help="Whether to test CAE"
)
parser.add_argument(
"--matlab",
type=bool,
default=False,
help="Whether to use matlab weights and biases to test",
)
args = parser.parse_args()
return args
def test(CAE_model_path, OVR_SVM_path, args, gap=2, score_threshold=0.4):
# to get the image paths
image_folder = args.dataset_folder
vids_paths = util.get_vids_paths(image_folder)
# to set gpu visible
os.environ["CUDA_VISIBLE_DEVICES"] = args.gpu
import tensorflow as tf
# to load the ssd fpn model, and get related tensor
object_detection_graph = inference.load_frozen_graph(args.graph_path)
with object_detection_graph.as_default():
ops = object_detection_graph.get_operations()
all_tensor_names = {output.name for op in ops for output in op.outputs}
tensor_dict = {}
for key in [
"num_detections",
"detection_boxes",
"detection_scores",
"detection_classes",
]:
tensor_name = key + ":0"
if tensor_name in all_tensor_names:
tensor_dict[key] = object_detection_graph.get_tensor_by_name(
tensor_name
)
image_tensor = object_detection_graph.get_tensor_by_name("image_tensor:0")
former_batch = tf.placeholder(
dtype=tf.float32, shape=[1, 64, 64, 1], name="former_batch"
)
gray_batch = tf.placeholder(
dtype=tf.float32, shape=[1, 64, 64, 1], name="gray_batch"
)
back_batch = tf.placeholder(
dtype=tf.float32, shape=[1, 64, 64, 1], name="back_batch"
)
# grad1_x, grad1_y = tf.image.image_gradients(former_batch)
# grad1=tf.concat([grad1_x,grad1_y],axis=-1)
grad1 = tf.math.abs(tf.math.subtract(former_batch, gray_batch))
# grad2_x,grad2_y=tf.image.image_gradients(gray_batch)
# grad3_x, grad3_y = tf.image.image_gradients(back_batch)
# grad3=tf.concat([grad3_x,grad3_y],axis=-1)
grad3 = tf.math.abs(tf.math.subtract(back_batch, gray_batch))
# grad_dis_1 = tf.sqrt(tf.square(grad1_x) + tf.square(grad1_y))
# grad_dis_2 = tf.sqrt(tf.square(grad3_x) + tf.square(grad3_y))
former_feat = CAE_encoder(grad1, "former", bn=args.bn, training=False)
gray_feat = CAE_encoder(gray_batch, "gray", bn=args.bn, training=False)
back_feat = CAE_encoder(grad3, "back", bn=args.bn, training=False)
# [batch_size,3072]
feat = tf.concat(
[
tf.layers.flatten(former_feat),
tf.layers.flatten(gray_feat),
tf.layers.flatten(back_feat),
],
axis=1,
)
var_list = tf.get_collection(
tf.GraphKeys.TRAINABLE_VARIABLES, scope="former_encoder"
)
var_list.extend(
tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope="gray_encoder")
)
var_list.extend(
tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope="back_encoder")
)
g_list = tf.get_collection(
tf.GraphKeys.GLOBAL_VARIABLES, scope="former_encoder"
)
g_list.extend(
tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES, scope="gray_encoder")
)
g_list.extend(
tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES, scope="back_encoder")
)
bn_list = [
g for g in g_list if "moving_mean" in g.name or "moving_variance" in g.name
]
var_list += bn_list
restorer = tf.train.Saver(var_list=var_list)
(image_height, image_width) = util.image_size_map[args.dataset]
# image_height,image_width=640,640
if not args.matlab:
clf = joblib.load(OVR_SVM_path)
else:
import scipy.io as io
W = io.loadmat("../matlab_files/{}_weights.mat".format(args.dataset))["W"]
B = io.loadmat("../matlab_files/{}_biases.mat".format(args.dataset))["B"]
anomaly_scores_records = []
timestamp = time.time()
num_videos = len(vids_paths)
total = 0
with tf.Session() as sess:
if args.bn:
# restorer.restore(sess, CAE_model_path+'_bn')
model_file = tf.train.latest_checkpoint(f"{CAE_model_path}_bn")
else:
# restorer.restore(sess,CAE_model_path)
model_file = tf.train.latest_checkpoint(CAE_model_path)
restorer.restore(sess, model_file)
for frame_paths in vids_paths:
anomaly_scores = np.empty(shape=(len(frame_paths),), dtype=np.float32)
for frame_iter in range(gap, len(frame_paths) - gap):
img = np.expand_dims(
util.data_preprocessing(
frame_paths[frame_iter], target_size=640
),
axis=0,
)
output_dict = sess.run(tensor_dict, feed_dict={image_tensor: img})
# all outputs are float32 numpy arrays, so convert types as appropriate
output_dict["num_detections"] = int(
output_dict["num_detections"][0]
)
output_dict["detection_classes"] = output_dict["detection_classes"][
0
].astype(np.int8)
output_dict["detection_boxes"] = output_dict["detection_boxes"][0]
output_dict["detection_scores"] = output_dict["detection_scores"][0]
_temp_anomaly_scores = []
_temp_anomaly_score = 10000.0
for score, box in zip(
output_dict["detection_scores"], output_dict["detection_boxes"]
):
if score >= score_threshold:
box = [
int(box[0] * image_height),
int(box[1] * image_height),
int(box[2] * image_height),
int(box[3] * image_width),
]
img_gray = util.box_image_crop(frame_paths[frame_iter], box)
img_former = util.box_image_crop(
frame_paths[frame_iter - gap], box
)
img_back = util.box_image_crop(
frame_paths[frame_iter + gap], box
)
_feat = sess.run(
feat,
feed_dict={
former_batch: np.expand_dims(img_former, 0),
gray_batch: np.expand_dims(img_gray, 0),
back_batch: np.expand_dims(img_back, 0),
},
)
if args.norm != 0:
_feat = util.norm_(_feat, l=args.norm)
if args.class_add:
_temp = np.zeros(90, dtype=np.float32)
_temp[output_dict["detection_classes"][0] - 1] = 1
result = np.concatenate((_feat[0], _temp), axis=0)
_feat = np.expand_dims(result, 0)
if not args.matlab:
scores = clf.decision_function(_feat)
else:
scores = np.dot(_feat, W) + B
# print(scores[0])
_temp_anomaly_scores.append(-max(scores[0]))
if _temp_anomaly_scores.__len__() != 0:
_temp_anomaly_score = max(_temp_anomaly_scores)
print(
"video = {} / {}, i = {} / {}, score = {:.6f}".format(
frame_paths[0].split("/")[-2],
num_videos,
frame_iter,
len(frame_paths),
_temp_anomaly_score,
)
)
anomaly_scores[frame_iter] = _temp_anomaly_score
anomaly_scores[:gap] = anomaly_scores[gap]
anomaly_scores[-gap:] = anomaly_scores[-gap - 1]
min_score = min(anomaly_scores)
for i, _s in enumerate(anomaly_scores):
if _s == 10000.0:
anomaly_scores[i] = min_score
anomaly_scores_records.append(anomaly_scores)
total += len(frame_paths)
# use the evaluation functions from github.com/StevenLiuWen/ano_pred_cvpr2018
result_dict = {
"dataset": args.dataset,
"psnr": anomaly_scores_records,
"flow": [],
"names": [],
"diff_mask": [],
}
used_time = time.time() - timestamp
print("total time = {}, fps = {}".format(used_time, total / used_time))
# TODO specify what's the actual name of ckpt.
if not args.bn:
# pickle_path = '/home/' + args.machine + '/anomaly_scores/' + args.dataset + '.pkl'
pickle_path = f"{PATHS.get_anomaly_scores_pickle_path()}/{args.dataset}.pkl"
else:
# pickle_path = '/home/' + args.machine + '/anomaly_scores/' + args.dataset + '_bn' + '.pkl'
pickle_path = f"{PATHS.get_anomaly_scores_pickle_path()}/{args.dataset}_bn.pkl"
with open(pickle_path, "wb") as writer:
pickle.dump(result_dict, writer, pickle.HIGHEST_PROTOCOL)
results = evaluate.evaluate_all(pickle_path, reverse=True, smoothing=True)
print(results)
def test_CAE(CAE_model_path, args, gap=2, score_threshold=0.4):
image_folder = prefix + args.dataset + "/testing/frames/"
vids_paths = util.get_vids_paths(image_folder)
# to set gpu visible
os.environ["CUDA_VISIBLE_DEVICES"] = args.gpu
import tensorflow as tf
# to load the ssd fpn model, and get related tensor
object_detection_graph = inference.load_frozen_graph()
with object_detection_graph.as_default():
ops = object_detection_graph.get_operations()
all_tensor_names = {output.name for op in ops for output in op.outputs}
tensor_dict = {}
for key in [
"num_detections",
"detection_boxes",
"detection_scores",
"detection_classes",
]:
tensor_name = key + ":0"
if tensor_name in all_tensor_names:
tensor_dict[key] = object_detection_graph.get_tensor_by_name(
tensor_name
)
image_tensor = object_detection_graph.get_tensor_by_name("image_tensor:0")
former_batch = tf.placeholder(
dtype=tf.float32, shape=[1, 64, 64, 1], name="former_batch"
)
gray_batch = tf.placeholder(
dtype=tf.float32, shape=[1, 64, 64, 1], name="gray_batch"
)
back_batch = tf.placeholder(
dtype=tf.float32, shape=[1, 64, 64, 1], name="back_batch"
)
# grad1_x, grad1_y = tf.image.image_gradients(former_batch)
# grad1=tf.concat([grad1_x,grad1_y],axis=-1)
grad1 = tf.math.abs(tf.math.subtract(former_batch, gray_batch))
# grad2_x,grad2_y=tf.image.image_gradients(gray_batch)
# grad3_x, grad3_y = tf.image.image_gradients(back_batch)
# grad3=tf.concat([grad3_x,grad3_y],axis=-1)
grad3 = tf.math.abs(tf.math.subtract(back_batch, gray_batch))
# grad_dis_1 = tf.sqrt(tf.square(grad1_x) + tf.square(grad1_y))
# grad_dis_2 = tf.sqrt(tf.square(grad3_x) + tf.square(grad3_y))
former_output = CAE(grad1, "former", bn=args.bn, training=False)
gray_output = CAE(gray_batch, "gray", bn=args.bn, training=False)
back_output = CAE(grad3, "back", bn=args.bn, training=False)
outputs = tf.concat([former_output, gray_output, back_output], axis=1)
targets = tf.concat([grad_dis_1, gray_batch, grad_dis_2], axis=1)
L2_dis = tf.reduce_sum(tf.square(outputs - targets))
var_list = tf.get_collection(
tf.GraphKeys.TRAINABLE_VARIABLES, scope="former_encoder"
)
var_list.extend(
tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope="gray_encoder")
)
var_list.extend(
tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope="back_encoder")
)
var_list.extend(
tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope="former_decoder")
)
var_list.extend(
tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope="gray_decoder")
)
var_list.extend(
tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope="back_decoder")
)
if args.bn:
g_list = tf.get_collection(
tf.GraphKeys.GLOBAL_VARIABLES, scope="former_encoder"
)
g_list.extend(
tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES, scope="gray_encoder")
)
g_list.extend(
tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES, scope="back_encoder")
)
g_list.extend(
tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES, scope="former_decoder")
)
g_list.extend(
tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES, scope="gray_decoder")
)
g_list.extend(
tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES, scope="back_decoder")
)
bn_list = [
g
for g in g_list
if "moving_mean" in g.name or "moving_variance" in g.name
]
var_list += bn_list
restorer = tf.train.Saver(var_list=var_list)
(image_height, image_width) = util.image_size_map[args.dataset]
# image_height,image_width=640,640
anomaly_scores_records = []
timestamp = time.time()
num_videos = len(vids_paths)
total = 0
with tf.Session() as sess:
if args.bn:
restorer.restore(sess, CAE_model_path + "_bn")
else:
restorer.restore(sess, CAE_model_path)
for frame_paths in vids_paths:
anomaly_scores = np.empty(shape=(len(frame_paths),), dtype=np.float32)
for frame_iter in range(gap, len(frame_paths) - gap):
img = np.expand_dims(
util.data_preprocessing(
frame_paths[frame_iter], target_size=640
),
axis=0,
)
output_dict = sess.run(tensor_dict, feed_dict={image_tensor: img})
# all outputs are float32 numpy arrays, so convert types as appropriate
output_dict["num_detections"] = int(
output_dict["num_detections"][0]
)
output_dict["detection_classes"] = output_dict["detection_classes"][
0
].astype(np.int8)
output_dict["detection_boxes"] = output_dict["detection_boxes"][0]
output_dict["detection_scores"] = output_dict["detection_scores"][0]
_temp_anomaly_scores = []
_temp_anomaly_score = 10000.0
for score, box in zip(
output_dict["detection_scores"], output_dict["detection_boxes"]
):
if score >= score_threshold:
box = [
int(box[0] * image_height),
int(box[1] * image_height),
int(box[2] * image_height),
int(box[3] * image_width),
]
img_gray = util.box_image_crop(frame_paths[frame_iter], box)
img_former = util.box_image_crop(
frame_paths[frame_iter - gap], box
)
img_back = util.box_image_crop(
frame_paths[frame_iter + gap], box
)
l2_dis = sess.run(
L2_dis,
feed_dict={
former_batch: np.expand_dims(img_former, 0),
gray_batch: np.expand_dims(img_gray, 0),
back_batch: np.expand_dims(img_back, 0),
},
)
_temp_anomaly_scores.append(l2_dis)
if _temp_anomaly_scores.__len__() != 0:
_temp_anomaly_score = max(_temp_anomaly_scores)
print(
"video = {} / {}, i = {} / {}, score = {:.6f}".format(
frame_paths[0].split("/")[-2],
num_videos,
frame_iter,
len(frame_paths),
_temp_anomaly_score,
)
)
anomaly_scores[frame_iter] = _temp_anomaly_score
anomaly_scores[:gap] = anomaly_scores[gap]
anomaly_scores[-gap:] = anomaly_scores[-gap - 1]
min_score = np.min(anomaly_scores)
for i, _s in enumerate(anomaly_scores):
if _s == 10000.0:
anomaly_scores[i] = min_score
anomaly_scores_records.append(anomaly_scores)
total += len(frame_paths)
# use the evaluation functions from github.com/StevenLiuWen/ano_pred_cvpr2018
result_dict = {
"dataset": args.dataset,
"psnr": anomaly_scores_records,
"flow": [],
"names": [],
"diff_mask": [],
}
used_time = time.time() - timestamp
print("total time = {}, fps = {}".format(used_time, total / used_time))
# TODO specify what's the actual name of ckpt.
if not args.bn:
# pickle_path = '/home/' + args.machine + '/anomaly_scores/' + args.dataset + '_CAE_only' + '.pkl'
pickle_path = f"{PATHS.get_anomaly_scores_pickle_path()}/{args.dataset}_CAE_only.pkl"
else:
# pickle_path = '/home/' + args.machine + '/anomaly_scores/' + args.dataset + '_CAE_only' + '_bn' + '.pkl'
pickle_path = f"{PATHS.get_anomaly_scores_pickle_path()}/{args.dataset}_CAE_only_bn.pkl"
with open(pickle_path, "wb") as writer:
pickle.dump(result_dict, writer, pickle.HIGHEST_PROTOCOL)
results = evaluate.evaluate_all(pickle_path, reverse=True, smoothing=True)
print(results)
if __name__ == "__main__":
args = arg_parse()
if args.dataset_folder:
PATHS.set_dataset_dir_path(args.dataset_folder)
if not args.test_CAE:
test(args.model_path, args.svm_model, args, score_threshold=0.4)
else:
test_CAE(args.model_path, args)
| 40.901923 | 122 | 0.53585 | 2,358 | 21,269 | 4.544953 | 0.119169 | 0.050947 | 0.025194 | 0.028553 | 0.812821 | 0.805356 | 0.797331 | 0.780722 | 0.76206 | 0.753849 | 0 | 0.017118 | 0.362782 | 21,269 | 519 | 123 | 40.980732 | 0.773629 | 0.088392 | 0 | 0.608173 | 0 | 0 | 0.100816 | 0.016484 | 0 | 0 | 0 | 0.001927 | 0 | 1 | 0.007212 | false | 0 | 0.036058 | 0 | 0.045673 | 0.014423 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
99470198e20d13701fa353f148132268eba338ad | 180 | py | Python | mypublicapi/admin.py | bleemeo/mypublicapi | 029810c8da0adb3baa17f2e5a885860932cb79fb | [
"MIT"
] | null | null | null | mypublicapi/admin.py | bleemeo/mypublicapi | 029810c8da0adb3baa17f2e5a885860932cb79fb | [
"MIT"
] | null | null | null | mypublicapi/admin.py | bleemeo/mypublicapi | 029810c8da0adb3baa17f2e5a885860932cb79fb | [
"MIT"
] | null | null | null |
from django.contrib import admin
from .models import Metric, Server, ServerProperty
admin.site.register(Server)
admin.site.register(ServerProperty)
admin.site.register(Metric)
| 18 | 50 | 0.816667 | 23 | 180 | 6.391304 | 0.478261 | 0.183673 | 0.346939 | 0.421769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094444 | 180 | 9 | 51 | 20 | 0.90184 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
997896a7afcbb4925aa85acaef9be187b80c219e | 4,954 | py | Python | dqn/dl.py | ShuminKong/RL-Restore | 0e03a53291f21d2155298e2ba4f6e20d8a17a783 | [
"MIT"
] | null | null | null | dqn/dl.py | ShuminKong/RL-Restore | 0e03a53291f21d2155298e2ba4f6e20d8a17a783 | [
"MIT"
] | null | null | null | dqn/dl.py | ShuminKong/RL-Restore | 0e03a53291f21d2155298e2ba4f6e20d8a17a783 | [
"MIT"
] | null | null | null | import os
import math
import random
import numpy as np
import torch
import torch.nn as nn
import torch.optim as optim
import torch.nn.functional as F
import torchvision.transforms as T
from torch.utils.data import DataLoader
from .pietorch import data_convertors
def get_dataloader(ds, data_dir, noise=10, crop=256, jpeg_quality=40):
transform = T.ToTensor()
if ds == 'RealNoiseHKPoly':
crop_size, with_data_aug = 128, False
test_root, test_list_pth = os.path.join(data_dir, ds, 'test1/'), os.path.join('split', ds, 'test1_list.txt')
test_convertor = data_convertors.ConvertImageSet(test_root, test_list_pth, ds,
transform=transform,crop_size=crop_size)
data_root, imlist_pth = os.path.join(data_dir, ds, 'OriginalImages/'), os.path.join('split', ds, 'train_list.txt')
convertor = data_convertors.ConvertImageSet(data_root, imlist_pth, ds,
transform=transform, is_train=True,
with_aug=with_data_aug, crop_size=crop_size)
dataloader, test_dataloader = DataLoader(convertor, batch_size=32, shuffle=False), DataLoader(test_convertor, batch_size=32, shuffle=False)
elif ds == 'GoPro':
transform = [jpeg_quality, T.ToTensor(), noise]
crop_size, with_data_aug = crop, False
test_root, test_list_pth = os.path.join(data_dir, ds, 'test/'), os.path.join('split', ds, 'test_list.txt')
test_convertor = data_convertors.ConvertImageSet(test_root, test_list_pth, ds,
transform=transform, resize_to=(640, 360), crop_size=crop_size)
data_root, imlist_pth = os.path.join(data_dir, ds, 'train/'), os.path.join('split', ds, 'train_list.txt')
convertor = data_convertors.ConvertImageSet(data_root, imlist_pth, ds,
transform=transform, is_train=True,
with_aug=with_data_aug, resize_to=(640, 360), crop_size=crop_size)
dataloader, test_dataloader = DataLoader(convertor, batch_size=1, shuffle=False), DataLoader(test_convertor, batch_size=32, shuffle=False)
elif ds == 'RainDrop':
test_set = 'test_a'
crop_size, with_data_aug = crop, False
transform = [jpeg_quality, T.ToTensor(), noise]
test_root, test_list_pth = os.path.join(data_dir, ds, test_set, test_set), os.path.join('split', ds, test_set+'_list.txt')
test_convertor = data_convertors.ConvertImageSet(test_root, test_list_pth, ds, transform=transform, crop_size=crop_size)
data_root, imlist_pth = os.path.join(data_dir, ds, 'train', 'train/'), os.path.join('split', ds, 'train_list.txt')
convertor = data_convertors.ConvertImageSet(data_root, imlist_pth, ds,
transform=transform, is_train=True,
with_aug=with_data_aug, crop_size=crop_size)
dataloader, test_dataloader = DataLoader(convertor, batch_size=32, shuffle=False), DataLoader(test_convertor, batch_size=32, shuffle=False)
elif ds == 'RESIDE':
test_root, test_list_pth = os.path.join(data_dir, ds, 'test_a', 'test_a'), os.path.join('split', ds, test_a+'_list.txt')
test_convertor = data_convertors.ConvertImageSet(test_root, test_list_pth, ds, transform=transform)
data_root, imlist_pth = os.path.join(data_dir, ds, 'train', 'train/'), os.path.join('split', ds, 'train_list.txt')
crop_size, with_data_aug = 256, False
convertor = data_convertors.ConvertImageSet(data_root, imlist_pth, ds,
transform=transform, is_train=True,
with_aug=with_data_aug, crop_size=crop_size)
dataloader, test_dataloader = DataLoader(convertor, batch_size=32, shuffle=False), DataLoader(test_convertor, batch_size=32, shuffle=False)
elif ds == 'DIV2K':
transform = [jpeg_quality, T.ToTensor(), noise]
crop_size, with_data_aug = 128, False
test_root, test_list_pth = os.path.join(data_dir, ds), os.path.join('split', ds, 'test_list.txt')
test_convertor = data_convertors.ConvertImageSet(test_root, test_list_pth, ds, crop_size=crop_size, transform=transform)
data_root, imlist_pth = os.path.join(data_dir, ds), os.path.join('split', ds, 'train_list.txt')
convertor = data_convertors.ConvertImageSet(data_root, imlist_pth, ds,
transform=transform, is_train=True,
with_aug=with_data_aug, crop_size=crop_size)
dataloader, test_dataloader = DataLoader(convertor, batch_size=1, shuffle=False), DataLoader(test_convertor, batch_size=1, shuffle=False)
return dataloader, test_dataloader | 57.604651 | 147 | 0.645539 | 633 | 4,954 | 4.774092 | 0.115324 | 0.060887 | 0.066181 | 0.052945 | 0.85407 | 0.842158 | 0.812707 | 0.802118 | 0.792852 | 0.792852 | 0 | 0.0129 | 0.24889 | 4,954 | 86 | 148 | 57.604651 | 0.799248 | 0 | 0 | 0.430769 | 0 | 0 | 0.058325 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.015385 | false | 0 | 0.169231 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
850b6ba5e88db0e7016844a87d53dca2031cfc6a | 145 | py | Python | SLpackage/private/pacbio/pythonpkgs/pbsvtools/lib/python2.7/site-packages/pbsv1/__init__.py | fanglab/6mASCOPE | 3f1fdcb7693ff152f17623ce549526ec272698b1 | [
"BSD-3-Clause"
] | 5 | 2022-02-20T07:10:02.000Z | 2022-03-18T17:47:53.000Z | SLpackage/private/pacbio/pythonpkgs/pbsvtools/lib/python2.7/site-packages/pbsv1/__init__.py | fanglab/6mASCOPE | 3f1fdcb7693ff152f17623ce549526ec272698b1 | [
"BSD-3-Clause"
] | null | null | null | SLpackage/private/pacbio/pythonpkgs/pbsvtools/lib/python2.7/site-packages/pbsv1/__init__.py | fanglab/6mASCOPE | 3f1fdcb7693ff152f17623ce549526ec272698b1 | [
"BSD-3-Clause"
] | null | null | null | from __future__ import absolute_import
def get_version():
return '0.2.0' # don't forget to update setup.py too
__version__ = get_version()
| 20.714286 | 56 | 0.744828 | 23 | 145 | 4.217391 | 0.782609 | 0.206186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024793 | 0.165517 | 145 | 6 | 57 | 24.166667 | 0.77686 | 0.241379 | 0 | 0 | 0 | 0 | 0.046296 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
51f54b86e1fce520e758a41264c281db78c65841 | 34 | py | Python | src/test/resources/rocks/juergen/maven/jythonplugin/helloworld.py | edelbluth/jython-maven-plugin | 2d6780343871e2f4da539366f42d0a86ebc8fc84 | [
"Apache-2.0"
] | 2 | 2020-03-26T17:12:54.000Z | 2021-03-17T14:15:59.000Z | src/test/resources/rocks/juergen/maven/jythonplugin/helloworld.py | juergen-rocks/jython-maven-plugin | 2d6780343871e2f4da539366f42d0a86ebc8fc84 | [
"Apache-2.0"
] | 18 | 2016-10-09T18:12:36.000Z | 2020-03-26T17:19:46.000Z | src/test/resources/rocks/juergen/maven/jythonplugin/helloworld.py | edelbluth/jython-maven-plugin | 2d6780343871e2f4da539366f42d0a86ebc8fc84 | [
"Apache-2.0"
] | 3 | 2016-10-09T10:23:52.000Z | 2018-11-08T09:05:57.000Z | print 'Hello, World - from file!'
| 17 | 33 | 0.676471 | 5 | 34 | 4.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 34 | 1 | 34 | 34 | 0.821429 | 0 | 0 | 0 | 0 | 0 | 0.735294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
a40369ad76b38dd4a4391e3bac4671edcc6c2c9d | 7,064 | py | Python | src/models/resnet.py | joaopfonseca/remote_sensing | 1c3704e2ea4de6ec803e6b42a1557660768ab03b | [
"MIT"
] | null | null | null | src/models/resnet.py | joaopfonseca/remote_sensing | 1c3704e2ea4de6ec803e6b42a1557660768ab03b | [
"MIT"
] | null | null | null | src/models/resnet.py | joaopfonseca/remote_sensing | 1c3704e2ea4de6ec803e6b42a1557660768ab03b | [
"MIT"
] | null | null | null |
import os
import numpy as np
from keras.utils import np_utils
from keras.layers import (
Input,
Conv2D,
Conv3D,
Flatten,
Dense,
Dropout,
Reshape
)
from keras.models import Model, load_model
from keras.optimizers import Adam
from keras.callbacks import ModelCheckpoint
from keras.applications import resnet
class ResNet50:
def __init__(self, input_shape, output_units, filepath='best_model.hdf5'):
"""input_shape: (height, width, num_bands)"""
self.height, self.width, self.num_bands = input_shape
self.output_units = output_units
## input layer
self.input_layer = Input(
(
self.height,
self.width,
self.num_bands
)
)
self.base_model = resnet.ResNet50(weights=None, include_top=False, input_shape= input_shape)(self.input_layer)
self.flatten_layer = Flatten()(self.base_model)
########################################################################
# fully connected layers
########################################################################
dense_layer1 = Dense(
units=256,
activation='relu'
)(self.flatten_layer)
self.dense_layer1 = Dropout(0.4)(dense_layer1)
dense_layer2 = Dense(
units=128,
activation='relu'
)(self.dense_layer1)
self.dense_layer2 = Dropout(0.4)(dense_layer2)
self.output_layer = Dense(
units=self.output_units,
activation='softmax'
)(self.dense_layer2)
self.model = Model(inputs=self.input_layer, outputs=self.output_layer)
self.adam = Adam(lr=0.001, decay=1e-06)
self.model.compile(loss='categorical_crossentropy', optimizer=self.adam, metrics=['accuracy'])
self.model.summary()
abspath = os.path.abspath('.')
self.filepath = os.path.abspath(os.path.join(abspath,filepath))
checkpoint = ModelCheckpoint(self.filepath, monitor='accuracy', verbose=1, save_best_only=True, mode='max')
self.callbacks_list = [checkpoint]
def load_weights(self, filepath):
self.filepath = filepath
self.model = load_model(filepath)
self.model.compile(loss='categorical_crossentropy', optimizer=self.adam, metrics=['accuracy'])
def fit(self, X, y, batch_size=256, epochs=100):
# transform matrices to correct format
self.num_bands = X.shape[-1]
self.X = X.reshape(
-1,
self.height,
self.width,
self.num_bands
)
self.y = np_utils.to_categorical(y, num_classes=self.output_units)
self.history = self.model.fit(
x=self.X,
y=self.y,
batch_size=batch_size,
epochs=epochs,
callbacks=self.callbacks_list
)
def predict(self, X, filepath=None):
# assert: self.filepath or filepath must exist
if filepath:
self.load_weights(filepath)
self.model.compile(loss='categorical_crossentropy', optimizer=self.adam, metrics=['accuracy'])
#else:
# self.load_model(self.filepath)
#self.model.compile(loss='categorical_crossentropy', optimizer=self.adam, metrics=['accuracy'])
X = X.reshape(
-1,
self.height,
self.width,
self.num_bands
)
y_pred = np.argmax(self.model.predict(X), axis=1)
return y_pred
class PixelBasedResNet50:
def __init__(self, input_shape, output_units, filepath='best_model.hdf5'):
"""input_shape: (height, width)"""
self.input_shape = input_shape
self.height_pad = 32-input_shape[0]
self.width_pad = 32-input_shape[1]
self.height, self.width = (32, 32)
self.output_units = output_units
## input layer
self.input_layer = Input(
(
self.height,
self.width,
1
)
)
self.base_model = resnet.ResNet50(
weights=None,
include_top=False,
input_shape=(32, 32, 1)
)(self.input_layer)
self.flatten_layer = Flatten()(self.base_model)
########################################################################
# fully connected layers
########################################################################
dense_layer1 = Dense(
units=256,
activation='relu'
)(self.flatten_layer)
self.dense_layer1 = Dropout(0.4)(dense_layer1)
dense_layer2 = Dense(
units=128,
activation='relu'
)(self.dense_layer1)
self.dense_layer2 = Dropout(0.4)(dense_layer2)
self.output_layer = Dense(
units=self.output_units,
activation='softmax'
)(self.dense_layer2)
self.model = Model(inputs=self.input_layer, outputs=self.output_layer)
self.adam = Adam(lr=0.001, decay=1e-06)
self.model.compile(loss='categorical_crossentropy', optimizer=self.adam, metrics=['accuracy'])
self.model.summary()
abspath = os.path.abspath('.')
self.filepath = os.path.abspath(os.path.join(abspath,filepath))
checkpoint = ModelCheckpoint(self.filepath, monitor='accuracy', verbose=1, save_best_only=True, mode='max')
self.callbacks_list = [checkpoint]
def load_weights(self, filepath):
self.filepath = filepath
self.model = load_model(filepath)
self.model.compile(loss='categorical_crossentropy', optimizer=self.adam, metrics=['accuracy'])
def fit(self, X, y, batch_size=256, epochs=100):
# transform matrices to correct format
self.num_bands = X.shape[-1]
X = X.reshape(
-1,
self.input_shape[0],
self.input_shape[1]
)
X = np.expand_dims(X, -1)
X = np.pad(X, ((0,0), (self.height_pad,0), (self.width_pad,0), (0,0)))
y = np_utils.to_categorical(y, num_classes=self.output_units)
self.history = self.model.fit(
x=X,
y=y,
batch_size=batch_size,
epochs=epochs,
callbacks=self.callbacks_list
)
def predict(self, X, filepath=None):
# assert: self.filepath or filepath must exist
if filepath:
self.load_weights(filepath)
self.model.compile(loss='categorical_crossentropy', optimizer=self.adam, metrics=['accuracy'])
#else:
# self.load_model(self.filepath)
#self.model.compile(loss='categorical_crossentropy', optimizer=self.adam, metrics=['accuracy'])
X = X.reshape(
-1,
self.input_shape[0],
self.input_shape[1]
)
X = np.expand_dims(X, -1)
X = np.pad(X, ((0,0), (self.height_pad,0), (self.width_pad,0), (0,0)))
y_pred = np.argmax(self.model.predict(X), axis=1)
return y_pred
| 33.961538 | 118 | 0.567384 | 798 | 7,064 | 4.860902 | 0.145363 | 0.041763 | 0.032998 | 0.041248 | 0.902037 | 0.888115 | 0.888115 | 0.880124 | 0.875999 | 0.875999 | 0 | 0.023256 | 0.28171 | 7,064 | 207 | 119 | 34.125604 | 0.74123 | 0.080549 | 0 | 0.691824 | 0 | 0 | 0.04466 | 0.023301 | 0 | 0 | 0 | 0 | 0 | 1 | 0.050314 | false | 0 | 0.050314 | 0 | 0.125786 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
320c21c2d82e8ed6657dd93532137dbf2905b4fb | 67,290 | py | Python | OohsBay.py | parv-joshi/Online-Shopping-Portal-Project | 4473b9bd26db1eab2dfe84a928cf669aebb1c950 | [
"MIT"
] | null | null | null | OohsBay.py | parv-joshi/Online-Shopping-Portal-Project | 4473b9bd26db1eab2dfe84a928cf669aebb1c950 | [
"MIT"
] | null | null | null | OohsBay.py | parv-joshi/Online-Shopping-Portal-Project | 4473b9bd26db1eab2dfe84a928cf669aebb1c950 | [
"MIT"
] | null | null | null | # OOHSbay.com - Online Shopping Portal
"""
This program is about an online shopping portal which asks the user to enter the number of products the user wants to buy. It then asks the number of products
he/she wants to buy. The it aks the user to choose the genre of product, brand, budget and the product he/she wants to buy. It also gives choice for the
shipping details and the finally prints the bill.
"""
def genre_of_product(): # This prints menu for the genre of the products
print "\n"
genre_list=['Home Appliances', 'Electronics', 'Clothing', 'Stationary']
print "Printing the genre list"
print "SL NO.", "\t\t", "PRODUCT"
for i in range(4):
print i+1, "\t\t", genre_list[i]
global choice # makes the later created variable 'choice' global scope
choice=input("Enter your choice: ") # Allows user to enter choice of the genre of products listed
def product_menu(choice):
print "\n"
global product_choice # makes the later created variable 'product_choice' global scope
if choice==1: # prints menu for the category of the products for choice=1
home_appliances_list=["Vaccum Cleaners", "Juicers", "Mixers"]
print "SL NO.", "\t\t", "ITEMS"
for i in range(3):
print i+1, "\t\t", home_appliances_list[i]
product_choice=input("Enter your item choice: ") # allows user to choose the category of product from the given list
elif choice==2: # prints menu for the category of the products for choice=2
electronics_list=["Televisions", "Mobiles", "Laptops"]
print "SL NO.", "\t\t", "ITEMS"
for i in range(3):
print i+1, "\t\t", electronics_list[i]
product_choice=input("Enter your item choice: ") # allows user to choose the category of product from the given list
elif choice==3: # prints menu for the category of the products for choice=3
clothing_list=["Shirts", "Pants", "Hoodies"]
print "SL NO.", "\t\t", "ITEMS"
for i in range(3):
print i+1, "\t\t", clothing_list[i]
product_choice=input("Enter your item choice: ") # allows user to choose the category of product from the given list
elif choice==4: # prints menu for the category of the products for choice=4
stationary_list=["Pencils", "Pens", "Erasers"]
print "SL NO.", "\t\t", "ITEMS"
for i in range(3):
print i+1, "\t\t", stationary_list[i]
product_choice=input("Enter your item choice: ") # allows user to choose the category of product from the given list
else:
print "Invalid Choice"
def brand(choice, product_choice): # Allows user to choose the product brand
print "\n"
global brand_choice # makes the later created variable 'brand_choice' global scope
if choice==1: # prints menu for the product brand for choice=1
if product_choice==1: # prints menu for the product brand for product_choice=1
print "Choose the item brand: "
print "1. BOSCH"
print "2. SIEMENS"
print "3. SHARP"
brand_choice=input("Enter the brand serial number: ") # Allows user to choose product brand
if brand_choice>3 or brand_choice<1:
print "Invalid Choice"
elif product_choice==2: # prints menu for the product brand for product_choice=2
print "Choose the item brand: "
print "1. LG"
print "2. PANASONIC"
print "3. KENWOOD"
brand_choice=input("Enter the brand serial number: ") # Allows user to choose product brand
if brand_choice>3 or brand_choice<1:
print "Invalid Choice"
elif product_choice==3: # prints menu for the product brand for product_choice=3
print "Choose the item brand: "
print "1. KENWOOD"
print "2. PANASONIC"
print "3. PHILIPS"
brand_choice=input("Enter the brand serial number: ") # Allows user to choose product brand
if brand_choice>3 or brand_choice<1:
print "Invalid Choice"
elif choice==2: # prints menu for the product brand for choice=2
if product_choice==1: # prints menu for the product brand for product_choice=1
print "Choose the item brand: "
print "1. SONY"
print "2. SAMSUNG"
print "3. LG"
brand_choice=input("Enter the brand serial number: ") # Allows user to choose product brand
if brand_choice>3 or brand_choice<1:
print "Invalid Choice"
elif product_choice==2: # prints menu for the product brand for product_choice=2
print "Choose the item brand: "
print "1. APPLE"
print "2. SAMSUNG"
print "3. HTC"
brand_choice=input("Enter the brand serial number: ") # Allows user to choose product brand
if brand_choice>3 or brand_choice<1:
print "Invalid Choice"
elif product_choice==3: # prints menu for the product brand for product_choice=3
print "Choose the item brand: "
print "1. LENOVO"
print "2. HP"
print "3. DELL"
brand_choice=input("Enter the brand serial number: ") # Allows user to choose product brand
if brand_choice>3 or brand_choice<1:
print "Invalid Choice"
elif choice==3: # prints menu for the product brand for choice=3
if product_choice==1: # prints menu for the product brand for product_choice=1
print "Choose the item brand: "
print "1. OCTAVE"
print "2. GIORDANO"
print "3. LEE COOPER"
brand_choice=input("Enter the brand serial number: ") # Allows user to choose product brand
if brand_choice>3 or brand_choice<1:
print "Invalid Choice"
elif product_choice==2: # prints menu for the product brand for product_choice=2
print "Choose the item brand: "
print "1. SPLASH"
print "2. BEING HUMAN"
print "3. MAX"
brand_choice=input("Enter the brand serial number: ") # Allows user to choose product brand
if brand_choice>3 or brand_choice<1:
print "Invalid Choice"
elif product_choice==3: # prints menu for the product brand for product_choice=3
print "Choose the item brand: "
print "1. MAX"
brand_choice=input("Enter the brand serial number: ") # Allows user to choose product brand
if brand_choice>1 or brand_choice<1:
print "Invalid Choice"
elif choice==4: # prints menu for the product brand for choice=4
if product_choice==1: # prints menu for the product brand for product_choice=1
print "Choose the item brand: "
print "1. HELIX"
brand_choice=input("Enter the brand serial number: ") # Allows user to choose product brand
if brand_choice>1 or brand_choice<1:
print "Invalid Choice"
elif product_choice==2: # prints menu for the product brand for product_choice=2
print "Choose the item brand: "
print "1. UNI BALL"
print "2. FABER CASTELL"
brand_choice=input("Enter the brand serial number: ")
if brand_choice>2 or brand_choice<1: # Allows user to choose product brand
print "Invalid Choice"
elif product_choice==3: # prints menu for the product brand for product_choice=3
print "Choose the item brand: "
print "1. FABER CASTELL"
brand_choice=input("Enter the brand serial number: ") # Allows user to choose product brand
if brand_choice>1 or brand_choice<1:
print "Invalid Choice"
def budget(choice, product_choice):
print "\n"
global budget_choice # makes the later created variable 'budget_choice' global scope
if choice==1: # prints menu for the product budget range for choice=1
if product_choice==1: # prints menu for the product budget range for product_choice=1
print "Select Budget Choice: "
print "1. AED 30 - AED 40"
print "2. AED 40 - AED 55"
budget_choice=input("Enter your Budget Serial Number: ")
if budget_choice>2 or budget_choice<1:
print "Invalid Choice"
elif product_choice==2: # prints menu for the product budget range for product_choice=2
print "Select Budget Choice: "
print "1. AED 100 - AED 135"
print "2. AED 135 - AED 150"
budget_choice=input("Enter your Budget Serial Number: ")
if budget_choice>2 or budget_choice<1:
print "Invalid Choice"
elif product_choice==3: # prints menu for the product budget range for product_choice=3
print "Select Budget Choice: "
print "1. AED 30 - AED 40"
print "2. AED 40 - AED 55"
budget_choice=input("Enter your Budget Serial Number: ")
if budget_choice>2 or budget_choice<1:
print "Invalid Choice"
elif choice==2: # prints menu for the product budget range for choice=2
if product_choice==1: # prints menu for the product budget range for product_choice=1
print "Select Budget Choice: "
print "1. AED 4850 - AED 5000"
print "2. AED 5000 - AED 5290"
budget_choice=input("Enter your Budget Serial Number: ")
if budget_choice>2 or budget_choice<1:
print "Invalid Choice"
elif product_choice==2: # prints menu for the product budget range for product_choice=2
print "Select Budget Choice: "
print "1. AED 1699 - AED 1945"
print "2. AED 1945 - AED 2235"
budget_choice=input("Enter your Budget Serial Number: ")
if budget_choice>2 or budget_choice<1:
print "Invalid Choice"
elif product_choice==3: # prints menu for the product budget range for product_choice=3
print "Select Budget Choice: "
print "1. AED 2450 - AED 3000"
print "2. AED 3000 - AED 3569"
budget_choice=input("Enter your Budget Serial Number: ")
if budget_choice>2 or budget_choice<1:
print "Invalid Choice"
elif choice==3: # prints menu for the product budget range for choice=3
if product_choice==1: # prints menu for the product budget range for product_choice=1
print "Select Budget Choice: "
print "1. AED 35 - AED 55"
print "2. AED 55 - AED 70"
budget_choice=input("Enter your Budget Serial Number: ")
if budget_choice>2 or budget_choice<1:
print "Invalid Choice"
elif product_choice==2: # prints menu for the product budget range for product_choice=2
print "Select Budget Choice: "
print "1. AED 35 - AED 50"
print "2. AED 50 - AED 75"
budget_choice=input("Enter your Budget Serial Number: ")
if budget_choice>2 or budget_choice<1:
print "Invalid Choice"
elif product_choice==3: # prints menu for the product budget range for product_choice=3
print "Select Budget Choice: "
print "1. AED 45 - AED 60"
budget_choice=input("Enter your Budget Serial Number: ")
if budget_choice>1 or budget_choice<1:
print "Invalid Choice"
elif choice==4: # prints menu for the product budget range for choice=4
if product_choice==1: # prints menu for the product budget range for product_choice=1
print "Select Budget Choice: "
print "1. AED 1 - AED 2"
budget_choice=input("Enter your Budget Serial Number: ")
if budget_choice>1 or budget_choice<1:
print "Invalid Choice"
elif product_choice==2: # prints menu for the product budget range for product_choice=2
print "Select Budget Choice: "
print "1. AED 1 - AED 3"
print "2. AED 3 - AED 5"
budget_choice=input("Enter your Budget Serial Number: ")
if budget_choice>3 or budget_choice<1:
print "Invalid Choice"
elif product_choice==3: # prints menu for the product budget range for product_choice=3
print "Select Budget Choice: "
print "1. AED 1 - AED 2"
budget_choice=input("Enter your Budget Serial Number: ")
if budget_choice!=1:
print "Invalid Choice"
def specific_item(choice, product_choice, brand_choice, budget_choice):
print "\n"
global item_choice # makes the later created variable 'item_choice' global scope
global item # makes the later created variable 'item' global scope
global sum # makes the later created variable 'sum' global scope
global warranty # makes the later created variable 'warranty' global scope
global k # makes the later created variable 'k' global scope
global s # makes the later created variable 's' global scope
global d # makes the later created variable 'd' global scope
global x # makes the later created variable 'x' global scope
item=warranty="" # creates empty string named 'item' and 'warranty'
sum=0
if choice==1:
if product_choice==1:
if brand_choice==1:
if budget_choice==1: # createss Vaccum Cleaner product list for brand choice 1 budget choice 1
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Bosch Model A03 red","\t\t", "2 Years", "\t\t", "AED 32"
print "2", "\t\t", "Bosch Model A03 blue","\t\t", "2 Years", "\t\t", "AED 32"
print "3", "\t\t", "Bosch Model C35 white","\t\t", "1 Year", "\t\t\t", "AED 38"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=32
item+="Bosch Model A03 red"
warranty+="2 Years"
elif item_choice==2:
sum+=32
item+="Bosch Model A03 blue"
warranty+="2 Years"
elif item_choice==3:
sum+=38
item+="Bosch Model C35 White"
warranty+="1 Year"
elif budget_choice==2: # creates Vaccum Cleaner product list for for brand choice 1 budget choice 2
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Bosch Model F23 green","\t\t", "1 Year", "\t\t", "AED 45"
print "2", "\t\t", "Bosch Model A12 blue","\t\t", "1 Year", "\t\t", "AED 52"
print "3", "\t\t", "Bosch Model A12 white","\t\t", "1 Year", "\t\t", "AED 52"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=45
item+="Bosch Model F23 green"
warranty+="1 Year"
elif item_choice==2:
sum+=52
item+="Bosch Model A12 blue"
warranty+="1 Year"
elif item_choice==3:
sum+=52
item+="Bosch Model A12 white"
warranty+="1 Year"
elif brand_choice==2:
if budget_choice==1: # creates Vaccum Cleaner product list for for brand choice 2 budget choice 1
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Siemens Model E34 red","\t\t", "2 Years", "\t\t", "AED 34"
print "2", "\t\t", "Siemens Model B45 blue","\t\t", "1 Year", "\t\t", "AED 34"
print "3", "\t\t", "Siemens Model B45 black","\t\t", "1 Year", "\t\t", "AED 37"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=34
item+="Siemens Model E34 red"
warranty+="2 Years"
elif item+_choice==2:
sum+=34
item+="Siemens Model B45 blue"
warranty+="1 Year"
elif item+_choice==3:
sum+=37
item+="Siemens Model B45 black"
warranty+="1 Year"
elif budget_choice==2: # creates Vaccum Cleaner product list for for brand choice 2 budget choice 2
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Siemens Model J27 red","\t\t", "2 Years", "\t\t", "AED 49"
print "2", "\t\t", "Siemens Model J27 blue","\t\t", "2 Years", "\t\t", "AED 49"
print "3", "\t\t", "Siemens Model W12 white","\t\t", "1 Year", "\t\t", "AED 53"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=49
item+="Siemens Model J27 red"
warranty+="2 Years"
elif item_choice==2:
sum+=49
item+="Siemens Model J27 blue"
warranty+="2 Years"
elif item_choice==3:
sum+=53
item+="Siemens Model W12 white"
warranty+="1 Year"
elif brand_choice==3:
if budget_choice==1: # creates Vaccum Cleaner product list for for brand choice 3 budget choice 1
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Sharp Model E34 red","\t\t", "1 Year", "\t\t", "AED 38"
print "2", "\t\t", "Sharp Model B45 blue","\t\t", "2 Years", "\t\t", "AED 37"
print "3", "\t\t", "Sharp Model B45 black","\t\t", "2 Years", "\t\t", "AED 37"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=38
item+="Sharp Model E34 red"
warranty+="1 Year"
elif item_choice==2:
sum+=37
item+="Sharp Model B45 blue"
warranty+="2 Years"
elif item_choice==3:
sum+=37
item+="Sharp Model B45 black"
warranty+="2 Years"
elif budget_choice==2: # creates Vaccum Cleaner product list for for brand choice 3 budget choice 2
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Sharp Model J27 red","\t\t", "1 Year", "\t\t", "AED 47"
print "2", "\t\t", "Sharp Model J27 blue","\t\t", "1 Year", "\t\t", "AED 47"
print "3", "\t\t", "Sharp Model W12 white","\t\t", "2 Years", "\t\t", "AED 51"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=47
item+="Sharp Model J27 red"
warranty+="1 Year"
elif item_choice==2:
sum+=47
item+="Sharp Model J27 blue"
warranty+="1 Year"
elif item_choice==3:
sum+=51
item+="Sharp Model W12 white"
warranty+="2 Years"
elif product_choice==2:
if brand_choice==1:
if budget_choice==1: # creates Juicer product list for for brand choice 1 budget choice 1
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "LG Model A03 red","\t\t", "2 Years", "\t\t", "AED 120"
print "2", "\t\t", "LG Model A03 blue","\t\t", "2 Years", "\t\t", "AED 120"
print "3", "\t\t", "LG Model C35 white","\t\t", "1 Year", "\t\t", "AED 117"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=120
item+="LG Model A03 red"
warranty+="2 Years"
elif item_choice==2:
sum+=120
item+="LG Model A03 blue"
warranty+="2 Years"
elif item_choice==3:
sum+=117
item+="LG Model C35 white"
warranty+="1 Year"
elif budget_choice==2: # creates Juicer product list for for brand choice 1 budget choice 2
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "LG Model I23 blue","\t\t", "1 Year", "\t\t", "AED 140"
print "2", "\t\t", "LG Model K82 red","\t\t", "1 Year", "\t\t", "AED 145"
print "3", "\t\t", "LG Model K82 white","\t\t", "1 Year", "\t\t", "AED 145"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=140
item+="LG Model I23 blue"
warranty+="1 Year"
elif item_choice==2:
sum+=145
item+="LG Model K82 red"
warranty+="1 Year"
elif item_choice==3:
sum+=145
item+="LG Model K82 white"
warranty+="1 Year"
elif brand_choice==2:
if budget_choice==1: # creates Juicer product list for for brand choice 2 budget choice 1
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t" "Warranty", "\t\t" "Price"
print "1", "\t\t", "Panasonic Model G38 white","\t\t", "2 Years", "\t\t", "AED 119"
print "2", "\t\t", "Panasonic Model H45 blue","\t\t", "1 Year", "\t\t", "AED 124"
print "3", "\t\t", "Panasonic Model H45 black","\t\t", "1 Year", "\t\t", "AED 124"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=119
item+="Panasonic Model G38 white"
warranty+="2 Years"
elif item_choice==2:
sum+=124
item+="Panasonic Model H45 blue"
warranty+="1 Year"
elif item_choice==3:
sum+=124
item+="Panasonic Model H45 black"
warranty+="1 Year"
elif budget_choice==2: # creates Juicer product list for for brand choice 2 budget choice 2
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Panasonic Model L07 red","\t\t", "2 Years", "\t\t", "AED 145"
print "2", "\t\t", "Panasonic Model L07 blue","\t\t", "2 Years", "\t\t", "AED 145"
print "3", "\t\t", "Panasonic Model M52 white","\t\t", "1 Year", "\t\t", "AED 138"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=145
item+="Panasonic Model L07 red"
warranty+="2 Years"
elif item_choice==2:
sum+=145
item+="Panasonic Model L07 blue"
warranty+="2 Years"
elif item_choice==3:
sum+=138
item+="Panasonic Model M52 white"
warranty+="1 Year"
elif brand_choice==3: # creates Juicer product list for for brand choice 3 budget choice 1
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Kenwood Model U84 red","\t\t", "2 Years", "\t\t", "AED 120"
print "2", "\t\t", "Kenwood Model E95 blue","\t\t", "1 Year", "\t\t", "AED 125"
print "3", "\t\t", "Kenwood Model E95 black","\t\t", "1 Year", "\t\t", "AED 125"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=120
item+="Kenwood Model U84 red"
warranty+="2 Years"
elif item_choice==2:
sum+=125
item+="Kenwood Model E95 blue"
warranty+="1 Year"
elif item_choice==3:
sum+=125
item+="Kenwood Model E95 black"
warranty+="1 Year"
elif budget_choice==2:# creates Juicer product list for for brand choice 3 budget choice 1
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Kenwood Model Y47 red","\t\t", "1 Year", "\t\t", "AED 150"
print "2", "\t\t", "Kenwood Model Y47 blue","\t\t", "1 Year", "\t\t", "AED 150"
print "3", "\t\t", "Kenwood Model Q17 white","\t\t", "2 Years", "\t\t", "AED 140"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=150
item+="Kenwood Model Y47 red"
warranty+="1 Year"
elif item_choice==2:
sum+=150
item+="Kenwood Model Y47 blue"
warranty+="1 Year"
elif item_choice==3:
sum+=140
item+="Kenwood Model Q17 white"
warranty+="2 Years"
elif product_choice==3:
if brand_choice==1:
if budget_choice==1: # creates mixer product list for for brand choice 1 budget choice 1
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Kenwood Model A03 red","\t\t", "2 Years", "\t\t", "AED 120"
print "2", "\t\t", "Kenwood Model A03 blue","\t\t", "2 Years", "\t\t", "AED 120"
print "3", "\t\t", "Kenwood Model C35 white","\t\t", "1 Year", "\t\t", "AED 135"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=120
item+="Kenwood Model A03 red"
warranty+="2 Years"
elif item_choice==2:
sum+=120
item+="Kenwood Model A03 blue"
warranty+="2 Years"
elif item_choice==3:
sum+=135
item+="Kenwood Model Q17 white"
warranty+="1 Year"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Kenwood Model I23 blue","\t\t", "1 Years", "\t\t", "AED 140"
print "2", "\t\t", "Kenwood Model K82 red","\t\t", "1 Years", "\t\t", "AED 145"
print "3", "\t\t", "Kenwood Model K82 white","\t\t", "1 Year", "\t\t", "AED 145"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=140
item+="Kenwood Model A03 red"
warranty+="1 Year"
elif item_choice==2:
sum+=140
item+="Kenwood Model A03 blue"
warranty+="1 Year"
elif item_choice==3:
sum+=145
item+="Kenwood Model Q17 white"
warranty+="1 Year"
elif brand_choice==2:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Panasonic Model G38 white","\t\t", "2 Years", "\t\t", "AED 119"
print "2", "\t\t", "Panasonic Model H45 blue","\t\t", "1 Years", "\t\t", "AED 124"
print "3", "\t\t", "Panasonic Model H45 black","\t\t", "1 Year", "\t\t", "AED 124"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=140
item+="Panasonic Model G38 white"
warranty+="2 Years"
elif item_choice==2:
sum+=140
item+="Panasonic Model H45 blue"
warranty+="1 Year"
elif item_choice==3:
sum+=145
item+="Panasonic Model H45 black"
warranty+="1 Year"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Panasonic Model L07 red","\t\t", "2 Years", "\t\t", "AED 145"
print "2", "\t\t", "Panasonic Model L07 blue","\t\t", "2 Years", "\t\t", "AED 145"
print "3", "\t\t", "Panasonic Model M52 white","\t\t", "1 Year", "\t\t", "AED 150"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=145
item+="Panasonic Model L07 red"
warranty+="2 Years"
elif item_choice==2:
sum+=145
item+="Panasonic Model L07 blue"
warranty+="2 Years"
elif item_choice==3:
sum+=150
item+="Panasonic Model M52 white"
warranty+="1 Year"
elif brand_choice==3:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Philips Model U84 red","\t\t", "2 Years", "\t\t", "AED 120"
print "2", "\t\t", "Philips Model E95 blue","\t\t", "1 Year", "\t\t", "AED 125"
print "3", "\t\t", "Philips Model E95 black","\t\t", "1 Year", "\t\t", "AED 125"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=120
item+="Philips Model U84 red"
warranty+="2 Years"
elif item_choice==2:
sum+=125
item+="Philips Model E95 blue"
warranty+="1 Year"
elif item_choice==3:
sum+=125
item+="Philips Model E95 black"
warranty+="1 Year"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Philips Model Y47 red","\t\t", "1 Year", "\t\t", "AED 150"
print "2", "\t\t", "Philips Model Y47 blue","\t\t", "1 Year", "\t\t", "AED 150"
print "3", "\t\t", "Philips Model Q17 white","\t\t", "2 Years", "\t\t", "AED 143"
if item_choice==1:
sum+=150
item+="Philips Model Y47 red"
warranty+="1 Year"
elif item_choice==2:
sum+=150
item+="Philips Model Y47 blue"
warranty+="1 Year"
elif item_choice==3:
sum+=143
item+="Philips Model Q17 white"
warranty+="2 Years"
elif choice==2:
if product_choice==1:
if brand_choice==1:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Sony TV Black","\t\t", "2 Years", "\t\t", "AED 4999"
print "2", "\t\t", "Sony TV Grey","\t\t", "2 Years", "\t\t", "AED 4999"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=4999
item+="Sony TV Black"
warranty+="2 Years"
elif item_choice==2:
sum+=4999
item+="Sony TV Grey"
warranty+="2 Years"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Sony HD TV Black","\t\t", "1 Year", "\t\t", "AED 5000"
print "2", "\t\t", "Sony Curve TV Black","\t\t", "1 Year", "\t\t", "AED 5199"
print "3", "\t\t", "Sony Curve TV Grey","\t\t", "1 Year", "\t\t", "AED 5199"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=5000
item+="Sony HD TV Black"
warranty+="1 Year"
elif item_choice==2:
sum+=5199
item+="Sony Curve TV Black"
warranty+="1 Year"
elif item_choice==3:
sum+=5199
item+="Sony Curve TV Grey"
warranty+="1 Year"
elif brand_choice==2:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Samsung TV Black","\t\t", "2 Years", "\t\t", "AED 4850"
print "2", "\t\t", "Samsung TV Grey","\t\t", "1 Year", "\t\t", "AED 4850"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=4850
item+="Samsung TV Black"
warranty+="2 Years"
elif item_choice==2:
sum+=4850
item+="Samsung TV Grey"
warranty+="1 Year"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Samsung HD TV Black","\t\t", "1 Year", "\t\t", "AED 5290"
print "2", "\t\t", "Samsung Curve TV Black","\t\t", "1 Year", "\t\t", "AED 5000"
print "3", "\t\t", "Samsung Curve TV Grey","\t\t", "2 Years", "\t\t", "AED 5000"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=5290
item+="Samsung HD TV Black"
warranty+="1 Year"
elif item_choice==2:
sum+=5000
item+="Samsung Curve TV Black"
warranty+="1 Year"
elif item_choice==3:
sum+=5000
item+="Samsung Curve TV Grey"
warranty+="2 Years"
elif brand_choice==3:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "LG TV Black","\t\t", "1 Year", "\t\t", "AED 4999"
print "2", "\t\t", "LG TV Grey","\t\t", "2 Years", "\t\t", "AED 4999"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=4999
item+="LG TV Black"
warranty+="1 Year"
elif item_choice==2:
sum+=4999
item+="LG TV Grey"
warranty+="2 Years"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "LG HD TV Black","\t\t", "1 Years", "\t\t", "AED 5100"
print "2", "\t\t", "LG Curve TV Black","\t\t", "1 Years", "\t\t", "AED 5200"
print "3", "\t\t", "LG Curve TV Grey","\t\t", "2 Year", "\t\t", "AED 5200"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=5100
item+="LG HD TV Black"
warranty+="1 Year"
elif item_choice==2:
sum+=5200
item+="LG Curve TV Black"
warranty+="1 Year"
elif item_choice==3:
sum+=5200
item+="LG Curve TV Grey"
warranty+="2 Years"
elif product_choice==2:
if brand_choice==1:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Apple iPhone 4","\t\t", "2 Years", "\t\t", "AED 1799"
print "2", "\t\t", "Apple iPhone 4s","\t\t", "2 Years", "\t\t", "AED 1899"
print "3", "\t\t", "Apple iPhone 5s","\t\t", "1 Year", "\t\t", "AED 1945"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=1799
item+="Apple iPhone 4"
warranty+="2 Years"
elif item_choice==2:
sum+=1899
item+="Apple iPhone 4s"
warranty+="2 Years"
elif item_choice==3:
sum+=1945
item+="Apple iPhone 5s"
warranty+="1 Year"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Apple iPhone 6","\t\t", "1 Year", "\t\t", "AED 2000"
print "2", "\t\t", "Apple iPhone 6+","\t\t", "1 Year", "\t\t", "AED 2100"
print "3", "\t\t", "Apple iPhone 7","\t\t", "1 Year", "\t\t", "AED 2235"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=2000
item+="Apple iPhone 6"
warranty+="1 Year"
elif item_choice==2:
sum+=2100
item+="Apple iPhone 6+"
warranty+="1 Year"
elif item_choice==3:
sum+=2235
item+="Apple iPhone 7"
warranty+="1 Year"
elif brand_choice==2:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Samsung galaxy S4","\t\t", "2 Years", "\t\t", "AED 1845"
print "2", "\t\t", "Samsung galaxy S5","\t\t", "1 Years", "\t\t", "AED 1945"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=1845
item+="Samsung galaxy S4"
warranty+="2 Years"
elif item_choice==2:
sum+=1945
item+="Samsung galaxy S5"
warranty+="1 Year"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Samsung galaxy S6","\t\t", "1 Years", "\t\t", "AED 2000"
print "2", "\t\t", "Samsung galaxy S7","\t\t", "1 Years", "\t\t", "AED 2135"
print "3", "\t\t", "Samsung galaxy S7 Edge","\t\t", "2 Year", "\t\t", "AED 2235"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=2000
item+="Samsung galaxy S6"
warranty+="1 Year"
elif item_choice==2:
sum+=2135
item+="Samsung galaxy S7"
warranty+="1 Year"
elif item_choice==3:
sum+=2235
item+="Samsung galaxy S7 Edge"
warranty+="2 Years"
elif brand_choice==3:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "HTC 10 Silver","\t\t", "2 Years", "\t\t", "AED 1799"
print "2", "\t\t", "HTC One X9","\t\t", "1 Year", "\t\t", "AED 1899"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=1799
item+="HTC 10 Silver"
warranty+="2 Years"
elif item_choice==2:
sum+=1899
item+="HTC One X9"
warranty+="1 Year"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "HTC One A9","\t\t", "1 Year", "\t\t", "AED 1995"
print "2", "\t\t", "HTC One A9s","\t\t", "1 Year", "\t\t", "AED 2135"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=1995
item+="HTC One A9"
warranty+="1 Year"
elif item_choice==2:
sum+=2135
item+="HTC One A9s"
warranty+="1 Year"
elif product_choice==3:
if brand_choice==1:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Lenovo ideapad 300","\t\t", "2 Years", "\t\t", "AED 2550"
print "2", "\t\t", "Lenovo ideapad 500","\t\t", "2 Years", "\t\t", "AED 2570"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=2550
item+="Lenovo ideapad 300"
warranty+="2 Years"
elif item_choice==2:
sum+=2570
item+="Lenovo ideapad 500"
warranty+="2 Years"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Lenovo ideapad 700","\t\t", "1 Year", "\t\t", "AED 5190"
print "2", "\t\t", "Lenovo Thinkpad Yoga","\t\t", "2 Years", "\t\t", "AED 5290"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=5190
item+="Lenovo ideapad 700"
warranty+="1 Year"
elif item_choice==2:
sum+=5290
item+="Lenovo Thinkpad Yoga"
warranty+="2 Years"
elif brand_choice==2:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "HP Pavilion X2","\t\t", "2 Years", "\t\t", "AED 2450"
print "2", "\t\t", "HP 250 Notebook","\t\t", "1 Years", "\t\t", "AED 2850"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=2450
item+="HP Pavilion X2"
warranty+="2 Years"
elif item_choice==2:
sum+=2850
item+="HP 250 Notebook"
warranty+="1 Year"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "HP Stream 13","\t\t", "2 Years", "\t\t", "AED 3100"
print "2", "\t\t", "HP Probook 450","\t\t", "2 Years", "\t\t", "AED 3569"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=3100
item+="HP Stream 13"
warranty+="2 Years"
elif item_choice==2:
sum+=3569
item+="HP Probook 450"
warranty+="2 Years"
elif brand_choice==3:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Dell Inspiron 3558","\t\t", "2 Years", "\t\t", "AED 2699"
print "2", "\t\t", "Dell Inspiron 5559","\t\t", "1 Years", "\t\t", "AED 2899"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=2699
item+="Dell Inspiron 3558"
warranty+="2 Years"
elif item_choice==2:
sum+=2899
item+="Dell Inspiron 5559"
warranty+="1 Year"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Warranty", "\t\t", "Price"
print "1", "\t\t", "Dell Inspiron 7559","\t\t", "1 Year", "\t\t", "AED 3000"
print "2", "\t\t", "Dell Inspiron 7568","\t\t", "1 Year", "\t\t", "AED 3200"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=3000
item+="Dell Inspiron 7559"
warranty+="1 Year"
elif item_choice==2:
sum+=3200
item+="Dell Inspiron 7568"
warranty+="1 Year"
elif choice==3:
if product_choice==1:
if brand_choice==1:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Octave Cowboy Shirt black","\t\t", "AED 37"
print "2", "\t\t", "Octave Cowboy Shirt yellow","\t\t", "AED 37"
print "3", "\t\t", "Octave blue t-shirt","\t\t", "AED 41"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=37
item+="Octave Cowboy Shirt black"
warranty+="-"
elif item_choice==2:
sum+=37
item+="Octave Cowboy Shirt yellow"
warranty+="-"
elif item_choice==3:
sum+=41
item+="Octave blue t-shirt"
warranty+="-"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Octave full sleves Shirt Blue","\t\t", "AED 65"
print "2", "\t\t", "Octave brown t-shirt","\t\t", "AED 52"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=65
item+="Octave full sleves Shirt Blue"
warranty+="-"
elif item_choice==2:
sum+=52
item+="Octave brown t-shirt"
warranty+="-"
elif brand_choice==2:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Giordano Cowboy Shirt black","\t\t", "AED 44"
print "2", "\t\t", "Giordano Cowboy Shirt brown","\t\t", "AED 44"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=44
item+="Giordano Cowboy Shirt black"
warranty+="-"
elif item_choice==2:
sum+=44
item+="Giordano Cowboy Shirt brown"
warranty+="-"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Giordano full sleves Shirt Orange","\t", "AED 62"
print "2", "\t\t", "Giordano brown t-shirt","\t\t", "AED 53"
print "3", "\t\t", "Giordano blue t-shirt","\t\t", "AED 53"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=62
item+="Giordano full sleves Shirt Orange"
warranty+="-"
elif item_choice==2:
sum+=53
item+="Giordano brown t-shirt"
warranty+="-"
elif item_choice==3:
sum+=53
item+="Giordano blue t-shirt"
warranty+="-"
elif brand_choice==3:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Lee Cooper Cowboy Shirt black","\t\t", "AED 38"
print "2", "\t\t", "Lee Cooper Cowboy Shirt brown","\t\t", "AED 37"
print "3", "\t\t", "Lee Cooper brown t-shirt","\t", "AED 37"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=38
item+="Lee Cooper Cowboy Shirt black"
warranty+="-"
elif item_choice==2:
sum+=37
item+="Lee Cooper Cowboy Shirt brown"
warranty+="-"
elif item_choice==3:
sum+=37
item+="Lee Cooper brown t-shirt"
warranty+="-"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Lee Cooper full sleves Shirt Red","\t\t", "AED 59"
print "2", "\t\t", "Lee Cooper blue t-shirt","\t\t", "AED 66"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=59
item+="Lee Cooper full sleves Shirt red"
warranty+="-"
elif item_choice==2:
sum+=66
item+="Lee Cooper blue t-shirt"
warranty+="-"
elif product_choice==2:
if brand_choice==1:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Splash Jeans Black","\t\t", "AED 40"
print "2", "\t\t", "Splash Jeans Blue","\t\t", "AED 40"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=40
item+="Splash Jeans Black"
warranty+="-"
elif item_choice==2:
sum+=40
item+="Splash Jeans Blue"
warranty+="-"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Splash pant Black","\t\t", "AED 51"
print "2", "\t\t", "Splash pant Brown","\t\t", "AED 51"
print "3", "\t\t", "Splash pant Blue","\t\t" "AED 51"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=51
item+="Splash pant Black"
warranty+="-"
elif item_choice==2:
sum+=51
item+="Splash pant Brown"
warranty+="-"
elif item_choice==3:
sum+=51
item+="Splash pant Blue"
warranty+="-"
elif brand_choice==2:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Being Human Jeans Black","\t\t", "AED 45"
print "2", "\t\t", "Being Human Jeans Blue","\t\t", "AED 45"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=45
item+="Being Human Jeans Black"
warranty+="-"
elif item_choice==2:
sum+=45
item+="Being Human Jeans Blue"
warranty+="-"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Being Human pant black","\t\t", "AED 59"
print "2", "\t\t", "Being Human pant Brown","\t\t", "AED 59"
print "3", "\t\t", "Being Human pant Blue","\t\t" "AED 59"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=59
item+="Being Human pant Black"
warranty+="-"
elif item_choice==2:
sum+=59
item+="Being Human pant Brown"
warranty+="-"
elif item_choice==3:
sum+=59
item+="Being Human pant Blue"
warranty+="-"
elif brand_choice==3:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Max Jeans Black","\t\t", "AED 43"
print "2", "\t\t", "Max Jeans Blue","\t\t", "AED 43"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=43
item+="Max Jeans Black"
warranty+="-"
elif item_choice==2:
sum+=43
item+="Max Human Jeans Blue"
warranty+="-"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Max pant black","\t\t", "AED 57"
print "2", "\t\t", "Max pant Brown","\t\t", "AED 57"
print "3", "\t\t", "Max pant Blue","\t\t" "AED 57"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=57
item+="Max pant Black"
warranty+="-"
elif item_choice==2:
sum+=57
item+="Max pant Brown"
warranty+="-"
elif item_choice==3:
sum+=57
item+="Max pant Blue"
warranty+="-"
elif product_choice==3:
if brand_choice==1:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Max Hoodie Blue","\t\t", "AED 60"
print "2", "\t\t", "Max Hoodie White","\t\t" "AED 60"
print "3", "\t\t", "Max Hoodie Black","\t\t", "AED 60"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=60
item+="Max Hoodie Blue"
warranty+="-"
elif item_choice==2:
sum+=60
item+="Max Hoodie White"
warranty+="-"
elif item_choice==3:
sum+=60
item+="Max Hoodie Black"
warranty+="-"
elif choice==4:
if product_choice==1:
if brand_choice==1:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Helix HP break proof pencil", "\t\t" "AED 2"
print "2", "\t\t", "Helix HP pencil","\t\t" "AED 1"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=2
item+="Helix HP break proof pencil"
warranty+="-"
elif item_choice==2:
sum+=1
item+="Helix HP pencil"
warranty+="-"
elif product_choice==2:
if brand_choice==1:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Uniball Red 0.5 mm","\t\t", "AED 1"
print "2", "\t\t", "Uniball Blue 0.5 mm","\t\t" "AED 1"
print "3", "\t\t", "Uniball Black 0.5 mm","\t\t", "AED 1"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=1
item+="Uniball Red 0.5 mm"
warranty+="-"
elif item_choice==2:
sum+=1
item+="Uniball Blue 0.5 mm"
warranty+="-"
elif item_choice==3:
sum+=1
item+="Uniball Black 0.5 mm"
warranty+="-"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Uniball Red 0.7 mm","\t\t", "AED 3"
print "2", "\t\t", "Uniball Blue 0.7 mm","\t\t", "AED 3"
print "3", "\t\t", "Uniball Black 0.7 mm","\t\t", "AED 3"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=3
item+="Uniball Red 0.7 mm"
warranty+="-"
elif item_choice==2:
sum+=3
item+="Uniball Blue 0.7 mm"
warranty+="-"
elif item_choice==3:
sum+=3
item+="Uniball Black 0.7 mm"
warranty+="-"
elif brand_choice==2:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Faber Castell Red 0.5 mm","\t\t", "AED 1"
print "2", "\t\t", "Faber Castell Blue 0.5 mm","\t\t" "AED 1"
print "3", "\t\t", "Faber Castell Black 0.5 mm","\t\t", "AED 1"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=1
item+="Faber Castell Red 0.5 mm"
warranty+="-"
elif item_choice==2:
sum+=1
item+="Faber Castell Blue 0.5 mm"
warranty+="-"
elif item_choice==3:
sum+=1
item+="Faber Castell Black 0.5 mm"
warranty+="-"
elif budget_choice==2:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Faber Castell Red 0.7 mm","\t\t", "AED 3"
print "2", "\t\t", "Faber Castell Blue 0.7 mm","\t\t", "AED 3"
print "3", "\t\t", "Faber Castell 0.7 mm","\t", "AED 3"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=3
item+="Faber Castell Red 0.7 mm"
warranty+="-"
elif item_choice==2:
sum+=3
item+="Faber Castell Blue 0.7 mm"
warranty+="-"
elif item_choice==3:
sum+=3
item+="Faber Castell Black 0.7 mm"
warranty+="-"
elif product_choice==3:
if brand_choice==1:
if budget_choice==1:
print "SL NO.", "\t\t", "ITEMS", "\t\t\t\t", "Price"
print "1", "\t\t", "Faber Castell Eraser", "\t\t", "AED 1"
item_choice=input("Enter your item choice: ")
if item_choice==1:
sum+=1
item+="Faber Castell Eraser"
warranty+="-"
k=item
s=warranty
d=str(sum)
x=sum
def process():
print "\n"
print "Choose your item: "
genre_of_product()
product_menu(choice)
brand(choice, product_choice)
budget(choice, product_choice)
specific_item(choice, product_choice, brand_choice, budget_choice)
def shipping_detials():
print "\n"
global reference_number
global shipping_choice
import random
print "How do you want to get your product(s)?"
print "1. Cash on delivery"
print "2. Come and collect"
shipping_choice=input("Enter your choice: ")
if shipping_choice==1:
address=raw_input("Enter you Address(in one line): ")
reference_number=random.randint(1,10000)
print "Your Extra Charge for delivery is AED 60"
reference_number=random.randint(1,10000)
elif shipping_choice==2:
reference_number=random.randint(1,10000)
print "Please give this booking reference number when you get the delivery"
print "\nYour Booking Reference Number is", reference_number
def main():
print "Welcome to 'OOHSbay.com'"
print "\n"
n=input("Enter the number of items you want to buy: ")
total_price=0
overall_items=[]
overall_warranty=[]
overall_price=[]
for i in range(n):
process()
q=overall_items.append(k)
w=overall_warranty.append(s)
e=overall_price.append(d)
total_price+=x
shipping_detials()
print "\n\n\n"
print "NEXT -"
print "1. Print Bill"
print "\nEnter '1' to print your bill"
print "\nREMEMBER: SHOW PRINTED COPY OF YOUR BILL WHEN YOU SHIP YOUR PRODUCTS"
bill_choice=input("Enter your choice: ")
print "SLNO", "\t\t\t\t\t", "ITEM", "\t\t\t\t\t", "WARRANTY", "\t\t", "PRICE"
for i in range(n):
print " ", i+1, "\t\t\t\t\t", overall_items[i], "\t\t\t\t", overall_warranty[i], "\t\t\t", overall_price[i],"Dhs"
print "Your Booking Reference Number is :",reference_number
if shipping_choice==1:
print "Your total cost is", total_price+60,"Dhs."
print"Your product will arrive in 3 days!"
elif shipping_choice==2:
print "Your total cost is", total_price,"Dhs."
main()
| 50.784906 | 159 | 0.430346 | 7,809 | 67,290 | 3.646306 | 0.04354 | 0.04678 | 0.024759 | 0.051275 | 0.869952 | 0.815305 | 0.748859 | 0.720903 | 0.680445 | 0.649575 | 0 | 0.055098 | 0.440868 | 67,290 | 1,324 | 160 | 50.823263 | 0.701706 | 0.067588 | 0 | 0.660948 | 0 | 0 | 0.267264 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.000817 | null | null | 0.27451 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
32255354f8a9eb4284b68b631d7c9f389edbfefd | 15,024 | py | Python | test/test_general.py | jwodder/apachelogs | 6b6d2dc1abb54d8fb08c151f5a554b03553bab84 | [
"MIT"
] | 16 | 2019-05-06T18:19:32.000Z | 2022-03-12T16:18:26.000Z | test/test_general.py | jwodder/apachelogs | 6b6d2dc1abb54d8fb08c151f5a554b03553bab84 | [
"MIT"
] | 3 | 2020-08-04T07:33:51.000Z | 2021-11-24T17:32:44.000Z | test/test_general.py | jwodder/apachelogs | 6b6d2dc1abb54d8fb08c151f5a554b03553bab84 | [
"MIT"
] | 3 | 2019-06-26T17:35:35.000Z | 2021-08-22T23:08:44.000Z | from datetime import datetime, timezone
from pathlib import Path
import pytest
from apachelogs import (
COMBINED,
VHOST_COMBINED,
InvalidEntryError,
LogEntry,
LogParser,
parse,
parse_lines,
)
def mkentry(entry, format, **attrs):
logentry = LogEntry(entry, format, [], [])
logentry.__dict__.update(attrs)
return logentry
VHOST_COMBINED_LOG_ENTRIES = [
mkentry(
'www.varonathe.org:80 203.62.1.80 - - [06/May/2019:06:28:20 +0000] "GET / HTTP/1.1" 301 577 "-" "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:58.0) Gecko/20100101 Firefox/58.0"',
VHOST_COMBINED,
virtual_host="www.varonathe.org",
server_port=80,
remote_host="203.62.1.80",
remote_logname=None,
remote_user=None,
request_time=datetime(2019, 5, 6, 6, 28, 20, tzinfo=timezone.utc),
request_time_fields={
"timestamp": datetime(2019, 5, 6, 6, 28, 20, tzinfo=timezone.utc),
},
request_line="GET / HTTP/1.1",
final_status=301,
bytes_out=577,
headers_in={
"Referer": None,
"User-Agent": "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:58.0) Gecko/20100101 Firefox/58.0",
},
directives={
"%v": "www.varonathe.org",
"%p": 80,
"%h": "203.62.1.80",
"%l": None,
"%u": None,
"%t": datetime(2019, 5, 6, 6, 28, 20, tzinfo=timezone.utc),
"%r": "GET / HTTP/1.1",
"%>s": 301,
"%O": 577,
"%{Referer}i": None,
"%{User-Agent}i": "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:58.0) Gecko/20100101 Firefox/58.0",
},
),
mkentry(
'www.varonathe.org:80 203.62.1.80 - - [06/May/2019:06:28:20 +0000] "GET /robots.txt HTTP/1.1" 301 596 "-" "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:58.0) Gecko/20100101 Firefox/58.0"',
VHOST_COMBINED,
virtual_host="www.varonathe.org",
server_port=80,
remote_host="203.62.1.80",
remote_logname=None,
remote_user=None,
request_time=datetime(2019, 5, 6, 6, 28, 20, tzinfo=timezone.utc),
request_time_fields={
"timestamp": datetime(2019, 5, 6, 6, 28, 20, tzinfo=timezone.utc),
},
request_line="GET /robots.txt HTTP/1.1",
final_status=301,
bytes_out=596,
headers_in={
"Referer": None,
"User-Agent": "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:58.0) Gecko/20100101 Firefox/58.0",
},
directives={
"%v": "www.varonathe.org",
"%p": 80,
"%h": "203.62.1.80",
"%l": None,
"%u": None,
"%t": datetime(2019, 5, 6, 6, 28, 20, tzinfo=timezone.utc),
"%r": "GET /robots.txt HTTP/1.1",
"%>s": 301,
"%O": 596,
"%{Referer}i": None,
"%{User-Agent}i": "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:58.0) Gecko/20100101 Firefox/58.0",
},
),
mkentry(
'www.varonathe.org:80 203.62.1.80 - - [06/May/2019:06:28:21 +0000] "POST /App6079ec68.php HTTP/1.1" 301 606 "-" "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:58.0) Gecko/20100101 Firefox/58.0"',
VHOST_COMBINED,
virtual_host="www.varonathe.org",
server_port=80,
remote_host="203.62.1.80",
remote_logname=None,
remote_user=None,
request_time=datetime(2019, 5, 6, 6, 28, 21, tzinfo=timezone.utc),
request_time_fields={
"timestamp": datetime(2019, 5, 6, 6, 28, 21, tzinfo=timezone.utc),
},
request_line="POST /App6079ec68.php HTTP/1.1",
final_status=301,
bytes_out=606,
headers_in={
"Referer": None,
"User-Agent": "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:58.0) Gecko/20100101 Firefox/58.0",
},
directives={
"%v": "www.varonathe.org",
"%p": 80,
"%h": "203.62.1.80",
"%l": None,
"%u": None,
"%t": datetime(2019, 5, 6, 6, 28, 21, tzinfo=timezone.utc),
"%r": "POST /App6079ec68.php HTTP/1.1",
"%>s": 301,
"%O": 606,
"%{Referer}i": None,
"%{User-Agent}i": "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:58.0) Gecko/20100101 Firefox/58.0",
},
),
mkentry(
'www.varonathe.org:80 203.62.1.80 - - [06/May/2019:06:28:21 +0000] "GET /webdav/ HTTP/1.1" 301 554 "-" "Mozilla/5.0"',
VHOST_COMBINED,
virtual_host="www.varonathe.org",
server_port=80,
remote_host="203.62.1.80",
remote_logname=None,
remote_user=None,
request_time=datetime(2019, 5, 6, 6, 28, 21, tzinfo=timezone.utc),
request_time_fields={
"timestamp": datetime(2019, 5, 6, 6, 28, 21, tzinfo=timezone.utc),
},
request_line="GET /webdav/ HTTP/1.1",
final_status=301,
bytes_out=554,
headers_in={
"Referer": None,
"User-Agent": "Mozilla/5.0",
},
directives={
"%v": "www.varonathe.org",
"%p": 80,
"%h": "203.62.1.80",
"%l": None,
"%u": None,
"%t": datetime(2019, 5, 6, 6, 28, 21, tzinfo=timezone.utc),
"%r": "GET /webdav/ HTTP/1.1",
"%>s": 301,
"%O": 554,
"%{Referer}i": None,
"%{User-Agent}i": "Mozilla/5.0",
},
),
mkentry(
'www.varonathe.org:80 203.62.1.80 - - [06/May/2019:06:28:21 +0000] "GET /help.php HTTP/1.1" 301 592 "-" "Mozilla/5.0 (Windows NT 5.2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 SE 2.X MetaSr 1.0"',
VHOST_COMBINED,
virtual_host="www.varonathe.org",
server_port=80,
remote_host="203.62.1.80",
remote_logname=None,
remote_user=None,
request_time=datetime(2019, 5, 6, 6, 28, 21, tzinfo=timezone.utc),
request_time_fields={
"timestamp": datetime(2019, 5, 6, 6, 28, 21, tzinfo=timezone.utc),
},
request_line="GET /help.php HTTP/1.1",
final_status=301,
bytes_out=592,
headers_in={
"Referer": None,
"User-Agent": "Mozilla/5.0 (Windows NT 5.2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 SE 2.X MetaSr 1.0",
},
directives={
"%v": "www.varonathe.org",
"%p": 80,
"%h": "203.62.1.80",
"%l": None,
"%u": None,
"%t": datetime(2019, 5, 6, 6, 28, 21, tzinfo=timezone.utc),
"%r": "GET /help.php HTTP/1.1",
"%>s": 301,
"%O": 592,
"%{Referer}i": None,
"%{User-Agent}i": "Mozilla/5.0 (Windows NT 5.2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 SE 2.X MetaSr 1.0",
},
),
mkentry(
'www.varonathe.org:80 203.62.1.80 - - [06/May/2019:06:28:22 +0000] "GET /java.php HTTP/1.1" 301 592 "-" "Mozilla/5.0 (Windows NT 5.2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 SE 2.X MetaSr 1.0"',
VHOST_COMBINED,
virtual_host="www.varonathe.org",
server_port=80,
remote_host="203.62.1.80",
remote_logname=None,
remote_user=None,
request_time=datetime(2019, 5, 6, 6, 28, 22, tzinfo=timezone.utc),
request_time_fields={
"timestamp": datetime(2019, 5, 6, 6, 28, 22, tzinfo=timezone.utc),
},
request_line="GET /java.php HTTP/1.1",
final_status=301,
bytes_out=592,
headers_in={
"Referer": None,
"User-Agent": "Mozilla/5.0 (Windows NT 5.2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 SE 2.X MetaSr 1.0",
},
directives={
"%v": "www.varonathe.org",
"%p": 80,
"%h": "203.62.1.80",
"%l": None,
"%u": None,
"%t": datetime(2019, 5, 6, 6, 28, 22, tzinfo=timezone.utc),
"%r": "GET /java.php HTTP/1.1",
"%>s": 301,
"%O": 592,
"%{Referer}i": None,
"%{User-Agent}i": "Mozilla/5.0 (Windows NT 5.2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 SE 2.X MetaSr 1.0",
},
),
]
@pytest.mark.parametrize("end", ["", "\n", "\r", "\r\n"])
def test_parse_general(end):
ENTRY = '209.126.136.4 - - [01/Nov/2017:07:28:29 +0000] "GET / HTTP/1.1" 301 521 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.133 Safari/537.36"'
parser = LogParser(COMBINED, encoding="utf-8")
assert parser.format == COMBINED
parsed = parser.parse(ENTRY + end)
assert parsed.remote_host == "209.126.136.4"
assert parsed.remote_logname is None
assert parsed.remote_user is None
assert parsed.request_time == datetime(2017, 11, 1, 7, 28, 29, tzinfo=timezone.utc)
assert parsed.request_line == "GET / HTTP/1.1"
assert parsed.final_status == 301
assert parsed.bytes_sent == 521
assert parsed.headers_in == {
"Referer": None,
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.133 Safari/537.36",
}
assert (
parsed.headers_in["User-Agent"]
== parsed.headers_in["USER-AGENT"]
== parsed.headers_in["user-agent"]
== "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.133 Safari/537.36"
)
assert parsed.entry == ENTRY
assert parsed.format == COMBINED
assert parsed.request_time_fields == {
"timestamp": datetime(2017, 11, 1, 7, 28, 29, tzinfo=timezone.utc)
}
assert parsed.directives == {
"%h": "209.126.136.4",
"%l": None,
"%u": None,
"%t": datetime(2017, 11, 1, 7, 28, 29, tzinfo=timezone.utc),
"%r": "GET / HTTP/1.1",
"%>s": 301,
"%b": 521,
"%{Referer}i": None,
"%{User-Agent}i": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.133 Safari/537.36",
}
def test_parse_lines_invalid():
with (Path(__file__).with_name("data") / "vhost_combined.log").open() as fp:
entries = parse_lines(VHOST_COMBINED, fp)
assert next(entries) == VHOST_COMBINED_LOG_ENTRIES[0]
assert next(entries) == VHOST_COMBINED_LOG_ENTRIES[1]
assert next(entries) == VHOST_COMBINED_LOG_ENTRIES[2]
assert next(entries) == VHOST_COMBINED_LOG_ENTRIES[3]
with pytest.raises(InvalidEntryError) as excinfo:
next(entries)
assert str(excinfo.value) == (
"Could not match log entry 'Bad line'"
" against log format {!r}".format(VHOST_COMBINED)
)
assert excinfo.value.entry == "Bad line"
assert excinfo.value.format == VHOST_COMBINED
def test_parse_lines_ignore_invalid():
with (Path(__file__).with_name("data") / "vhost_combined.log").open() as fp:
entries = parse_lines(VHOST_COMBINED, fp, ignore_invalid=True)
assert list(entries) == VHOST_COMBINED_LOG_ENTRIES
def test_parse_default_enc(mocker):
m = mocker.patch("apachelogs.LogParser", spec=LogParser)
r = parse("%s", "200")
m.assert_called_once_with("%s", encoding="iso-8859-1", errors=None)
m.return_value.parse.assert_called_once_with("200")
assert r is m.return_value.parse.return_value
def test_parse_custom_enc(mocker):
m = mocker.patch("apachelogs.LogParser", spec=LogParser)
r = parse("%s", "200", encoding="utf-8", errors="surrogateescape")
m.assert_called_once_with("%s", encoding="utf-8", errors="surrogateescape")
m.return_value.parse.assert_called_once_with("200")
assert r is m.return_value.parse.return_value
def test_parse_lines_default_enc(mocker):
m = mocker.patch("apachelogs.LogParser", spec=LogParser)
r = parse_lines("%s", ["200"])
m.assert_called_once_with("%s", encoding="iso-8859-1", errors=None)
m.return_value.parse_lines.assert_called_once_with(["200"], False)
assert r is m.return_value.parse_lines.return_value
def test_parse_lines_custom_enc(mocker):
m = mocker.patch("apachelogs.LogParser", spec=LogParser)
r = parse_lines("%s", ["200"], encoding="utf-8", errors="surrogateescape")
m.assert_called_once_with("%s", encoding="utf-8", errors="surrogateescape")
m.return_value.parse_lines.assert_called_once_with(["200"], False)
assert r is m.return_value.parse_lines.return_value
def test_case_insensitive_dicts():
entry = parse(
"%{USER}e|%{Content-Type}i|%{flavor}C|%{ssl-secure-reneg}n"
"|%{Content-Type}o|%{Foo}^ti|%{Baz}^to|%{HTTP_USER_AGENT}x|%{errcode}c",
"www-data|application/x-www-form-urlencoded|chocolate|1|text/html"
'|Bar|Quux|Web "Browsy" Browser|-',
)
assert entry.env_vars == {"USER": "www-data"}
assert (
entry.env_vars["USER"]
== entry.env_vars["user"]
== entry.env_vars["User"]
== "www-data"
)
assert entry.headers_in == {"Content-Type": "application/x-www-form-urlencoded"}
assert (
entry.headers_in["Content-Type"]
== entry.headers_in["CONTENT-TYPE"]
== entry.headers_in["content-type"]
== "application/x-www-form-urlencoded"
)
assert entry.cookies == {"flavor": "chocolate"}
assert (
entry.cookies["flavor"]
== entry.cookies["FLAVOR"]
== entry.cookies["Flavor"]
== "chocolate"
)
assert entry.notes == {"ssl-secure-reneg": "1"}
assert (
entry.notes["ssl-secure-reneg"]
== entry.notes["SSL-SECURE-RENEG"]
== entry.notes["SSL-Secure-Reneg"]
== "1"
)
assert entry.headers_out == {"Content-Type": "text/html"}
assert (
entry.headers_out["Content-Type"]
== entry.headers_out["CONTENT-TYPE"]
== entry.headers_out["content-type"]
== "text/html"
)
assert entry.trailers_in == {"Foo": "Bar"}
assert (
entry.trailers_in["Foo"]
== entry.trailers_in["FOO"]
== entry.trailers_in["foo"]
== "Bar"
)
assert entry.trailers_out == {"Baz": "Quux"}
assert (
entry.trailers_out["Baz"]
== entry.trailers_out["BAZ"]
== entry.trailers_out["baz"]
== "Quux"
)
assert entry.variables == {"HTTP_USER_AGENT": 'Web "Browsy" Browser'}
assert (
entry.variables["HTTP_USER_AGENT"]
== entry.variables["http_user_agent"]
== entry.variables["Http_User_Agent"]
== 'Web "Browsy" Browser'
)
assert entry.cryptography == {"errcode": None}
assert (
entry.cryptography["errcode"]
is entry.cryptography["ERRCODE"]
is entry.cryptography["Errcode"]
is None
)
| 38.622108 | 236 | 0.567425 | 2,036 | 15,024 | 4.073183 | 0.107073 | 0.008682 | 0.023876 | 0.036657 | 0.854094 | 0.837815 | 0.799831 | 0.775714 | 0.722055 | 0.703123 | 0 | 0.114849 | 0.263978 | 15,024 | 388 | 237 | 38.721649 | 0.635106 | 0 | 0 | 0.502717 | 0 | 0.0625 | 0.322351 | 0.026824 | 0 | 0 | 0 | 0 | 0.141304 | 1 | 0.024457 | false | 0 | 0.01087 | 0 | 0.038043 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5c66192f871007c001f42715a227e887cdaf22df | 24 | py | Python | dqlttt/__init__.py | SepehrV/DQL_TicTacToe | 82f9b9ad8831e50498b7cec42629ac3be6d02fda | [
"MIT"
] | 2 | 2016-06-23T01:03:31.000Z | 2017-04-24T20:12:12.000Z | dqlttt/__init__.py | SepehrV/DQL_TicTacToe | 82f9b9ad8831e50498b7cec42629ac3be6d02fda | [
"MIT"
] | null | null | null | dqlttt/__init__.py | SepehrV/DQL_TicTacToe | 82f9b9ad8831e50498b7cec42629ac3be6d02fda | [
"MIT"
] | null | null | null | from . import TicTacToe
| 12 | 23 | 0.791667 | 3 | 24 | 6.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 24 | 1 | 24 | 24 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5c7edd9e6b804d56f01c6469d94c017a2fdd6836 | 107 | py | Python | src/aio_dtls/dtls/record_layer.py | businka/aio_dtls | 0dba40d425b443e5ceb516011aadf58f573a4dc8 | [
"MIT"
] | null | null | null | src/aio_dtls/dtls/record_layer.py | businka/aio_dtls | 0dba40d425b443e5ceb516011aadf58f573a4dc8 | [
"MIT"
] | null | null | null | src/aio_dtls/dtls/record_layer.py | businka/aio_dtls | 0dba40d425b443e5ceb516011aadf58f573a4dc8 | [
"MIT"
] | null | null | null | from ..tls.record_layer import RecordLayer as TlsRecordLayer
class RecordLayer(TlsRecordLayer):
pass
| 17.833333 | 60 | 0.803738 | 12 | 107 | 7.083333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140187 | 107 | 5 | 61 | 21.4 | 0.923913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
5cb77ddcf639b34e71bb014a96e840e5398c6d26 | 184 | py | Python | profiles_api/admin.py | APPIAHTECH/profiles-rest-api | 32ac950ea06071cbb2a9f0ec67bdb4cdcced5074 | [
"MIT"
] | null | null | null | profiles_api/admin.py | APPIAHTECH/profiles-rest-api | 32ac950ea06071cbb2a9f0ec67bdb4cdcced5074 | [
"MIT"
] | null | null | null | profiles_api/admin.py | APPIAHTECH/profiles-rest-api | 32ac950ea06071cbb2a9f0ec67bdb4cdcced5074 | [
"MIT"
] | null | null | null | from django.contrib import admin
from profiles_api import models
# Register your models here.
admin.site.register( models.UserProfile )
admin.site.register( models.ProfileFeedItem ) | 23 | 45 | 0.815217 | 24 | 184 | 6.208333 | 0.583333 | 0.120805 | 0.228188 | 0.308725 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11413 | 184 | 8 | 45 | 23 | 0.91411 | 0.141304 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
7a2b8319bae34a98fb7edf9c641ce9dc40cbfc15 | 1,028 | py | Python | alpyro_msgs/turtle_actionlib/shapeactionfeedback.py | rho2/alpyro_msgs | b5a680976c40c83df70d61bb2db1de32a1cde8d3 | [
"MIT"
] | 1 | 2020-12-13T13:07:10.000Z | 2020-12-13T13:07:10.000Z | alpyro_msgs/turtle_actionlib/shapeactionfeedback.py | rho2/alpyro_msgs | b5a680976c40c83df70d61bb2db1de32a1cde8d3 | [
"MIT"
] | null | null | null | alpyro_msgs/turtle_actionlib/shapeactionfeedback.py | rho2/alpyro_msgs | b5a680976c40c83df70d61bb2db1de32a1cde8d3 | [
"MIT"
] | null | null | null | from typing import Final
from alpyro_msgs import RosMessage
from alpyro_msgs.actionlib_msgs.goalstatus import GoalStatus
from alpyro_msgs.std_msgs.header import Header
from alpyro_msgs.turtle_actionlib.shapefeedback import ShapeFeedback
class ShapeActionFeedback(RosMessage):
__msg_typ__ = "turtle_actionlib/ShapeActionFeedback"
__msg_def__ = "c3RkX21zZ3MvSGVhZGVyIGhlYWRlcgogIHVpbnQzMiBzZXEKICB0aW1lIHN0YW1wCiAgc3RyaW5nIGZyYW1lX2lkCmFjdGlvbmxpYl9tc2dzL0dvYWxTdGF0dXMgc3RhdHVzCiAgdWludDggUEVORElORz0wCiAgdWludDggQUNUSVZFPTEKICB1aW50OCBQUkVFTVBURUQ9MgogIHVpbnQ4IFNVQ0NFRURFRD0zCiAgdWludDggQUJPUlRFRD00CiAgdWludDggUkVKRUNURUQ9NQogIHVpbnQ4IFBSRUVNUFRJTkc9NgogIHVpbnQ4IFJFQ0FMTElORz03CiAgdWludDggUkVDQUxMRUQ9OAogIHVpbnQ4IExPU1Q9OQogIGFjdGlvbmxpYl9tc2dzL0dvYWxJRCBnb2FsX2lkCiAgICB0aW1lIHN0YW1wCiAgICBzdHJpbmcgaWQKICB1aW50OCBzdGF0dXMKICBzdHJpbmcgdGV4dAp0dXJ0bGVfYWN0aW9ubGliL1NoYXBlRmVlZGJhY2sgZmVlZGJhY2sKCg=="
__md5_sum__ = "aae20e09065c3809e8a8e87c4c8953fd"
header: Header
status: GoalStatus
feedback: ShapeFeedback
| 64.25 | 578 | 0.92607 | 53 | 1,028 | 17.528302 | 0.45283 | 0.043057 | 0.06028 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081967 | 0.050584 | 1,028 | 15 | 579 | 68.533333 | 0.869877 | 0 | 0 | 0 | 0 | 0 | 0.610895 | 0.610895 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.416667 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7a2fc93c3456f3785874e73bfa8a259aae9bf9d5 | 32 | py | Python | thedatamine/__init__.py | TheDataMine/thedatamine_py | 05c0302040a0abfa9d80f7ef8cf3ba18bbdd88ea | [
"Apache-2.0"
] | null | null | null | thedatamine/__init__.py | TheDataMine/thedatamine_py | 05c0302040a0abfa9d80f7ef8cf3ba18bbdd88ea | [
"Apache-2.0"
] | 1 | 2021-11-30T21:23:35.000Z | 2021-11-30T21:23:35.000Z | thedatamine/__init__.py | TheDataMine/thedatamine_py | 05c0302040a0abfa9d80f7ef8cf3ba18bbdd88ea | [
"Apache-2.0"
] | null | null | null | from .core import hello_datamine | 32 | 32 | 0.875 | 5 | 32 | 5.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 32 | 1 | 32 | 32 | 0.931034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.